solving TF-TRT graph conversion errors

I’ve tried converting a graph in various setups which produced different error messages Original environment ubuntu 16.04 CUDA 9.0.176 cudnn 7.0.x TITAN XP: compute capability 6.1 nvidia driver version 384.130 tensorRT 4.0 original environment + tensorflow-gpu==1.11.0 original environment + tensorflow-gpu==1.12.0 anaconda environment(may not match with original environment) Solution install cudnn Read more…

tensorflow tensorrt import error

the above error shows up whenever I import tensorrt from tensorflow with the following import statement. Environment was: tensorflow-gpu 1.9.0 python 3.6.6 ubuntu 16.04 CUDA 9.0 Solution upgraded to tensorflow 1.10.0. the problem disappeared immediately. This solution was referred in this thread.

tensorRT stuff

tensorRT support matrix: https://docs.nvidia.com/deeplearning/dgx/integrate-tf-trt/index.html#matrix to apply the tensorRT optimizations, it needs to call create_inference_graph function. Check here for more details on this function. the graph that is fed to create_inference_graph should be freezed. To know more on what exactly means by “freezing”, check here. for using bare tensorRT python module, Read more…