1

After countless tries, I got confused in this issue but finally solved it. Now, back to the initial problem that made me lose my hair: CUDA does not seem to be used when I run my model with pytorch 2.4.0+cu124 and onnxruntime-gpu.

2024-07-30 14:32:46.5323116 [E:onnxruntime:Default, provider_bridge_ort.cc:1745 onnxruntime::TryGetProviderInfo_CUDA] D:\a\_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1426 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\Users\david\miniconda3\envs\Pose2Sim\Lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"

2024-07-30 14:32:46.5491752 [W:onnxruntime:Default, onnxruntime_pybind_state.cc:895 onnxruntime::python::CreateExecutionProviderInstance] Failed to create CUDAExecutionProvider. Please reference https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirementsto ensure all dependencies are met.
load C:\Users\david\.cache\rtmlib\hub\checkpoints\rtmpose-m_simcc-body7_pt-body7-halpe26_700e-256x192-4d3e73dd_20230605.onnx with onnxruntime backend

I run Windows 11 with python 3.11. I installed the following packages:

pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu124
pip install onnxruntime-gpu

What follows seems to be in line with ONNXruntime-gpu requirements Running pip install onnxruntime-gpu --extra-index-url https://aiinfra.pkgs.visualstudio.com/PublicPackages/_packaging/onnxruntime-cuda-12/pypi/simple/ like suggested there does not make any difference.

import torch; import onnxruntime as ort; print(torch.cuda.is_available(),ort.get_available_providers())
print(f'torch version: {torch.__version__}, cuda version: {torch.version.cuda}, cudnn version: {torch.backends.cudnn.version()}, onnxruntime version: {ort.__version__}')

True ['TensorrtExecutionProvider', 'CUDAExecutionProvider', 'CPUExecutionProvider']
torch version: 2.4.0+cu124, cuda version: 12.4, cudnn version: 90100, onnxruntime version: 1.18.1

enter image description here


Dependency walker on the missing library gives this. It seems like onnxruntime-gpu is expecting cuDNN 8.x. Why? This is not what the doc says, unless I am missing something?

enter image description here

1 Answer 1

0

please check whether you've installed cudatoolkit and cudnn correctly.

run this command to check your cuda nvcc -V

determine whether your cuda directory includes the comparable version of cudnn

Sign up to request clarification or add additional context in comments.

1 Comment

This does not provide an answer to the question. Once you have sufficient reputation you will be able to comment on any post; instead, provide answers that don't require clarification from the asker. - From Review

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.