[Help] wsl docker 调用 cuda 核心的问题 #328
-
Problem Overviewwsl docker gpu version 无法正常转换字幕 Steps Taken
Expected Outcome(A clear and concise description of what you expected to happen.) ScreenshotsEnvironment Information
Additional context(Add any other context about the problem here.) |
Beta Was this translation helpful? Give feedback.
Replies: 11 comments
-
wsl 跑 nvidia gpu docker,需要 参考官方文档:https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/latest/install-guide.html |
Beta Was this translation helpful? Give feedback.
-
安装过
|
Beta Was this translation helpful? Give feedback.
-
好的, |
Beta Was this translation helpful? Give feedback.
-
10000
nvcc: NVIDIA (R) Cuda compiler driver |
Beta Was this translation helpful? Give feedback.
-
单独运行一下识别模块,看一下输出结果, |
Beta Was this translation helpful? Give feedback.
-
或者直接拉个镜像验证一下。
|
Beta Was this translation helpful? Give feedback.
-
@timerring sorry最近有点忙
|
Beta Was this translation helpful? Give feedback.
-
没事,如果驱动能正常使用的话,可以直接在 docker container 里验证一下 torch 是否能正确调用 cuda 核心: import torch
def check_cuda_with_pytorch():
"""Check if the PyTorch CUDA environment is working correctly"""
try:
print("Checking PyTorch CUDA environment:")
if torch.cuda.is_available():
print(f"CUDA device is available, the current CUDA version is: {torch.version.cuda}")
print(f"PyTorch version is: {torch.__version__}")
print(f"Detected {torch.cuda.device_count()} CUDA devices.")
for i in range(torch.cuda.device_count()):
print(f"Device {i}: {torch.cuda.get_device_name(i)}")
print(f"Device {i} total memory: {torch.cuda.get_device_properties(i).total_memory / (1024 ** 3):.2f} GB")
print(f"Device {i} current memory usage: {torch.cuda.memory_allocated(i) / (1024 ** 3):.2f} GB")
print(f"Device {i} max memory usage: {torch.cuda.memory_reserved(i) / (1024 ** 3):.2f} GB")
else:
print("CUDA device is not available.")
except Exception as e:
print(f"Error when checking PyTorch CUDA environment: {e}")
if __name__ == "__main__":
check_cuda_with_pytorch() |
Beta Was this translation helpful? Give feedback.
-
看起来不可用 |
Beta Was this translation helpful? Give feedback.
-
好的,大概率还是 |
Beta Was this translation helpful? Give feedback.
-
重新拉取了最新版镜像,问题解决 |
Beta Was this translation helpful? Give feedback.
没事,如果驱动能正常使用的话,可以直接在 docker container 里验证一下 torch 是否能正确调用 cuda 核心: