Menu

#12 How to inference on multi-gpu?

open
nobody
None
2022-10-21
2022-10-21
Anonymous
No

Originally created by: Athena-I

I tried to make the inference on A30, while an error occurred: RuntimeError: CUDA out of memory. How to inference on multi cards?

Discussion


Log in to post a comment.

Want the latest updates on software, tech news, and AI?
Get latest updates about software, tech news, and AI from SourceForge directly in your inbox once a month.