Gazebo | Ignition | Community
Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

Okay, so I think I solved the problem. After having the graphics card driver reinstalled, it seemed to function properly (gzclient and gzserver are both listed as processes on the GPU after running nvidia-smi). That being said, it looks like only 8-9% of the GPU (about 550 MiB/7982 MiB) is being utilized. I'm wondering if people have achieved better utilization of their GPU, or if this looks about right?

Okay, so I think I solved the problem. After having the graphics card driver reinstalled, it seemed to function properly (gzclient and gzserver are both listed as processes on the GPU after running nvidia-smi). That being said, it looks like only 8-9% of the GPU (about 550 MiB/7982 MiB) is being utilized. I'm wondering if people have achieved better utilization of their GPU, or if this looks about right?

Edit: wanted to add that the whole reason for this is question that the RTF of the simulation started out at 0.9 and would deteriorate to 0.1. I do know that I'm adding models continuously so that definitely contributes to the problem, but I'm deleting them at the same time, so the model shouldn't deteriorate that much over time. Is there something else I'm missing/could be improving on? Appreciate any thoughts.