gzclient no offloading to GPU [closed]
Hi,
Is there an arguemnt one can pass or config file, to let gzclient use the GPU for rendering?
I have an NVIDA card installed, and the drivers look fine:
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 390.77 Driver Version: 390.77 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
|===============================+======================+======================|
| 0 Quadro P400 Off | 00000000:65:00.0 On | N/A |
| 34% 32C P8 N/A / N/A | 143MiB / 1991MiB | 0% Default |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: GPU Memory |
| GPU PID Type Process name Usage |
|=============================================================================|
| 0 1143 G /usr/lib/xorg/Xorg 69MiB |
| 0 1180 G /usr/bin/gnome-shell 72MiB |
+-----------------------------------------------------------------------------+
But when I start gzclient and gzserver they wont use an GPU load...
On another machine I saw that especially gzclient had a gpu process running.
The Gazebo version is:
Gazebo multi-robot simulator, version 9.0.0
Copyright (C) 2012 Open Source Robotics Foundation.
Released under the Apache 2 License.
http://gazebosim.org
Any ideas what I missed?
BR
This issue arose, because of loggin in with remote desktop. This started a Xorg session, not a NVIDIA Xorg Session causing this problem. Switched to VNC. Working now.
Can you please provide some details how you exactly did this? I am facing the same problem so I will be really happy if you can provide some steps.