Running Gazebo under Docker with VirtualGL
I'm trying to run the LRAUV Simulator on a headless system with an NVIDIA GPU, which I access over VNC.
The simulator is published as a Docker container. I modified this container to add VirtualGL as well as the libgl1-mesa-glx
and libegl1-mesa
packages. Then, following these examples of running a hardware accelerated program inside a container with VirtualGL, I invoke Gazebo like this:
xauth merge /etc/opt/VirtualGL/vgl_xauth_key
rm -f ~/.Xauthority.docker
xauth nlist $DISPLAY :0.0 | sed -e 's/^..../ffff/' | xauth -f ~/.Xauthority.docker nmerge -
docker run --rm -it \
--user $(id -u):$(getent group vglusers | cut -d : -f 3) \
--gpus all \
--device=/dev/dri \
--ipc host \
-e DISPLAY \
-e MESA_GL_VERSION_OVERRIDE=3.3 \
--mount=type=bind,src=$HOME/.Xauthority.docker,dst=/home/developer/.Xauthority,ro \
--mount=type=bind,src=/tmp/.X11-unix,dst=/tmp/.X11-unix,ro \
osrf/lrauv:latest \
vglrun gz sim -v4
This appears to work and I can see the simulated environment if I load some of the examples like multi_lrauv_race.sdf
.
However, if I try to run the tethys_at_portuguese_ledge.sdf
or tethys_at_empty_environment.sdf
files, Gazebo starts and then terminates. Here is the complete log, with the part that looks to be the error below.
I am unsure where to go from here due to the fact that other SDFs work OK... Is this some unexpected weirdness from the way I'm running it with VirtualGL? Is this a bug in Gazebo? Is it an issue with the SDF I'm trying to load?
libEGL warning: DRI2: failed to create dri screen
libEGL warning: DRI2: failed to create dri screen
[Err] [Ogre2RenderEngine.cc:1098] Unable to create the rendering window: OGRE EXCEPTION(3:RenderingAPIException): Unable to create a suitable GLXContext in GLXContext::GLXContext at /home/jenkins/workspace/ogre-2.3-debbuilder/repo/RenderSystems/GL3Plus/src/windowing/GLX/OgreGLXContext.cpp (line 61)
(... above repeated a few times ...)
[Err] [Ogre2RenderEngine.cc:1106] Unable to create the rendering window after [11] attempts.
[Err] [Ogre2RenderEngine.cc:1017] Failed to create dummy render window.
Stack trace (most recent call last) in thread 125:
#24 Object "[0xffffffffffffffff]", at 0xffffffffffffffff, in
#23 Object "/usr/lib/x86_64-linux-gnu/libc.so.6", at 0x7fa89a64f132, in clone
#22 Object "/usr/lib/x86_64-linux-gnu/libpthread.so.0", at 0x7fa89a119608, in
#21 Object "/usr/lib/x86_64-linux-gnu/libstdc++.so.6", at 0x7fa896708de3, in
#20 Object "/home/developer/gz_ws/install/lib/gz-sim-7/plugins/libgz-sim-sensors-system.so", at 0x7fa85c1a85b7, in gz::sim::v7::systems::SensorsPrivate::RenderThread()
#19 Object "/home/developer/gz_ws/install/lib/gz-sim-7/plugins/libgz-sim-sensors-system.so", at 0x7fa85c1a6f08, in gz::sim::v7::systems::SensorsPrivate::WaitForInit()
#18 Object "/home/developer/gz_ws/install/lib/libgz-sim7-rendering.so.7", at 0x7fa85c0c4c23, in gz::sim::v7::RenderUtil::Init()
#17 Object "/home/developer/gz_ws/install/lib/libgz-rendering7.so.7", at 0x7fa85bec9b57, in gz::rendering::v7::RenderEngineManager::Engine(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::map<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >, std::less<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char ...