Gazebo | Ignition | Community
Ask Your Question

Gazebo System Requirements

asked 2017-01-01 19:02:14 -0600

Weiwei gravatar image

Hi all,

I intend to buy a new laptop and configure a PC and wish to run Gazebo under Ubuntu. My experiments background are quadrotors swarm (more than 15 vehicles) whose software is based on PX4 SITL with Gazebo6.6.

I searched the internet and found the similar answers,

But these answers were little outmoded and I would like to know that

  • Whether NVIDIA GeForce is the bottleneck for Gazebo performance (real-time factor)?
  • On your experience related to Gazebo-based UAV Swarm Simulation, which type processor and graphic card are recommended?
edit retag flag offensive close merge delete


I'm running gazebo on an octa-core, 8GB Ram and an nvidia quadro 5000, and it's already kind of laggy for my single robot simulation :D. The simulation takes up roughly 3 cores(running i3 instead of KDE). So I would think my graphics card makes it slow.

Lyndwyrm gravatar imageLyndwyrm ( 2017-01-02 08:44:26 -0600 )edit

@Lyndwyrm Thank you for your reply. :)

Weiwei gravatar imageWeiwei ( 2017-01-03 19:38:36 -0600 )edit

1 Answer

Sort by ยป oldest newest most voted

answered 2017-01-04 08:58:51 -0600

nkoenig gravatar image

There are multiple bottlenecks in simulation, which depend on your use case. For example:

  1. If you have lots of camera sensors, or a few high-resolution cameras, then rendering can be a bottleneck.
  2. If you have a robot with many degrees of freedom, then physics (CPU) will be a bottleneck.
  3. It is also likely that you have a combination of 1 & 2.

We recommend, and develop Gazebo on, a fairly modern computer (within the last two years). Not bleeding edge, but not 5 years old either.

Keep in mind that throwing more expensive hardware at a problem is not usually the best solution. Chances are you should modify simulation parameters to achieve desired results.

edit flag offensive delete link more


@nkoenig Thank you for your kind reply. Would the Nvidia Graphic Card will speed up the rendering or ODE calculation? or all the ODE burden is for CPUs?

Weiwei gravatar imageWeiwei ( 2017-01-04 16:52:42 -0600 )edit

Currently ODE uses only the CPU.

nkoenig gravatar imagenkoenig ( 2017-01-04 21:46:29 -0600 )edit

@nkoenig Great. So Nvidia Graphic Card only enhances the rendering assignments?

Weiwei gravatar imageWeiwei ( 2017-01-04 22:32:16 -0600 )edit
Login/Signup to Answer

Question Tools

1 follower


Asked: 2017-01-01 19:02:14 -0600

Seen: 2,868 times

Last updated: Jan 04 '17