Home | Tutorials | Wiki | Issues
Ask Your Question

Clément Pinard's profile - activity

2016-04-01 10:10:23 -0500 received badge  Famous Question (source)
2016-03-16 09:27:08 -0500 received badge  Enthusiast
2016-03-14 04:44:29 -0500 received badge  Notable Question (source)
2016-03-13 01:50:46 -0500 received badge  Nice Question (source)
2016-03-11 05:05:13 -0500 received badge  Student (source)
2016-03-10 06:37:28 -0500 received badge  Popular Question (source)
2016-03-09 14:00:17 -0500 asked a question sensors and physics synchronization

Hi, I'm currently working on gazebo, trying to do some reinforcement learning, using only camera for the moment. as a consequence i want gazebo to run as quickly as possible, during learning. As I understood, the best way to do this is to set "real_time_update_rate" to 0 in the world file. I've been doing some tests, and it appears that simulation largely outdistance the camera. When I dump the frames, I see a massive amount of lost frames, which is not very convenient from a stability point of view for reinforced learning. Now as far as I have understood the code behind the sensors, each iteration for the sensor thread, there is a check if simulation time is enough to update the sensor, and if it is too much, it can compensate by updating faster next time.

Apparently the sensor thread is so outdistanced here that it just gives up to catch up and goes on, hence the lost frames.

So is there a way to make the physics engine wait for the sensor to be updated ? It will certainly make the sim slower but it would be useless otherwise.

Furthermore I’ve been playing with real_time_update_rate max_step_size and update_rate for the cam, to check how many engine steps there were between each frame I dumped

In theory, Real time execution is real_time_update_rate*ma_step_size, so if for example I set the three parameters like this

real_time_update_rate = 100
max_step_size = 0.01
update_rate = 10

I should get a frame every 10 steps, which it does. Now when i set it like this

real_time_update_rate = 10
max_step_size = 0.1
update_rate = 10

(parameters are not realistic, that's just for the test) I should get a frame every step right ? In fact I get approx 3 frames per step. How is that ? isn't the sensor supposed to wait for the simulation time to be enough ?

TL;DR How can synchronize the engine and the sensor to make sure that every simulation is deterministic, even with different computing speeds ?

Thanks in advance !