Gazebo | Ignition | Community
Ask Your Question

Custom sensors and materials simulation in Gazebo: how to simulate specific sensors behaviors for heterogenous materials / buildings.

asked 2022-02-28 12:20:17 -0500

Anil Narassiguin gravatar image

We are currently working on a simulation for a drone that would evolve in an urban environment. It would flight in front of facades and thus must avoid collision into walls, balconies, windows, etc. To fulfil this task, we added several sensors (lidars, sonars and cameras) to the drone that will jointly understand the surrounding environment by fusing the sensors information. One interesting case would be to have windows that are invisible to lidars and stereo cameras but visible by sonar. Even if the use case seems simple, we couldn’t find any resources or information to simulate different materials (especially their opacity or their transparency) or model sensors behaviors according to the object they face (apart from the transparency it would be interesting to specify different error noise according to the surface seen by a sensor).

Do you think Gazebo is the correct simulation environment to test this kind of behavior? If yes:

  • Is there a way to directly specify textures or opacity characteristics that have physical meaning and are not only visual?
  • Is there a way to program the sensors to behave differently according to the materials they’re detecting or not?

Thank you!

edit retag flag offensive close merge delete

1 Answer

Sort by » oldest newest most voted

answered 2022-05-16 00:36:05 -0500

Veerachart gravatar image

updated 2022-05-16 00:36:32 -0500

Cameras see the <visual> part of your models, taking the transparency into account.

GPU lidars also see by <visual>s, but the transparent meshes are also senses by the sensors. CPU lidars, on the other hands, detect by <collision>s. I think sonars also use <collision>s but I'm not sure.

A simple way (but possibly not good in terms of performance, is using different meshes for collisions and visuals, i.e. something may be visible in the camera but not collidable to the robots, and something may not be visible but will block the way of the robots. This may help you in some areas, but may not be all.

More than this, maybe you need to modify the sensors, or add your own.

edit flag offensive delete link more
Login/Signup to Answer

Question Tools

1 follower


Asked: 2022-02-28 12:20:17 -0500

Seen: 688 times

Last updated: May 16