Custom sensors and materials simulation in Gazebo: how to simulate specific sensors behaviors for heterogenous materials / buildings.
We are currently working on a simulation for a drone that would evolve in an urban environment. It would flight in front of facades and thus must avoid collision into walls, balconies, windows, etc. To fulfil this task, we added several sensors (lidars, sonars and cameras) to the drone that will jointly understand the surrounding environment by fusing the sensors information. One interesting case would be to have windows that are invisible to lidars and stereo cameras but visible by sonar. Even if the use case seems simple, we couldn’t find any resources or information to simulate different materials (especially their opacity or their transparency) or model sensors behaviors according to the object they face (apart from the transparency it would be interesting to specify different error noise according to the surface seen by a sensor).
Do you think Gazebo is the correct simulation environment to test this kind of behavior? If yes:
- Is there a way to directly specify textures or opacity characteristics that have physical meaning and are not only visual?
- Is there a way to program the sensors to behave differently according to the materials they’re detecting or not?
Thank you!