Gazebo | Ignition | Community
Ask Your Question

Revision history [back]

click to hide/show revision 1
initial version

unexpected results from laser sensors

Hi

I've configured my robot with two inclined laser scanner, then I've wrote some plugin to record laser data and ground truth pose of the robot. I've tried a very simple environment with only the <math>z=0</math>.

I notice that taking the data acquired by sensor, converting them from polar to cartesian and composing with the robot pose, the reconstructed plane is slightly different from the <math>z=0</math> plane: the minimum height of the point cloud is about 0.002m (~2mm), while the maximum height is about 0.015mm (~1.5cm).

The curious thing is that height is maximum with laser beam acquired at small angles, while decrease with lateral angles. I was expecting that, since there is no noise addition, all the points should be exactly at <math>z=0</math>.

Why this does not happens? Is it related to the ray tracing process performed in the simulation? It depends on the surface of the objects?

unexpected results from laser sensors

Hi

I've configured my robot with two inclined laser scanner, then I've wrote some plugin to record laser data and ground truth pose of the robot. I've tried a very simple environment with only the <math>z=0</math>. z=0.

I notice that taking the data acquired by sensor, converting them from polar to cartesian and composing with the robot pose, the reconstructed plane is slightly different from the <math>z=0</math> z=0 plane: the minimum height of the point cloud is about 0.002m (~2mm), while the maximum height is about 0.015mm (~1.5cm).

The curious thing is that height is maximum with laser beam acquired at small angles, while decrease with lateral angles. I was expecting that, since there is no noise addition, all the points should be exactly at <math>z=0</math>. z=0.

Why this does not happens? Is it related to the ray tracing process performed in the simulation? It depends on the surface of the objects?