Home | Tutorials | Wiki | Issues
Ask Your Question
0

Defining a Stereo Camera in URDF for usage in Gazebo Simulations

asked 2016-08-18 18:11:26 -0600

atomoclast gravatar image

Hello all,

I have been struggling for the past few days to get a Gazebo simulation put together, and the hardest part has proven to be getting sensor data integrated in the simulation. I am trying to write a chunk of code that will represent a stereo camera, which seems to launch with no errors, but I'm not seeing any sensor related topics listed below. Can anyone help me sort out the issue?

Thanks in advance!

The relevant section from my URDF is as follows:

<link
name="ZedCamera">
<inertial>
  <origin
    xyz="-0.015875 1.2526E-17 1.4088E-05"
    rpy="0 0 0" />
  <mass
    value="0.16007" />
  <inertia
    ixx="0.00039322"
    ixy="-3.7717E-21"
    ixz="1.1677E-22"
    iyy="2.5042E-05"
    iyz="-3.0636E-21"
    izz="0.00039512" />
</inertial>
<visual>
  <origin
    xyz="0 0 0"
    rpy="0 0 0" />
  <geometry>
    <mesh
      filename="package://dante_eeTest/meshes/ZedCamera.STL" />
  </geometry>
  <material
    name="">
    <color
      rgba="0.50196 0.50196 0.50196 1" />
  </material>
</visual>
<collision>
  <origin
    xyz="0 0 0"
    rpy="0 0 0" />
  <geometry>
    <mesh
      filename="package://dante_eeTest/meshes/ZedCamera.STL" />
  </geometry>
</collision>
<map name="sensor" flag="gazebo">
  <verbatim key="zed_camera">
    <controller:ros_stereo_camera name="$zed_gz_controller" plugin="libros_stereo_camera.so">
      <alwaysOn>true</alwaysOn>
      <updateRate>60.0</updateRate>
      <leftCamera>l_sensor</leftCamera>
      <rightCamera>r_sensor</rightCamera>
      <topicName>/raw_stereo</topicName>
      <frameName>optical_frame</frameName>
      <CxPrime>960</CxPrime>
      <Cx>960</Cx>
      <Cy>540</Cy>
      <focal_length>1.4</focal_length> <!-- image_width / (2*tan(hfov_radian /2)) -->
      <distortion_k1>-0.16</distortion_k1>
      <distortion_k2>0</distortion_k2>
      <distortion_k3>0</distortion_k3>
      <distortion_t1>0</distortion_t1>
      <distortion_t2>0</distortion_t2>
      <baseline>.12</baseline> <!-- home pos. of robot has +x forward, +y left -->
      <interface:stereocamera name="zed_iface" />
    </controller:ros_stereo_camera>
  </verbatim>
</map>

</link>

edit retag flag offensive close merge delete

Comments

1

I have used stereo cam once but i was not using zed stereo camera. I used camera from gazebo http://gazebosim.org/tutorials?tut=ros_gzplugins and a ros node (stereo_image_proc) to combine the two cameras : http://wiki.ros.org/stereo_image_proc

Brosseau.F gravatar imageBrosseau.F ( 2016-08-19 01:24:56 -0600 )edit

hi is it possible to to tell me where to get the code for the controller for the stereo camera

Rajnunes gravatar imageRajnunes ( 2017-05-12 04:31:28 -0600 )edit

1 Answer

Sort by » oldest newest most voted
0

answered 2017-05-17 10:24:49 -0600

yah130s gravatar image

updated 2017-05-17 10:26:06 -0600

You're defining physical parameters, but you also need to write plugin to "mock" your own sensor output values, or use existing plugins as @Brosseau.F said in the comment.

edit flag offensive delete link more
Login/Signup to Answer

Question Tools

1 follower

Stats

Asked: 2016-08-18 18:11:26 -0600

Seen: 834 times

Last updated: May 17