Multisense SL Transform Tree

Follow

The Multisense-SL unit stores a unique calibration to transform laser range data into the left camera optical frame. This calibration is comprised of two static transforms; one from the motor frame to the left camera frame, the other from the laser frame to the spindle frame. There is an additional transform between the spindle frame and the motor frame which accounts for the rotation of the laser. The Multisense-SL ROS driver automatically generates a transform tree based off these three transforms. Additionally both the ROS driver and the underlying API offer mechanisms to perform custom transformations of laser data. The following sections detail this transform tree and provide a basic example of how to perform this transformation.

1.1.         Multisense-SL Transform Overview

 MultisenseFrames_ROS_Colors.PNG

 The Multisense-SL has three primary transforms between these four coordinate frames which are used to transform laser range data into the left camera optical frame. The first transform is a static transform between the fixed motor frame and the left camera frame which is described by the homogenous transform matrix HCM . In the ROS driver this transform is referred to as cameraToSpindleFixed. The second transform is between the spindle frame and the fixed motor frame which is described by the homogenous transform matrix HSM. This transform accounts for the spindle frame’s rotation about the z-axis of the fixed motor frame as the laser spins. The final transform is a static transform between the laser frame and the spindle frame which is described by the homogeneous transform matrix HSL. In the ROS driver  is referred to as laserToSpindle. By multiplying these homogenous transform matrices together a laser range point in Cartesian space can be transformed into the left camera frame.

dC = HCMHMS HSL dL

 Where dLis a 4x1 matrix of a single laser point in the x-z plane of the laser’s local frame and dc is a 4x1 matrix of a single laser point in the left camera’s optical frame.

1.2.         ROS Transform Tree

The Multisense-SL ROS driver uses the robot_state_publisher ROS package to create and update the transform tree. The robot_state_publisher uses individual joint rotations and translations published on a /joint_states topic to construct the transform tree. Since the transforms HCM and HSL are comprised of a rotation and translation six separate joint states – x, y, z, roll, pitch, yaw – need to be published for each degree of freedom. This is seen in the attached full Multisense-SL ROS TF tree with the pre_spinde_cal and post_spindle_cal transform segments. 

1.3.         ROS Laser Interface

The Multisense-SL ROS driver offers three primary mechanisms to process and view laser data: the /laser/scan, /laser/points2, and /laser/calibration/raw_lidar_data topics. In addition to laser data, the driver presents the raw calibration matrices on the /laser/calibration/raw_lidar_cal topic.

1.3.1.         /laser/scan Topic

The /laser/scan topic is a sensor_msgs/LaserScan message containing the raw laser range and intensity data in polar coordinates along with information used to interpolate the Cartesian position of each range reading. The transformation of the laser data into the left camera frame is managed by the transform tree constructed by the robot_state_publiser. When the laser is spinning the motion between individual range returns is compensated for in ROS internally using SLERP. This interpolation can also be used to account for external motions of the head; assuming the Multisense-SL’s transform tree is a child of the frame in which the motion is occurring.

1.3.2.         /laser/points2 Topic

The /laser/points2 topic is a sensor_msgs/PointCloud2 message which contains laser data transformed into the left camera frame. Each full laser scan is published as an individual point cloud. When transforming the laser data into Cartesian space both the scan angle and the spindle angle are linearly interpolated. Because this transformation is done with no knowledge of the heads state, it cannot compensate for external movements of the sensor head.

1.3.3.         /laser/calibration/raw_lidar_data and /laser/calibration/raw_lidar_cal Topics

The /laser/calibration/raw_lidar_data and /laser/calibration/raw_lidar_cal are a custom message types which contain all the information necessary to manually transform the laser data into the left camera optical frame. The /laser/calibration/raw_lidar_data topic contains the raw laser data as well as information about the angle of the spindle frame at the start and end of the laser scan. The /laser/calibration/raw_lidar_cal topic contains the two static calibration transforms  HCM and HSL flattened to 16 element arrays in row-major order. Using these two topics the laser data can be manually transformed using a custom interpolation scheme external of the Multisense-SL ROS driver.

1.4.         External Transformation Example

The following excerpts of code use ROS’s KDL and Angles packages to transform laser data into the left camera frame. The example consists of two subscribers; one to the /laser/calibration/raw_lidar_data topic and the other to the /laser/calibration/raw_lidar_cal topic. The full example code is attached at the bottom of this post.

 

1.4.1.         /laser/calibration/raw_lidar_cal Subscriber

 

void RawScanListener::rawLaserCalCallback(const multisense_ros::RawLidarCal::ConstPtr& msg)

{  

   //

   // Get Rotation Matrices from flattened 16 element Homogenous arrays

   for (int i = 0; i < 3; i++) {

       for (int j = 0; j < 3; j++) {

           laserToSpindle.M(i,j) = msg->laserToSpindle[i*4 + j];

           motorToCamera.M(i,j) = msg->cameraToSpindleFixed[i*4 + j];

       }

   }

 

   //

   // Get Translation Matrices from flattened 16 element Homogenous arrays

   for (int i = 0; i < 3; i++) {

      laserToSpindle.p[i] = msg->laserToSpindle[i*4 + 3];

      motorToCamera.p[i] = msg->cameraToSpindleFixed[i*4 + 3];

   }

}

The above code constructs two KDL::Frame objects, laserToSpindle and motorToCamera, from a single /laser/calibration/raw_lidar_cal message. These Frames correspond to  HCM and HSL  respectively.

1.4.2.         /laser/calibration/raw_lidar_data Subscriber

void RawScanListener::rawLaserDataCallback(const multisense_ros::RawLidarData::ConstPtr& msg)

{

    //

    // Full scan angle range of the laser

    const double fullScanAngle = 270 * M_PI / 180.;

 

    //

    // Laser start at -135 degrees spinning counterclockwise

    const double startScanAngle = -fullScanAngle / 2.;

 

    //

    // Get the spindle angles for this scan. angle_start and angle_end are micro-radians

    // Use rospackage angles to normalize them to -PI to +PI

 

    const double spindleAngleStart = angles::normalize_angle(1e-6 *                 

                                                 static_cast<double>(msg->angle_start));

 

    const double spindleAngleEnd = angles::normalize_angle(1e-6 *   

                                                 static_cast<double>(msg->angle_end));

 

    const double spindleAngleRange = angles::normalize_angle(spindleAngleEnd -  

                                                             spindleAngleStart);

 

    //

    // Loop over all the range data

    for ( unsigned int i = 0; i < msg->distance.size(); i++) {

 

        //

        // Percent through the scan. Used for linear interpolation

        const double percent = static_cast<double>(i) / static_cast<double>   

                                                                (msg->distance.size() -1);

       

        //

        // Linearly interpolate the laser scan angle from -135 to 135

        const double mirrorAngle = startScanAngle + (percent * fullScanAngle);

 

        //

        // Linearly interpolate the spindle angle

        const double spindleAngle = spindleAngleStart + (percent * spindleAngleRange);

 

        //

        // Generate the Homogeneous Transform for the Motor to Spindle Transform

        const double cosSpindle = std::cos(spindleAngle);

        const double sinSpindle = std::sin(spindleAngle);

        spindleToMotor = KDL::Frame(KDL::Rotation(cosSpindle, -sinSpindle, 0,

                                                  sinSpindle,  cosSpindle, 0,

                                                  0, 0, 1),

                                                  KDL::Vector(0, 0, 0));

 

        //

        // Convert range point to meters and project into the X-Z laser plane

        const double        rangeMeters = static_cast<double>(1e-3 * msg->distance[i]);

 

        const KDL::Vector   rangePoint  = KDL::Vector(rangeMeters * std::sin(mirrorAngle),                     

                                                      0,

                                                     rangeMeters * std::cos(mirrorAngle));

 

        //

        // Transform point into the left camera frame

        const KDL::Vector pointInCamera = motorToCamera * (spindleToMotor *

                                                          (laserToSpindle * rangePoint));

    }

}

 

 

The above code generates a KDL::Vector pointInCamera which contains the x, y, and z coordinates of a single laser range point transformed into the left camera frame. The code linearly interpolates the laser scan angle as it spins counterclockwise from -135 degrees to 135 degrees. The angle of the spindle, originally published in micro-radians, is also linearly interpolated to compensate for motion due to the rotation of the spindle.

Have more questions? Submit a request

Comments

  • Avatar
    Thomas Koletschka

    Maybe it's just me but using red for x, green for y, blue for z might be less confusing for the reader as those are the color assignments predominantly used to colorize those axes (which would then also be conform with the way RViz colorizes them).

    In addition to that, the axes drawn in your diagram seem to be inconsistent with the ones in the URDF. The URDF has the lidar rotate about the X axis - i.e. using the color assignment I outlined above, those axes would match the ones in the URDF, however this would then require the axes for the camera optical frame to change colors (which would also be less work than changing the colors for the 3 lidar/motor related coordinate systems ;) ).

  • Avatar
    Matt Alvarado

    Hi Thomas,

    Thanks for the catch on the colors. I will make sure to update the image.

    You are correct that the URDF has the laser rotating about the X axis. That feature is because ROS and RViz assume laser data is published in the X-Y plane not the Z-X plane.  There are two extra rotations which are added to the laserToSpindle transform so the laser data will be displayed correctly in RViz. I will make sure to add that clarification in the post.

    -Matt