ROS Resources: Documentation | Support | Discussion Forum | Service Status | Q&A answers.ros.org

Realsense T265 - hands on review

Thank you for this. The coordinate convention makes my head spin, hopefully this will help.

While impressive, so far I see the t265 as better suited for integration with an external vio/vslam system, which is kind of ironic. It is really amazing how well it works out of the box but I can’t quite figure out a good way to translate that into a production asset.

1 Like

For me the greatest promise of T265 was the plug’n’play aspect. I was hoping that I will plug it in to my robot and I will receive a good feedback from it that would allow me to detect wheel slip, especially on uneven/slippery terrains.

I’m really looking forward to see what happens to it in the future and I really hope Intel doesn’t abandon it.

I dont get your problem with the TFs. For me the T265 is publishing TFs in compliance with REP 105

Interesting! Did you set the publish_odom_tf parameter to true in the driver and managed to get a solid tree? Would you be able to share you settings, tf tree etc? I spent close to a week just on this and I’d love to learn how to set it up properly!

i have just used default settings :wink:

Yes, this looks good for a system where you use just the camera. How would you integrate it on your robot when it comes to tf structure?

Where would you connect the camera to base_link? Because the only way I can think of that won’t violate the REP-105 is if you have a camera_link -> base_link transform, which is quite backwards from what I’m used to.

if you want your standard odom --> base_link --> camera_link you have to handle TF by yourself or use robot_localization package.

Can we take pictures with the Intel Tracking camera T265 ? I am interested in getting pictures as well with the tracking records.

Yes, you can. It is monochromatic fisheye stereo camera and you can access both stereo streams.

2 Likes

This is what you get:

3 Likes

I’ve had some pretty amazing results with a SLAM implementation called Basalt. There’s a wrapper for ROS 1 and for ROS 2.

2 Likes

Basalt looks super cool! I’ll actually feature it in Weekly Robotics!

@allenh1 any more information you can give on that? We’re looking at VSLAM integrations with Nav2 so that sounds like something we should consider (assuming its a full slam with loop closure).

I glanced over the paper, but they didn’t really compare it with any recently published things like Kimera / T265 / etc (ORB SLAM1, OKVIS, etc) to get an actual impression for how it lines up.

Basalt looks really nice! I’ve had some good results using VINS Fusion on the T265. Often, the results are better than the internal visual SLAM on the T265, but sometimes worse. Could be my parametrization though. I didn’t do a quantitative evaluation.

1 Like

I hope someone has the time / inclination to do some quantitative comparisons of the options on the same hardware, it would be interesting to see over a variety of situations and datasets if any stick out.

1 Like

Isn’t that basically what the KITTI odometry benchmark tries to show? http://www.cvlibs.net/datasets/kitti/eval_odometry.php
Although I realize that there is a difference in feeding in V(I)O to a robot system that has a separate SLAM method to do loop-closures compared to letting the V(I)O(SLAM) do it’s own loop closures. I don’t know, but I think that all algorithms in the KITTI ranking are doing their own loop closures.

Also, I don’t actually know if the KITTI dataset has IMU data with a good sync to their camera data, so maybe some of the results will be better with newer cameras with rigidly coupled and hardware synced IMUs?

Isn’t that basically what the KITTI odometry benchmark tries to show? http://www.cvlibs.net/datasets/kitti/eval_odometry.php

Yes, I believe basalt has an executable for running KITTI

the file size (.bag) are very large. even using compression, it didn’t reduce size much. They are 300mb + for just 8 seconds. How can I store tracking video in lesser space?

  • It’s accurate on AGVs traveling below 2m/s.
  • Ideal for Slow moving bots.
  • Odometry Gaps are evident when Used on Drone at 4m/s or Above and Can’t Fly Above 20m From Ground.
  • Tried to Get Images captured to Bag, But could do so Only For Left Camera, Right camera was dropping 90% Frames.(Even on USB3)
  • It’s the Most Rugged Device I have ever seen, Survived a horrible Drone Crash.
  • I Could Successfully Calibrate my IMU (Xsens MTI) and T265 for use with VINS Fusion but Image resolution is very less for use on Drones.

3 Likes

Thanks for reporting your experiences, @vasanth_reddy1! This kind of hands-on information by actual users is really valuable and hard to get from manufacturer’s websites.

I’ve seen that too on a colleague’s brand-new (2020) laptop. On the other hand, on my old 2015 laptop I don’t have any problems whatsoever. Might be related to what USB chipset you’re using.