Robust Odometry and Fully Autonomous Driving Robot | ATCrawler x FAST-LIO x Navigation2

แชร์
ฝัง
  • เผยแพร่เมื่อ 25 ก.ย. 2024

ความคิดเห็น • 37

  • @thomasluk4319
    @thomasluk4319 3 หลายเดือนก่อน +2

    Can u explain a bit how do u extract the laser point from fastlio registered point cloud? And what’s the relay?

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  3 หลายเดือนก่อน +1

      The laserscan is extract from a package called poiontcloud_to_laserscan package.
      The term “relay” on my package is just to repeat the Odometry topic from FastLio but replace the frame_id to “odom” and also publish the odom to base_link TF.

  • @F2MFireDraftingDesignsIn-ch4tw
    @F2MFireDraftingDesignsIn-ch4tw 20 วันที่ผ่านมา

    Hi Great video. Do you have any videos on how to program LOAM, SLAM or FAST LIO using a livox avia lidar?

  • @wongpokhim
    @wongpokhim หลายเดือนก่อน

    Very simple explanation! Thank you

  • @fra5715
    @fra5715 3 หลายเดือนก่อน +2

    Dude, this is amazing. Thank you for video.

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  3 หลายเดือนก่อน +1

      Hope it will be useful information 😁

    • @fra5715
      @fra5715 3 หลายเดือนก่อน

      @@stepbystep-robotics6881 it definitely is 👍👍

  • @nicholastan170
    @nicholastan170 19 วันที่ผ่านมา

    Great video! Could you also share the code for your fl_nav2_helper package please?

  • @madeautonomous
    @madeautonomous 3 หลายเดือนก่อน +1

    Great video. We tried fast-lio2 before but quickly drifted.

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  3 หลายเดือนก่อน +1

      I have seen it couple times, but after restart it and place on proper vehicle, it works most of the time.

  • @김영욱-n5u1r
    @김영욱-n5u1r หลายเดือนก่อน

    Nice video!
    As I understand it, AMCL publishes map to odom tf, while FAST-LIO publishes odom to robot's base frame tf. Will this method work well even if the robot platform experiences significant roll or pitch movements, such as in two-wheeled balancing robots?

  • @PCBWay
    @PCBWay 3 หลายเดือนก่อน +3

    Well done!

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  3 หลายเดือนก่อน

      Thank you to PCBWay for the nice 3D printed part. It’s really neat and fast delivery!

  • @hrmny_
    @hrmny_ 3 หลายเดือนก่อน +1

    What's the minimum angle of the lidar vertically?
    As in if it's mounted flat can it see the ground? Seems like you have some ground data in your neighborhood scan

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  3 หลายเดือนก่อน

      Min. FOV angle is -2degree. The lidar is placed in flat, and yes with this min FOV angle, meaning it will see ground with farther away from the robot.

  • @TNERA
    @TNERA 3 หลายเดือนก่อน

    very nice explanation of the tech and how you put it together!

  • @nanastos18060
    @nanastos18060 3 หลายเดือนก่อน

    Super sweet technology, thank you for sharing!!

  • @TinLethax
    @TinLethax 3 หลายเดือนก่อน +1

    Cook video! I wonder if the Fast LIO can be used with a depth camera (with depth image to point cloud conversion). I have a special version of Asus Xtion that is depth-only instead of typical RGB-D.

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  3 หลายเดือนก่อน +1

      if that pointcloud from depth camera is fast enough, I think it might work. And you still need IMU too.

    • @sencis9367
      @sencis9367 3 หลายเดือนก่อน +1

      Do you want to use RGBD point cloud instead of lidar? But the lidar has a large field of view of 360 degrees versus 90 for the camera, and most importantly, its accuracy is the same at 40 meters or 10 centimeters and is approximately equal to +/- 2 centimeters, the accuracy of the RGBD camera decreases with distance and is comparable with the lidar up to about 5 meters, which quite small, also computer vision algorithms like SGM in realsense cameras allow more errors in image triangulation than lidar physically measures the time of flight of light?

    • @TinLethax
      @TinLethax 3 หลายเดือนก่อน

      @@sencis9367 That what I already had. Also It's depth only. So the visual odometry was not available here. I have a plan to mount the camera on the front and back of the robot, as I have two of then. And then write a CPP ros2 node to combine the Point cloud into a single cloud message.

    • @sencis9367
      @sencis9367 3 หลายเดือนก่อน +1

      @TiNredstoner As far as I understand, the key condition for the operation of lidar odometry is the correlation of the visible lidar pointcloud and points on the map. Those in contrast to visual odometry, which is oriented by the ORB, KLT, SURF descriptors, by the coincidence of their PIDs, the lidar is oriented solely by the shape of objects around it, and for greater matching of the shapes of objects visible by the lidar, measurement accuracy plays a key role, and range affects reliability in rarefied environments and in open space, it turns out that theoretically you can count on working inside small rooms with the camera's IR projector turned on in case there are no visual signs for triangulation , also the camera will generate a very dense cloud in comparison with the lidar, one camera can create up to 1,000,000 points every 33.3 milliseconds in 1280x720 resolution & 30 fps, which will require a lot of computing power.

    • @TinLethax
      @TinLethax 3 หลายเดือนก่อน

      @@sencis9367 I have tested the Asus Xtion (640x480 depth resolution) with various Laser Odometry SLAM but not all of them. I ported the SSL_SLAM to ros2 and this seems to give the best result. Still, the problem remains when the robot rotates. The estimated odometry just fly off as the robot start to rotate. My guess was that the narrow horizontal FoV compared to 3D LiDAR, But something like Livox Horizon, MID-40 seems to works despite the similar narrow FoV. I yet to test it with KISS-ICP and another IMU coupled SLAM.

  • @Flynn3778
    @Flynn3778 3 หลายเดือนก่อน

    I wonder what how the point cloud would look like with the LIVOX MID-360 for a wire mesh fence. Our baseball field is surrounded by 6 foot high mesh fencing. In the process of building a mower to do the field and need a way for the mower to see the perimeter.

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  3 หลายเดือนก่อน

      I believe the wire mesh could be detected by the MID-360. It could detect some of power line, so in case of planar fence, it might be no problem.
      The lidar itself is not too expensive, you better to give a try on it. 🙂

  • @mr2tot
    @mr2tot 3 หลายเดือนก่อน +2

    Hi, what is the maximum speed that your robot can achieve? And what maximum speed have you tested with this algorithm ? Thanks :D

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  3 หลายเดือนก่อน +2

      Max speed with 18V battery is 1.3m/s
      It’s DC brushed motors, so if higher voltage it could go faster. The AT Motor Driver that I am using could go up to 24V or 6 cells battery.
      I have trie run that with full speed outdoor, the Odometry from FAST-LIO still pretty reliable :) they really made a good work.

    • @mr2tot
      @mr2tot 3 หลายเดือนก่อน +1

      @@stepbystep-robotics6881 Thank you for your answer. In addition to the robot depending on the battery voltage, does it also depend on the ROS processing speed? If I'm not mistaken, robots developed on ROS only have a maximum speed of about 3m/s?

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  3 หลายเดือนก่อน

      @@mr2tot I don’t think any ROS robot will be limited as 3m/s because robot’s physical speed doesn’t relate to ROS processing speed. The vehicle physical speed should be depending on ESC or motor driver.
      From my point of view, I don’t use ROS for all of control level, sometime we need realtime data in low level then better to use RTOS. ROS is better from middle to higher level software I feel.

    • @sencis9367
      @sencis9367 3 หลายเดือนก่อน

      @stepbystep-robotics6881 Am I understanding this correctly, the lidar is highly susceptible to the rolling shutter effect due to the slow mechanical sweep (rotation mirror livox mid 360 only 10Hz), which may require an accurate imu sensor or wheel odometry for predict the displacement of the point cloud on the moving platform thus obtain an accurate lidar odometry xyz position and quat position? Perhaps in this case, wheel odometry can help determine the exact position of the robot on the move. Have you tried solutions like LIW-OAM to obtain a more accurate position?

  • @sebastianrada4107
    @sebastianrada4107 3 หลายเดือนก่อน +1

    Great video
    Does it work with 2d lidars?

    • @stepbystep-robotics6881
      @stepbystep-robotics6881  3 หลายเดือนก่อน

      Fast-Lio only works with 3D lidar. But Nav2 just 2D laserscan is enough.

  • @sutanmuhamadsadamawal715
    @sutanmuhamadsadamawal715 3 หลายเดือนก่อน

    Nice Project, I working with the similar project for agriculture robot, here in Japan. It will be nice if I can contact you for getting some advice.

  • @FactorSocial4797
    @FactorSocial4797 หลายเดือนก่อน

    Nice one

  • @salh2665
    @salh2665 3 หลายเดือนก่อน

    ❤❤❤❤