• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

vectr-ucla/direct_lidar_odometry: Direct LiDAR Odometry: Fast Localization with ...

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称(OpenSource Name):

vectr-ucla/direct_lidar_odometry

开源软件地址(OpenSource Url):

https://github.com/vectr-ucla/direct_lidar_odometry

开源编程语言(OpenSource Language):

C++ 98.3%

开源软件介绍(OpenSource Introduction):

Direct LiDAR Odometry:
Fast Localization with Dense Point Clouds

[IEEE RA-L] [ArXiv] [Video] [Code]

DLO is a lightweight and computationally-efficient frontend LiDAR odometry solution with consistent and accurate localization. It features several algorithmic innovations that increase speed, accuracy, and robustness of pose estimation in perceptually-challenging environments and has been extensively tested on aerial and legged robots.

This work was part of NASA JPL Team CoSTAR's research and development efforts for the DARPA Subterranean Challenge, in which DLO was the primary state estimation component for our fleet of autonomous aerial vehicles.


drawing drawing

drawing

Instructions

DLO requires an input point cloud of type sensor_msgs::PointCloud2 with an optional IMU input of type sensor_msgs::Imu. Note that although IMU data is not required, it can be used for initial gravity alignment and will help with point cloud registration.

Dependencies

Our system has been tested extensively on both Ubuntu 18.04 Bionic with ROS Melodic and Ubuntu 20.04 Focal with ROS Noetic, although other versions may work. The following configuration with required dependencies has been verified to be compatible:

  • Ubuntu 18.04 or 20.04
  • ROS Melodic or Noetic (roscpp, std_msgs, sensor_msgs, geometry_msgs, pcl_ros)
  • C++ 14
  • CMake >= 3.16.3
  • OpenMP >= 4.5
  • Point Cloud Library >= 1.10.0
  • Eigen >= 3.3.7

Installing the binaries from Aptitude should work though:

sudo apt install libomp-dev libpcl-dev libeigen3-dev 

Compiling

Create a catkin workspace, clone the direct_lidar_odometry repository into the src folder, and compile via the catkin_tools package (or catkin_make if preferred):

mkdir ws && cd ws && mkdir src && catkin init && cd src
git clone https://github.com/vectr-ucla/direct_lidar_odometry.git
catkin build

Execution

After sourcing the workspace, launch the DLO odometry and mapping ROS nodes via:

roslaunch direct_lidar_odometry dlo.launch \
  pointcloud_topic:=/robot/velodyne_points \
  imu_topic:=/robot/vn100/imu

Make sure to edit the pointcloud_topic and imu_topic input arguments with your specific topics. If an IMU is not being used, set the dlo/imu ROS param to false in cfg/dlo.yaml. However, if IMU data is available, please allow DLO to calibrate and gravity align for three seconds before moving. Note that the current implementation assumes that LiDAR and IMU coordinate frames coincide, so please make sure that the sensors are physically mounted near each other.

If successful, RViz will open and you will see similar terminal outputs to the following:

drawing drawing

Services

To save DLO's generated map into .pcd format, call the following service:

rosservice call /robot/dlo_map/save_pcd LEAF_SIZE SAVE_PATH

Test Data

For your convenience, we provide example test data here (9 minutes, ~4.2GB). To run, first launch DLO (with default point cloud and IMU topics) via:

roslaunch direct_lidar_odometry dlo.launch

In a separate terminal session, play back the downloaded bag:

rosbag play dlo_test.bag

drawing

Citation

If you found this work useful, please cite our manuscript:

@article{chen2022direct,
  author={Chen, Kenny and Lopez, Brett T. and Agha-mohammadi, Ali-akbar and Mehta, Ankur},
  journal={IEEE Robotics and Automation Letters}, 
  title={Direct LiDAR Odometry: Fast Localization With Dense Point Clouds}, 
  year={2022},
  volume={7},
  number={2},
  pages={2000-2007},
  doi={10.1109/LRA.2022.3142739}
}

Acknowledgements

We thank the authors of the FastGICP and NanoFLANN open-source packages:

  • Kenji Koide, Masashi Yokozuka, Shuji Oishi, and Atsuhiko Banno, “Voxelized GICP for Fast and Accurate 3D Point Cloud Registration,” in IEEE International Conference on Robotics and Automation (ICRA), IEEE, 2021, pp. 11 054–11 059.
  • Jose Luis Blanco and Pranjal Kumar Rai, “NanoFLANN: a C++ Header-Only Fork of FLANN, A Library for Nearest Neighbor (NN) with KD-Trees,” https://github.com/jlblancoc/nanoflann, 2014.

License

This work is licensed under the terms of the MIT license.


drawing drawing

drawing drawing




鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap