Localization using only 2D LiDAR scan point clouds with deep learning techniques

Pushed on GitHub something cool.
Localization using only 2D LiDAR scan point clouds with deep learning techniques. If you are interested, help me design a better NN architecture and beat the baseline. here is the link

https://github.com/donymorph/deep_2Dlidar_localization

twitter

— Dony (@dony_morph) January 18, 2025

It seems like you are putting the tool (NN) before the problem (localization). What’s the motivation here? What’s the hypothesis why a NN will do better than the Monte Carlo techniques that have been developed for over 30 years? Of course, I’m not against trying new approach, but I don’t see existing approaches to localization in your comparison.

Also, won’t there be a big risk that your NN may overfit to the features of the environments you train on? The pure statistical approaches being used for localization in robotics right now don’t have that problem, because they are mathematically sound and these properties can be proven mathematically (theorems), rather than just measured empirically (“it works 99% of the time”).

In the beginning, I was also thinking like you, but after all decided to try this approach.
The primary motivation is to explore the potential of neural networks (NNs) in addressing the localization problem using 2D LiDAR scan point clouds as input, While traditional methods like Monte Carlo Localization (MCL) and other probabilistic approaches do exist

Neural networks can learn features and relationships directly from raw data, potentially bypassing the need for hand-engineered features or manually tuned parameters. NNs also have hyperparameter tuning, However, once it is trained, NNs can provide faster inference times, and may handle Complex Environments. For now, this project aims for indoor robots that the area is fixed. I will try to compare it with other methods and do a benchmark. I shared the link because perhaps someone out there is working on or thinking about applying NNs to the localization problem.
project skeleton is ready and there are sample data to try out as well, the problem is designing network architecture that fits this problem.

hand-engineer features? The whole idea of statistical methods is that they avoid subjectivity by avoiding the need for hand coding.

Yes, this can be a problem. So perhaps it would make sense to just train a NN to tune MCL methods?

I think that remains to be seen. It’s important to remain critical and ask where any such efficiency could come from. The BIG issue with NNs is that they are very opaque, so most of the time you actually cannot inspect them and tell why they are more efficient. As a result you cannot rule out that efficiency is just coming from memorization of specific training situations, which would not generalize well.

In any event, if this is meant as a research project then by all means, it’s great that you are willing to try it and I’ll be curious to see the results. If I was your research advisor, I would recommend focusing on a multi-sensor situation though (e.g., lidar + video + imu), because those are much harder to unify under one, well-defined statistical model (mostly due to the video) and are also more relevant to practical applications these days. Many robotics companies would like to shed their lidars in favor of much cheaper cameras if it was safe to do so.

“Hand-engineered features”: I was referring to the manual design of specific feature representations like Scan Matching like ICP methods, Feature Descriptors, Grids Histograms, and so on, which NNs can bypass by directly learning from raw data.

see, there is a huge problem with NNs you mentioned that needs to be inspected and find out what is underhood why it works, I prefer the practical approach by building something I learn rather than reading someone else’s work and believing false claims but not always, there are brilliant articles, books, frontier papers.

appreciate for an advice, I go there don’t worry, start simple and evolve with it.

update:

  1. implemented optuna https://optuna.org/ for better hyperparameter search space of particular NN. according to terminal output you can design your NN that fits your dataset.

dependance: pip install optuna, optuna-dashboard

change number of trials and epochs for bigger numbers to get better NN
reference video: