ALMA: Algebraic Machine Learning, ROS 2 & Fast DDS

During the last ROS World, we presented ALMA, a new AI Project based in a new technology: Algebraic Machine Learning (AML).

This new Machine Learning algorithm proposes a totally different approach to AI. Instead of a statistical based approach in which the learning process is a “black box”, it is based in an Algebra, leading to very interesting features.

Using this Algebra, you can describe what you already know about the problem, and unlike statistical learning, AML algorithms are robust regarding the statistical properties of the data and are parameter-free.

eProsima, the company behind Fast DDS, is the coordinator of the ALMA project and also in charge of the middleware: AML can be implemented in a distributed way, and thus Facilitate a new distributed, incremental collaborative learning method by going beyond the dominant off-line and centralized data processing approach.

And sure, part of our work is to integrate AML with ROS 2 and Fast DDS. The goal is to enable distributed learning in a ROS 2 system, creating Robots with the ability of sharing their knowledge and learning together. Cool right?

The ALMA consortia include the best AI research institutes of Europe, and the original authors of the Algebraic Machine Learning theory, and an initial budget of 4.000.000 Eur.

ALMA Consortia Members: eProsima, Algebraic.ai, Champalimaud, DFKI, INRIA, KAISERSLAUTERN, VTT, UC3M and Fiware Foundation.

10 Likes

I want also to highlight the huge impact of this project on ROS.

For the European Commission, Robotics and IA come together, and they are funding projects combining both areas, such as ALMA.

In this case, we count also with DFKI, the biggest European Research institute on AI, in charge of many European Research Projects in this area, with an investment of Billions of Eurs.

ALMA count already with a team of 25 engineers and researchers, combining fundamental research on AI with Robotics.

1 Like

Where’s the code hosted? I can’t find any links to it. Any examples of using it for some canonical ML tasks with comparisons?

1 Like

As a mathematician-statistician I must strongly protest against such nonsenses. There’s nothing black box in statistics itself, unless by black box we mean ‘complicated’. Today’s ML is indeed bit black box because it evolved into black box of simple basic statistical models like linear regression, decision tree or perceptron ‘ensambled’ into ‘random’ structures that the only evidence of corectness is good performance on some dataset.
I don’t think we can explain or predict anything with pure algebra without taking into account ‘randomness’.

Hi @smac,

The code of the initial AML engine is not public, but the theory is already published.

In this article you have a couple of examples. Algebraic Machine Learning is a new theory. It is not a standardized and widely used technique as neural networks, so we are working on the foundations.

AML shows comparable performance in typical problems such as MNIST , but we are not following a statistical path. Currently, in ML many papers are based on training combinations of different types of neural networks for canonical problems, but that is not our goal. Rather than that, ALMA will develop further the math behind these ideas. In the next months, we will publish several articles in this direction.

I think we are saying exactly the same:

Today’s ML is indeed bit black box because it evolved into black box of simple basic statistical models like linear regression, decision tree or perceptron ‘ensambled’ into ‘random’ structures that the only evidence of corectness is good performance on some dataset.

Please read the main article. It is hard to follow sometimes, but sure it is going to surprise you. We are organizing some presentations about the theory itself, and you are very welcome if you are interested.

Ah got it, I think its worth a bigger announcement once there’s some public available resources. Its hard to think much of this if I can’t “touch or feel” it. This sounds like an interesting concept, but I can’t spend much time digging into it until there’s something tangible available to work with, even if starting off basic like MNIST. That’s probably enough to start working on ML based on signal detections, even if we’re not at full-scale image detection yet.

3 Likes

Let me ask the main authors of the theory to see what more I could provide you now. In our next article they are thinking in providing a small demo of the concept, I will keep you posted.

The black boxness of today’s ML is not because of it’s rooted in some basic probability theory, but rather because it diverged from this mathematical formalism. Of course there are some advantages of brute force approach but I hope people will start to realize how poorly it generalize. Nevertheless, good luck with your research, maybe I’ll find some time to at least familiarize with main concepts. Algebra is definitely less friendly and intuitive then statistics and as such is better candidate for black box for the masses :slight_smile:

1 Like