I’ve spent over a hundred hours connecting robots with LTE - Ask Me Anything

Hi Val,

Happy to hear you are finding the discussion useful!

I’m glad you found the PepLink Speedfusion! We highly recommend their routers for robotics applications. I have an upcoming post about some routers that we have seen customers use successfully where I talk a bit more about PepLink and Speed fusion.

It is really easy to configure and deploy a ROS 1 robot that will handle a loss of connectivity poorly in a tele-operation scenario. Too combat this we recommend adding in safety at the motor controller lever so that you motor controllers are never executing commands with a late timestamp and to always test. The UDP Bridge ROS package looks really cool! I didn’t know that existed.

That’s fantastic you are monitoring the SNR. Another member of our Robotics team at Freedom just posted about our new resource management feature that will automatically log lots of great data including the network errors and network bandwidth. If your radio publishes the SNR you could also log that along with GPS location and use our API to build a heat map of connectivity to optimize the routing of your vessels to avoid areas prone to poor signal.

Cheers,
Alex

1 Like

@vschmidt - what does your SNR look like on the radio? Im curious to see some graphs. how do you record and plot this in realtime and what kind of insights can you gather from it? You mentioned anticipated losses - im curious how you view and predict this in that data.

We recently released a feature that we built to track network dropouts along with CPU utilization and bandwidth in a robot specific way - we’d love to add this if its relevant! https://www.freedomrobotics.ai/blog/robotics-resource-monitor

Would love to learn more!

1 Like

Hi!

I’ve built a solar rover with an Rpi Zero and am running standard Raspbian and a Sierra Wireless Aircard 320U - consumer grade stuff.

I have a problem with connectivity. I power my robot up for 5 minutes on the hour, every hour. It either boots and connects to our VPN perfectly after 1 minute 45 seconds, or it never finds signal and doesn’t connect.

This seems too perfect; I’d expect a bigger spread of time taken to find signal and connect. Any pointers as to why sometimes my 'bot boots but can’t find signal? Should I change a Linux thing to make it look harder/with a lower timeout?

Hi @whatuptkhere, Great question!

Does you robot always power on in the same location? If that is the case then the signal strength should be relatively constant. Likely the issue resides inside your robot.

The AirCard 320U is a USB device, and many computes have trouble recognizing a USB connection source on initial startup. You are seeing a good connection establish at very consistent times so it sounds to me like there is an issue in the boot sequence where the AirCard does not start communicating at the exact same time each startup which is causing inconsistencies in establishing a connection.

One solution would be to write a simple script where after 2:30 (or at whatever time you are confident that a connection will not establish) after startup, if there is not an established connection, you could reset or power cycle the USB. This will allow the Aircard to establish a connection after the compute is up and running and should establish a connection more reliably.

If your robot is booting up in a different location each time, then the signal strength could be causing the issue, in which case I would recommend mounting the antenna as high as possible and maybe switching to a router that supports an external antenna which will boost your signal. If you are looking to stay in the consumer grade product range the Huawei E8372h-517 is a great option. It has two TS-9 connectors for an external antenna.

Cheers,
Alex

Hi,

You may try this new cool Pi HAT -
https://www.avnet.com/wps/portal/us/products/new-product-introductions/npi/monarch-go-pi-hat/
It shoud allow every Pi user to have cellular connectivity with almost no effort.

Gadi

1 Like

Thanks for the help!

It’s a combination of both. In high signal areas, the robot works on 99% of boot ups. When parked in a low signal area, it’ll miss 50% or more. I’m implementing the USB reset idea and am keen to see if it works!

Another question - if I want to track my robot in real time on a map - what web app or service would be good to use? I have the GPS data and can send it to whatever app or service, I just want a web page where I can see my robot’s little dot moving in real time.

Hi, this is a great discussion on a timely topic. Thanks, Alex for starting it and writing an excellent blog post.

5G core networks have a feature called the Network Exposure Function (NEF) [1]. Third-party applications can programmatically (HTTP REST) request the NEF to assign specific QoS to specific IP flows. This removes the need to use multiple SIM cards with different plans. A similar feature exists in LTE core networks too called Service Capability Exposure Function. Your operator may provide such exposure features so talk to them.

I struggle to differentiate between network QoS assigned to publishers and subscribers in communicating nodes. Say for example, a publisher P1 requires normal latency QoS while P2 requires low latency QoS, with P1 and P2 in the same node. The solution I usually adopt is to break the publishers out into dedicated applications or assign the highest quality QoS to both. What is your experience in handling such varied QoS requirements in communicating nodes via ROS interfaces?

Hi,
I’m trying to remotely control a bot using ROS and ESP32.

The bot will have ESP32 camera onboard and will transmit the video stream via Wi-Fi.
Is it possible to receive this video stream via ROS, which will be running on my laptop?

Thanks in advance.

Hi @Ajaykumaar_S! Ever heard of micro-ROS? In a nutshell, it is a library allowing you to bring a ROS 2 node into microcontrollers. Specifically, we already have a port to the ESP32 that supports WiFi. Moreover, as micro-ROS comes natively interoperable with the ROS 2 ecosystem, you can leverage any of the different ready-to-use solutions that allow bridging ROS 2 with ROS in order to put your micro-ROS node into communication with your ROS operated bot. To this end, see e.g. how to do so with SOSS, for which a detailed tutorial can be found here. If wanting to know more, don’t hesitate to contact us!

1 Like

Hi @FraFin
Great suggestion! I’ll check it out.
Thank you :smiley:

1 Like

I’ve been looking at micro-ros for the past few days, and frankly its driving me crazy. :-). I’m trying to interface with a 6-axis arm run that was run through an arduino. I’m upgrading some old projects to ROS2 and I stumbled upon micro-ros. Found I’ll have to upgrade my uC, fine I’ve got some ESP32 sitting around. Learn all about DDS, Learn the new build systems for ROS2 / MicroROS… Find out there’s like ZIP out there for off-the-shelf stepper motion control in ESP32/FreeRTOS ecosystem.

Honestly, Its looking like i’ll probably select another approach like MQTT to a vanilla ESP32 sketch.

Hi @billieblaze,

It seems an interesting case. I am the coordinator of the Micro-ROS project together with @FraFin.

Please contact us, and we can have a chat about your case.

Hey Jaime,
Thanks for reaching out! This is basically what I’m trying to convert at the moment: https://github.com/jesseweisberg/moveo_ros/blob/master/moveo_moveit/moveo_moveit_arduino/moveo_moveit_arduino.ino

I’d love to hear any thoughts you have on this. There’s really not a lot to it, just take moveit messages and update steppers.

Hi @billieblaze, and welcome to the community! The problem with MQTT is that, in order to integrate your project with the ROS 2 environment, you’d have to generate a dedicated bridge from scratch. Interoperability with ROS 2 is a feature with which micro-ROS comes natively instead, with no effort on the user’s side. As a matter of fact, we believe that migrating your example application to our user’s API should be super-easy. You can take a look to our micro-ROS component for the ESP-IDF to grasp an idea on how to use it.
Good luck and come back to us if you need further guidance!

Thank you, @FraFin! I will definitely check that out, just sitting down to play with it for the weekend. I’ve spent the last week getting my idea to work if only to understand the ESP32 + libraries and the stepper motor configuration.

It is working, but I see what you’re saying, I’ll still have to run a run a node to translate somewhere. I’ve got a small kubernetes homelab that i’m planning to run some of the compute on and it could easily be a pod there but it would certainly be much better to subscribe to the raw topics!

Hello, me and my team are trying to control a ROVER from long distances using ROS.
However we have been with some issues about the radio transmitter/receiver because we don´t know witch one to use.
The idea is to control the robot and receive is camera video so we can see what is is looking at, there are 3 cameras that we want to see simultaneously.

Hope someone can help!
Thanks

Offhand, you might be able to get away with CBRS and avoid expensive connectivity bills. I haven’t been able to get good rate plans from the carriers for my image acquisition devices, they typically want up to $10 a GB, when I need it to be pennies to be feasible for my use case. If you have good line of sight you might be able to get long distances with fixed wireless.

Is there an example of microROS TDOA example that I can use , using ESP32. Thank you.

Hi @FraFin , I am trying to Flash the ESP32 with Camera with microROS, where can I find a ESP32 Camera example that I can use, the one like int32_publisher program.
Thank you.

1 Like

for control the stepper motor with esp32 you can use new robolaunch Cloudy robot because that use 2 stepper motors for drive robot,