Hi guys. Im currently working on an agricultural rover where in the function of the rover is to identify wether the plant detected by the camera mounted is a crop or a weed and hence eliminate the weed using high intensity laser if detected. The Design consists of a base rover and a robotic arm on top of the rover which has cameras attached to it for CV aspect.
I actually seek some guidance regarding the autonomous navigation of the rover. Im planning on using ROS for this. The main purpose is to navigate the rover without human interference (completely autonomous) on the agricultural terrain. If someone could please guide me as to how i can go about doing this, it would be helpful.
Quite honestly and with the best of intent, that’s a big ask and we’d need much more information about the robots, environment, terrain you’ll have to traverse and requirements to really give you any good advice. ‘Agricultural environments’ is a very broad category from trivially easy to SOTA solutions not being quite as reliable as I’d like. This might not really be a reasonable ask for us to answer for you in completion.
This is potentially a very big task on your hands that there many not just be an easy tutorial / documentation to help you through step by step. I’m not sure if this is for a company, school project, or just a hobby, but I might suggest consulting resources appropriate for your stage to help you out. Outdoor and especially agricultural robots are hard for various and potentially subtle autonomous, electromechanical, and environmental reasons. Books can be written about it.
(but also you could be talking about something easy and doable, I’m not sure )
As Steve said, this is potentially a huge undertaking, but a couple of comments on how I’d approach the problem (without too much thought):
Make sure you have the physical operation/control of the mobile platform and the arm fully working and understood, independently
Make sure your sensor data is appropriately transformed, especially if you are planning on using arm-mounted sensors for navigation
Achieve autonomous navigation in an indoor/controlled environment with the arm in a fixed pose
As above but in an outdoor environment
Begin working on the arm manipulation in controlled environment, then outdoor
Have a simulation set up to make this easier!
So much can become more difficult when testing things on rough ground with changing environmental conditions, and field testing is generally costly, so you want to make sure any “easy problems” are solved before doing that (while being aware of the assumptions being made), then tackle your domain-specific problems.
I am Venkat from GURUV
we also working on same and bit more advanced.
are you from India , we are from Hyderabad Telangana,
can we meet to discuss bit detailed and get solution for it ,
With best regards
Hi. Thanks for the reply, I too am from Hyderabad. Im currently focusing on simulating this in a virtual environment using rviz/gazebo. I need to figure out the autonomous navigation using nav2 stack. Maybe we could use SLAM algorithm for this if Im not wrong. For your kind information, this is a college project and we are looking only at simulating this.
Thanks for your reply. So we have two aspects to this project
1)Actuating the robotic arm on the rover. We have figured out the kinematic transformation matrices for the arm which is mounted on the rover and have also figured out its repeatable action
2)Navigation of the base rover, the aspect where I require help. We are going ahead with a design similar to that of the Husky robot from clearpath robotics. For now we are looking at getting the simulation through in our project, we are not looking at going ahead with a physical model for now since we are doing this as a college project. I was wondering wether we could use nav2 stack, where we could use a GPS sensor to create a map of the rover and then use SLAM for autonomous navigation. Any suggestions or improvements to the current idea from your side is absolutely welcome. I would be highly oblidged if I could get some guidance from your end. Please note that we are only looking at performing the simulation for now.