I’m just getting started with ROS, and was finally able to get ROS installed on Linux Mint 17.1 rather then in VirtualBox or on some older slower HW I had laying around.
I’ve been working with MyRobotLab but have had a lot of issues with Java not wanting to run well on 64bit Linux Mint. Lots of random crashes and running it on a Win10 box there were other issues, especially when trying to use the OpenCV service and tracking.
Ultimately I want to do some experimentation with physical object manipulation and computer vision, but I’m taking baby steps at first.
I have a head constructed with dual cameras for eyes, and using MyRobotLab I’ve been able to do some basic object tracking (on a good day with the Buffalo running on the west side of the mountain:)). But due to the Java issues I’ve been experiencing as well as the shifting MRL APIs it is difficult to get the reliability I need.
So I’ve moved to ROS, and have done some of the earlier tutorials. I’ve written some Arduino code for controlling the servos in the head and have successfully tested that using rostopic.
In MyRobotLab the Servo service had the ability to specify MinMax values for limiting the servo output, as well as a mapping function that can be used to map 0 - 180 degrees to the physical limits. For example in the head I may need to limit eye movement in the Y axis to a fairly small range.
I’m assuming that there may be some code already available in ROS for this type of implementation but I haven’t located it yet.
So before I reinvent the wheel I thought I would ask if there might be some generic code for implementing a mapping layer to manage the servo(s) and then the output from that mapping layer would be sent to the Arduino code via rosserial to handle the low level movement of the servos.