ROS Resources: Documentation | Support | Discussion Forum | Service Status | Q&A

Robotic Humanoid hand

I try to build a robotic humanoid hand. I use finger and servos.

I am working on a little system to measure the force of every wire rope with flexi-force sensors to have a force feedback for the servos. I use ROS on raspberry pis to control the servos and read the flexi-force sensors.

Also i am searching for a good method to detect the angle of every phalanx with a computer. I mean with a RBGD camera or a digital glove or anything else, perhaps someone knows a good idea.

1 Like

Do you mean for input or servoing feedback?

In case you want to be able to map the motions of your hand to the robotic hand I would suggest looking at

thanks, i see they seem to have a good api also

Yes, I have one and was thinking of tying to use it in combination with an oculus rift for remote operation for a robot-hand I am helping my girlfriend build, if I ever have the time that is…

if you got a leapmotion device, does it detect good all angles of every phalanx of one finger? I mean , could you bend your finger, just a little, and it detects it ? Or is it more rough, just detects if your finger is open or closed ?

It has good accuracy for most positions and usually detects even small changes in angles.

The sensor is not very impressive, but they have a good hand-model that they map the readings to and good filtering software. It works pretty well for hand tracking but not much else…

cool, thanks for the info

1 Like

here is my first prototype of a force tester on wire rope. Its very big, but it works very well. The wire rope is pressing against a flexiforce sensor.

and the backside

perhaps someone got some good ideas ?

1 Like

now its smaller, but still very big

Have you considered measuring the current used by the servos or reading the load from a smart-servo like to determine tension?

yes, i have considered this, but i have not yet looked which boards can provide this reading. I cannot see, how you can read the load from the servo you mentioned, in the document pdf is nothing mentioned of reading the load from the servo ?

P.S. I bought a leap motion, not much time to look at it right now

1 Like

The dynamixel servos use a serial protocol

If you want you can probably use whatever board you want and current sensing modules like, or someting similar.

There is a ros stack for interfacing with dynamixel servos, but I am not sure that the xl-320 is supported…

nice info again, thanks i need to test this. my thinking of having a stand-alone tension tester is, that someone could use other methods of actuating the wire-ropes, like perhaps with fluidic muscles or linear motors or, or , or.

But a try with these sensors is it worth anyway, thanks

1 Like

i made it smaller, it could get smaller, but it still produces good responses

i changed a little bit and printed 8 of them. Here the full setup.

Now i need to connect the flexi-force sensors. After that, and with the information the leap motion device should deliver me about this finger, i think i got enough information to build a reinforcement neural network, so that the finger could learn by themself how it should be moved to close or open a hand.
But first, there is a little bit of work

thats prototyping, i did a new one. Combined servo holder and force-tester into one small piece.

With this setup i reduced the selfmade things, like alu-pipe for winding and alu-connector to servo. The force tester still needs some metal/alu stuff, but this is not complex, time consuming. Now i need 8

Again, i made it smaller. Now its not getting much smaller with the flexi-force sensor. I made a picture which shows that only a minimum of selfmade stuff is needed for this combined servo holder with force/tension sensor, and it works pretty well.

Wow, i like this setup. Its easy to handle (even if it perhaps doesnt look so) and works pretty good. And it is relativ small. I printed a ring, so i can easily plug every single combined-servo-holder into a hole of this ring.

Which then looks from bottom side, a little bit like the ATLAS experiment of the LHC (Large Hadron Collider) :slight_smile:

And fully assembled, ok without the wires to the flexi-force sensors, it looks so:

Where you can see, that it has a very small base

i want that the computer knows in which position all phalanx are. @samiam : i tried the leap motion device, but it seemed not well supported at all and it looks like it not recognizes the fingers so well as described in their info.

But i got a xtion pro and a kinect one (V2), the kinect one i got running with iai_kinect2. But now i dont find a way to extract one object and then get the coordinates of it, then mapping it to an URDF model ? Perhaps someone could help me