I am working on a little system to measure the force of every wire rope with flexi-force sensors to have a force feedback for the servos. I use ROS on raspberry pis to control the servos and read the flexi-force sensors.
Also i am searching for a good method to detect the angle of every phalanx with a computer. I mean with a RBGD camera or a digital glove or anything else, perhaps someone knows a good idea.
thanks
Yes, I have one and was thinking of tying to use it in combination with an oculus rift for remote operation for a robot-hand I am helping my girlfriend build, if I ever have the time that is… https://www.leapmotion.com/product/vr
if you got a leapmotion device, does it detect good all angles of every phalanx of one finger? I mean , could you bend your finger, just a little, and it detects it ? Or is it more rough, just detects if your finger is open or closed ?
The sensor is not very impressive, but they have a good hand-model that they map the readings to and good filtering software. It works pretty well for hand tracking but not much else…
here is my first prototype of a force tester on wire rope. Its very big, but it works very well. The wire rope is pressing against a flexiforce sensor.
yes, i have considered this, but i have not yet looked which boards can provide this reading. I cannot see, how you can read the load from the servo you mentioned, in the document pdf is nothing mentioned of reading the load from the servo ?
P.S. I bought a leap motion, not much time to look at it right now
nice info again, thanks i need to test this. my thinking of having a stand-alone tension tester is, that someone could use other methods of actuating the wire-ropes, like perhaps with fluidic muscles or linear motors or, or , or.
But a try with these sensors is it worth anyway, thanks
i changed a little bit and printed 8 of them. Here the full setup.
Now i need to connect the flexi-force sensors. After that, and with the information the leap motion device should deliver me about this finger, i think i got enough information to build a reinforcement neural network, so that the finger could learn by themself how it should be moved to close or open a hand.
But first, there is a little bit of work
thats prototyping, i did a new one. Combined servo holder and force-tester into one small piece.
With this setup i reduced the selfmade things, like alu-pipe for winding and alu-connector to servo. The force tester still needs some metal/alu stuff, but this is not complex, time consuming. Now i need 8
Again, i made it smaller. Now its not getting much smaller with the flexi-force sensor. I made a picture which shows that only a minimum of selfmade stuff is needed for this combined servo holder with force/tension sensor, and it works pretty well.
Wow, i like this setup. Its easy to handle (even if it perhaps doesnt look so) and works pretty good. And it is relativ small. I printed a ring, so i can easily plug every single combined-servo-holder into a hole of this ring.
Which then looks from bottom side, a little bit like the ATLAS experiment of the LHC (Large Hadron Collider)
And fully assembled, ok without the wires to the flexi-force sensors, it looks so:
i want that the computer knows in which position all phalanx are. @samiam : i tried the leap motion device, but it seemed not well supported at all and it looks like it not recognizes the fingers so well as described in their info.
But i got a xtion pro and a kinect one (V2), the kinect one i got running with iai_kinect2. But now i dont find a way to extract one object and then get the coordinates of it, then mapping it to an URDF model ? Perhaps someone could help me