Using SOLIDWORKS, I designed a robotic hand with wire controlled prehension and elastic retraction. When the wires fed through the fingers are pulled, the palm will close, and when they are released, the elastics will open up the hand again.
Using a 3D Printer, I printed out prototypes, iterating on and redesigning each one making small improvements here and there. Shown to the left is Mk. III, which while similar, had some design flaws that didn’t allow for smooth finger motion.
Using TensorFlow, OpenCV, and Mediapipe, I performed real time hand gesture recognition through classification and communicated with an Arduino Uno Microcontroller. The image to the left shows the Computer Vision system running, and identifying between an open and closed hand.
The robotic hand control was driven using serial communication between the Computer Vision software and the Arduino, which controlled the HS-422 Servo Motors.
One thing keeping the previous iteration from perfection was its slightly jagged movement. By increasing the diameter of the joints, I smoothed out the movement, but still had a ways to go until my fingers' movement was completely smooth.
Going into my next phase of iterations I wanted to do 2 things:
Perfect Joint Smoothness
Give the Fingers Another Degree of Freedom
Once again, heading back to the drawing board, this time I decided to try my hand at something different. If I wanted a robot hand that can truly replicate human hand gestures, I'd need one that can move the same way.
When you add another axis of movement, the next concern is figuring out how to let the robot take advantage of it. Having another degree of freedom would've been useless with my current wire setup, because the elastic + fishing wire just didn't allow for the control I needed.
This led me to a 3 wire setup, with the back wire mainly controlling retraction, with both prehension and tilting being controlled by the front 2 wires.
By 3D Printing iteration upon iteration, I could figure out what was wrong with current designs, and fix it instantly. I mostly changed the wire rope pass-through on each part, and made the overall design bigger.
I'm currently working on iterating on the hand design, finalizing motor placement and also figuring out the route each cable is going to be driven through
The thumb placement and attachment has been a daunting notion from the start, and just proved to be the biggest challenge. To get the thumb positioned the way I want to, I had to relegate the thumb motors to the arm for now.
Immediate next steps include implementing wire routing on the CAD. This would include both the cable wiring for the fingers, and also the electrical wiring of the motors
I currently use a multi-class classification CNN to detect gestures, and outputs discrete states for the hand to be in. That would communicate with the Arduino, which would output to the motors.
Since this is a classification algorithm, it can only give the hand discrete gestures to move to, and can't translate hand motion in to real time robotic motion.
Using an RGB-D sensor, I will track motion with a 2D hand tracking algorithm, and then translate that data to 3D using the depth data. This will allow me to create a full 3D representation of the hand.
Sebastian Levy - Portfolio