Sebastian Levy

Sebastian LevySebastian LevySebastian Levy

Sebastian Levy

Sebastian LevySebastian LevySebastian Levy
  • Home
  • Projects
    • Seal and Serve
    • Bio-Robotics
    • SmartCrowd
    • Thesis
    • Capstone
    • Robotic Hand
  • Experience
    • Professional
    • Academic
    • Research
    • Extra-Curricular
  • Fun
    • Photography
    • DIY
  • Resume
  • LinkedIn
  • GitHub
  • More
    • Home
    • Projects
      • Seal and Serve
      • Bio-Robotics
      • SmartCrowd
      • Thesis
      • Capstone
      • Robotic Hand
    • Experience
      • Professional
      • Academic
      • Research
      • Extra-Curricular
    • Fun
      • Photography
      • DIY
    • Resume
    • LinkedIn
    • GitHub
  • Home
  • Projects
    • Seal and Serve
    • Bio-Robotics
    • SmartCrowd
    • Thesis
    • Capstone
    • Robotic Hand
  • Experience
    • Professional
    • Academic
    • Research
    • Extra-Curricular
  • Fun
    • Photography
    • DIY
  • Resume
  • LinkedIn
  • GitHub

Gesture Controlled Robotic Hand

Using SOLIDWORKS, I designed a robotic hand with wire controlled prehension and elastic retraction. When the wires fed through the fingers are pulled, the palm will close, and when they are released, the elastics will open up the hand again.

Rapid Protoyping: Mk. I - Mk. IV

Using a 3D Printer, I printed out prototypes, iterating on and redesigning each one making small improvements here and there. Shown to the left is Mk. III, which while similar, had some design flaws that didn’t allow for smooth finger motion.

Programming

Using TensorFlow, OpenCV, and Mediapipe, I performed real time hand gesture recognition through classification and communicated with an Arduino Uno Microcontroller. The image to the left shows the Computer Vision system running, and identifying between an open and closed hand. 

Robotic Control

The robotic hand control was driven using serial communication between the Computer Vision software and the Arduino, which controlled the HS-422 Servo Motors.

Finger Changes

One thing keeping the previous iteration from perfection was its slightly jagged movement. By increasing the diameter of the joints, I smoothed out the movement, but still had a ways to go until my fingers' movement was completely smooth.  

Next Steps

Going into my next phase of iterations I wanted to do 2 things: 

Perfect Joint Smoothness

Give the Fingers Another Degree of Freedom

Abduction & Adduction: Adding a Second Degree of Freedom

Once again, heading back to the drawing board, this time I decided to try my hand at something different. If I wanted a robot hand that can truly replicate human hand gestures, I'd need one that can move the same way. 

Understanding Movement

When you add another axis of movement, the next concern is figuring out how to let the robot take advantage of it. Having another degree of freedom would've been useless with my current wire setup, because the elastic + fishing wire just didn't allow for the control I needed.

This  led me to a 3 wire setup, with the back wire mainly controlling retraction, with both prehension and tilting being controlled by the front 2 wires. 

Rapid Prototyping: Mk. VII - Mk. IX

By 3D Printing iteration upon iteration, I could figure out what was wrong with current designs, and fix it instantly. I mostly changed the wire rope pass-through on each part, and made the overall design bigger.

#

Mk. VII

#

Mk. VIII

#

Mk. IX

Final Finger Design

A Handy Solution

Prototyping

I'm currently working on iterating on the hand design, finalizing motor placement and also figuring out the route each cable is going to be driven through

Thumb-thing's Wrong

The thumb placement and attachment has been a daunting notion from the start, and just proved to be the biggest challenge. To get the thumb positioned the way I want to, I had to relegate the thumb motors to the arm for now.

Next Steps

Immediate next steps include implementing wire routing on the CAD. This would include both the cable wiring for the fingers, and also the electrical wiring of the motors

Current Design

Another Sub-System Overhaul Planned: Software

Current Method

What Was Wrong?

What Was Wrong?

I currently use a multi-class classification CNN to detect gestures, and outputs discrete states for the  hand to be in. That would communicate with the Arduino, which would output to the motors.

What Was Wrong?

What Was Wrong?

What Was Wrong?

Since this is a classification algorithm, it can only give the hand discrete gestures to move to, and can't translate hand motion in to real time robotic motion.

The Idea

What Was Wrong?

The Idea

Using an RGB-D sensor, I will track motion with a 2D hand tracking algorithm, and then translate that data to 3D using the depth data. This will allow me to create a full 3D representation of the hand.

We've reached the end... for now.

Sebastian Levy - Portfolio

This website uses cookies.

We use cookies to analyze website traffic and optimize your website experience. 

Accept