Raspberry Pi Gesture Control
I’ve probably spent more time on this than anything else. To be able to control my raspberry pi with the wave of my hand..It wasn’t easy at first.
This proof of concept which was made with broken toys and popsicle sticks was so inspirational to me, I had to take it as far as I could.
Yea. That’s raspberry pi controlling a 3d printed hand using nothing but my muscle signals. No buttons. No wires. No popsicle sticks.
Controlling your raspberry pi Can now be as natural as moving.
Finding a System
So first how does it work?
Now there are a few ways to pull off gesture control with their own strengths and weaknesses and I’ve looked into a few like:
–OpenCV: With a decent haar cascade based on hand gestures,
I could easily program a machine to react to what it sees using computer vision. The problem with that(for me) is:
1: Since the camera has to be facing it’s target,i would have to have a stationary set up and i wanted freedom of movement with minimal gear.
2: The said station would have to have very little detail around it as OpenCV tends to get distracted by things in the background and I wanted my system to be highly responsive in any location and condition.
And
3: Cascades wouldn’t be a good way customize on the fly. I wanted a modular system to be able to change recognizable poses on a whim.
Next I looked into Leap Motion:
Which I haven’t actually tried myself but with its smooth integrations, Live 3d representations and it’s user friendly design not to mention the decent price it looks like a solid piece of gear and I still want it, except!
1: Again, I didn’t want to be confined to a control station.
And
2: It needs to be hooked up to another computer and I didn’t want extra hardware.
When I see something like Leap Motion,
I think: long distance control stations for construction machines.,
I think: performing a robotic surgery from home using over WiFi lol.
It’s great for many things, but not for a natural, plug n’ play, lightweight and fully mobile kind of project.
And then I ran into Thalmic Labs’ Myo Armband:
Game Changer
and 5 built in universally recognizable commands to use out of the box for drones, rovers and mouse control to name a few.
The reason for such a short command set on such high grade hardware
is that no two people are the same in muscle type therefore these things can’t really be trained universally out of the box.
So the developers did what they could and put the most universal gestures they found into it.
The first thing I noticed of course was that it was a bit more expensive than leap motion with a price tag of 200$
But if I can get it to work, it’ll add unlimited value to my projects from now on and Be totally Worth the investment.
Not only that but it would also expand on my overall capability as a tinkerer. I’ll be able to do so much more at a higher level of computing.
Since it’s wireless, I had all the freedom of movement I can imagine.
Since it’s a wearable device, there’s no need for a station to haul around and set up and can be used “hands free” so to speak.
And Since it’s attached to my arm, there’s no way for it to be confused for anyone else.
But it’s true majesty is how the Armband itself works:
rather than just scanning your hand
somehow like most methods of gesture control, it instead reads the electrical signals of the user’s muscles making it irrelevant if the user has a hand at all.
The device truly becomes one with it’s user making the Myo Armband an honest to goodness, game changer.
State of the art
Yea. Cyborgs.
This thing has certainly proven itself to be more than just an expensive toy
By opening up the possibility of non invasive bionic prosthetics.
Its interface is so thoroughly connected to it’s user, that it even works on amputees and other people with limited muscle activity by Working with the muscles the user DOES have.
Now THAT’S state of the art!
Sold!
Hacking myo
The Myo Armband one serious piece of technology.
One such hack uses machine learning to conveniantly “teach” the bracelet any kind of gesture the user desires. (Yes, all the “F*** you” gestures too) As well as overcome the 5 command limitation by doubling the capacity for a max of ten Per memory folder so you can theoretically store unlimited systems and gestures And simply change the folder to easily swap applications making a perfectly modular system.
Dzhu’s code connects the armband to your pi via Bluetooth, takes muscle readings from your forearm and stores it for later use To be applied to any code you have or will have 🙂
And nowadays it’s even easier to connect to the pi for a near seamless experience as long a your systems up to date.
Plug it into your movie or music player for media control,
Play your python videogames with it or hook a robot up to it!
The 3 main files: myo_raw.py, Classify.py and myo.py May look alien to you, but It gets easier as you work with them and learn how they work Which is pretty simple the 3 files work together as such:
–Myo_raw.py is used for connecting the armband and reading raw emg data from your arm.
–Classify.py takes that raw data and trains it so it be used With..
–Myo.py to actually execute that trained data when you perform your gestures From the programs that you want to integrate.
Connecting raspberry pi to myo
So the first thing you want to do,
is make sure yours system is up to date of course:
$ sudo apt-get update
$ sudo apt-get -y upgrade
$ sudo apt-get -y dist-upgrade
Now let’s make sure this thing connects.
Plug in the Bluetooth dongle it came with and test your connection by running myo_raw.py
Now when it looks like this:
your armband is not active.
to activate, perform the gesture:
Touch your index finger to your thumb twice or Outstrecth your arm like a power ranger.
You’ll know it’s active when it vibrates and your screen looks like this:
And that blank Pygame screen wakes up.
So what that screen is, Myo reads 8 parts of your arm and Each horizontal section is a pygame representation of activity in a section. You may have noticed it’s activity as you move your arm.
And that’s when you know your ready to start using this thing.
Training
So to train it to your arm:
-Run Classify.py
-decide on a pose and what number you want that pose to be.
-Strike the pose And with the number pad, hold that number down while maintaining that pose.You should see the corresponding number grow on the Pygame screen As well as a bar that raises to indicate how well it recognizes that gesture over the others.
While you train, take these tips into consideration.
-when you pose, be as natural as possible so that your Movements Don’t have to be 100% exact for them to work. It gives you room For error.
-It helps if each pose is very distinct so the system has no doubts. Every pose should have it’s own full green line Without it jumping to other numbers. But if your gestures ARE close in nature,You can reinforce it’s recognition ability With more training time.
-The longer you train a pose,The more accurate it’s recognition capability becomes. You start to really notice a difference when the numbers get around 600+.
-If your screen happens to freeze while you train,close the application and start over.Your data’s still there It may do this because of some Pygame error So if this happens to you, don’t let it slow you down,Everythings ok.And when you think you’re ready for a test,Run Myo.py to execute the test function.
Let’s recap:
–Run Classify.py
-hold pose while holding desired corresponding number of that pose.
-Run Myo.py
Take that! “hello world” example 🙂
When you start training and testing extensively, you may at times get this error:
The “not a valid pose” error occurs when your gesture somehow confuses the system.Can be fixed by making each pose more distinct and/or more training time.
Also if think you ruined a pose by accidentally training multiple poses in the same number,Delete them from the memory folder and start again.
When you got the idea, it’s time to integrate our robot into this thing and
plug in some functions.
And have ourselves a test!
Drawbacks
Have you spotted anything that the code doesn’t do yet..?
As it is now, The code can pull off many amazing feats Except for these small things that may not even matter to your purposes:
–No live control– The code is designed to recognize the gesture, then obey the command. What it doesn’t do is full on live mimicry It cannot perform exactly what you are doing and follow along with you.
–No quaternions– it cannot measure rotation, pitch or yaw so you wouldn’t be able to program elaborate movement based gesture Or gestures based on angle.
–Individual users– One size does not fit all. For the same reason the original sdk only had 5 commands, since no muscles are the same, you can’t just share your bracelet once you train it Ever user would have to have their own training data.
Like I said these things may not even matter to you And if they do, you can always learn and apply that logic Now that you have the code to study yourself.
But other than that, these scripts are more than a very strong start Good enough for most ideas.
Good enough for an easy to use, affordable, practical bionic prosthetic 😉
Gesture Controlled Servos
Installing The Fun
Now that my head was wrapped around this particular myo hack, all I needed was a program to integrate into it.
One of my favorite expansions of the raspberry pi is the Adafruit 16 Channel Servohat:
The Adafruit servo hat is a modular raspberry pi mod that allows you to control up to 16 servos for use with any pi project requiring movement From Automatic light switches to full on robots.
Read my servo post for a total breakdown of capabilities and installation.
Merging the Code
So I attached the hat and imported my Servo Controller to myo.py
Next I changed lines 31 and 37 and 40 to reflect my desired path to training data. You would create a new folder destination for every system you implement. In this way you can add as many programs as you want, each with it’s own set of gestures.
Then I modified the “page” function around line 104 to use my imported servo functions.
And then changed the pose that is called up around line 110 to the correct function and gave it a spin.
Code
If your still having trouble understanding the code, you can have my fully commented version, totally set up and merged with the servohat as an example to better help you understand how it would work for a small donation.
And as a thank you bonus, I’ve also integrated a code that will allow for full gesture control of DC and stepper motors (CARS!) on TOP of servcontrol all merged, commented and ready to go.
Bionic Prosthetics with Myo and the Raspberry Pi
Can you imagine having the knowledge to possibly replace a lost hand for an amputee?
My little claw and that video I saw really had me curious as to how far could I actually go with my current knowledge and resources.
And with my logic pretty much tested and polished, the only way to really expand on the idea and to see what this system is truly capable of, is with better physical designs that are built with servos in mind. 3D printing allows me to do that and if it didn’t, I could always buy one like this LewanSoul 5DOF Metal Humanoid Robot Hand.
Now I really don’t have the time to learn 3d modeling so thank goodness for thingiverse, which has millions of user uploaded models for just about anything you could want to be downloaded and printed right at home.
I went through a few promising, yet failed attempts at hands that we’re either a pain to print or a pain to put together.
But I finally ended up with The Servo Powered Robotic Hand by HatsyFlatsy
Which has an elegant, practical design that’s easy to print and put together requiring nothing but screws and common sense.
Testing the Prosthetic Hand
I had a blast putting this thing together now for a quick servo calibration test.
Sick! Now to test it out for the first time.
Not bad! I can see where some screws could be loosened/tightened for the hand and see in my code where to make the minor adjustments to the degree of movement.
Other than that, one more good training session oughtta get it where it needs to be.
It won’t be bending steel anytime soon but for my first Bionic Prosthetic, It came out great I’d say! Its not as good as a real hand (yet) but it can hold coffee mugs, perform small daily tasks and even play guitar a bit. And can be further enhanced with rubber tips added to the fingers for grip.
I’m am thoroughly impressed with myself..
But what I else can I do with this? I started updating my old projects. 🙂
Gesture Controlled Fighting Robots
By simply plugging in another project like my NasBot and applying new training data, I came up with a fun little boxing program reminiscent of real steel and Rock em’ sock em’ Robots.
Wicked..note that I’m controlling both arms with only one left arm band by training in the position that the arm would be in if I were throwing a right punch.
Gesture Controlled Raspberry Pi Cars
My Car. The MotorBliss 🙂
Built with the Adafruit Dc/Stepper motor shield, it’s every bit as useful and easy to use as the Servo Hat, only it’s meant for wheels and propellors mostly.
Now For SOME reason, I wasn’t able to connect my Pi to a PlayStation 3 controller, and it never occurred to me to use keyboard commands :p. So I just went ahead and applied Myo.
Works the same way, works perfectly. (Though the video could use an update)
Sayonara Suckers!
We’ll, that’s about as far as I got in gesture control for now and These are only a few applications for it. Theres plenty of other ways to apply this type of remote and feel free to share them if you have one.
I’m currently looking into drones and home automation myself.
Well I hope you got a kick out of this one 🙂
Don’t forget to like, share and comment.
Cheers!
Nice project! I’m not experienced in robotics but have some coding knowlegde.And a myo armband for my partly paralysed right arm. My whish is a myo controlled 3d printed prosthetic arm (hand open/ close, rorating wrist, elbow lift/strech) for training purposes. Do you think thats possible whit the pi and servos ? I read that rotating wrist will be a problem since the armband doesnt register ?
Greetings Sebastiaan
Hey Bas! Thanks a lot! Yes most of this IS possible with enough time and the right coding. The Pi probably won’t be strong enough to say, do push-ups with but it’d be great for physical therapy and muscle assistance purposes. As far as a rotating wrist goes, it can be a bit wonky but results can be improved somewhat with enough training time.
*Ill also be updating this post soon to include an alternate library that allows for smooth, simultaneous movement that tends to be stronger than the code that’s up right now. So stay tuned for that. And WHEN you build your exoskeleton, i wanna see pictures and videos! 🙂 Thanks again!
Great work… its very beautyfull…
you can help me with more information about training new gestures please..
Sure thing. What do you wanna Kno? The system can learn most hand gestures and if you get my code, I can better help you one on one 🙂
Thank you by the way.* It’s great to know that this work is appreciated 🙂 The goal is to fine tune my projects and to spread the word that this knowledge is out there, it’s affordable and I’m willing to teach anybody. So of course I’ll help you.
please help me getting the codes when you edit myo.py cause of not getting the idea
Are you running the Myo_raw.py through the Linux terminal?
How are you calling to your trained data? Are you using an ArgumentParser?
🙂 great questions. I’m running the whole thing through a regular ol’ python3 IDE. The trained data is stored in it’s own folder and called up through the myo.py NNClassifier class. It’s actually a very user friendly system. No argument parsing.