Embedded

View Original

My Arms! They Are Here!

They are here! My arms! They are here!

 Wait, maybe I should start from the beginning. No… that involves installing Ubuntu no less than 83 times so it is utterly boring.

I want to make a robot typist. I will tell it what to type, then the robot hands will type. Like a human typist, I may start out with single finger hunt-and-peck but I hope to someday get to fluent touch typing. I want it to work on any keyboard. This is will hopefully let me play with machine learning, machine vision, and robotics. I suppose if I do voice to tell it what to type, then voice recognition too but I’m not as excited about that. We’ll see.

I suppose the beginning is really that a friend from Nvidia sent me a Jetson TX2 developer kit. (Yes, this is a generous friend.) What to do with it? Machine learning and vision have been on my to-learn list for a while now. I had a really good time with my Raspberry Pi and its camera though I found development to be a pain (on-board was slow, cross-compiling was a pain to set up and keep stable for me).

I unboxed the Jetson, excited to try it. I found the Two Days to a Demo page, and read the background material and all the steps. I spent a lot of time on step 2 (Download JetPack 2.3.1 and TensorRT), not realizing that the link covered steps 2-10. Anyway, I got out my Jetson and started installing things.

But it needed Ubuntu to flash the board. VirtualBox Ubuntu wouldn’t work. I run a Windows machine for development. And since the start of 2017, I have a Windows 10 machine that thinks it is in charge of me. I got an external SSD and tried to install Ubuntu to it, many times (many, many times). Then a friend suggested I re-partition my Windows drive and boot locally, that it was possible to do that without hurting Windows. That did not work until Chris gave it a shot and noticed some BIOS stuff regarding RAID and Windows and Ubuntu not having access to the local drive. A few weeks of cursing later, it worked. I finished Step 2. Yay!

The next steps went quickly and I got to the demo working (Step 5: Classify images with ImageNet). And it blew my mind. I find it hard to describe.

Imagine, you have this chunk of hardware on your desk. It runs Linux and has an HDMI output. So you hook up USB keyboard and mouse. Then an external monitor. As you’ve been doing for an hour or two, you faithfully and painstakingly type in a command that you don’t really understand:

./imagenet-camera googlenet

On your monitor now is a camera output, you can see your desk and office. So, not having an apple or an orange as is shown in the docs, you go to the kitchen, get the most iconic of all fruits. You return and put a banana in front of it. And it says “Banana” and puts up high confidence. Even when you rotate the banana and try to trick it. Neat! So, you get an avocado from the kitchen. It thinks that is a fig, actually pretty close from the outside.

You notice that it says, “Pool table” a lot when you take the banana away. And you realize it is because the camera sees your green ESD mat. Funny. So, you move the camera around. And now it is identifying a pen and a stuffed animal. You run around the house collecting random things, showing them to the Jetson board, wondering what is it going to say? Hold the sleepy dog up to the camera, yes! (And a great estimate of the dog’s breeds.) My small weight is indeed a dumbbell, yes! Hammer, yes! Scissors, never! Ha! Fooled it! Screwdriver is sometimes right, sometimes a missile, hilarious! Stuffed animal, usually! Stuff frog is a tree frog or broccoli, hee hee!

It wasn’t rock-solid but was scary, creepy good. And it wasn’t working hard; the Jetson’s fan never turned on.

This really seemed possible. I started looking at robot arms. I talked to Shaun Meehan about robotic options and decided on the small MeArm as a path (Shaun recommended very large or very small arms, comedic value either way).

Now, we sound the forbidding music. I wanted to start working on the keyboard image recognition. I plugged in that external drive and noted the drive that appeared in the Files app. I removed it, the name disappeared, inserted it, the name appeared. Ok, let’s format this and start downloading keyboard images so it can learn all kinds of keyboards (I wonder if I should include a music-keyboards Easter Egg, hmm).

Yeah, I formatted my work Windows partition and overwrote it with images of keyboards. This was bad. Very bad.

A week later, I was mostly back to everything working on Windows. Plus, with listener Frank Duignan’s suggestions about using VirtualBox to install to the external drive and a bit more BIOS-fu, I really do run Ubuntu from my external drive.

I’m going to set my computer up to start pulling down keyboard images. Then I’ll start building these arms.