Author Archive

GestureBots: Intuitive Control for Small Robots

Posted in 11. Project Final Reports on December 14, 2009 by Zhiquan Yeo

GestureBots is a hardware platform that allows users to control small robots using gestures. Gestures are more intuitive and easier to remember than commands on a keyboard, and can be more expressive with minimal effort.
The GestureBot system consists of a gesture recognition glove and a series of GestureBots, which are small, microcontroller controlled and servo actuated robots. The system is standalone, and can be used without a computer. Communication between the glove and the GestureBots is conducted over a wireless connection, which provides flexibility in deployment and usage.

Full details of the project can be found in the paper: GestureBots

Schematics: Controller Schematic | GestureBot Schematic

Bill of Materials: Controller Parts List | GestureBot Parts List

Code: Controller Code | Example GestureBot Code

Images of the GestureBot System

The spoon-bot

Close up of the spoon-bot’s servomotors

The GestureBot Controller board and glove

I intend to carry out some of the future work specified in the paper, and this does seem like an interesting project to conduct a long term study on. More updates will be posted to http://www.gesturebots.com (once the site is put up).

Building boards

Posted in Uncategorized on November 14, 2009 by Zhiquan Yeo

After the crit, where someone (I think it was Mark) said something about having a room full of robots that respond to gestures, I decided that the GestureBot board (the thing that provides the brains for the robots) should be made as small but as extensible as possible. I managed to shove a uController, oscillator, ICSP header, 2 power supplies and an XBee radio onto a board measuring 1.65″ x 2.0″ (God bless you surface mount components), and provides 5 servo outputs. Since control can be done in code, the servo ports can be used as digital output ports as well (Just don’t connect power and ground together!). Power can either be supplied using a 2 pin external power jumper, or using a regulated 5V supply. I have a battery boost board done up that will take a single Lithium Polymer battery (3.7V nominal), and boost the voltage up to 5, which can then supply the GestureBot.

I’ll be sending the boards for production soon (a little behind schedule), and will be making multiple GestureBot boards, so maybe I’ll have a whole bunch of little robots doing weird things come demo day.

Window Based Gesture Recognition on uController

Posted in 10. Build the Hardest Part on November 5, 2009 by Zhiquan Yeo

As part of my final project, I am intending to build a gesture recognizer on a glove instrumented with a 3D accelerometer and flex sensors. As an added challenge, I intend to build the gesture recognizer software on the microcontroller, instead of offloading the processing to a computer. But why do the recognition on the glove? By doing away with the computer, I have one less set of wireless communications to deal with, as well as less overall components.

The gesture recognizer is being built as a window/frame based system. What this means is that samples are collected every VEC_FRAME_WIDTH milliseconds, and stuffed into a sliding window. The gesture recognizer engine then uses whatever data is in the window to determine the gesture being made. A smoothing algorithm is also applied to the data prior to recognition.

To have everything fit on the microcontroller (and make life a little bit easier), I opted to have specific states whereby the data for gesture recognition can be captured. This eases the computational burden of the microcontroller, which now will not need to constantly analyze the window and determine the gesture being made.

While the gesture recognition engine is not complete as of the time of writing, here are the items that have been ccompleted so far:

  • Interfacing of flex sensors + accelerometer with ATMega328 (Using a previously built data capture board)
  • Implementation of sliding window
  • Implementation of gesture capture
  • Implementation of smoothing function
  • Implementation of peak detection using vectors

Items left to do:

  • Peaks -> gesture conversion

Again, to make life simpler, I have opted for a small library of between 10 – 20 gestures, using only 4 fingers. This gives sufficient range of gestures that can be used to control a robot. Once a gesture is recognized, it will be converted into a gestureID, which will be sent wirelessly to the robot, which will then execute the programmed action for that gesture.

GestureBot

Posted in 9. Project Proposals on November 3, 2009 by Zhiquan Yeo

Note: Images, sketches, etc to come… real soon now.

Motivation

Controlling robots is sometimes hard to do, and often, the means to control them are complicated and have a steep learning curve. Commands are also easily forgotten and need to be looked up in a manual of sorts. Gestures on the other hand are easily remembered, and with some thought, can be mapped nicely to robot actions. This may have benefits in the field of Human-Robot Interaction.

Implementation

The project will be divided into 2 parts. Part 1 is a gesture sensing glove, and part 2 will be a wirelessly controlled robot. The form factor of the robot has yet to be determined, but I am strongly leaning toward a bipedal robot with articulated arms (AKA, robopenguin).

The gloves will contain flex sensors and an accelerometer to measure approximate finger position and hand orientation. This will be converted into a feature vector, and using a frame based gesture recognition system, will be mapped to one of a finite set of gestures. This gesture ID will then be sent wirelessly to the robot, which will execute actions based on the gesture.

The glove will use an ATMega328 uController, a 74HC4051 multiplexer/demultiplexer, 5 flex sensors, an accelerometer and an XBee module. This will be built on a custom circuit board with predominantly surface mount components.

The robot will utilize 6 servos for actuation, an ATMega328 uController (maybe 2) and an XBee module.

Both components will support in circuit programming to enable quick changes to the software.

Components Required:

  • ATMega328 x2 [Already Have]
  • 74HC4051 x1 [Already Have]
  • Accelerometer [Already Have]
  • Flex Sensor x5 [Already Have]
  • XBee modules x2 [Already Have]
  • Misc Resistors [Already Have]
  • Misc Capacitors [Already Have]
  • Power supply [May Have, need to look]
  • Servos x6 [Have 5, need 1 more]
  • Printed Circuit Boards [Not Yet]

Rough Timeline

Week 1

Obtain components, design circuit boards, Start work on gesture recognizer

Week 2

Continue work on gesture recognizer, begin work on robot controller, order PCBs

Week 3

Complete Gesture Recognizer, Continue work on robot controller, Complete assembly of boards

Week 4

Complete robot controller, stuff mechanics into penguin suit, test, test, test, curse, test, test

Finite State Machine, Arduino style

Posted in 8. Finite State Machines on October 29, 2009 by Zhiquan Yeo

So I set out to build a FSM in C on the Arduino, using events.

The FSM is modeled as a set of states and a set of transitions between states. This is the state diagram for the state machine that was implemented:

StateMachine

The system transitions between states depending on its current state and the event that was triggered. Unfortunately there are no events in straight up C for the Atmega328, so we have to “fake it”. Events are fired when a button or light dependent resistor changes state (low to high). Thus, if a button was unpressed, an EVENT_BUTTON_DOWN event is fired when the button is pushed.

During each iteration of the main loop, the system checks to see if a new event was fired. It if was, it checks up the transitions available at the current state in the transition table. If a transition at the current state has an event that matches the most recently fired event, then that transition is taken, and the current system state is changed. This allows the system to move between states.

This is the way the circuit is set up:

circuit

And here’s a video of it in action:

Finally, here’s the code:MTI_EX_8

I highly recommend reading the code to see exactly how it works, its commented and gives a more indepth explanation of how the system works.

Auto-lego-Maton

Posted in 7. A Mechanical Automaton on October 15, 2009 by Zhiquan Yeo

I built a hand-cranked automata out of LEGO. It looks like 2 frogs racing (kind of… Squint real hard and you’ll see it). The crank is made out of a wheel and operates a cam which moves one of the arms up and down. The arm in turn moves the “frog”. In order to get the other cam and arm moving, there is a system of gears that transmits rotational power along a shaft. The video shows it in action as well as some close up shots.

(Under)powered, light sensitive, propeller driven thingamajiggy

Posted in 6 Form & Motion on October 14, 2009 by Zhiquan Yeo

So I had a brilliant idea of making a light activated flying object (and actually had a brushless motor + ESC ready), but those technically didn’t count as an external power circuit. So, I ripped the brushed DC motor out of the gearbox and attached a propeller to it. The motor is then powered by a 9V battery, and driven using a TIP120 transistor. The motor+prop assembly is then strapped to a box, which is then strapped to a breadboard.

On the arduino end, a light sensor connected to an analog in port, and is used to determine when to turn the motor on.

MTI_EX_6

The motor was SEVERLY underpowered, and as you will see in the video, it moved ridiculously slowly across the table. If I had a burshed motor for powering an airplane, then it would be a completely different story…

Arduino Sketch: MTI_EX_6