Author Archive

Final Project Report (Rhino Glove)

Posted in 11. Project Final Reports with tags , , , , , , on December 16, 2009 by mehrdadgh

The goal of this project is developing a wireless glove and a software framework to manipulate a parametric model in Rhino 3D. The glove reads user’s hand movement and gestures using mounted sensors on it. Then a LilyPad Arduino gathers sensors data as input, processes them, and sends them using a XBee module wirelessly to the computer. The software framework on computer gets sent data using another XBee module, connected to the computer. Finally, the framework translate data to information for controlling the parametric model in Rhino 3D.

Link to documentation: https://mtifall09.files.wordpress.com/2009/12/rhinoglove.pdf

Link to Arduino sketch: http://code.arc.cmu.edu/~cheng/uploads/RhinoGloveSketch.pde

Link to Grasshopper definition: http://rapidshare.com/files/321715436/Rhino_Glove.ghx

Link to video: http://www.youtube.com/watch?v=6A5I48RLtx4

Advertisements

Build the Hardest Part

Posted in 10. Build the Hardest Part on December 16, 2009 by mehrdadgh

It may be too late too figure out the hardest part, yet, it is a good time to explain the hardest part of the Rhino Glove. Because, after a long process of loving and hating programming Arduino sketch for this project, I understand writing a proper code to handle 6 inputs as data and retrieve information out of them is a frustrating part of such a project. In my project, three bend sensors, with three different range of output has been used to help program figures out what accelerometer outputs mean. For example, bending index finger, as a result bending sensor related to index finger, should be interpreted as a trigger for extruding object. In addition, bending middle finger sensor should trigger the rotation control. However, when user moves one finger other fingers move as well, and bend sensors send data to Arduino, writing proper code to find user desired action out of three different bend sensor outputs is a very hard job to be done. Same, when user moves his hand in an arbitrary direction, Arduino can read three different values of acceleration, each for one axis, figuring out the exact movement of hand using these three stream of data is a difficult task. In fact, I found it hardest part to build. Unfortunately, I faced this problem in last phase of my project, which was demonstration. I have not yet found out a way to overcome this problem. In future, I will focus on solving this problem more dedicated.

EMGH BAND

Posted in 9. Project Proposals on November 15, 2009 by mehrdadgh

The goal of this project is developing “EMGH band”. EMGH band is a wrist band consists of EMG electrodes that recognize fingers and wrist movements, and, based on those movements animates a parametric model of waves in Grasshopper®. As an illustration rotating fist will rotate an object in Grasshopper®, waving fingers will create a wave motion on screen based on intensity of fingers motion.

Electromyography (EMG) is a technique for evaluating and recording the activation signal of muscles.

Grasshopper® is a graphical algorithm editor tightly integrated with Rhino’s 3-D modeling tools.

In order to achieve this goal, developing two frameworks is essential:

1- Hardware Framework

2 – Software Framework

Details of each framework and points that should be considered can be found below:

1- Hardware Framework

–  User movement

–  User muscles related to the movement react.

(Which muscle related to which movement?)

– EMG sensor reads the muscle impulse and output a signal not stronger than  miliVolt.

– Amplifiers amplify EMG sensor signal as an input for Arduino.

– Filters filter signal noises.

(How to filter different noises of different muscles and different movements?)

– Arduino reads amplified and infiltrated signals as analog inputs.

– Arduino sends read data to Digital Computer.

(Send data using cable connection or a wireless module?)

2 – Software Framework

– Arduino software store data in a text file.

– Grasshopper® reads values from text file.

– Grasshopper® animates the parametric model based on read values.

– Parametric model evolves on the screen in front of user based on Grasshopper output.

(Signal values should be processed in order to have meaningful information for parametric model. In which step data should be processed?)

The hardest part is improving and analyzing weak EMG signals and deriving information for parametric model from that data.

– Where the data should processed?

– What is the relation between separate EMG signals?

– How can separate information combined to simulate a movement like wrist rotation?

Here is the shopping list:

– Reusable EMG sensors (5 – 10)

– EMG sensors connection wires (same as sensors)

– Amplifiers (same as sensors)

– Arduino Board

– USB Connection for Arduino or Xbee Module

– Wires

– Wrist Band

Here is the project schedule:

Week 1

– Ordering sensors

– Developing the framework for transferring data from Arduino board to grasshopper

– Working on amplifiers and sensors

Week 2

– Working with EMG Sensor

– Working on amplifiers and sensors

– Interpreting EMG signals

Week 3

– Working on amplifiers and sensors

– Interpreting infiltrated amplified signals

– Finishing the software framework

Week4

– Assembling components

– Testing the product

– Enhancing signal interpretation

– Developing and enhancing parametric model

– (If there is a spare time) Adding wireless module

You can find in class presentation here:

http://rapidshare.com/files/307525686/EMGH_Band.pdf

Love & Hate Robot

Posted in 6 Form & Motion on October 19, 2009 by mehrdadgh

For the assignment in form and motion, I built a robot which is called hate and love robot. Robot has been assembled on a chariot made of basswood, two wheels in rear and a sliding wheel in front. This robot is basically uses two IR sensor as input device and controls its two DC motors using input information. The input of left sensor controls right motor and the input of right sensor controls left motor. The following image is an illustration:

braitenberg

The robot has a push button on it that send an input to arduino. Using it user can change robot mode. The uploaded code controls arduino so that for odd pushes of push button robot goes on love mode, and for even pushes robot goes on hat mode.

In love mode when a IR sensors detects an object in close distance the related motor generate torque so robot moves toward the object. In hate when a IR sensors detects an object in close distance the related motor stop generating torque so robot moves away from the object for the other motor is generating torque. (Refer to the following book: http://books.google.com/books?id=7KkUAT_q_sQC&dq=braitenberg+vehicles)

DSCN6670

DSCN6672

DSCN6671

Watch the story of this robot:

What if …?

Posted in Uncategorized on October 19, 2009 by mehrdadgh

Here is the late version of my “What if …?” Assignment.

Treasure Hunting and Some DayDreaming Late at Night

Posted in 5. Treasure Hunt with tags , , on October 2, 2009 by mehrdadgh

As long as, I don’t want to terrify you with a long list of items, which is readable only by students of this course, I have created a simple excel file, which contains item, price, link and other information. See the file yourself and by the way fell free to edit the file (Yes, the file is reachable and editable online just sign-up (select lite plan) using this link:

http://www.box.net/shared/c04fg2ub8g

it is free and I assure you it is not an advertisement ) in order to improve our list of items.

As a bizarre sensor or devise eNose is a good technical example.

http://science.nasa.gov/headlines/y2004/06oct_enose.htm

There is a commercial version of similar device , which you can find link in the excel spreadsheet.

Finally, a project where we can use all this stuff would be a tangible dance or theater scene. Suppose actors can perform magical lightning using bend sensors in their hand or stretch sensors in their customs that change EL-wires light. Using conductive paint or fabric on their body dancers or actors can trigger the backstage mechanisms which use solenoids, linear actuator, tiny potentiometer, shape memory alloy wire, neodymium magnets, copper tape, copper foil, tilt sensor, peltier Junction and all the other fascinating stuffs!!! and change the scene depending on their choreography or piece. A scene which may consists of thermochromic paint as a color effect using heat. And we can use alcohol sensor to detect artists and audiences who can not drive home after theater.

Anyway, here is the video inspired me (movie Moulin Rouge! is the another source. Just see both and let your mind fly in the MTI class. Enjoy your daydreaming.)

Assignment 3 Using Pandool Switch

Posted in 3. Digital Input-Output with tags on September 27, 2009 by mehrdadgh

This Cup includes two switches, a press button switch on top of cup and a pandool inside inner cup. As you can see in video there are several ways to activate these two switches. The idea of switch just came in my head while seating in archtecture lounge drinking tea – no wonder!!! Here are codes and you can see what they do in video.

This Cup includes two switches, a press button switch on top of cup and a pandool inside inner cup. As you can see in video there are several ways to activate these two switches. The idea of switch just came in my head while seating in archtecture lounge drinking tea – no wonder!!! Here are codes and you can see what they do in video.

Arduino Sketch: Assignment_3_part_2_mehrdadgh
Arduino Sketch: Assignment_3_part_3
Arduino Sketch: Assignment_3_part_4
Arduino Sketch: Assignment_3_Additional