Project Post #3: Theremin Jacket

Project Team

Junda Chen, Jeff Ma, Yudong Huang, William Black

Weekly Accomplishments

([x]: Finished Task)

  • 3D PrintLeap Motion case
  • [x] First software prototype for theremin
    • [x] Motion trace: proximity and height change
    • [x] Data Transfer and MIDI encode/decode
    • [x] Run on Arduino/Raspberry Pi
  • Leap Motion Optimization
    • [x] Add an infrared light source to
    • Determine where the light should be
      • Wrist
      • On Jacket
  • Jacket
    • [x] Jacket and light
    • Select a jacket.
    • [x] Design the jacket.
    • Design the light effect of the jacket

Image/Video

Material list

  • [x] Circuit Board: (Potentially) MIDI encode/decoder, Leap motion image processor,
  • [x] Leap Motion (1): $96
  • [x] LED Strip light (2, TBD)
  • A Jacket (1, TBD)
  • [x] (Safe) Infrared LED (20)

Areas of Concern

  • Infrared LED Safety to eye: To design a better tracking of hand while not directly influence users’ eye is a design challenge, and that might require some research in the area.

——

Development Log

Cylon.js: an Arduino API to control the leap motion

Adafruit strip LED : a $17.99

Leap Motion installation: Troubleshooting in Windows.

Project Post #2 – Theremin Jacket

Project Post 2

@Postdate: Mar 9th (Sat)

Project Title

Theremin Jacket

Project Team

Junda Chen, Jeff Ma, Yudong Huang, William Black

Major aspects for Development

  • Sensor

    • Sonar Sensor

    • Leap Motion

  • Arduino/Other interface & Software Design

    • MIDI Software

    • Storage

    • Data Transfer

  • Clothes Design

    • Sensor/Leapmotion embedding

    • Light Design

      • LED

      • Covering Material / Defussing material (Potentially)

    • Jacket

Weekly Accomplishments

  • Setup sonar sensor tracking in Arduino Mega.
  • Use sonar sensor to build a prototypical MIDI device.
  • Leap Motion Mechanism
    • How leap motion works, Accuracy, General application
    • HW and SW. Compatilibility to IoT Devicess
  • 3D PrintLeap Motion case
  • First software prototype for theremin
    • Motion trace: proximity and height change
    • Data Transfer and MIDI encode/decode
    • Run on Arduino/Raspberry pi
    • (Optimization) De-noise.
  • Select a jacket.
  • Design the jacket.

Image/Video

Changes to our approach

We originally want to design the primitive circuit and sensors to make the sensing work. William has just worked out the sonar in wednesday, and as a backup plan and primitive approach we will design a theremin using the sonar sensors and integrate it as a part of jacket.

In search for potential improvement of gesture recognition, we also focus our attention on Leap Motion. On leap motion, we’re able to capture richer and more sensive gesture information — able to grab, tremble, drastically move up and down, within its well-defined range of service.

Material list

  • Circuit Board: (Potentially) MIDI encode/decoder, Leap motion image processor,
  • Leap Motion (1): $96
  • LED Strip light (2, TBD)
  • A Jacket (1, TBD)

Development Log

Cylon.js: an arduino API to control the leap motion

Adafruit strip LED : a $17.99

Leap Motion installation: Trouble shooting in Windows.

Lit Lehenga

Individual: Jessica Fernandes

What is it:

The structure of this project is a lehenga, an Indian cultural skirt worn by women for traditional events and celebrations. This piece incorporates cultural textiles and light components to create the illusion of a cloud of light, celebrating and honoring the experience of growing up as a first generation, Indian American.

What it does:

The garment illuminates to create the impression of airy color diffusion. Light sensors trigger the LEDs to turn on in a certain level of darkness and can also be controlled by a switch embedded in the structure of the garment.

Who it’s for:

This garment is designed to be a statement piece for Indian women in search of unique, traditional clothing for Diwali (the festival of lights), among other cultural events.

How it’s used:

To activate the garment, the user would simply wear the skirt and turn on the light sensor that will trigger the lights to turn on. This switch is discretely integrated into the waist for ease of use. The power source will also be embedded in the structure of the skirt and will need to be charged when not in use.

How it’s unique: 

Because the piece illuminates in darkness, it creates a moment of fantasy for the wearer. This is especially fitting for celebrations like Diwali.

Inspiration & Sketch

Lehengas and dress with lights

Materials

  • End-emitting fiber optic fibers
  • Side-emitting fibers or fiber optic fabric
  • LEDs
  • Light sensors
  • Micro-controller
  • Power source
  • Super glue/glue gun
  • Fabric/(conductive) thread

Skills

  • Sewing/embroidery
  • Programming
  • Soldering

Timeline

Milestone 1 (March 25)

Darkness triggers the light sensor to turn on the other lights.

Milestone 2 (April 8)

Network of lights and sensors function in a form that can be draped onto the garment.

Milestone 3 (April 22)

The sensors, lights, and power source function and are integrated into the garment in an aesthetically pleasing finish.

Fallback Plan

If the initial plan does not succeed, I will adjust the features implemented based on what will accomplished the best functionality. This means potentially reducing the number of lights incorporated or changing the type of power source or changing the triggering of lights from sensors to a switch. These adjustments aim to simplify or bridge problems of functionality with more direct solutions.

Project Post #1: Theremin Jacket

Project Title

Theremin Jacket

Team Member

Jeff Ma, Junda Chen, William Black, Yudong Huang

Project Description

1) What does our project do?

The concept of Theremin Jacket comes from the Theremin, an electronic musical instrument that could be controlled without physical contact by the performer. Thus, the Theremin Jacket we want to make allows the wearer to control an external connected MIDI to play music without any physical contact.

2) Who is our project for?

Theremin Jacket is for people who are fans of music, or more specifically, fans of the electronic musical instrument Theremin. It could also provide a chance to those without any experience of playing musical instrument to play music.

3) Describe how someone would use the developed device. What are the steps that a user would go through to interface with the technology?

First, there will be a switch on the jacket that allows wearers to turn on/off the circuit. When circuit is on, sensors on the jacket will read positions of wearers’ hands and arms in real time. After that, those data will be transmitted to the external connected MIDI through bluetooth and the MIDI will receive and play corresponding tones.

Besides the features mentioned above, we might also want to allow wearers to play different type of sounds by moving different parts of body instead of just two arms (ex. Playing drum by moving one foot up and down). We could also try allowing user store customize MIDI instrumental sounds in an external device and display sounds by speaker or earphone.

4) What makes your project different from existing products?

As we have researched so far, we have not found a wearable technology that support Theremin and MIDI music that allows user to move and control  the flow of music. There are products that allow user to move but produce funny music (e.g. movement jacket), and products that allow user to control music but in a fixed, not fully interact-able way (e.g. arm MIDI keyboard, MIDI shirt, etc).

Our project aims to provide an easy to control interface to detect user hand movement. User adjust pitch by putting hands in different heights ( or different relative position from the other device ) and control music flow by the touch of buttons and proximity to body.

Inspirations

12th December 1927: Professor Leon Theremin demonstrating his theremin. The theremin was the world’s first electronic musical instrument. It is played without actually touching any part of the instrument. Film scores of the 40s and 50s used the instrument to eerie effect and it makes a famous appearance in the chorus of the Beach Boys hit ‘Good Vibrations’. (Photo by Topical Press Agency/Getty Images)

Sketch

Material/Tools Needed

  • Base Jacket
  • Accelerometers / Infrared Sensors / Sonars (for position detecting)
  • Arduino Board
  • Thread
  • Machine Needles
  • Battery

Skills/Concept to Master

  • Coding in Arduino
  • Connecting with MIDI
  • Making sensors work
  • Mounting sensors
  • Data transmit through Bluetooth

Timeline

Milestone 1 (March 25)

  • Try out different types of position detectors
  • Determine which type of position detectors to finally use

Milestone 2 (April 8)

  • Get the base jacket
  • Mount sensors on the jacket
  • Data transmit through Bluetooth
  • Connect with MIDI
  • Improve overall precision

Milestone 3 (April 22)

  • Improve overall precision
  • Aesthetic adjustments

Fallback Plan

We are planning to implement a jacket that allows wearers to control different parameters of music (i.e. pitch, amplitude or duration). If later in this semester we could determine that we have fallen behind what we have expected, we could: 1) Instead of having both arms to work, just implement and make sure one arm to work, and decrease the number of parameters of music we are going to control, or 2) Instead of making a Theremin Jacket, just simply make a position detector for parts of a body.

In-Glove + Applications (SRF and Home Assistant)

The In-Glove (Intelligent Glove)


TEAM: Curt, Shruthi, Vedant

1.DESCRIPTION:

Our project is composed of three subprojects: the hand gesture / position glove and two applications for said glove.

For the first part of the final class project, we would like to make smart gloves that helps the user send remote signals using hand gestures as commands. While there are many similar kinds of gloves that exist as projects, a lot of those gloves do not make use of the wide variety of sensors available in the market. Additionally, some of the projects seemed to be made inefficiently in that they seemed to be pretty bulky for how much they can do. We want to explore how those existing gloves could be improved in not just functionality but also aesthetics. Thus, we would work on this project not as an invention, but rather a demonstration/experiment

1.1 FUNCTIONALITY:

The functionality of the glove would include things such as pressure, motion and flex sensors to capture various gestures, and associate these gesture-controlled commands to control the application sub-projects as detailed below. Communication will be handled with Bluetooth or WiFi depending on the application being implemented.

1.2 USAGE:

The typical process to use the glove will start with a user placing the glove onto their hand and performing calibration. While the details of calibration will need to be worked through as we develop the software, it will most likely take the form of the user positioning their hand into a predefined pose. Depending on the end application, we could either provide the user with a list of predefined guidelines that maps gestures to tasks or allow customization of the said mapping. In the latter case, we would have to perform experiments on the kinds of gestures that are best read and provide the user with a set of guidelines on how to make customization most efficient.

We now proceed to describe separately the idea and plan for Curt’s  supernumerary robotic finger as well as Vedant and Shruthi’s Wearable home automation.


2. SUPERNUMERARY ROBOTIC FINGER

Using the glove developed as hand position input, Curt will construct a supernumerary robotic finger mounted next to the left hand pinky finger. This digit will mirror the biological thumbs location and joint structure. The robotic finger will map the hand gesture to user intention which in turn maps to a joint configuration for the finger. By the end of the term a simple hard-coded heuristic function will be developed to perform this mapping.

Curt is primarily developing this project for his own curiously. Specifically, Curt would like to wear the device for a full day to record hand position data, record failures and inconveniences, record interactions with others and their perception, and explore contexts of applicability. This in turn allows Curt to further develop a machine learning algorithm, iterative design improvements, HCI insight, and further general SRF usage taxonomy respectively.

As for the eventual end-user, this technology could potentially augment any life task however I am mostly interested in applying the technology to the manufacturing and construction spaces where the ability to do self-handovers is an important aspect of the task. An example would be screwing an object overhead while on a ladder. The constraints are that a person should keep three points of contact while holding both the object and the screwdriver. If they need to do all three, they may lean the abdomen onto the ladder which is less effective than a grasp. Instead with several robotic fingers (or a robotic limb) the object and screw driver could be effectively held/manipulated while grasping the ladder. Another example the should relate to this class is soldering where the part(s), soldering iron, and solder need to be secured. This could be overcome with an SRF thumb to feed the solder to the tip of the soldering iron while holding the parts down with one’s other hand.

In the SRF literature Curt is proposing to provide a modest incremental improvement. Wu & Asada worked with flex sensors however they were only interested in the first three fingers and did not attempt to model the hand position directly. Arivanto, Setiawan, & Arifin focused on developing a lower cost version of Wu & Asada’s work. One of Leigh’s & Maes’ work is with a Myo EMG sensor which is not included in the project. They also present work with modular robotic fingers though they never explore the communication between finger and human in that piece. Finally, Meraz, Sobajima, Aoyama, et al. focus on body schema where they remap a wearer’s existing thumb to the robotic thumb.

2.1 WEARABLE  HOME AUTOMATION:

While these gloves could have many different functions, we would like to focus on using these gloves as part of a smart home, including functions like controlling the TV, smart lights, as well as a Google Home/Alexa (through a speaker). This could especially be useful for people with disabilities (scenarios where voice based communication is not easy), or even just a lazy person.

This project utilizes ideas of Computational gesture recognition to create a wearable device. We use components of sensor systems and micro controllers in conjunction with a mobile application, (preferably android) along with fundamentals of circuit design. Our project management technique would involve a mix of waterfall and iterative models, keeping in mind the timeline available so as to create a viable solution to home automation.

The idea is to have the glove communicate to an android application either via bluetooth or a wifi module, and the phone in turn can control several other devices. Since applications like the Google assistants have powerful AI technology integrated into them, can we extend those capabilities from a beyond the phone on to a wearable fabric?

Also, it brings in the concept of a portable google home of sorts. This means that we do not need to install a google home in every room. This project is meant to be pragmatic and the consumer is the general population. It could also be of extra help to people with disabilities.


3.INSPIRATION

SRF takes inspiration from Wu & Asada (along with other work in flex sensors as to detect finger movement), Meraz, Sobajima, Aoyama, et al. will provide inspiration of using a thumb as the digit being added, and Leigh’s & Maes’ work in modular fingers will be the inspiration for how Curt constructs the wrist connection. The novelty is bringing these pieces together to form a wearable that one can run a long-term usability test.

https://ieeexplore.ieee.org/document/6629884

http://maestroglove.com/

 

SRF:

  • [1] F. Y. Wu and H. H. Asada
  • [2] M. Ariyanto, R. Ismail, J. D. Setiawan and Z. Arifin
  • [3] Sang-won Leigh and Pattie Maes
  • [4] S. Leigh, H. Agrawal and P. Maes
  • [5] F. Y. Wu and H. H. Asada
  • [6] Segura Meraz, N., Sobajima, M., Aoyama, T. et al.

We find significant work done in the general area of gesture recognition using a glove. However, those gesture interpretations have been applied in different domains. We did find a project or two where gloves where used for home automation. However, the choice of sensors is different from what we plan to use. (flex and pressure).

https://pdfs.semanticscholar.org/48d1/f42a04c14eaac14f0339666d610309a3ff58.pdf

https://ieeexplore.ieee.org/document/5573128

https://ieeexplore.ieee.org/document/7931887

 


4.SKETCHES

How will the final product look like?

The following sketches were developed for the project pitch and compiled here due to their similarity. The important aspects to note with the glove is the presence of flex sensors used to capture finger movement, touch sensors on the fingertips, and an inertial measurement unit to capture hand orientation.

 

Glove:

 

SRF:


5. MATERIALS

Electronics:

  • Microcontroller (w/ Bluetooth and WiFi  ex. ESP32, Adafruit Flora, Particle Photon, Arduino, Possibly external Bluetooth or WiFi Module)
  • Flex sensors
  • Resistive pressure/force sensors
  • Vibration motor
  • IMU (Gyroscope, accelerometer)
  • Micro servos (For SRF)
  • Infrared LED emitter / receiver

Clothing / Physical Materials:

  • Glove
  • Wrist brace (For SRF)
  • 3D printed components (case, mounting)

6.SKILLSET LIST:

Curt:

I come into this project with experience soldering, designing circuits, and programming. My talent is mainly in embedded system programing with C. I will need to master sewing and other soft materials skills and knowledge to hack a glove in an aesthetically pleasing fashion. Another skill hurdle will be 3D printing as I have some familiarity during undergrad where I worked with a 3D printer as a hobbyist but never formally trained. Finally, I will need to further hone my project management skills due to the ambitious scope as laid out.

Vedant:

While I have some experience with programming, this project would require a lot of microcontroller programming, so that would be the main thing I would have to master. The project would also require some knowledge of IoT so I would be looking more into that as well.

Additionally, I have experience using tools in the makerspace (3d printing, laser cutting), so if I have time to focus on the aesthetics, I could use those skills to help me improve the looks. However, it is a given that I will also have to learn soldering well to compact the design, as well as sewing to make the glove look nice.

Shruthi:

I have  prior experience with arduino programming.  I have a decent understanding of C and Python programming. Also, I can contribute in integrating the sensory data to a mobile application.  I am more comfortable debugging on an IDE and not so much directly on a hardware device and I need to improve my skillset in this direction.I could also help with the aestheic aspect of the glove in sewing and stitching. However, I do not have significant experience in these areas.


7.TIMELINE

7.1 MILESTONE 1

Technology shown to work (March 25)

  • Glove
    • Glove w/ all flex sensors
    • Position of hand captured, data transmitted to PC/phone for processing / visualization
    • IMU captures absolute orientation, data transmitted to PC/phone for processing / visualization
    • Power supply and integration started
  • SRF
    • Robotic finger 3D printed
  • Assistant
    • Test whether the bluetooth or wifi module would work best to connect to the Google Assistant
    • Look into Google assistant API and see what functions/commands can be given using a third part app

7.2 MILESTONE 2

Technology works in wearable configuration (April 8)

Note after this point, project focus splits from being glove focused to application focused. Curt will take on SRF subproject and Shruthi & Vedant will take on the home assistant subproject.

  • Glove
    • Refinement of aesthetics
    • Power supply and integration complete
  • SRF
    • Robotic finger controlled
    • Brace developed
  • Assistant
    • Basic functioning app developed
    • Required IR codes and assistant commands obtained

7.3 MILESTONE 3

Technology and final wearable fully integrated (April 22)

  • Glove
    • Error reduction, final iteration on design
  • SRF
    • If time allows a one person “user study” to capture longer term data
    • Else refinement of the finger control / design
  • Assistant
    • Fully functioning app that receives commands from gloves and sends commands to Google Assistant app to control smart home
    • Aesthetic gloves that house the electronics, battery and sensors ergonomically

8. FALLBACK PLAN

Glove:

We will be starting with an experiment with one finger to see if one flex sensor is sufficient to capture the gesture information necessary. The answer found may show that one flex sensor is sufficient for the assistant subproject but not the SRF project. Alternatively it could be that flex sensors themselves are not sufficient. To that end, this experiment will be done early in order to integrate the findings into our design.

The minimum that we will consider success is a glove that can output preprogrammed gesture commands that are loosely coupled to the state of the fingers. For example curling the index finger will be a command.

SRF:

Curt will have to learn more about 3D printing to develop the SRF physical components. Additionally, a mounting system needs to be developed. Depending on how time consuming this step is, Curt may need to develop a simple, proof of concept, heuristic for the final demonstration. This could result in non-ideal finger movement that does not track the intent of the wearer, however failure of the algorithm in some cases will be deemed acceptable as this is still an open research area.

 

Assistant:

We will also integrate the sensor signal data and manage to establish communication to a phone via either bluetooth or wifi. We would probably begin by controlling a smart light or a single device and eventually try to build larger integration with the Google home app.  We would first develop a prototype, a rather simple proof of concept and try to get a single instance of end to end communication channel functional. However, since this is still in experimental stages, we cannot guarantee the accuracy and reliability of its function.

Project Post #1

Project title: Qi jeans

What should we call your project? Qi Jeans

Who you are (are you an individual, a team, etc)? By Gregg Van Dycke

A description of what you would like to create.

I would like to create a pair of jeans that allow you wirelessly charge your phone from your the pocket.

1) What does your project do? (1-2 sentences)

My project is to have a pair of jeans that is capable of charging your phone wirelessly while it is in the pocket.
2) Who is your project for? (1-2 sentences)

My project is for people that are looking to increase their phones battery life. While still allowing them to be flexible in there day to day activities.

3) Describe how someone would use the developed device.  What are the steps that a user would go through to interface with the technology? (at least a paragraph)
To use the device the user will first need a smartphone that is wireless charging capable. Second the user will need to to charge the device so I can supply their phone with battery. Next the user will insert the device in their jeans, in the special holder pocket and make sure the transmitter is facing their normal pants pocket. Then the user would put their phone in their pocket with the backside of the phone facing the transmitter.
4) What makes your project different from existing products? (2-4 sentences)

My project is different from other products because it is removable where others are not. Also it is in a pair of pants, while most others are in jackets. Also it uses wireless charging while existing products use a cable.
What is already out there that is similar to what you are trying to do?

Nokia actually did something exactly like this which I just recently saw. They disassembled one of these wireless charges and integrated it into a pair of pants. That might also be the best option for me. But there are other types of clothing that can recharge electronic devices, although they have different form factors or use a different form of charging. 

Digital or scanned sketches of your project

A bulleted list of the materials/tools you’ll use/need

Milestone 0 (March 15): Have received all bought materials

Milestone 1 (March 25):  Have a working transmitter from materials purchased.

Milestone 2 (April 8): Have jeans pocket built with package functioning outside of pants.

Milestone 3 (April 18): Have everything integrated in pants.

Milestone 3 (April 22): Have fully functioning unit.


What can you do to recover your project if it doesn’t go as planned? To use an already built wireless charger that sits in the pants where the custom made transmitter would have gone. This would still allow for wireless charging which is a key component but it will also allow me to work on the jeans portion as well.
What is the bare minimal outcome that you would consider a success? To have a functioning charging unit that can be used outside the pants. Or to have the pants pocket sewn with correct pocket size so that phone sits where it needs to be.

Initial Project Pitch

Project Diagram

Project Description:

My project will use an electromyography (EMG) sensor and Adafruit Circuit Playground Express Development Board I got from the class. Here is how it works: The EMG sensor will receive the electro signal sent from human muscle and transmitted via analog format. The development board which has an ARM Cortex M0 Processor can convert the analog signal into digital format and visualize it via the onboard LED lights gauge and showing it like a bar- like indicator. And if the sensor senses the muscle is putting on maximum strength, the buzzer will sound to alert the person from muscle related injuries.

Inspiration Project

  • Muscle Sensor + LED Shield
  • The muscle sensor output the analog signal directly into the LED Shield and can display a 10-segment bar graph

Project Sketches

  • The final product will look like an armband and has LED indicator with buzzer and/or vibration module

Materials and tools

  • MCU: LilyPad Arduino 328 Main Board
  • EMG Sensor: MyoWare Muscle Sensor + Biomedical Sensor Pad
  • LED: MyoWare LED Shield/LilyPad Pixel Board/SparkFun RGB LED Breakout – WS2812B
  • Buzzer: LilyPad Buzzer
  • Vibration Motor: LilyPad Vibe Board
  • Tools: Soldering Iron, Breadboard
  • Miscellaneous:Male & Female header connectors, Jumper wire

MCU: LilyPad Arduino 328 Main Board

The LilyPad Arduino consists of an ATmega328 with the Arduino bootloader and a minimum number of external components to keep it as small (and as simple) as possible. This board will run from 2V to 5V and offers large pin-out holes that make it easy to sew and connect. Each of these pins, with the exception of (+) and (-), can control an attached input or output device (like a light, motor, or switch).

EMG Sensor: MyoWare Muscle Sensor + Biomedical Sensor Pad

This is the MyoWare Muscle Sensor, an Arduino-powered, all-in-one electromyography (EMG) sensor from Advancer Technologies. The MyoWare board acts by measuring the filtered and rectified electrical activity of a muscle; outputting 0-Vs Volts depending the amount of activity in the selected muscle, where Vs signifies the voltage of the power source. It’s that easy: stick on a few electrodes (not included), read the voltage out and flex some muscles!

 

Skills and concepts

  • Soldering
  • A/D Converter Basics
  • Coding on Arduino Platform

Timeline

  • Milestone 1 (March 16): Make the project basic functions work on breadboard
  • Milestone 2 (April 6): Test the Valve and Threshold value and test on wearable scenarios
  • Milestone 3 (April 20): Fully tested and ensured the project works perfectly during real life situations

Fallback Plan

If it’s not easy to successfully find out the actual workable valve and threshold value for the project to function properly, I can change the project into testing suddenly peaked muscle electro signal to count such as push-ups or sit-ups

Project Member

Sin-Ying Lin

 

 

Initial Project Pitch

Jeff Brandt

My design is for a shirt that can change colors due to an electrical stimulus. This will be accomplished by magnetized beads on the shirt that will flip their orientation and thus show only one half of the bead that is colored differently than the other half. This is trying to solve a problem for over-packers on work trips or vacations by allowing the user to pack less clothing but still be able to have a different appearance each day.

I am confident in the materials selection process as well as some very basic programming.

I am less confident in the design aspects such as stitching or weaving together fabrics.