Project Post #2 – Theremin Jacket

Project Post 2

@Postdate: Mar 9th (Sat)

Project Title

Theremin Jacket

Project Team

Junda Chen, Jeff Ma, Yudong Huang, William Black

Major aspects for Development

  • Sensor

    • Sonar Sensor

    • Leap Motion

  • Arduino/Other interface & Software Design

    • MIDI Software

    • Storage

    • Data Transfer

  • Clothes Design

    • Sensor/Leapmotion embedding

    • Light Design

      • LED

      • Covering Material / Defussing material (Potentially)

    • Jacket

Weekly Accomplishments

  • Setup sonar sensor tracking in Arduino Mega.
  • Use sonar sensor to build a prototypical MIDI device.
  • Leap Motion Mechanism
    • How leap motion works, Accuracy, General application
    • HW and SW. Compatilibility to IoT Devicess
  • 3D PrintLeap Motion case
  • First software prototype for theremin
    • Motion trace: proximity and height change
    • Data Transfer and MIDI encode/decode
    • Run on Arduino/Raspberry pi
    • (Optimization) De-noise.
  • Select a jacket.
  • Design the jacket.

Image/Video

Changes to our approach

We originally want to design the primitive circuit and sensors to make the sensing work. William has just worked out the sonar in wednesday, and as a backup plan and primitive approach we will design a theremin using the sonar sensors and integrate it as a part of jacket.

In search for potential improvement of gesture recognition, we also focus our attention on Leap Motion. On leap motion, we’re able to capture richer and more sensive gesture information — able to grab, tremble, drastically move up and down, within its well-defined range of service.

Material list

  • Circuit Board: (Potentially) MIDI encode/decoder, Leap motion image processor,
  • Leap Motion (1): $96
  • LED Strip light (2, TBD)
  • A Jacket (1, TBD)

Development Log

Cylon.js: an arduino API to control the leap motion

Adafruit strip LED : a $17.99

Leap Motion installation: Trouble shooting in Windows.

Michael Leykin: Initial Project Pitch

Name: Michael Leykin

Idea#1: Defense

The 3-factor Bluetooth authentication bracelet!

Sketch:

Purpose: Add additional layers of security to any physical device.

Project is meant to be a pragmatic solution for companies with large number of workstations.

How It Would Work: After application is installed on machine, the Bluetooth bracelet would calibrate to identify you, then from then on, you would have to have the bracelet on when logging into your machine.

Confident Skills: Programming, security knowledge.

Not Confident Skills: Product design, hardware design, electrical engineering and bio metric sensors.

Idea#2: Offense

Penetration of

Endpoints and

Networks

Infiltration

System

Sketch:

Purpose: To perform a vulnerability assessment on a physical location/organization.

This project is not pragmatic at all, the only legitimate use for such a project would be for malicious purposes, so this is mainly experimental and playful.

Again, the only people I could see using this would be a malicious actor or a very dedicated security team.

How it would work: The wearer of this jacket could use a multitude of the pen testing tools present in this jacket to gain a variety of information to send back to your home machine (password hashes, metadata from workstations…etc).

Confident Skills: Programming, Pen testing tool knowledge.

Non-confident Skills: Component integration into a garment, sewing.

 

My idea for a wearable tech project is a mobile game with a wearable component and plushies. The intent of this project is to create a game with wearable tech with dirt-cheap wearable tech components that can be given away practically for free.

The main summary of the game is that users find and collect animals in their game by going to locations with the plushies and scanning their pendants against the plushie’s body to unlock that animal in their game.

This idea can be broken down into 3 components, each of which is considered a milestone:

  1. The Mobile App
    1. The game itself will be free-to-download and will not require any wearable tech (although, most of the fun comes in owning the subsequent wearable tech products). The game features the user having a collection of pets which they send on missions throughout the day. Each mission lasts a few hours, the player receives updates on what their pet is doing (“playing in a puddle, swinging across a rickity bridge, etc.), and once the pet completes their mission they earn some experience points and some small in-game reward (an apple, a trinket, etc.).
  2. The Animal Pendants
    1. Animal Pendants are wearable components of the game. Each pendant holds within it an RFID/NFC tag which contains a unique scannable number associated with its owner. The pendant is used to collect more animals. Because the hardware within the pendant is cheap ($1-2), small (around 1cm), and flexible, pendants can resemble anything ranging from necklaces to earrings to clip-on gadgets. Animal pendants are mainly stylized based on the animal designs and symbols created for the game.
  3. The Animal Plushies
    1. The animal plushies are stuffed animals that resemble animals that can be found in the game. Each of these animals has a RaspberryPI 0 W, an RFID/NFC reader/writer, and a battery within it. When a player scans their pendant against a plushie, they receive that animal in their game and can use that animal for adventuring in the gameplay. Each animal plushie costs around $15 in hardware (excluding cost of fabric/stuffing), and the intent is to market these towards small businesses, museums, libraries and other areas to draw in crowds.

Below are some mockups for the game’s UI and some ideas for the plushie designs. The two plushie ideas that people seemed to really like were a Sheep and an Otter, although close runner-ups were a Corgi and “A Loaf Of Cat”

I feel fairly confident in building the mobile app, and feel as though the hardware components will be rather easy to setup once I have one working model.

I feel like I could use some improvement in setting up the networking aspect of this, as well as the plushie designs.

SRF Glove

Final Project

Author: Curt Henrichs

I propose to make a supernumerary robotic finger (SRF) with position detection glove. This will consist of a robotic thumb mounted near the pinky, mirroring the biological thumbs location and movement. This thumb will be constructed with 3D printed mounting brackets and high torque micro servos. The thumb will be mounted to a modified wrist guard as used in skateboarding in order to distribute the weight of the thumb onto the wrist. The wrist guard must not impede normal hand function. Finally, there will be a series of flex sensors mounted onto a glove to detect the joint state of the wearer’s fingers. This along with an inertial measurement unit will report the hand pose information to an algorithm to control the SRF. For this project the algorithm will be constrained to a simple hard-coded heuristic approach, but future work will use data driven AI to learn the wearer’s intent off their hand position.

Target Audience

I am primarily developing this project for my own curiously. Specifically, I would like to wear the device for a full day to record hand position data, record failures and inconveniences, record interactions with others and their perception, and explore contexts of applicability. This in turn allows me to further develop a machine learning algorithm, iterative design improvements, HCI insight, and further general SRF usage taxonomy respectively.

As for the eventual end-user, this technology could potentially augment any life task however I am mostly interested in applying the technology to the manufacturing and construction spaces where the ability to do self-handovers is an important aspect of the task. An example would be screwing an object overhead while on a ladder. The constraints are that a person should keep three points of contact while holding both the object and the screwdriver. If they need to do all three, they may lean the abdomen onto the ladder which is less effective than a grasp. Instead with several robotic fingers (or a robotic limb) the object and screw driver could be effectively held/manipulated while grasping the ladder. Another example the should relate to this class is soldering where the part(s), soldering iron, and solder need to be secured. This could be overcome with an SRF thumb to feed the solder to the tip of the soldering iron while holding the parts down with one’s other hand.

My Motivation

Academically I am motivated by the research opportunities in the space. There are many unanswered questions as this technology has not been popularized yet. Though my interest stems from the philosophy that I adhere to, that being humans are not the end state of evolution. I am excited to construct technology that not only affects my physical appearance but my physical capacities.

Inspiration and Novel Proposal

As far as I am aware in the SRF literature I am providing a modest incremental improvement. Wu & Asada worked with flex sensors however they were only interested in the first three fingers and did not attempt to model the hand position directly [1,5]. Arivanto, Setiawan, & Arifin focused on developing a lower cost version of Wu & Asada’s work [2]. One of Leigh’s & Maes’ work is with a Myo EMG sensor which is not included in the project [3]. They also present work with modular robotic fingers though they never explore the communication between finger and human in that piece [4]. Finally, Meraz, Sobajima, Aoyama, et al. focus on body schema where they remap a wearer’s existing thumb to the robotic thumb [6].

My project will take inspiration from Wu & Asada (along with other work in flex sensors as to detect finger movement), Meraz, Sobajima, Aoyama, et al. will provide inspiration of using a thumb as the digit being added, and Leigh’s & Maes’ work in modular fingers will be the inspiration for how I construct the wrist connection. The novelty is bringing these pieces together to form a wearable that I can run a long-term usability test with myself as the subject.

Figure 1: Supernumerary robotic fingers found in literature. 

Design

Figure 2 displays my thoughts on how the glove will be laid out. My first experiment will be whether one flex sensor is sufficient to capture the joint position for a finger. The position of the IMU is mounted on the glove instead of the wrist in order to capture more accurate absolute orientation of the fingers. Figure 3 shows the mounting location of the finger along with sketches of the robotic finger itself. This is inspired off of Wu & Asada’s work though I am going to building one with micro-servos instead of standard scale. These sketches are subject to change as I construct the glove.

Figure 2: Glove sketch with alternate approaches.

 

Figure 3: Robotic finger sketch (top) fusion 3D image, (bottom) top and bottom view of finger mounting position.

Materials

  • Electronics
    • ESP32 – Microcontroller w/ Bluetooth and Wi-Fi
    • Flex sensors
    • Micro servo motors
    • Resistive pressure sensor
    • Vibration motor
    • IMU
  • Clothing
    • Glove (Light weight, breathable)
    • Snowboarding / skateboarding arm brace

My Skills

  • Have experience in:
    • Soldering
    • Circuit design
    • Programming
  • Need to master:
    • Sewing / other soft materials knowledge and skills
    • Project management
    • 3D printing

Timeline

  • Milestone 0 – Initial Prototype (February 24)
    • Glove w/ several flex sensors.
    • Determine either approach 1 or 2
    • Detect both span between fingers and flex of a finger
    • Power supply not a concern
  • Milestone 1 – Technology shown to work (March 16)
    • Glove w/ flex sensors
    • Position of hand captured, data transmitted to PC for processing / visualization
    • IMU captures absolute orientation, data transmitted to PC for processing / visualization
    • Power supply and integration started
    • (If time) Robotic finger 3D printed
  • Milestone 2 – Technology works in wearable configuration (April 6)
    • Full integration
    • Glove w/ all flex sensors, IMU mounted
    • Wrist-brace w/ finger mounting, processor mounting
    • Power supply complete
    • (If time) Robotic finger controlled
  • Milestone 3 – Technology and final wearable fully integrated (April 20)
    • (If time) “user” study

Potential Challenges

First major challenge that I have accounted for is that one flex sensor may not be enough to determine the joint state of a finger. Thus, as I will outline later, my first experiment is to see if this is the case. The next is that I do not have time to develop all components. This would be the worst case but if it does happen, I believe that I will prioritize the sensor glove before the fingers. While I have to learn more about 3D printing, I am also not a novice so given access and time I should be able to print out several parts for a robotic finger. Finally, the algorithm to convert from pose to finger position will most likely be a simple heuristic based on gesture. This could mean inadvertent triggering of a movement even if this was not the intent. While a failure of the algorithm, I am least concerned with this aspect for the term.

First Step

I have already purchased (though as of writing this still in shipping) the parts I need for my first hardware experiment. I need to figure out whether I can determine which of three finger joints is bending using one 4.5” flex sensor. If this is successful, then I will use 5 of these to capture pose information. Otherwise I will need to purchase 10 2” flex sensors to detect the finger position in one direction. As for the spread between fingers, I plan on using 1” flex sensors but for the initial prototype 2” flex sensors will work. I also plan on using an IMU to determine absolute orientation with respect to the Earth. All of this will be mounted on a relatively thin winter glove that I have purchased.

I will need to sew on the flex sensors for the one finger and one spread sense along with some mounting for the IMU. This will be connected to an Arduino Uno clone that I have to report the data back to my PC for visualization. The most challenging portion of this prototype is developing the code to determine position followed by with visualizing the hand state.

Inspiration References

  • [1] F. Y. Wu and H. H. Asada, Implicit and Intuitive Grasp Posture Control for Wearable Robotic Fingers: A Data-Driven Method Using Partial Least Squares, IEEE Transactions on Robotics, vol. 32, no. 1, pp. 176-186, Feb. 2016.
    doi: 10.1109/TRO.2015.2506731
  • [2] M. Ariyanto, R. Ismail, J. D. Setiawan and Z. Arifin, Development of low cost supernumerary robotic fingers as an assistive device, 2017 4th International Conference on Electrical Engineering, Computer Science and Informatics (EECSI), Yogyakarta, 2017, pp. 1-6. doi: 10.1109/EECSI.2017.8239172
  • [3] Sang-won Leigh and Pattie Maes. 2016. Body Integrated Programmable Joints Interface. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ’16). ACM, New York, NY, USA, 6053-6057. DOI: 10.1145/2858036.2858538
  • [4] S. Leigh, H. Agrawal and P. Maes, Robotic Symbionts: Interweaving Human and Machine Actions, IEEE Pervasive Computing, vol. 17, no. 2, pp. 34-43, Apr.-Jun. 2018. doi: 10.1109/MPRV.2018.022511241
  • [5] F. Y. Wu and H. H. Asada, “Hold-and-manipulate” with a single hand being assisted by wearable extra fingers, 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, 2015, pp. 6205-6212. doi: 10.1109/ICRA.2015.7140070
  • [6] Segura Meraz, N., Sobajima, M., Aoyama, T. et al. Modification of body schema by use of extra robotic thumb, Robomech J (2018) 5: 3. doi: 10.1186/s40648-018-0100-3

Project_2

Lydia Schweitzer

CONCEPT IMAGE >>>>

 

RELATED IMAGES/INFO >>>>

interesting link

another interesting link

DESCRIPTION >>>>

NIRS or EEG based hat that gathers brain responses and changes color/produces visualization

Different way of communicating.

People can see elements of emotional/mental state.

For creative, poetic, experimental output, research purposes.

Future research- collect data over time for mental health field.

PROJECT ASPECTS >>>>

Confident:

  • Art experience
  • Some visualization experience
  • Music experience
  • Research on emotion/visualization material within health and psychology field

Not so Confident

  • Hardware, software
  • Data analysis

Project_1

Lydia Schweitzer

CONCEPT IMAGE >>>>

RELATED IMAGES/INFO >>>>

interesting link

DESCRIPTION >>>>

Gloves/conducting stick that sense(s) sounds and movement, connecting with a screen to generate visualization.

Creates the ability to provide a visual component in response to the environment, accounting for human interpretation and creativity.

Could be used at concerts, performances, conducting an orchestra, DJing.

For creative, poetic, experimental output.

PROJECT ASPECTS >>>>

Confident:

  • Art experience
  • Some visualization experience
  • Music experience

Not so Confident

  • Hardware
  • Data analysis