Final Project Post – InGlove

Project Title: InGlove

Project Team:  Vedant Agrawal and Shruti Nambiar

Project One Liner: Smart glove that helps the user control their TV and smart switches/lights using hand gestures

Video: 

Poster:

Describe what your project does and how it works

This is a smart glove that helps the user control smart (WiFi connected) or Infrared (TVs, Music Receiver) devices in their home. The glove has fabric flex sensors integrated onto the fingers and which allow the Particle Photon microcontroller to recognize when the user has flexed their fingers by making specific hand gestures.  For IR devices, the microcontroller sends out the IR signal using an IR LED transmitter circuit integrated onto the glove. For smart devices, the microcontroller publishes an event on the Spark cloud via WiFi. Then, IFTTT, a web-based app gateway service, recognizes the published event, and tells the smart device app to execute the command respective to the gesture made.

The glove is meant to target users with physical disabilities that find it difficult to move around their house to control electronic devices in their homes, as well as for users with speaking disabilities that cannot necessarily communicate with their Google Home/Alexa to control their smart devices.

Describe your overall feelings on your project. Are you pleased, disappointed, etc.?

We started out with the goal of building a gesture based alternative to a Google Assistant.  We are happy that we did indeed build a prototype that accepted gesture based inputs to control smart home devices. However, there was indeed  quite some room for improvement. We spent a large part of our time on coding and technical aspect and less on the assembly and design integration of the circuits into our glove.

Describe how well did your project meet your original project description and goals.

Our original goal was to build a gesture based home automation glove. I believe we did a good job in establishing our initial goals but more as a proof of concept than as a finished product. We showed that it was very well possible to connect wearable fabrics over WiFi to smart devices and control several devices at a time using gestures. However, there was room for improvement in terms of consistency of results. We had a tough time with our IR circuit and it was only working intermittently. We could also have done a better of job of the integration of these components into the fabric.

Describe the largest hurdles you encountered.  How did you overcome these challenges?

One of the earliest challenge we were faced with were the flex sensors itself. We first bought a commercial long and short flex sensor from Adafruit.  However the commercial flex sensors costed 12$ a piece and were made of a non-flexible plastic like material. It was difficult to tack it down with the fabric and have it follow the movement of the fabric. We then figured out how to build our own DIY flex sensors with Velostat and electrical tape. This costed us less than a few cents a piece. However, the electric tape did not adhere well to the material of our fabric either. After discussing with Marianne,  we figured we could try building flex sensors with fabric!!! And it worked.

Once we did this our next challenge was to have the flex sensor communicate with the smart device. We used a Particle Photon as a microcontroller. There was a small learning curve to understanding how the Particle worked. Once we figured that out, we integrated it to the smart sensor using a gateway application called IFTTT. However, since IFTTT is a free platform, guarantee of service was about once a minute since they had to cater to a larger market. This meant that when the Particle Photon would publish an event onto the Spark cloud when a finger was flexed, the IFTTT app would only check for this event once per minute, causing there to be a large delay between the finger flex and the smart light turning on. We tried to work around this by building our own app that could read from Spark cloud of the particle servers. But the android API integration wasn’t very well documented and the support from online communities were low. So we decided to stay with IFTTT as a service.

Another challenge we faced generally throughout the project was hardware debugging. Since neither of our backgrounds involved a strong base in embedded systems and electronics, we had a slightly tough time with it. Trying to integrate and assemble all of the components onto the glove was also quite challenging since we had to ensure that there are no cross connections and short circuits and that all the wires are tacked down to make perfect and consistent contact. However, over the course of the semester, we became better at hardware debugging and we began to understand what to look for and how to diagnose the circuit problems better.

Describe what would you do next if you had more time

Since neither of us had experience with app development, we spent a lot of time trying to figure out how to make an app that connects to the Spark cloud to read a published event from the Particle, and then also connect to the smart switch/light app that controls the smart device. We did figure out that those two functions have been done independently, but not in the same app (except for IFTTT). Thus, we would use the extra time to look into how the app would be developed for this glove, to reduce the delay between the finger flex and the smart light turning on/off.
We would also like to integrate more flex sensors to capture a wider range gestures, and thus commands. Additionally, we would use that time to better integrate the circuits we have into the glove, to make it comfortable, yet functional.

Materials Used:

  1. 1 x Thin Glove
  2. 1 x Particle Photon
  3. 3 x DIY Fabric Flex Sensors:
    i. Velostat – Force-sensitive fabric
    ii. Conductive Thread
    iii. Copper Tape
    iv. Pieces of thin cotton fabric
  4. 1 x Infrared LED
    General Electrical Components (wires, resistors, general purpose transistor)

Final Project Post

Project Title: Táltos-oid (Formerly posted under InGlove)

Project Teams: Curt Henrichs

Sentence Description:

Human augmentation device providing users with an extra thumb for everyday tasks.

Video:

Poster Image:

The following image is the poster presented at the Wearable Technology showcase.

Project Description:

The long-term goal of this project is to experiment with augmenting the human body. Specifically I am interested in understanding how a supernumerary robotic finger interacts with the user to complete typical tasks that occur while living their normal life (in essence what does it mean to live with this device) in order to better understand the design challenges and applications in a generalized manner.  This is motivated by a gap in the nascent literature of supernumerary robotic limbs in exploring these broad application questions instead of a narrow application focus.

To achieve this goal, the supernumerary robotic finger must be a functional appendage; meaning a well-articulated finger and intuitive interface to command finger. I am considering a well-articulated finger as one that can provide a sufficient range of motion. A user should be able to understand this appendage as a finger or thumb thereby building on existing grasping knowledge. In regards to the intuitive control, the finger must capture the intentions of the user in a manner that facilitates the augment the task. Thus the intention signal to noise must be high enough to be actionalizable and yet not be a burden on the user.

Táltos-oid is designed to address the outined goal by providing a four degree of freedom robotic finger and a flex sensor glove to provide gesture data to control the finger. Táltos-oid is designed with a function first perspective though aesthetics of the design should be reasonably thought out. In the following paragraphs I will discuss the robotic finger, flex sensor glove, and data-processing subsystems.

The robotic finger is composed of custom designed 3D printed plastic components actuated by four high-torque micro-servos. The rotational joints roughly map to the articulation of a human thumb where greater range of motion is feasible with the micro-servos. A soft interface layer was constructed with neoprene and leather to provide comfortable mounting of the finger to the human hand. The design of the interface is inspired by wrist and thumb braces with a 3D shape held by the plastic and leather. Furthermore, the interface is adjustable with two velcro straps providing some invariance to hand size. Finally, control of the servo motors is handled through an I2C servo driver.

For the flex sensor glove I started by minituraizing and innovating on the DIY flex sensors as documented by Plusea. Specifically I introduced a node in the center of the flex sensor to provide voltage thereby allowing two joints to be captured as parallel resistors. I sewed on the flex sensors using a manikin hand to provide support for the material. Then to I used silver-nylon conductive thread and Bare Conductive electronic ink to connect the flex sensors to wires coming from the microcontroller (through a wire to thread interface). The wire to thread interface is composed of either a 3-pin or 2-pin male header with conductive thread tied and wrapped around the long end. I applied conductive ink to the threads to secure them. After drying the long end of the pins are pushed through a piece of electrical tape thereby sandwiching the conductive thread between the electrical tape and plastic space from the header. I then apply a bit of hot glue to secure the interface. This interface is then sewn into the glove; wires attached; and more hot glue applied to secure the assembly. Finally, to read the flex sensor signal, the wires are connected to an analog mux and 3.3V. When the microcontroller selects a channel, that sensor is then in series with a resistor to ground creating a voltage the divider that the ESP32’s ADC can read. For the custom flex sensors, I found 1K Ohms to be a reasonable resistor value. If using a commercial flex sensor this may change (e.g. the adafruit 4.5″ flex sensor works well with 10K Ohms).

The microcontroller and data-processing is the last major component to this project. It consists of an ESP32 microcontroller and PC computer connected through bluetooth. The ESP32 was selected due to providing bluetooth onboard along with WiFi (though not using it for this project, it could be used in the future). The ESP32 connects to the I2C servo driver and the analog mux as noted previously. The firmware is developed around a serial JSON API that provides a push update of current state to connected PC along with polling functions for sensor data. JSON API also provides a command to set the joint states of the finger.

On the PC I wrote several python scripts to assist in development and operation of the machine learning model that controls the finger. First script worth noting is a training script which provides a command line interface to set joint state and capture sensor and joint data to a csv file. The second script is the algorithm generation script which takes in the csv file and outputs a trained model based on user selection as a python pickle file. Finally I have a multi-threaded model runner script that captures the pushed state from the serial port, calculates joints from the trained model and commands the finger joints through the serial JSON API.

For the machine learning models, I worked with linear regression, adaboost random forest regression, and a neural network. The linear regression and random forest approach consisted of four models each trained on a single joint whereas the neural network was trained to generate all joints in one model. All of these models are trained on a supervised problem of mapping raw sensor data into joint states. Additionally I explored a couple of pre- and post-processing steps on this data. First I applied PCA to the sensor data and compared the model trained using PCA against the raw data model. With no noticeable difference, I opted for the raw data model if only for simplicity. Then I applied a moving average to the sensor data again comparing against default case. I found the moving average helped with the random forest and linear regression approaches but the neural network did not benefit from this addition. Then I tried post-processing the joint values to prevent some of the jitter produced by the models (except the neural network as it was already fairly stable). I found that a moving average on the joint values did benefit the random forest and linear regression approaches. Finally, I tried both a sensor and joint moving average which made for a stable but lagged response from the SRF. After having the results I concluded that the neural network was the best approach. Note, in terms of implementation, I used python’s scikit-learn library to develop these models.

Given the description of what Táltos-oid is, the next topic is how to wear and use it. While technically a modular system with the glove, finger, and microcontroller connected by wires that mate with header pins, I have found that it is far easier to keep the connections intact and to just place the components on in the following order: glove -> microcontroller wrist-strap -> robotic to finger. After putting on the device, connect the two USB power cables to the USB ports on the microcontroller wrist-strap. When powered up, the finger hold at a predefined resting position. The user can now start the model runner script with the neural network  model and start to grasp objects. The model was trained such that a fully closed hand will cause the SRF to rest its tip onto the fingers; mirroring a typical biological thumb state. When fully open, the model would stretch out the finger, assuming it is grasping a large object. In between these extremes the finger will move to anticipate size of the object being grasped (even if no object is being grasped as there is no way to discern that state).

Feelings toward Project:

I am very pleased in the way the robotic finger turned out, generally pleased with the microcontroller subsystem, and a bit disappointed with the glove. While I will cover these in the successes and challenges sections, it is worth noting that the disappointment with the glove is due to the failure in my design and implementation but not in the concept behind it. Specifically, the glove failed during the showcase event which did not allow time in the term to correct for it. Thus, I feel like the project is not the best that it could be. From a different perspective though, I am glad that the glove failed. As now I can learn from this failure with the intention of not making the same design decisions in the future (regardless of project).

Success:

Though as mentioned, the glove did fail, I was able to construct a simple replacement that demonstrated some of the capabilities of the project. Recall my long term goal is to develop a human augmentation device for everyday tasks and to understand what challenges, design considerations, etc. that the space offers. To that end, I would consider this a good step in that direction. The technical platform of the finger is a good first iteration; the microcontroller system works well enough for the task at hand; and the simplified glove made for the final demo illustrates the intuitive control. Also the challenges that I faced during the design are worth capturing as it is part of the journey to answer my questions.

If I had to give a quantitative score I would give an 8.5 out of 10 for my expectations. The missing aspect is a user study to start asking the questions outlined.

Challenges:

Like any ambitious project, one does not know ahead of time all that will be encountered. While this can present challenges, it also affords the opportunity to pivot and prove out the concepts coming into the project. My project was no exception to this with pivots on several key decisions in roughly chronological order.

The first challenge encountered in this project was the flex sensors available for purchase are built on a flexible PCB that restricts movement and forces a glove shape instead of adapting to the wearer’s hand. An InGlove team member (when I was on that team) found Plusea’s work on a flex sensor that was soft and conforming to the body.

The second challenge was redesigning the robotic finger as it initially had three joints instead of the current four. While three joints are sufficient for most smaller object grasping tasks, it is insufficient for a grasp of large or awkward objects that need support lower than the palm of the wearer. Fortunately I designed the robotic finger in anticipation of these potential changes (by using a modular 3D printed component system).  As an aside, there is also a piece of sanded down command hook to provide the necessary angle between two flex sensors. This could easily be replaced by a 3D printed component.

Third challenge was the interaction with bluetooth and ADC readings. The ESP32 sells itself as having two ADCs onboard giving approximately half of the pins ADC capacity. However when using the radio (either bluetooth or WiFi), ADC 2 cannot be used for external sensing. I also found issues with ADC 1 when using the radio though it generally worked. To handle this challenge I used an analog mux that I purchased on a whim for a different project.

Fourth challenge and pivot was using an I2C servo driver for the PWM signals to the micro-servos instead of direct control by the ESP32. While working with the ESP32, I found that the PWM signal would dip in voltage when the servos were moving. Using a second power source helps in this (and I did implement it) but I also went a step further and purchased a driver board that offloads the control task from the microcontroller. I have tested with a single battery and the system still functions correctly so this change was not pointless.

Current failures that I am facing (and will address) are torque issues at robot finger-tip and developing a new glove approach.

The torque issue is a consequence of direct driving the joint from the servo. One approach is if I wanted to reduce the joint angle to 90 degrees I could double the torque output. Another approach would be to purchase micro-servos with even higher torque rating (though this would be more expensive than adding two gears). For this version of the finger, I think the torque is sufficient to express the concept.

Designing a new glove approach means more exploration of technologies, implementation strategies, and testing. As I mentioned before, I am glad the glove failed as it instructs me on the next steps I need to take in order to improve the design. So far I understand the issues with the glove as:

  • Megaohm resistance between flex sensor and wire terminals.
  • Some of the terminals failed
  • Some of the conductive threads while still attached seemed to have increased resistivity (perhaps friction and skin contact)
  • Conductive ink which was used as a conductive glue was discovered to be water soluble and thus not great if the user’s sweat breaks down the glove.

Last challenge is more of a hypothesis suggestion. Specifically, context and environment awareness is important for the machine learning algorithm to make sense of the grasp intention and resulting joint-states. A user could grasp different objects that at the level of flex sensors looks near identical but requires the robotic finger to be different sets of joint-states. This made machine learning challenging as seemingly valid data would cause performance to drop drastically.

Next Steps:

If the course was extended another week or two, I would rework the broken glove to get a fully functional demo for the blog. Additionally, I would try to run a mini-UX study with convenience sampling from the class.

For short period of time but still into the future, I would rework my approach to the glove / finger tracking with either a new interface design, use COTS flex sensors, or use IMUs on a flex PCB. I would also like to explore haptic feedback signaling joint configuration to user.

Further into the future, I would like to explore environmental context with 3D sensing and other interface modalities to gather intention. With 3D sensing the finger can make grasp adjustments, avoid collisions with non-target objects, and start to make predictions on workflow. With respect to other interface modalities I would be interested in EMG instead of flex sensors to capture the gesture intention. I also would be interested in a brain based approach like EEG or fNIRS (perhaps to augment the finger control captured by other means).

Finally, I want/need to present this project in experimental setting in order to evaluate the research goals I laid out. I am currently thinking about a day-long autobiographical study, user experience studies, and field studies (in manufacturing, health care, etc.).

Materials List:

  1. Gloves [Prototype and Main] – Free (N/A) and $14.99 (N/A)
  2. ESP32 Dev Board– $15.00 (1)
  3. High Torque Micro Servo– $9.95 (4)
  4. Flex sensor – 4.5 inches– $12.95 (2)
  5. 3D printed SRF – ~$14.00 (N/A)
  6. PCBs– N/A (1)
  7. I2C Servo Driver Board – $14.95 (1)
  8. Analog Mux – $5.50 (1)
  9. Velostat – Free (N/A)
  10. Electrical Tape – N/A (N/A)
  11. Wires, Resistors, etc. – Free (N/A)

InGlove – Post #7

Curt, Shruthi, Vedant

Project Sentence

Home Automation

Smart glove that helps the user control their TV and smart switches/lights using hand gestures

Taltos-oid (SRF)

Human augmentation device providing users with an extra thumb for everyday tasks.

Weekly Accomplishments

Curt –

[Note for showcase: I will need power for my laptop / backup for SRF power supply]

First major accomplishment this last week was putting in the time to construct the flex sensors for the glove. My current status with this task is that all of the flex sensors are constructed, tested, and sewn onto the glove.

Pinky and ring finger lower flex sensors are completely sewn and have been tested with my Analog Discovery for a valid flex signal.

I then moved onto sewing the conductive thread to connect the flex sensors to wires that will lead to the microcontroller. After several approaches / experiments I found a relatively simple solution to interface between wires (with female headers) and the conductive fabric. My current status is the first one has been sewn on and tested. I plan on completing the rest of these this week by end of day Wednesday.

On a seperate thread of work I have worked on the firmware for the SRF. Currently I am debugging the firmware as I continue physical development efforts. Additionally, I have experimented with the Adaboost random forest approach to determine joint states. I need to write a program that runs on my laptop that communicates to the firmware level that I have written. To that end, I did define and implement a JSON interface in the firmware.

Finally, on Thursday last week I received my final shipment of parts which included the fourth servo needed to rework the finger. After reworking the design several times (as documented in the pictures below) I came up with a solution that works mechanically.

Added rotation joint at bottom. Finger collides with hand when rotating. Also the finger movement is restricted as it sits in a uncomfortable  spot in space around hand.

Rotation joint in middle, this allows better movement in 3D space but rotation joint is too close to hand causing collision with the protrusions from the finger.

Changing the angle that the finger sits at to prevent rotation collisions and make it feel more natural in the space around the hand. Problem is this is a larger finger / extends from hand further.

Also the piece I built is out of broken 3D printed hinges, copious hot glue, and part of a command hook taken from my wall. Looks fine several feet away but not as aesthetically pleasing close up.

There are several issues with this final version of the robotic finger. Namely, the grip torque is further reduced, size/bulk increased, and abundance of hot glue. While I acknowledge these issues as failures of the current design, it should nonetheless inform future iterations.

Remaining Tasks

  • Construct wire to conductive thread adapters (4x)
  • Sew on conductive thread adapters
  • Debug / verify correct operation of flex sensor circuit
  • Build microcontroller wrist-strap with velcro
  • Build microcontroller board using solder protoboard
  • Debug / verify correct operation of firmware for both finger and glove
  • Write PC program for controlling finger joint states from flex data. Requires collecting and training ML data
  • [Optionally] Get either Bluetooth serial or WiFi websocket for wireless operation
  • Poster design and language complete

Shruthi –

This week we spent time on discussing and developing ideas for the poster and its contents. I made an initial draft and working towards improving it. We also made a few more fabric flex sensors and sewed them on. Also planned the layout of where the circuitry goes into. The idea is to have no cross overs or short circuits. Another challenge we seem to run into is to try and figure out the best way to power the particle photon. We could use a power bank, however we aren’t sure how it has bearings on the overall user experience and the portability factor. We may also consider sewing on a Li-Po battery to Vin of the photon. We also had a  discussion with Kevin on how to best demo the functionality. I am looking at pulling the browser data from particle website either using a browser emulator or java-script and working towards building a local server to do this. Here is a photograph of the sewed on flex sensors.

Vedant –

This week we worked on sewing the fabric DIY sensors onto the glove, the poster, and well as finalizing the code for the IR transmitter. I was able to understand how to convert the IR Hex Codes I had to Raw Codes, which is then used in a pulse and delay combo in a for loop that loops over the Raw IR codes list. I was able to get the IR connected to the photon to emit IR signals turn on/off my TV and increase volume.

Material List

Home Assistant Sub-Project

  1. Particle Photon – $19.00 (1)
  2. Flex sensor – 4.5 inches – $12.95 (1)
  3. Flex sensor – 2.5 inches – $7.95 (1)
  4. IR LEDs

*We decided to go ahead with more of the DIY flex sensors. So we might possibly need more velostat and copper sheets and conducting thread.

SRF Sub-Project

Already purchased / owned

  1. Glove for prototype [final version subject to change based on prototype]
  2. Sparkfun IMU – $14.95 (1)
  3. Flex Sensor – 4.5 inches  – $15.95 (1)
  4. ESP32 Dev Board – $15.00 (1)
  5. High Torque Micro Servo – $9.95 (3)
  6. Resistive Force Sensor – $7.00 (1)
  7. Flex sensor – 4.5 inches – $12.95 (1)
  8. 3D printed  SRF – ~$14.00 (N/A)
  9. Glove [Final design] – $14.99 (1 ordered)
  10. PCBs – N/A (N/A ordered)
  11. High Torque Micro Servo – $9.95 (1)
  12. 3.3V to 5.0V Level Shifter IC – ~$1.00 ( 1)
  13. I2C Servo Driver Board [in case issue with microcontroller persists] – $14.95 (1)

Need to Purchase / Being Shipped

  1. Resistive Force Sensor – $7.00 (5)
  2. Flex sensor – 4.5 inches – $12.95 (3) and/or Flex Sensor 2.5 inches – $7.95 (8)

Areas of Concern

Curt –

Primary concern for this week is finishing the sewing of conductive thread on the glove, wiring the microcontroller with all components, and writing/debugging the software for the project. There is plenty of work to do and not much time.

Secondary concern is the poster which I will probably have time to work on Tuesday evening / night. Again not much time to complete these items.

Shruthi –

A few areas of concern at this stage is to how to best integrate the circuit and battery into the fabric keeping in mind its aesthetic and portability requirements

Vedant  – 

Areas of concern as of now include getting the code for the flex sensor integrated with the IR emitter and making sure we are able to sew the circuit onto the glove in time.

InGlove – Post #6

InGlove

Curt, Vedant, Shruthi

Project Sentence

Home Automation

 

SRF

Human augmentation device providing users with an extra thumb for everyday tasks.

Weekly Accomplishments

Curt –

This past week I worked primarily on constructing a stable brace for the SRF finger. The brace now has a leather support which reduces the elastic effect of the neoprene. Additionally, sewed several tucks into the brace neoprene in order to have it better conform to the hand.

I have received the glove liner from Amazon (though later than expected). This next Monday I will be focusing all of my effort on constructing / applying all of the flex sensors.

Using fake data, I created several ML python scripts in preparation for the flex sensors. Current plan is to have the ESP32 connect to laptop via Bluetooth in order to run the ML solution for the showcase. Unless the model is computationally cheap enough to run on the ESP32 (either main core or secondary).

Working with the servos currently, I have noticed the torque could be improved. This is something that I somewhat anticipated. Not going to address it in this iteration of the project, though the solution is to add a custom gearbox (perhaps worm gear?) to the output of the servo.

As for my power system, I have a USB battery pack  that I tested with the servos. It seems to handle the current draw with all three. It has two USB ports so it is possible to power both the controller and the servos using it, however I may purchase a secondary supply for the controller in order to isolate the voltage sag from servos.

For the poster I have started to lay out the information and determine which pictures I want to include. I need to find someone to assist me either as a hand model or to take the pictures.

Finally, I have found a name for my project that captures the essence of this step in embracing my transhuman philosophy. While I will still use the term SRF to describe it, the device shall be labeled: Táltos-oid. Táltos comes from Hungarian mythology and is a shaman that has innate supernatural power and can be identified by several possible abnormalities. The one I am drawing upon is that a Táltos can be identified by an extra finger.  What I like about this name is that this device, poetically, allows me to take a step toward transcending my humanity.

Vedant –

This week I worked on looking up how to implement an IR transmitter using the Patricle Photon. I was able to make the circuit for doing so, but had some trouble with the code, as there was multiple errors in the IR codes.

Additionally, we sewed on one of the DIY flex sensors onto a glove for the in-class demonstration and were able to successfully turn on a smart light with the flex of a finger (with a 20 second delay). However, we had to look into a DIY sensor that used fabric instead of electrical tape, to integrate it into the glove well, and we were able to get it to work successfully.

DIY Flex Sensor integrated into glove:

IR Circuit :

DIY Fabric Flex Sensor

Shruthi –

This week I worked mainly on the physical aspect of the glove and assembling the bits together. We initially sewed on a flex sensor we had made out of velostat and tape. However when we sewed it on, it did not integrate well with the fabric of the glove. The sensor also often dislocated reducing the consistency of our output.  After talking to the professor, we created a similar sensor using fabric as the insulator instead of tape. Although the readings were changing in much smaller granularity, it sewed on really well and a much better aesthetic choice.

I also tried making the android app, however the rest api error continued and the app did not work. We might fallback on IFTTT to the extent possible.

Material List

Home Assistant Sub-Project

  1. Particle Photon – $19.00 (1)
  2. Flex sensor – 4.5 inches – $12.95 (1)
  3. Flex sensor – 2.5 inches – $7.95 (1)

*We decided to go ahead with more of the DIY flex sensors. So we might possibly need more velostat and copper sheets and conducting thread.

SRF Sub-Project

Already purchased / owned

  1. Glove for prototype [final version subject to change based on prototype]
  2. Sparkfun IMU – $14.95 (1)
  3. Flex Sensor – 4.5 inches  – $15.95 (1)
  4. ESP32 Dev Board – $15.00 (1)
  5. High Torque Micro Servo – $9.95 (3)
  6. Resistive Force Sensor – $7.00 (1)
  7. Flex sensor – 4.5 inches – $12.95 (1)
  8. 3D printed  SRF – ~$9.00 (N/A)
  9. Glove [Final design] – $14.99 (1 ordered)
  10. PCBs – N/A (N/A ordered)

Need to Purchase / Being Shipped

  1. Resistive Force Sensor – $7.00 (5)
  2. Flex sensor – 4.5 inches – $12.95 (3) and/or Flex Sensor 2.5 inches – $7.95 (8)
  3. [Expected 4-18-19] High Torque Micro Servo – $9.95 (1)
  4. [Expected 4-18-19] 3.3V to 5.0V Level Shifter IC – ~$1.00 ( 1)
  5. [Expected 4-18-19] I2C Servo Driver Board [in case issue with microcontroller persists] – $14.95 (1)
  6. [Submitted request 4-13-19] Updated 3D printed SRF components $5.00 (N/A)

Areas of Concern

Curt –

Main concern is getting the flex sensors and algorithm done on time. I will be focusing exclusively on the flex sensors Monday and Wednesday so that I can make final adjustments the week after. Secondary concern (though very much related) is time. I have parts in shipment from Adafruit due Thursday to fix the lack of the rotational joint in the finger. Also I need to receive the 3D printed parts for this upgrade.

Vedant –

My main area of concern as of right now is to successfully get the particle to send the IR signal I want it to send. I am still trying to work on fixing the bugs in the code for the particle, but any help on that end would be great.

Shruthi –

InGlove – Post 5

InGlove

Curt, Shruthi, Vedant

Weekly Accomplishments

Curt –

I am going to present this in a different tone relative to previous posts that I have written. Specifically, I am going to allow more digression / stream of conscious comments in order to capture some of the context of last week (hopefully for entertainment value). If you don’t care for the longer version I have included a summary.

(Summary)

  • 3D printed all prototype finger parts
  • Constructed first version of the finger. Noting all construction issues.
  • Developing electronics of the finger (and struggling with a servo jitter issue along with sporadic movements)
  • Figured out the erratic servo issues are due to the ESP32 being a 3.3V microcontroller that has an issue controlling 5V servos once the Vcc voltage drops (motors starting moving). Current solution to this is to use a different 5V microcontroller until I get a 5V level shifter. Also servos will be driven off a separate USB powersupply.
  • Programmed a servo driver interface to determine appropriate starting joint values and joint limits. This will hopefully be shown in the demo.
  • I wear the finger to understand how it feels (weird) and
  • Ordered a glove that I like (also some through-hole PCBs for soldering later in the project).
  • Goal for next week is to crank out the flex sensors and integrate everything together.

(Long Version)

My work this week focused on finishing the prototype robotic finger and its mounting in order to start wearing the device thereby gathering insight into improvements for future versions. Below are several pictures from early in the week as I started exploring the integration process. During this time I was also printing parts at both the 3D printer on the second floor and the 3D printer at College Library InfoLab.

After picking up the 3D parts on Friday (in the much needed break I had during the day of the UIST deadline) I was able to get started on constructing the full finger.  More specifically, I started construction of the finger after a quick break to decompress from submitting the paper I was coauthoring (where I had a night’s deficit of sleep and fresh dose of caffeine mixing with the adrenaline already keeping me awake). I don’t have a picture of this version as it never took complete form. Instead I found I had to modify the components in order to get the individual pieces to work together. Unfortunately this meant going through all of the extra hinges I printed (in anticipation of this). Furthermore, I ended up cracking and warping most pieces (pretty sure this was not the sleep deprivation just poor part design).  In the end, I did manage to create a finger that came together as a complete piece (along with plenty of notes on what to change for future versions). Below is a picture of the finger in all its glory (note it is actually a later picture once it started working).

On Saturday I resolved to complete the finger for the demo on Monday. In comparison to my induced manic state on Friday (solely as a response to the unhealthy cocktail of stress, sleep deprivation, and caffeine), I was less productive. On Saturday, I was able to wire up the finger and started programmatic control of the servos. During this time I found an issue with using the ESP32 as my microcontroller, in that the ESP32 is a 3.3V device whereas the servos are ~5.5V. Ideally this is not a problem since the 3.3V should be barely high enough to be detected as high but once the servo starts to move (thereby drawing current) the voltage rail sags causing the microcontroller (connected to the same USB hub as servo motor voltage regulator) IO voltage to sag as well. This manifests as unpleasant jittering with the occasional sporadic spasm (which made me question whether the thing I was created was even safe to wear nevertheless I continued on). I ended up banishing the finger to a plastic box so that it would not make a freedom-focused break for the edge of the desk while I was debugging.  Saturday ends with me using an old arduino-clone microcontroller that is 5V to get rid of some of the odd behavior. I am planning on getting a 3.3V to 5V level shifter IC which should work even when the voltage rail sags.

Sunday, I finish some debugging on the servo control using the arduino clone and then use the serial interface I wrote to capture the joint information that I think is acceptable for the finger. After this I placed the finger onto my hand to get a feel for the device. As a short description, “it feels very weird when the servo moves”. Though this was nothing like an earlier moment where I only had the base on and reached into a cabinet getting caught on an edge that frankly I did not notice before. This event caused a moment of panic as my body had not adopted the base as part of its mapping. Lastly on Sunday, I ordered the glove that I intend to use for the final version. Final figure is an action shot with the finger.

My goals for this next week is to develop all of the flex sensors. I purchased the electrical tape I need (and so long as the Velostat, conductive thread, and conductive fabric last), I should be able to complete all eight sensors. Then it is onto implementation of the algorithm. I did put some further thought into this and I will be using PCA in order to make sense of all the data entering from flex sensors in addition to the orientation and acceleration vectors captured for the hand. From the PCA output I will construct a supervised learning ML task, starting with a linear regression (we can all dream that the solution is simple), but moving to something more complex like nonlinear regression, random forest regression, etc. Specifically to make it a supervised learning task I will create a program that requests a set of gesture and then I will enter the servo joint values associated with that state.

Shruthi –

For this week I worked on building an android application. I spent sometime simply understanding the basics of android app development process. After that I tried reading through the particle documentation of its API and was trying it out. My goal was to have an app read the data off a particle photon. However, the particle documentation is a bit vague and assumed a lot of knowledge of android development and I found it hard to build something from that documentation alone. I then started looking around for examples and eventually found one. In the process I realized that in order to establish connection with the particle server I need to use sockets to establish an open ssl connection using REST interface. To set this up I installed all the binaries required to setup curl on windows, however it is currently giving me an error when I try to POST a request. I plan on spending time maybe until Wednesday on this before trying alternate approaches.

Vedant – 

This week we looked more into developing an app that can relay information from the Particle Photon to the Smart Switch/Bulb. I looked into the API for the smart switch app and how to obtain information about it such as the device ID, local key and IP address. I worked on making 5 more of the DIY flex sensors which we will sew on to the glove. Additionally, I started looking into how to send IR signals using a photon and converting the IR codes into the format the photon can use.

Materials Lists

Home Assistant Sub-Project

  1. Particle Photon – $19.00 (1)
  2. Flex sensor – 4.5 inches – $12.95 (1)
  3. Flex sensor – 2.5 inches – $7.95 (1)

*We decided to go ahead with more of the DIY flex sensors. So we might possibly need more velostat and copper sheets and conducting thread.

SRF Sub-Project

Already purchased / owned

  1. Glove for prototype [final version subject to change based on prototype]
  2. Sparkfun IMU – $14.95 (1)
  3. Flex Sensor – 4.5 inches  – $15.95 (1)
  4. ESP32 Dev Board – $15.00 (1)
  5. High Torque Micro Servo – $9.95 (3)
  6. Resistive Force Sensor – $7.00 (1)
  7. Flex sensor – 4.5 inches – $12.95 (1)
  8. 3D printed  SRF – ~$9.00 (N/A)

Need to Purchase / Being Shipped

  1. Resistive Force Sensor – $7.00 (5)
  2. Flex sensor – 4.5 inches – $12.95 (3) and/or Flex Sensor 2.5 inches – $7.95 (8)
  3. Glove [Final design] – $14.99 (1 ordered)
  4. PCBs – N/A (N/A ordered)
  5. 3.3V to 5.0V Level Shifter IC – ~$1.00 ( 1)

Areas of Concern

Curt –

Falling behind on the glove portion. Need to just build all of the sensors and get them mounted on the glove along with the IMU. Also, I need to work through the issues with the finger that the prototype demonstrates. Some of these may not be addressed by the showcase as I would need to test alternate approaches to find the best. One important note in particular, is that I am going to need another servo in order to get a 4th degree of freedom.

Shruthi –

The particle photon community is not very extensive and hence debugging has become quite challenging as you run into new errors.

Vedant –

I was able to get the device ID and IP address but I am having trouble getting the local key for the smart switch. I did some more research and found couple of other ways to get it, so I will be working on that next week. Additionally, I was having trouble finding a IR code converter (from HEX to raw) for the IR codes for my TV, so I’ll have to look a little more into that as well.

Project Post #4- InGlove

Weekly accomplishments:

 

Curt:

This week I started 3D printing the finger. The servo mountings were printed along with the fingertip. However, the printer broke so I was unable to print the hinges and finger interconnect pieces. I am going to be reaching out to the library for printing if the 3D printer I was using is still broken. On another note, the finger construction is proceeding with fair results for a first prototype. I have been working on preparing the parts to screw into each other. In the next week I plan on printing the rest of the finger and get the initial prototype constructed. I will also be shopping for a wrist / hand brace that I can use for the purposes of providing a soft yet sound base for the finger.

I have also pursued the custom flex sensor that Shruthi and Vedant worked on last week. Specifically, I have produced two flex sensors joined by a central node in order to capture flex for two different joints in a finger. My next steps on this portion are to purchase a glove liner, construct flex sensors for fingers, sew on flex sensors, and start capturing hand data.
For the algorithm to convert gesture into finger joint position, I still need to work out the specifics. To discuss this, I first must detail the two applications of the finger that I have in mind. The two applications are an assistive grasping device and a hand-remapping device. For an assistive grasping device, as labeled cluster may be enough to understand what the intention is, then using that as a look up for rough location of the thumb. For example, open hand will cause full open of finger, full close will result in finger being fully bent. It is not entirely clear on the mapping for other gestures but using Asada’s previous work, I would constrain the clustering to the thumb and first two fingers on the user’s hand. An alternate approach is to attempt full hand clustering and augment the predicted values with the artificial synergies from Asada’s work. As for hand remapping this, whereby the thumb and first two fingers are considered a distinct hand from the last two fingers and SRF thumb, I plan on using artificial synergies as a direct signal to the SRF digit.

Future work using human computation to help define the gesture to finger data and future work on a physical therapy procedure for hand remapping is needed but will not be explored by end of course.

Vedanth:

This week we did some more testing with the DIY and the Adafruit flex sensors. We realized the two have very different resistances, so we needed different resistors in our circuit to test them. I was also able to use the Particle and connect it to the IFTTT app (which connects to smart switches). We were able to write a program that publishes an ‘event’ when the flex sensor is bent, which is then read by IFTTT, which then indicated the smart plug app turn the smart plug on. However, there was a large delay during this process, as the IFTTT app checks for the published event only once every minute. So we are starting to look into other options, such as making our own Android app which transmits the information from the published event to the smart plug app. I am looking into how the Particle will send/receive information from the Android app directly.

 

Shruthi:

This week me and Vedant worked on trying to build a pipeline from the flex sensor to end devices using IFTTT. We did manage to successfully connect sensor data to a smart plug via IFTTT and a smart plug app. However, what we noticed was that this pipeline requires multiple reads and pings to the server and there was no guarantee of immediate service. This implied that the latency between a flex and the task was high. We went ahead to debug the what is causing the lag and realized that although particle photon uploads data to the cloud pretty quickly, IFTTT was slow in reading data from the particle servers. Another constraint was that IFTTT does not allow communicating to the google assitant, however it allows communication from the google assistant(Which is not what we want). Customization on IFTTT is very restricttive and hence we plan to use the cloud API exposed by particle photon and to build our own android application. This maybe involve bit of a learning curve and I am looking into it.

Materials List

Home Assistant Sub-Project

  1. Particle Photon – $19.00 (1)
  2. Flex sensor – 4.5 inches – $12.95 (1)
  3. Flex sensor – 2.5 inches – $7.95 (1)

*We decided to go ahead with more of the DIY flex sensors. So we might possibly need more velostat and copper sheets and conducting thread.

SRF Sub-Project

Already purchased / owned

  1. Glove for prototype [final version subject to change based on prototype]
  2. Sparkfun IMU – $14.95 (1)
  3. Flex Sensor – 4.5 inches  – $15.95 (1)
  4. ESP32 Dev Board – $15.00 (1)
  5. High Torque Micro Servo – $9.95 (3)
  6. Resistive Force Sensor – $7.00 (1)
  7. Flex sensor – 4.5 inches – $12.95 (1)

Future / After initial prototype

  1. Resistive Force Sensor – $7.00 (5)
  2. Flex sensor – 4.5 inches – $12.95 (3) and/or Flex Sensor 2.5 inches – $7.95 (8)

Areas of concern:

For areas of concern, the only major concern right now is time to get everything complete but mainly the algorithm, as I need to understand the signal to noise ratio in my data, the precision of the finger, etc. before I can determine the best course of action

InGlove – Project Post #3

Curt, Shruthi, Vedant

Weekly Accomplishments

Curt

In the week before spring break I primarily worked on designing the robotic finger and experimenting with mounting locations for the finger. In the images section I will cover the limited prototype that I constructed to see if the mounting location made sense and a couple of screen captures of the current finger design. My next step is to focus on 3D printing the finger and to work on a custom flex sensor that can measure two joints in the glove. I also need to explore a mounting mechanism for the finger that is comfortable yet secure.

Vedant

The week before spring break, we were able to make a DIY flex sensor using some duct tape, conductive thread and a few pieces of Velostat (pressure-sensitive fabric). We tested that out using a voltage divider circuit and got some promising results in the changing voltage. I was able to connect that voltage divider circuit with a Circuit Playground and use the serial plotter in the Arduino IDE and found that there actually wasn’t very much noise from the DIY flex sensor.

In addition to that, the Photon Particle and flex sensors came in, and we got stared with how to use the Particle and get started with some basic circuits connected to it and how to use the Particle cloud IDE. We were also trying to get some readings from the flex sensors on the Particle cloud over WiFi, but we were not being able to, so I helped diagnose the problem and found that one of the sensors we ordered was defective. However, trying the same code and circuit with the other sensor, we were able to get some promising readings.

Shruthi

Over the past week me and Vedanth worked together on trying out different kinds of flex sensors. We ordered a batch of long and short ones from adafruit as well as tried making one one our own using velostat and conductive threads. We built voltage divider circuits to measure the change in voltage drop with the varying resistance of the flex sensor. Both of us were fairly new to hardware debugging, but we eventually figured out that one of the flex sensors we ordered was defective. We were successfully able to get the readings on the other one. However, in this whole process we found that our DIY flex sensor had comparable results in flex detection as the store-bought one. We might consider using DIY sensors for the remaining fingers as they are more affordable.

Images

SRF Sub-Project

In Figures 1 and 2 I am presenting the physical mockup to get a sense of the hand and how the finger takes up space around it. The goal of this is to make the design process more concrete.

Figure 1: Simple mockup of the finger to get a sense of how the technology interacts with hand in 3D space.

Figure 2: Mockup of finger constructed with the servo motors, tape and a piece of plastic.

Figures 3  and 4 depict the current CAD (using Fusion 360) for the finger. Note that I am still thinking about the mounting to the hand. While it should be rigid it must also be comfortable (though the finger is worn over the glove so there is some cushioning already on the hand.

Figure 3:  View of finger backside.

Figure 4:  View of back of hand, note the finger mounting location that I am considering.

Materials List

Home Assistant Sub-Project

  1. Particle Photon – $19.00 (1)
  2. Flex sensor – 4.5 inches – $12.95 (1)
  3. Flex sensor – 2.5 inches – $7.95 (1)

*After initial prototype, purchase more flex sensors.

SRF Sub-Project

Already purchased / owned

  1. Glove for prototype [final version subject to change based on prototype]
  2. Sparkfun IMU – $14.95 (1)
  3. Flex Sensor – 4.5 inches  – $15.95 (1)
  4. ESP32 Dev Board – $15.00 (1)
  5. High Torque Micro Servo – $9.95 (3)
  6. Resistive Force Sensor – $7.00 (1)
  7. Flex sensor – 4.5 inches – $12.95 (1)

Future / After initial prototype

  1. Resistive Force Sensor – $7.00 (5)
  2. Flex sensor – 4.5 inches – $12.95 (3) and/or Flex Sensor 2.5 inches – $7.95 (8)

Areas of Concern

For the glove we will continue to explore the mounting strategies for the flex sensor. Once we complete this we will need to capture gesture data. Currently there is no concern with this aspect however feedback and suggestions for mounting are welcome.

In regards to the SRF subproject, Curt still needs to figure out how to mount the finger. The original idea was to modify a skateboard / snowboard arm brace but a suggestion that was brought up was to modify a soft-support brace to allow full wrist movement. Again suggestions and feedback is welcome.