This week I started 3D printing the finger. The servo mountings were printed along with the fingertip. However, the printer broke so I was unable to print the hinges and finger interconnect pieces. I am going to be reaching out to the library for printing if the 3D printer I was using is still broken. On another note, the finger construction is proceeding with fair results for a first prototype. I have been working on preparing the parts to screw into each other. In the next week I plan on printing the rest of the finger and get the initial prototype constructed. I will also be shopping for a wrist / hand brace that I can use for the purposes of providing a soft yet sound base for the finger.
I have also pursued the custom flex sensor that Shruthi and Vedant worked on last week. Specifically, I have produced two flex sensors joined by a central node in order to capture flex for two different joints in a finger. My next steps on this portion are to purchase a glove liner, construct flex sensors for fingers, sew on flex sensors, and start capturing hand data.
For the algorithm to convert gesture into finger joint position, I still need to work out the specifics. To discuss this, I first must detail the two applications of the finger that I have in mind. The two applications are an assistive grasping device and a hand-remapping device. For an assistive grasping device, as labeled cluster may be enough to understand what the intention is, then using that as a look up for rough location of the thumb. For example, open hand will cause full open of finger, full close will result in finger being fully bent. It is not entirely clear on the mapping for other gestures but using Asada’s previous work, I would constrain the clustering to the thumb and first two fingers on the user’s hand. An alternate approach is to attempt full hand clustering and augment the predicted values with the artificial synergies from Asada’s work. As for hand remapping this, whereby the thumb and first two fingers are considered a distinct hand from the last two fingers and SRF thumb, I plan on using artificial synergies as a direct signal to the SRF digit.
Future work using human computation to help define the gesture to finger data and future work on a physical therapy procedure for hand remapping is needed but will not be explored by end of course.
Vedanth:
This week we did some more testing with the DIY and the Adafruit flex sensors. We realized the two have very different resistances, so we needed different resistors in our circuit to test them. I was also able to use the Particle and connect it to the IFTTT app (which connects to smart switches). We were able to write a program that publishes an ‘event’ when the flex sensor is bent, which is then read by IFTTT, which then indicated the smart plug app turn the smart plug on. However, there was a large delay during this process, as the IFTTT app checks for the published event only once every minute. So we are starting to look into other options, such as making our own Android app which transmits the information from the published event to the smart plug app. I am looking into how the Particle will send/receive information from the Android app directly.
Shruthi:
This week me and Vedant worked on trying to build a pipeline from the flex sensor to end devices using IFTTT. We did manage to successfully connect sensor data to a smart plug via IFTTT and a smart plug app. However, what we noticed was that this pipeline requires multiple reads and pings to the server and there was no guarantee of immediate service. This implied that the latency between a flex and the task was high. We went ahead to debug the what is causing the lag and realized that although particle photon uploads data to the cloud pretty quickly, IFTTT was slow in reading data from the particle servers. Another constraint was that IFTTT does not allow communicating to the google assitant, however it allows communication from the google assistant(Which is not what we want). Customization on IFTTT is very restricttive and hence we plan to use the cloud API exposed by particle photon and to build our own android application. This maybe involve bit of a learning curve and I am looking into it.
For areas of concern, the only major concern right now is time to get everything complete but mainly the algorithm, as I need to understand the signal to noise ratio in my data, the precision of the finger, etc. before I can determine the best course of action
Our project is composed of three subprojects: the hand gesture / position glove and two applications for said glove.
For the first part of the final class project, we would like to make smart gloves that helps the user send remote signals using hand gestures as commands. While there are many similar kinds of gloves that exist as projects, a lot of those gloves do not make use of the wide variety of sensors available in the market. Additionally, some of the projects seemed to be made inefficiently in that they seemed to be pretty bulky for how much they can do. We want to explore how those existing gloves could be improved in not just functionality but also aesthetics. Thus, we would work on this project not as an invention, but rather a demonstration/experiment
1.1 FUNCTIONALITY:
The functionality of the glove would include things such as pressure, motion and flex sensors to capture various gestures, and associate these gesture-controlled commands to control the application sub-projects as detailed below. Communication will be handled with Bluetooth or WiFi depending on the application being implemented.
1.2 USAGE:
The typical process to use the glove will start with a user placing the glove onto their hand and performing calibration. While the details of calibration will need to be worked through as we develop the software, it will most likely take the form of the user positioning their hand into a predefined pose. Depending on the end application, we could either provide the user with a list of predefined guidelines that maps gestures to tasks or allow customization of the said mapping. In the latter case, we would have to perform experiments on the kinds of gestures that are best read and provide the user with a set of guidelines on how to make customization most efficient.
We now proceed to describe separately the idea and plan for Curt’s supernumerary robotic finger as well as Vedant and Shruthi’s Wearable home automation.
2. SUPERNUMERARY ROBOTIC FINGER
Using the glove developed as hand position input, Curt will construct a supernumerary robotic finger mounted next to the left hand pinky finger. This digit will mirror the biological thumbs location and joint structure. The robotic finger will map the hand gesture to user intention which in turn maps to a joint configuration for the finger. By the end of the term a simple hard-coded heuristic function will be developed to perform this mapping.
Curt is primarily developing this project for his own curiously. Specifically, Curt would like to wear the device for a full day to record hand position data, record failures and inconveniences, record interactions with others and their perception, and explore contexts of applicability. This in turn allows Curt to further develop a machine learning algorithm, iterative design improvements, HCI insight, and further general SRF usage taxonomy respectively.
As for the eventual end-user, this technology could potentially augment any life task however I am mostly interested in applying the technology to the manufacturing and construction spaces where the ability to do self-handovers is an important aspect of the task. An example would be screwing an object overhead while on a ladder. The constraints are that a person should keep three points of contact while holding both the object and the screwdriver. If they need to do all three, they may lean the abdomen onto the ladder which is less effective than a grasp. Instead with several robotic fingers (or a robotic limb) the object and screw driver could be effectively held/manipulated while grasping the ladder. Another example the should relate to this class is soldering where the part(s), soldering iron, and solder need to be secured. This could be overcome with an SRF thumb to feed the solder to the tip of the soldering iron while holding the parts down with one’s other hand.
In the SRF literature Curt is proposing to provide a modest incremental improvement. Wu & Asada worked with flex sensors however they were only interested in the first three fingers and did not attempt to model the hand position directly. Arivanto, Setiawan, & Arifin focused on developing a lower cost version of Wu & Asada’s work. One of Leigh’s & Maes’ work is with a Myo EMG sensor which is not included in the project. They also present work with modular robotic fingers though they never explore the communication between finger and human in that piece. Finally, Meraz, Sobajima, Aoyama, et al. focus on body schema where they remap a wearer’s existing thumb to the robotic thumb.
2.1 WEARABLE HOME AUTOMATION:
While these gloves could have many different functions, we would like to focus on using these gloves as part of a smart home, including functions like controlling the TV, smart lights, as well as a Google Home/Alexa (through a speaker). This could especially be useful for people with disabilities (scenarios where voice based communication is not easy), or even just a lazy person.
This project utilizes ideas of Computational gesture recognition to create a wearable device. We use components of sensor systems and micro controllers in conjunction with a mobile application, (preferably android) along with fundamentals of circuit design. Our project management technique would involve a mix of waterfall and iterative models, keeping in mind the timeline available so as to create a viable solution to home automation.
The idea is to have the glove communicate to an android application either via bluetooth or a wifi module, and the phone in turn can control several other devices. Since applications like the Google assistants have powerful AI technology integrated into them, can we extend those capabilities from a beyond the phone on to a wearable fabric?
Also, it brings in the concept of a portable google home of sorts. This means that we do not need to install a google home in every room. This project is meant to be pragmatic and the consumer is the general population. It could also be of extra help to people with disabilities.
3.INSPIRATION
SRF takes inspiration from Wu & Asada (along with other work in flex sensors as to detect finger movement), Meraz, Sobajima, Aoyama, et al. will provide inspiration of using a thumb as the digit being added, and Leigh’s & Maes’ work in modular fingers will be the inspiration for how Curt constructs the wrist connection. The novelty is bringing these pieces together to form a wearable that one can run a long-term usability test.
https://ieeexplore.ieee.org/document/6629884
http://maestroglove.com/
SRF:
[1] F. Y. Wu and H. H. Asada
[2] M. Ariyanto, R. Ismail, J. D. Setiawan and Z. Arifin
[3] Sang-won Leigh and Pattie Maes
[4] S. Leigh, H. Agrawal and P. Maes
[5] F. Y. Wu and H. H. Asada
[6] Segura Meraz, N., Sobajima, M., Aoyama, T. et al.
We find significant work done in the general area of gesture recognition using a glove. However, those gesture interpretations have been applied in different domains. We did find a project or two where gloves where used for home automation. However, the choice of sensors is different from what we plan to use. (flex and pressure).
The following sketches were developed for the project pitch and compiled here due to their similarity. The important aspects to note with the glove is the presence of flex sensors used to capture finger movement, touch sensors on the fingertips, and an inertial measurement unit to capture hand orientation.
Glove:
SRF:
5. MATERIALS
Electronics:
Microcontroller (w/ Bluetooth and WiFi ex. ESP32, Adafruit Flora, Particle Photon, Arduino, Possibly external Bluetooth or WiFi Module)
Flex sensors
Resistive pressure/force sensors
Vibration motor
IMU (Gyroscope, accelerometer)
Micro servos (For SRF)
Infrared LED emitter / receiver
Clothing / Physical Materials:
Glove
Wrist brace (For SRF)
3D printed components (case, mounting)
6.SKILLSET LIST:
Curt:
I come into this project with experience soldering, designing circuits, and programming. My talent is mainly in embedded system programing with C. I will need to master sewing and other soft materials skills and knowledge to hack a glove in an aesthetically pleasing fashion. Another skill hurdle will be 3D printing as I have some familiarity during undergrad where I worked with a 3D printer as a hobbyist but never formally trained. Finally, I will need to further hone my project management skills due to the ambitious scope as laid out.
Vedant:
While I have some experience with programming, this project would require a lot of microcontroller programming, so that would be the main thing I would have to master. The project would also require some knowledge of IoT so I would be looking more into that as well.
Additionally, I have experience using tools in the makerspace (3d printing, laser cutting), so if I have time to focus on the aesthetics, I could use those skills to help me improve the looks. However, it is a given that I will also have to learn soldering well to compact the design, as well as sewing to make the glove look nice.
Shruthi:
I have prior experience with arduino programming. I have a decent understanding of C and Python programming. Also, I can contribute in integrating the sensory data to a mobile application. I am more comfortable debugging on an IDE and not so much directly on a hardware device and I need to improve my skillset in this direction.I could also help with the aestheic aspect of the glove in sewing and stitching. However, I do not have significant experience in these areas.
7.TIMELINE
7.1 MILESTONE 1
Technology shown to work (March 25)
Glove
Glove w/ all flex sensors
Position of hand captured, data transmitted to PC/phone for processing / visualization
IMU captures absolute orientation, data transmitted to PC/phone for processing / visualization
Power supply and integration started
SRF
Robotic finger 3D printed
Assistant
Test whether the bluetooth or wifi module would work best to connect to the Google Assistant
Look into Google assistant API and see what functions/commands can be given using a third part app
7.2 MILESTONE 2
Technology works in wearable configuration (April 8)
Note after this point, project focus splits from being glove focused to application focused. Curt will take on SRF subproject and Shruthi & Vedant will take on the home assistant subproject.
Glove
Refinement of aesthetics
Power supply and integration complete
SRF
Robotic finger controlled
Brace developed
Assistant
Basic functioning app developed
Required IR codes and assistant commands obtained
7.3 MILESTONE 3
Technology and final wearable fully integrated (April 22)
Glove
Error reduction, final iteration on design
SRF
If time allows a one person “user study” to capture longer term data
Else refinement of the finger control / design
Assistant
Fully functioning app that receives commands from gloves and sends commands to Google Assistant app to control smart home
Aesthetic gloves that house the electronics, battery and sensors ergonomically
8. FALLBACK PLAN
Glove:
We will be starting with an experiment with one finger to see if one flex sensor is sufficient to capture the gesture information necessary. The answer found may show that one flex sensor is sufficient for the assistant subproject but not the SRF project. Alternatively it could be that flex sensors themselves are not sufficient. To that end, this experiment will be done early in order to integrate the findings into our design.
The minimum that we will consider success is a glove that can output preprogrammed gesture commands that are loosely coupled to the state of the fingers. For example curling the index finger will be a command.
SRF:
Curt will have to learn more about 3D printing to develop the SRF physical components. Additionally, a mounting system needs to be developed. Depending on how time consuming this step is, Curt may need to develop a simple, proof of concept, heuristic for the final demonstration. This could result in non-ideal finger movement that does not track the intent of the wearer, however failure of the algorithm in some cases will be deemed acceptable as this is still an open research area.
Assistant:
We will also integrate the sensor signal data and manage to establish communication to a phone via either bluetooth or wifi. We would probably begin by controlling a smart light or a single device and eventually try to build larger integration with the Google home app. We would first develop a prototype, a rather simple proof of concept and try to get a single instance of end to end communication channel functional. However, since this is still in experimental stages, we cannot guarantee the accuracy and reliability of its function.