A MIDI(Theremin) jacket that allows you to create, record and demonstrate your music idea as an instrument for synchronized acoustic and visual performances.
Video
Poster Image
What can you do with it?
When you have a sudden flash of music idea walking down the street, turn on the jacket and start recording the idea with the theremin attached in the front of the velcro jacket. The piece of music will be stored, and you can recreate the music when you came back home. You can also turn on the light strip and show off your music to the public.
How it works?
The jacket has three components: theremin, light strip, and the mini computer. The theremin is built on the Leap Motion, a gesture recognizer, that tracks the hand motion of your hands. The data will be transmitted to and stored at the mini computer (a raspberry pi) attached at the back of the jacket. When the light strip is on, the mini computer will also control the light strip to move in accordance with the hand motion. The higher the note it plays, the higher the active light will be on the strip, and vice versa.
Development Experience
Overall, the group works collaboratively and efficiently. We embrace each other’s ideas, cherish the time of development, and have good intra-group communication. All fantastic idea came after heated discussion (using velcro to modularize works, light strips with dynamic designs, etc.), and we all felt engaged in the discussion and pleased about the fantastic ideas and outcome.
Do we meet our goals?
Overall, the project meets our original goal to create this prototypical wearable theremin that allows the user to play with. The core functionality, theremin light strip, and programs, meet what we would like the theremin to do. There are certainly a few things we have tried and failed to deliver during the short period of time, including its wearability (raspberry pi), stability (connection between light strip and circuit playground), user experience (accuracy of leap motion sensing, software interface, instrument types). But we still think as a course project, we had a good prototype that people embrace and excite about.
Hurdles and Resolution
Challenge 1: Modularization. There was a period that we found it hard to separate our work because of the design of jacket, and that directly influence the progress in the visual effect development parts. After a heated discussion, we found that it was a good idea to use Velcro to modularize the jacket into different components, all flexible to attach and remove.
Challenge 2: Leap Motion on Raspberry Pi. Running Leap Motion on Raspberry Pi is a headache. We encountered hardware problem that burned out both of our available Pis. After getting standard hardwares and equipments, we limited system support, not to say huge power consumption. At the last moment, we give up on the wearable functionality and used a computer for the showcase.
Challenge 3: Light Strip and Data Transfer. Light strip and circuit playground turns out to be pretty hard to integrate. We spent a chunk of time to get the light strip work with the circuit playground, and a significant chunk of time to get the proper data transfer working alongside with the application.
Next Step
Spend more time developing the Theremin Software Interface
Develop our own infrared sender/receiver instead of using Leap Motion
Improve the hardware connection of the light strip
Final Materials
Velcro
Amazon Sweat Shirt
Leap Motion
Ada Fruit NeoPixel
USB Cables
Computer (a Macbook Pro, since we did not get the Raspberry Pi running)
A pair of wings that responds to sounds and colors for artists, performers, and adventurers.
Video
Poster
Describe what your project does and how it works (2 points) (Min. one Paragraph)
The project is a pair of light-up wings that reacts to external color and sound. The wings themselves are constructed with floral wire and cellophane. The light comes from four RGB LEDs that feed into 8 fiber optic cables on each wing. These LEDs are controlled by a Circuit Playground Express microcontroller. The color of the lights is determined by a color sensor attached to the sleeve of the project; the microcontroller collects the color data from the sensor and changes the LEDs to match. Meanwhile, the microcontroller’s microphone collects decibel level and maps it to the current LED brightness such that the LEDs will be brighter the louder the surroundings are.
Describe your overall feelings on your project. Are you pleased, disappointed, etc.? (2 points)(Min. one Paragraph)
We are very proud of our project overall. Through a lot of trial and error, we were able to get a functional product. Some pieces of our project could have been done better had we more time or another iteration – for example, the spray adhesive we used to attach the layers of cellophane ended up not securing as tightly as we could have liked and leaving visible splotches of glue – but for the time we were given and considering neither of us had made a project like this before we’re very proud of our work.
Describe how well did your project meet your original project description and goals. (2 points)(Min. one Paragraph)
Our original goal was the incorporate reactions to both a color sensor and a microphone in an aesthetically pleasing way. We have accomplished this goal; our project correctly responds to that information and we made it aesthetically pleasing both in darkness and in light. Some details could be improved; for example, we originally wanted a material that would diffuse the light rather than making it clear where the light is emitted. In the final project, this doesn’t happen as well as we wanted: it can be seen clearly where the fiber optic cables the light runs through are placed. The light is also much dimmer than we may have wanted, requiring complete darkness to be properly seen. Overall, however, we think we properly accomplished our goals.
Describe the largest hurdles you encountered. How did you overcome these challenges? (2 points)(Min. one Paragraph)
Yiting:
There are two hurdles I encountered. The first one is the difficulty of securing the optic fibers along the wires of the wings. Before I put my hands on the construction of the optic fibers, I thought the adhesive spray and the cellophane would make sure the optic fiber is aligned in place. Yet, it turned out to be more difficult than I expected since the fibers move around a lot. It would be very difficult to put cellophane on top of the wires and optic fiber while making sure they are placed neatly. In the end, I had to use hot glue to secure the optic fibers but the placement of the glue spots are not consistently placed, making it less aesthetically pleasing. If I were to make the wings better, I would have a 3d printing model to replace the wire and put optic fiber or Neopixel along with the printed model. Or, I would sew the optic fiber along the wire to make it less visible than the glue spots.
Another difficulty is to figure out the circuit design for the color sensor from the sleeve to the extended fabric attached to the cardboard panel while making sure there is enough space for the microcontroller and the soldering to the LEDs. If I have more time, I would 3d print the panel and make it a firm but flexible towards the back of the model so that the panel won’t appear downwards when being worn.
Julia:
The biggest struggle I encountered was in the coding. I used Arduino to write to the Circuit Playground, which introduced a couple of weird difficulties. For example, to change the color of the RGB LEDs, you need to be able to analog write to them; however, it took trial and error with Kevin to discover Arduino’s analog write function doesn’t work properly unless you refer to the pins by their digital numbers (6, 9, 10) rather than their analog numbers (A1, A2, A3). I also struggled with coding for the color sensor; there are few resources online for coding with it, and I spent a long time trying to debug why the code occasionally suddenly stopped working before realizing it’s not the fault of the code, but that the color sensor is very sensitive – if it loses connection with its SDA or SCL wire even momentarily, or if a short circuit occurs, even when it reconnects it will only read 0’s (thus turning off the lights) until the Circuit Playground is reset.
Describe what would you do next if you had more time (2 points) (Min. one Paragraph)
Yiting:
If I had more time, I would sew the optic fiber onto the floral wire rather than using hot glue to secure the optic fiber, create a more stable panel to hold the wings using 3D-printed model, design a better vest that is adjustable to everyone, and implement a better circuit design using conductive thread to connect from color sensor to the microcontroller.
Julia:
Given more time, I would try to fine-tune the code for more true color representation; the brightnesses of the red, green, and blue lights within the LED differ and even with different strength resistors connected to each pin the color is still slightly off. I’d also like to attempt sanding the sides of the fiber optic cables since we found even the side-emitting fiber optic cables are very dim unless in complete darkness.
List of materials:
Side Glow Fiber Optic Cable 1.5mm~8mm Optical Fiber [1.5mm for 15 meters]
Color Sensor
16 Gauge Floral Wire
Fabrics
Cellophane
Heat Adhesive Spray
Elastic
3D Printing Model
My project is an NFC coil, constructed out of copper foil, that powers devices or LEDs off of a phone or other NFC reader/writer. It is essentially an air-cored transformer, as the coil in the phone creates a magnetic field with alternating intensity. This time-varying magnetic field, due to Faraday’s Law, creates a voltage potential over the coil. This voltage potential can be used to power very-low-power devices such as an LED or (if rectified) microcontrollers. I stuck to using LEDs for this project because of their ease of use and visual demonstration.
I’m pretty pleased the NFC power works as well as it does, especially with many Android phones that everyone has in their pockets. I would have liked the “powerable” distance to be larger, but per the NFC specifications of <10cm my device performs about as well as commercial ones.
The fact that the LEDs work so well is great. I was guestimating the design of my NFC coil, trying to estimate the inductance I needed. The coil generates ~3.5Vpp when close to a phone and the current produced was on average about 10mA. This gives about 17mW of power. Whether the LEDs would work was up in the air until I tested them.
I accomplished my goal of a remotely powered, wearable circuit. I wasn’t able to get knee deep into the antenna design needed for higher frequency RFID but that would have taken a considerable amount of time and could have been almost impossible.
I didn’t wind up having the time to do any sensor design. I realize that programming my own sensors would have taken a serious amount of time and money. The common RFID / NFC sensors need some sort of devboard or breakout board to be programmable. Figuring out their firmware would have been another massive hurdle.
The largest challenge was getting the vinyl cutter to cooperate and not tear up the copper when cutting. It would also sometimes not complete edges or cut quite right.
Through trial and error, I started laminating the copper with a layer of vinyl to protect it. This stopped the cutter from tearing up the surface of the copper. The issues with the cutter not complete edges was a little more persistent. I eventually figured out that small details were (mostly) the cause. Any small corner or other detail in my SVG file could cause the cutter trouble. I took to simplify my SVGs before cutting them.
This mostly solved the issue, but there were still some times when I had to use an Exacto knife to weed part of the circuit.
I would put a lot of effort into getting my hands on some of that Lumilor paint. I potentially would see if I could get the Makerspace or another organization to sponsor purchasing it. I think it could be really interesting in combination with coils that generate induced AC voltage. Finally, I would try to design my own NFC “power” output PCB that would have increased and variable output for testing.
One Sentence: On-premises vulnerability assessment tools.
Video:
[Note: The entire state of Wisconsin, can’t handle the level of cheese in this video]
Final Poster:
Description of what project does and how it works:
The purpose of the hackerman jacket is to reveal security flaws in the infrastructure of a physical location, it accomplishes this with three tools. The first tool is the RFID theif that uses a basic RFID reader in combination with a micro controller to copy the hash map of a 13.56 Mghz tag and paste it onto a blank card. The second tool is the bash bunny that is made up by a raspberry pi zero w with some push buttons and a DIP switch. This raspberry pi was programmed to be recognized as a keyboard, mouse or some other external device by the victim machine and then I could select one of my payloads on the DIP switch which would then execute on the host machine. The third tool is the Pumpkin Pi which was a retrofitted raspberry pi 3b+ with wifi adapters that was programmed to set up a fake access point, once a victim connects to it I can monitor their internet traffic if it is unencrypted. The hope is that when these tools are used in tandem, one could achieve more in a assessment then on their own.
Overall Feelings Project:
Overall I am disappointed with myself in my en devours with this project. I knew that doing such a hardware focused project would be difficult as I did not have experience in it before, but I did not expect such unreliable performance and extreme difficulty. I had to cut back the versatility of these tools to start with due to my inability to obtain the hardware for greater uses, which already limited the scope of my project. However, the most saddening part was that one of my tools was completely bricked right before the showcase and I had to restart my software work on my pumpkin pi at least 12 times due to me crashing the SD card on it. The fact that I couldn’t get two of three tools reliably working was overall a disappointment for the whole project.
How well did the project meet original project description and goals:
Technically speaking I did set accomplish what I set out to do. All of the tools that I originally wanted to make, were made and did work in a limited function. However, like I have been saying before I did not make these tools better then their commercial counter parts, rather I downgraded all of their versatility and increased their sizes, so this aspect of my original intentions were failed to be realized. In addition to this I intended for my tools to be easily taken in or out of the jacket, but this did not happen due to my difficulties with 3-D printing, my tools ended up being permanently attached to my jacket.
Largest hurdles and how they were overcome:
My biggest hurdles were definitely all my hardware challenges. I did not know how to solder, how to put a circuit together, how to 3-D print, all of these things were brand new to me. To overcome these challenges I spent a lot of time experimenting and training myself in the ways of soldering (purchased my own solder gun and ruined many good circuit boards). For this and the 3-D printing my main method for coping with these hurdles was simply trial and error and to my surprise this kinda worked, I did get all my circuits together, on and working (kinda). Even though that some of these tools broke I still consider these challenges overcome.
If I had more time:
If I had more time I would have acquired more versatile hardware and made my tools accomplish the scope that I set out for. In addition to this if I had more time I would have had more chances to get my holders/inserts for my tools the perfect size (and they would have been finally removable and protected!). In addition to this I wish I had more time to come up with a better demonstration for these tools instead of them just (working). Maybe this could have been a well shot video of all of these tools working in tandem in a real life scenario would have been easy to show.
Project Title: Táltos-oid (Formerly posted under InGlove)
Project Teams: Curt Henrichs
Sentence Description:
Human augmentation device providing users with an extra thumb for everyday tasks.
Video:
Poster Image:
The following image is the poster presented at the Wearable Technology showcase.
Project Description:
The long-term goal of this project is to experiment with augmenting the human body. Specifically I am interested in understanding how a supernumerary robotic finger interacts with the user to complete typical tasks that occur while living their normal life (in essence what does it mean to live with this device) in order to better understand the design challenges and applications in a generalized manner. This is motivated by a gap in the nascent literature of supernumerary robotic limbs in exploring these broad application questions instead of a narrow application focus.
To achieve this goal, the supernumerary robotic finger must be a functional appendage; meaning a well-articulated finger and intuitive interface to command finger. I am considering a well-articulated finger as one that can provide a sufficient range of motion. A user should be able to understand this appendage as a finger or thumb thereby building on existing grasping knowledge. In regards to the intuitive control, the finger must capture the intentions of the user in a manner that facilitates the augment the task. Thus the intention signal to noise must be high enough to be actionalizable and yet not be a burden on the user.
Táltos-oid is designed to address the outined goal by providing a four degree of freedom robotic finger and a flex sensor glove to provide gesture data to control the finger. Táltos-oid is designed with a function first perspective though aesthetics of the design should be reasonably thought out. In the following paragraphs I will discuss the robotic finger, flex sensor glove, and data-processing subsystems.
The robotic finger is composed of custom designed 3D printed plastic components actuated by four high-torque micro-servos. The rotational joints roughly map to the articulation of a human thumb where greater range of motion is feasible with the micro-servos. A soft interface layer was constructed with neoprene and leather to provide comfortable mounting of the finger to the human hand. The design of the interface is inspired by wrist and thumb braces with a 3D shape held by the plastic and leather. Furthermore, the interface is adjustable with two velcro straps providing some invariance to hand size. Finally, control of the servo motors is handled through an I2C servo driver.
For the flex sensor glove I started by minituraizing and innovating on the DIY flex sensors as documented by Plusea. Specifically I introduced a node in the center of the flex sensor to provide voltage thereby allowing two joints to be captured as parallel resistors. I sewed on the flex sensors using a manikin hand to provide support for the material. Then to I used silver-nylon conductive thread and Bare Conductive electronic ink to connect the flex sensors to wires coming from the microcontroller (through a wire to thread interface). The wire to thread interface is composed of either a 3-pin or 2-pin male header with conductive thread tied and wrapped around the long end. I applied conductive ink to the threads to secure them. After drying the long end of the pins are pushed through a piece of electrical tape thereby sandwiching the conductive thread between the electrical tape and plastic space from the header. I then apply a bit of hot glue to secure the interface. This interface is then sewn into the glove; wires attached; and more hot glue applied to secure the assembly. Finally, to read the flex sensor signal, the wires are connected to an analog mux and 3.3V. When the microcontroller selects a channel, that sensor is then in series with a resistor to ground creating a voltage the divider that the ESP32’s ADC can read. For the custom flex sensors, I found 1K Ohms to be a reasonable resistor value. If using a commercial flex sensor this may change (e.g. the adafruit 4.5″ flex sensor works well with 10K Ohms).
The microcontroller and data-processing is the last major component to this project. It consists of an ESP32 microcontroller and PC computer connected through bluetooth. The ESP32 was selected due to providing bluetooth onboard along with WiFi (though not using it for this project, it could be used in the future). The ESP32 connects to the I2C servo driver and the analog mux as noted previously. The firmware is developed around a serial JSON API that provides a push update of current state to connected PC along with polling functions for sensor data. JSON API also provides a command to set the joint states of the finger.
On the PC I wrote several python scripts to assist in development and operation of the machine learning model that controls the finger. First script worth noting is a training script which provides a command line interface to set joint state and capture sensor and joint data to a csv file. The second script is the algorithm generation script which takes in the csv file and outputs a trained model based on user selection as a python pickle file. Finally I have a multi-threaded model runner script that captures the pushed state from the serial port, calculates joints from the trained model and commands the finger joints through the serial JSON API.
For the machine learning models, I worked with linear regression, adaboost random forest regression, and a neural network. The linear regression and random forest approach consisted of four models each trained on a single joint whereas the neural network was trained to generate all joints in one model. All of these models are trained on a supervised problem of mapping raw sensor data into joint states. Additionally I explored a couple of pre- and post-processing steps on this data. First I applied PCA to the sensor data and compared the model trained using PCA against the raw data model. With no noticeable difference, I opted for the raw data model if only for simplicity. Then I applied a moving average to the sensor data again comparing against default case. I found the moving average helped with the random forest and linear regression approaches but the neural network did not benefit from this addition. Then I tried post-processing the joint values to prevent some of the jitter produced by the models (except the neural network as it was already fairly stable). I found that a moving average on the joint values did benefit the random forest and linear regression approaches. Finally, I tried both a sensor and joint moving average which made for a stable but lagged response from the SRF. After having the results I concluded that the neural network was the best approach. Note, in terms of implementation, I used python’s scikit-learn library to develop these models.
Given the description of what Táltos-oid is, the next topic is how to wear and use it. While technically a modular system with the glove, finger, and microcontroller connected by wires that mate with header pins, I have found that it is far easier to keep the connections intact and to just place the components on in the following order: glove -> microcontroller wrist-strap -> robotic to finger. After putting on the device, connect the two USB power cables to the USB ports on the microcontroller wrist-strap. When powered up, the finger hold at a predefined resting position. The user can now start the model runner script with the neural network model and start to grasp objects. The model was trained such that a fully closed hand will cause the SRF to rest its tip onto the fingers; mirroring a typical biological thumb state. When fully open, the model would stretch out the finger, assuming it is grasping a large object. In between these extremes the finger will move to anticipate size of the object being grasped (even if no object is being grasped as there is no way to discern that state).
Feelings toward Project:
I am very pleased in the way the robotic finger turned out, generally pleased with the microcontroller subsystem, and a bit disappointed with the glove. While I will cover these in the successes and challenges sections, it is worth noting that the disappointment with the glove is due to the failure in my design and implementation but not in the concept behind it. Specifically, the glove failed during the showcase event which did not allow time in the term to correct for it. Thus, I feel like the project is not the best that it could be. From a different perspective though, I am glad that the glove failed. As now I can learn from this failure with the intention of not making the same design decisions in the future (regardless of project).
Success:
Though as mentioned, the glove did fail, I was able to construct a simple replacement that demonstrated some of the capabilities of the project. Recall my long term goal is to develop a human augmentation device for everyday tasks and to understand what challenges, design considerations, etc. that the space offers. To that end, I would consider this a good step in that direction. The technical platform of the finger is a good first iteration; the microcontroller system works well enough for the task at hand; and the simplified glove made for the final demo illustrates the intuitive control. Also the challenges that I faced during the design are worth capturing as it is part of the journey to answer my questions.
If I had to give a quantitative score I would give an 8.5 out of 10 for my expectations. The missing aspect is a user study to start asking the questions outlined.
Challenges:
Like any ambitious project, one does not know ahead of time all that will be encountered. While this can present challenges, it also affords the opportunity to pivot and prove out the concepts coming into the project. My project was no exception to this with pivots on several key decisions in roughly chronological order.
The first challenge encountered in this project was the flex sensors available for purchase are built on a flexible PCB that restricts movement and forces a glove shape instead of adapting to the wearer’s hand. An InGlove team member (when I was on that team) found Plusea’s work on a flex sensor that was soft and conforming to the body.
The second challenge was redesigning the robotic finger as it initially had three joints instead of the current four. While three joints are sufficient for most smaller object grasping tasks, it is insufficient for a grasp of large or awkward objects that need support lower than the palm of the wearer. Fortunately I designed the robotic finger in anticipation of these potential changes (by using a modular 3D printed component system). As an aside, there is also a piece of sanded down command hook to provide the necessary angle between two flex sensors. This could easily be replaced by a 3D printed component.
Third challenge was the interaction with bluetooth and ADC readings. The ESP32 sells itself as having two ADCs onboard giving approximately half of the pins ADC capacity. However when using the radio (either bluetooth or WiFi), ADC 2 cannot be used for external sensing. I also found issues with ADC 1 when using the radio though it generally worked. To handle this challenge I used an analog mux that I purchased on a whim for a different project.
Fourth challenge and pivot was using an I2C servo driver for the PWM signals to the micro-servos instead of direct control by the ESP32. While working with the ESP32, I found that the PWM signal would dip in voltage when the servos were moving. Using a second power source helps in this (and I did implement it) but I also went a step further and purchased a driver board that offloads the control task from the microcontroller. I have tested with a single battery and the system still functions correctly so this change was not pointless.
Current failures that I am facing (and will address) are torque issues at robot finger-tip and developing a new glove approach.
The torque issue is a consequence of direct driving the joint from the servo. One approach is if I wanted to reduce the joint angle to 90 degrees I could double the torque output. Another approach would be to purchase micro-servos with even higher torque rating (though this would be more expensive than adding two gears). For this version of the finger, I think the torque is sufficient to express the concept.
Designing a new glove approach means more exploration of technologies, implementation strategies, and testing. As I mentioned before, I am glad the glove failed as it instructs me on the next steps I need to take in order to improve the design. So far I understand the issues with the glove as:
Megaohm resistance between flex sensor and wire terminals.
Some of the terminals failed
Some of the conductive threads while still attached seemed to have increased resistivity (perhaps friction and skin contact)
Conductive ink which was used as a conductive glue was discovered to be water soluble and thus not great if the user’s sweat breaks down the glove.
Last challenge is more of a hypothesis suggestion. Specifically, context and environment awareness is important for the machine learning algorithm to make sense of the grasp intention and resulting joint-states. A user could grasp different objects that at the level of flex sensors looks near identical but requires the robotic finger to be different sets of joint-states. This made machine learning challenging as seemingly valid data would cause performance to drop drastically.
Next Steps:
If the course was extended another week or two, I would rework the broken glove to get a fully functional demo for the blog. Additionally, I would try to run a mini-UX study with convenience sampling from the class.
For short period of time but still into the future, I would rework my approach to the glove / finger tracking with either a new interface design, use COTS flex sensors, or use IMUs on a flex PCB. I would also like to explore haptic feedback signaling joint configuration to user.
Further into the future, I would like to explore environmental context with 3D sensing and other interface modalities to gather intention. With 3D sensing the finger can make grasp adjustments, avoid collisions with non-target objects, and start to make predictions on workflow. With respect to other interface modalities I would be interested in EMG instead of flex sensors to capture the gesture intention. I also would be interested in a brain based approach like EEG or fNIRS (perhaps to augment the finger control captured by other means).
Finally, I want/need to present this project in experimental setting in order to evaluate the research goals I laid out. I am currently thinking about a day-long autobiographical study, user experience studies, and field studies (in manufacturing, health care, etc.).
Materials List:
Gloves [Prototype and Main] – Free (N/A) and $14.99 (N/A)
Exploring the ways in which we can see our mind respond to our environment by gathering brain data in a more user-integrated technological experience.
Video:
Description: All IR emitter and photodiode detector hardware is built into the front 3D-printed plastic compartment. The compartment is attached to a static strap which leads all wires from the front component to the back of the strap near the back of the head. Wires are attached to this strap via zig-zag stitching. The hardware in the back includes a protoboard with all wire connections to the emitter/detector setup and the Arduino Due which is used to transfer data via a USB cable to a computer. These components are housed in a neoprene pocket with drawstrings to close off visible wires. The other strap is neoprene and flexible to allow for different adjustments for various head shapes/sizes and is attached to the end of the static strap with velcro.
The prefrontal cortex is located in the front/forehead area of the head. This area of the brain is known to play a critical role in human-emotion response and is also one of the most successful areas for NIRS data gathering.
IR emitters send infrared waves about 2 cm deep into the skull and detectors observe the refracted IR waves that bounce back out of the skull. These channels of detectors and emitters measure changes in oxy-hemoglobin levels in the brain, similar fMRI scans, but using more localized technology for a more personal experience.
Data is visualized in live time using Unity 3D/VR software, corresponding to an increase in activity related to heat/light. These heat and light changes relate to activity changes detected from the forehead/skull/brain within 2 cm deep from the surface mount.
Future applications for cognitive research, memory analysis, and visualization communication.
Project Feelings/Evaluation:
Considering our prior knowledge in all related subjects, we feel accomplished with our final product. We each, in our own ways, dove deeply into academic research outside our own fields, and brought those conclusions together to create something that accomplished our goal: to manipulate a virtual environment using data from the head. We’re each proud of our own work put towards the project, and each others.
Goal Meeting Description:
Our initial goal of coming up with a more wearable device that could monitor activity level changes near/around certain parts of the head for the purposes of visualization was definitely met. Because of the level of complexity associated with high-quality material sourcing and the accuracy expectations that come with building a similar device within a medical environment, we were unable to use the device with that same level of sensitivity. Overall this was a great learning process for us all and given all we discovered during the process, our final result definitely exceeded our expectations of our capabilities given our experience and time-frame.
Hurdles Encountered and Overcoming Challenges:
Lack of knowledge designing hardware
Lack of knowledge regarding brain physiology
The ways that we overcame the hardware challenges are through steps by steps, we first tested all the components we need and make sure our proof of concept does work. Then we tried to create the first version of the circuit and discussed about the VIS HAT design. After all that testing process, we felt more confident about integrating all components together. Eventually, we think our project hardware design turns out to be pretty nice and satisfying.
As to the brain physiology part, we spent a lot of time doing research and reading through a bunch of relative papers in order to know from basic brain knowledge and gradually know how we are going to build our VIS HAT, which also sort of relates to designing hardware.
Approach if we were to do it again/With more time:
With more time, we believe that we all want to spend more time on testing our VIS HAT and get a great amount of data from it. By doing so, we think we can definitely play with the data and analysis the data. With the data we get, we can surely define human brain activities and do some classification about it, which would be great that we show our VIS HAT functionality demo with the brain activities.
One sentence that describes my project: A device that transmits location via radio when activated by the user in an emergency.
Image of Poster:
Video:
What the project does and how it works:
My project is a Radio Outdoor Emergency Transmitter. It transmits a radio message that has a generic call for help with specific GPS coordinates when the user sends the transmission. The user sends the transmission by pressing the patch 3 times, then the device vibrates twice, then the device waits 1 minute (30 seconds for demo purposes), then vibrates once more, then sends the radio transmission. This radio transmission then repeats once per minute .
When the user presses the button three times, the vibrator is activated for 2 pulses and the GPS module is activated, then if there is no user input in the 1 minute time period, the radio module is activated and the generic message for help and specific GPS coordinates are sent (specifically over the 89.00 FM frequency).
Overall thoughts:
Overall, I am pleased with my project, I have always been interested in radio, so this project was really interesting for me to work on. The only thing I am really disappointed in is that I was not able to broadcast the radio waves very far (mostly because I was afraid of breaking some laws regarding how far one can transmit for recreational purposes) and also that I relied heavily on the radio module to transmit the messages.
Comparison to Original Project Description:
After establishing the original description for this project I would say that this project does what I initially sought out to do, when activated, a distress signal containing GPS coordinates is transmitted over the radio. However, this device is not practical if the distance between the radio transmitter and the receiver has to be close. I think the original goal was to have the radio transmitter and receiver far away from one another, so I guess in this respect my goal was not met (primarily due to the reason stated above).
Latest Hurdles:
The biggest challenge I encountered towards the end of the project was getting all of my connections to stay connected. I found that the single core wire I used (kinda as a last resort) worked the best for not pulling out of the soldering. In the video above it is the white wire with the red stripe on the side, I ended up replacing the most challenging connections (which were 3-4 connections as shown in the video).
If I had more time:
There is actually quite a lot I would do if I had more time, the first being replacing all of the connections with single core wire connections. The second being instead of pushing a button, having a slide on the front of the patch that pushes the battery into the device (this would help the battery to only be used when needed – not powering the circuit playground all the time). The second change I would do is to fix the issue of the soft button being very sensitive (pressing when it is not supposed to). The third thing I would do is to find a way to say the emergency message more clearly over the radio. The fourth thing I would do is figure out how to transmit the radio waves farther without infringing on the law (this is actually the fix I would focus the most on, because it’s an operational fix, the others are mostly design fixes).