Project

# Title Team Members TA Documents Sponsor
79 Universal Gesture Interface
Connor Michalec
Kenobi Carpenter
Kobe Duda
Lukas Dumasius design_document1.pdf
proposal1.pdf
# Universal Gesture Interface

Team members:
- Kenobi Carpenter (joseph48)
- Kobe Duda (ksduda2)
- Connor Michalec (connor15)
# Problem

Since the invention of the personal computer, the interface between humans and computers has remained relatively unchanged. The keyboard and mouse layout has proven highly effective for the majority of use cases, but its mostly-discrete nature greatly restricts the possible ways humans can interact with computer applications.

Much of the way we interact with the world requires expressive, free-flowing modes of interaction. Activities like playing an instrument, martial arts, dancing, or sculpting often can’t simply be described by a series of inputs in the correct order at the correct time. They take place in continuous, 3D space—yet, the most complex expression we typically get with a computer is the 2D plane that a mouse movement provides.

Some solutions exist to address this need, the most notable of these being VR headsets. However, these headsets tend to be expensive, large, and lead to feelings of fatigue and nausea for many users. As it currently stands, there is no low-cost, low-fatigue, desk-friendly input device that allows continuous spatial interaction on PC. Such a device would open new possibilities for how users interface with programs while also improving accessibility for those lacking in fine motor skills, i.e. limited finger dexterity.
# Solution

We propose a wearable gesture-detecting glove that allows users to interface with computer applications through hand and finger motions. This glove will have a wired USB connection (though wireless would be ideal, we are omitting it for the sake of scope) with two interfaces. The first interface is an HID compliant mouse, allowing the glove to be generally used for regular applications, while the second interface streams live 3D movement data to be interpreted by specialized applications. This dual-interface approach allows the glove to stand on its own as a general-purpose tool while also granting the extensibility to be leveraged to its full potential by specialized applications.

The sensor layout will consist of a 9-DOF IMU placed on the back of the hand for broad movements, three flex sensors in the index, middle finger, and thumb, and three force-sensitive resistors (FSRs) on the fingertips to detect touch.

Finally, the device will feature on-board DSP on the MCU. It will process raw sensor data and interpret a predefined set of gestures, then send those interpreted actions as discrete inputs to the target USB device.
# Solution Components

## Subsystem 1: IMU Unit

Components: ICM-20948

This 9-axis accelerometer will be used for detecting broad-phase translational and rotational movements of the hand. It will be mounted to the back of the palm, and raw sensor data will be sent over SPI to the MCU for processing.
## Subsystem 2: Flex sensors

Components: Adafruit Industries Short Flex/Bend Sensor

We will mount three flex sensors to the thumb, index finger, and middle finger. They will be connected each to an ADC port by voltage divider with a 50kOhm resistor. 0.1uF capacitors will be used for noise reduction. Used for detecting specific hand positions.
## Subsystem 3: Touch sensors

Components: Geekcreit FSR402 Force Sensitive Resistor

Three force-sensitive resistors will be attached to the tips of the thumb, index finger, and middle finger. Similar to the flex sensors, they will be wired to ADCs with voltage dividers (22kOhm) to be read by the MCU. Used for detecting pinching, tapping, and pressing.
## Subsystem 4: Microprocessor

Components STM32F405 Microprocessor

This microprocessor takes as input all of the aforementioned sensor data and sends USB as output. The processor itself has been chosen for its DSP capabilities, as processing sensor inputs and identifying them as gestures will constitute a considerable portion of this project. Attached to the PCB will be a USB port for connecting to a computer, over which identified gestures are sent as inputs to the computer.

This is also where most of our design decisions will be integrated. For example, the IMU is prone to drift, meaning we’ll have to make UX decisions that mitigate its influence, i.e. only moving the mouse when a finger is down on the desk.
## Subsystem 5: Physical Frame

Another important aspect of the project will be the physical design itself. In order for our project to be even moderately successful, it has to be wearable. This presents the unique challenge of designing a glove that is both comfortable and can house the electronic components in a way that does not impede movement.
## Subsystem 6: Associated Software

This is not a part of the actual project, but a testbed to demonstrate its capabilities. We will use Unreal Engine 5 to create a very basic flight simulation that allows for controlling the plane with orientation of the user’s hand.

For basic testing, we will also have a barebones program that receives gesture inputs and prints them to the screen when received over serial.
# Criterion for success
- Hand movements are able to reliably move a mouse on the attached device
- The following gestures/actions can be reliably detected and mirrored to test program
- Hand closed
- Hand open
- Light tap (index/middle/thumb)
- Firm press (index/middle/thumb)
- Pinching fingers (index-thumb, middle-thumb)
- Thumbs up
- Thumbs down
- User can successfully navigate a plane in the testbed program through a basic course using hand orientation

Bracelet Aid for deaf people/hard of hearing

Aarushi Biswas, Yash Gupta, Anit Kapoor

Bracelet Aid for deaf people/hard of hearing

Featured Project

# PROJECT TITLE: Bracelet Aid for deaf people/hard of hearing

# TEAM MEMBERS:

- Aarushi Biswas (abiswas7)

- Anit Kapoor (anityak3)

- Yash Gupta (yashg3)

# PROBLEM

We are constantly hearing sounds around us that notify us of events occurring, such as doorbells, fire alarms, phone calls, alarms, or vehicle horns. These sounds are not enough to catch the attention of a d/Deaf person and sometimes can be serious (emergency/fire alarms) and would require the instant attention of the person. In addition, there are several other small sounds produced by devices in our everyday lives such as washing machines, stoves, microwaves, ovens, etc. that cannot be identified by d/Deaf people unless they are observing these machines constantly.

Many people in the d/Deaf community combat some of these problems such as the doorbell by installing devices that will cause the light in a room to flicker. However, these devices are generally not installed in all rooms and will also obviously not be able to notify people if they are asleep. Another common solution is purchasing devices like smartwatches that can interact with their mobile phones to notify them of their surroundings, however, these smartwatches are usually expensive, do not fulfill all their needs, and require nightly charging cycles that diminish their usefulness in the face of the aforementioned issues.

# SOLUTION

A low-cost bracelet aid with the ability to convert sounds into haptic feedback in the form of vibrations will be able to give d/Deaf people the independence of recognizing notification sounds around them. The bracelet will recognize some of these sounds and create different vibration patterns to catch the attention of the wearer as well as inform them of the cause of the notification. Additionally, there will be a visual component to the bracelet in the form of an OLED display which will provide visual cues in the form of emojis. The bracelet will also have buttons for the purpose of stopping the vibration and showing the battery on the OLED.

For instance, when the doorbell rings, the bracelet will pick up the doorbell sound after filtering out any other unnecessary background noise. On recognizing the doorbell sound, the bracelet will vibrate with the pattern associated with the sound in question which might be something like alternating between strong vibrations and pauses. The OLED display will also additionally show a house emoji to denote that the house doorbell is ringing.

# SOLUTION COMPONENTS

Based on this solution we have identified that we need the following components:

- INMP441 (Microphone Component)

- Brushed ERM (Vibration Motor)

- Powerboost 1000 (Power subsystem)

- 1000 mAh LiPo battery x 2 (hot swappable)

- SSD1306 (OLED display)

## SUBSYSTEM 1 → SOUND DETECTION SUBSYSTEM

This subsystem will consist of a microphone and will be responsible for picking up sounds from the environment and conducting a real-time FFT on them. After this, we will filter out lower frequencies and use a frequency-matching algorithm to infer if a pre-programmed sound was picked up by the microphone. This inference will be outputted to the main control unit in real-time.

## SUBSYSTEM 2 → VIBRATION SUBSYSTEM

This subsystem will be responsible for vibrating the bracelet on the wearer’s wrist. Using the vibration motor mentioned above, we should have a frequency range of 30Hz~500Hz, which should allow for the generation of a variety of distinguishable patterns. This subsystem will be responsible for the generation of the patterns and control of the motor, as well as prompting the Display subsystem to visualize the type of notification detected.

## SUBSYSTEM 3 → DISPLAY SUBSYSTEM

The Display subsystem will act as a set of visual cues in addition to the vibrations, as well as a visual feedback system for user interactions. This system should not draw a lot of power as it will be active only when prompted by user interaction or by a recognized sound. Both of these scenarios are relatively uncommon over the course of a day, which means that the average power draw for our device should still remain low.

## SUBSYSTEM 4 → USER INTERACTION SUBSYSTEM

This subsystem is responsible for the interaction of the user with the bracelet. This subsystem will include a set of buttons for tasks such as checking the charge left on the battery or turning off a notification. Checking the charge will also display the charge on the OLED display thus interacting and controlling the display subsystem as well.

## SUBSYSTEM 5 → POWER SUBSYSTEM

This subsystem is responsible for powering the device. One of our success criteria is that we want long battery life and low downtime. In order to achieve this we will be using a power boost circuit in conjunction with two rechargeable 1000 mAh batteries. While one is charging the other can be used so the user doesn’t have to go without the device for more than a few seconds at a time. We are expecting our device to use anywhere from 20-50mA which would mean we get an effective use time of more than a day. The power boost circuit and LiPo battery’s JST connector allow the user to secure and quick battery swaps as well.

# CRITERION FOR SUCCESS

- The bracelet should accurately identify only the crucial sounds in the wearer’s environment with each type of sound having a fixed unique vibration + LED pattern associated with it

- The vibration patterns should be distinctly recognizable by the wearer

- Should be relatively low cost

- Should have prolonged battery life (so the power should focus on only the use case of converting sound to vibration)

- Should have a small profile and a sleek form factor

Project Videos