Project

# Title Team Members TA Documents Sponsor
39 Hand gesture controlled audio effects system
Sarthak Singh
Sergio Bernal
Zachary Baum
Zicheng Ma design_document1.pdf
design_document2.pdf
final_paper1.pdf
photo1.png
photo2.png
presentation1.pptx
proposal1.pdf
Team Members:
Sarthak Singh (singh94)
Zachary Baum (zbaum2)
Sergio Bernal (sergiob2)

Problem
In audio production, both amateur and professional settings lack intuitive, hands-free control over audio effects. This limitation restricts the creativity and efficiency of users, particularly in live performance scenarios or in situations where physical interaction with equipment is challenging.

Solution Overview
Our project aims to develop a gesture-controlled audio effects processor. This device will allow users to manipulate audio effects through hand gestures, providing a more dynamic and expressive means of audio control. The device will use motion sensors to detect gestures, which will then adjust various audio effect parameters in real-time.

Solution Components:

Gesture Detection Subsystem:
The Gesture Detection Subsystem in our audio effects system uses a camera to track hand movements and orientations. The camera will be connected to a Raspberry PI which then sends signals to our custom PCB. The system processes sensor data in real time, minimizing latency and filtering out inaccuracies. Users can customize gesture-to-effect mappings, allowing for personalized control schemes. This subsystem is integrated with the audio processing unit, ensuring that gestures are seamlessly translated into desired audio effect alterations.


Audio Processing Subsystem:

The Audio Processing Subsystem uses a DSP algorithm to modify audio signals in real time. It includes various audio effects like reverb and delay, which change based on the user's hand gestures detected by the Gesture Detection Subsystem. This part of the system allows users to customize these effects easily. The DSP works closely with the gesture system, making it easy for users to control audio effects simply through gestures. Specifically, we are using a STM32 microcontroller on a custom PCB to handle this subsystem.

Control Interface Subsystem:
The Control Interface Subsystem in our audio effects processor provides a user-friendly interface for displaying current audio effect settings and other relevant information. This subsystem includes a compact screen that shows the active audio effects, their parameters, and the intensity levels set by the gesture controls. It is designed for clarity and ease of use, ensuring that users can quickly glance at the interface to get the necessary information during live performances or studio sessions.

Power Subsystem:

The Power Subsystem for our audio effects processor is simple and direct. It plugs into a standard AC power outlet and includes a power supply unit that converts AC to the DC voltages needed for the processor, sensors, and control interface. This design ensures steady and reliable power, suitable for long use periods, without the need for batteries.
Criterion for Success:
Our solution will enable users to intuitively control multiple audio effects in real time through gestures. The device will be responsive, accurate, and capable of differentiating between a wide range of gestures. It will be compatible with a variety of audio equipment and settings, from studio to live performance.

Alternatives:

Existing solutions are predominantly foot-pedal or knob-based controllers. These are limiting in terms of the range of expression and require physical contact. Our gesture-based solution offers a more versatile and engaging approach, allowing for a broader range of expression and interaction with audio effects.

STRE&M: Automated Urinalysis (Pitched Project)

Gage Gulley, Adrian Jimenez, Yichi Zhang

STRE&M: Automated Urinalysis (Pitched Project)

Featured Project

Team Members:

- Gage Gulley (ggulley2)

- Adrian Jimenez (adrianj2)

- Yichi Zhang (yichi7)

The STRE&M: Automated Urinalysis project was pitched by Mukul Govande and Ryan Monjazeb in conjunction with the Carle Illinois College of Medicine.

#Problem:

Urine tests are critical tools used in medicine to detect and manage chronic diseases. These tests are often over the span of 24 hours and require a patient to collect their own sample and return it to a lab. With this inconvenience in current procedures, many patients do not get tested often, which makes it difficult for care providers to catch illnesses quickly.

The tedious process of going to a lab for urinalysis creates a demand for an “all-in-one” automated system capable of performing this urinalysis, and this is where the STRE&M device comes in. The current prototype is capable of collecting a sample and pushing it to a viewing window. However, once it gets to the viewing window there is currently not an automated way to analyze the sample without manually looking through a microscope, which greatly reduces throughput. Our challenge is to find a way to automate the data collection from a sample and provide an interface for a medical professional to view the results.

# Solution

Our solution is to build an imaging system with integrated microscopy and absorption spectroscopy that is capable of transferring the captured images to a server. When the sample is collected through the initial prototype our device will magnify and capture the sample as well as utilize an absorbance sensor to identify and quantify the casts, bacteria, and cells that are in the sample. These images will then be transferred and uploaded to a server for analysis. We will then integrate our device into the existing prototype.

# Solution Components

## Subsystem1 (Light Source)

We will use a light source that can vary its wavelengths from 190-400 nm with a sampling interval of 5 nm to allow for spectroscopy analysis of the urine sample.

## Subsystem2 (Digital Microscope)

This subsystem will consist of a compact microscope with auto-focus, at least 100x magnification, and have a digital shutter trigger.

## Subsystem3 (Absorbance Sensor)

To get the spectroscopy analysis, we also need to have an absorbance sensor to collect the light that passes through the urine sample. Therefore, an absorbance sensor is installed right behind the light source to get the spectrum of the urine sample.

## Subsystem4 (Control Unit)

The control system will consist of a microcontroller. The microcontroller will be able to get data from the microscope and the absorbance sensor and send data to the server. We will also write code for the microcontroller to control the light source. ESP32-S3-WROOM-1 will be used as our microcontroller since it has a built-in WIFI module.

## Subsystem5 (Power system)

The power system is mainly used to power the microcontroller. A 9-V battery will be used to power the microcontroller.

# Criterion For Success

- The overall project can be integrated into the existing STRE&M prototype.

- There should be wireless transfer of images and data to a user-interface (either phone or computer) for interpretation

- The system should be housed in a water-resistant covering with dimensions less than 6 x 4 x 4 inches

Project Videos