Projects
# | Title | Team Members | TA | Professor | Documents | Sponsor |
---|---|---|---|---|---|---|
1 | Ant-weight, 3D Printed Battlebot |
Justin Leong Yuxuan Nan Zilong Jiang |
Haocheng Bill Yang | Viktor Gruev | proposal1.pdf |
|
# Ant-weight Battlebot - Scooper # Group members - Yuxuan Nan(yuxuann2) - Justin Leong (jyleong2) - Zilong Jiang(zjian4) # Problem The issue at hand is an Ant-weight, 3D Printed Battlebot Competition in which each team wants to win. There are certain constraints of the Battlebots’ design in order to be eligible and win. To win the competition, the goal of a battlebot is to outlast or destroy the opponent team’s battlebot. Different teams will have different designs, meaning a battlebot designed for winning must take into account as many factors to withstand and outlast the competition. # Solution We decided upon developing a 3D Printed Bluetooth-controlled battle bot powered by an STM32 microcontroller. The Battlebot communicates with a PC via a Bluetooth module, enabling wireless command and control. It is equipped with two DC motors driven by H-bridge circuits for precise movement and a ramp-shaped weapon system for engaging in battles. The STM32 manages motor control using GPIO and PWM, while the weapon system utilizes GPIO or I2C protocols for activation. The bot integrates real-time communication, robust motor control, and weapon functionality, offering an engaging and functional design. My goal is to create a responsive and competitive robot for dynamic competitions. # Solution Components ## DriveTrain The drivetrain of our battlebot will utilize two DC motors with high torque to power two of our rear wheels making this a rear wheel drive robot. We have decided upon a brand of motor that will output 0.58kgf.cm of torque as well as have a max of 200 RPM in rotational speed. The rear wheels will consist of high-friction wheels that allow for lots of traction to handle the speed. Additionally, we will combine this with an omni-directional front wheel to allow for easy directional movement if the weight capacity allows it. Furthermore, we will combine this with an H-bridge circuit to allow control, stability, and power for the battlebot. ## Weapon and Chassis The weapon and chassis are required to be 3D printed according to the list of approved plastic types. Ideally if possible, we would like to use ABS plastic as this would be the best in terms of weight and durability but have opted for PLA+ plastic as it is easier to print designs. PLA+ plastic also offers decent strength and isn’t as brittle as the other provided plastics. For weapon design, we will be creating a Skid Bucket type weapon that can potentially be moved up and down using pneumatics to attack other battlebots (both for ramming and slamming other battlebots). In terms of chassis, we have decided to use two motors for rear wheel drive box-ish battlebot as well as an omni-directional front wheel to make steering easy. ## Power system In the power system module, this includes regulators and converters to different electronics, the battery, and a monitor for the battery with a switch/relay. For the battery, we have decided upon a 12V NiMh battery as this can supply consistent and reliable power. We will also include a short circuit detection that will turn off the battery for safe practice. ## Control system The control system will be powered by an STM32 microcontroller, which will manage motor control and wireless communication. Using a Bluetooth module, the bot will communicate with a PC, enabling real-time control with minimal latency. The STM32 will leverage GPIO and PWM to provide precise speed and directional control for the motors, ensuring responsiveness and accuracy during battles. # Criterion For Success: Our battlebot would be considered successful if we can control the battlebot via Bluetooth relatively well. This would include having both motors work separately for steering as well as being able to make each motor have different amounts of throttle inputted. Additionally, we would like our weapon to work well enough to be considered functional during a match. |
||||||
2 | Antweight Battlebot |
Gauthami Yenne Jingyu Kang Nandika Vuyyuri |
Haocheng Bill Yang | Viktor Gruev | proposal1.pdf |
|
# Antweight Battlebot Nandika Vuyyuri (vuyyuri2) \ Gauthami Yenne (gyenne2) \ Jingyu Kang (jingyuk2) # Problem The goal of this project is to create an antweight battlebot that would weigh less than 2 lbs in order to participate in the Antweight Battlebot Competition. The criteria given are that all robots must have clearly visible and controlled mobility; must be controlled via either Bluetooth or WIFI using a microcontroller with an manual operation for disconnection; and rotational blade which would contact the arena 5 inches above the ground level and could come to a complete stop within 60 seconds. # Solution The battlebot will be mounted with a tombstone attacking mechanism in order to disable the opponent’s vehicle. # Solution Components ## Power System: We need a max of 16V considering the motor we are using for moving our robot around so we plan to use Thunder Power 325 mAh 3s battery (THP 325-3SR70J) which is 35g and is the lightest battery we could find that met our requirements. Other battery options weighted about 65g to 105g which would be too heavy to meet the criteria since the weight limit for the entire battlebot should be about 900g. \ Another option is to use flat lithium batteries since the weight of the batteries are significantly lighter than the regular batteries. However, the problem of this would be that the power would not be sufficient enough for the battlebot to move and perform the tasks required as most of the lithium batteries cannot produce significant power at a single instant but rather is a long lasting battery. ## MCU: The ESP32-C3 (ESP32-C3-DevKitM-1), which is known for its low power consumption, will be used for connection between the battlebot and the controller utilizing its built-in Wifi and Bluetooth system. We will use Arduino IDE in order to program the ESP32-C3 to control the robot. We will use this to control the robot’s mobility and attacking mechanism. \n We have access to debugging and flashing tools that are compatible with the ESP32-C3 MCU. ## Attacking mechanism: We plan to use the Emax RS2205 2600KV motor which is 30g. This motor has a fast RPM and is often used for drones actually which we are hoping will be a powerful attacking mechanism. ## Robot mobility To maneuver the battlebot we will use a dual H-bridge configuration using the DRV8833 motor driver paired with high-torque Pololu Micro Metal Gear Motors and integrate the parts with the ESP32-C3-DevKitM-1. ## Materials We plan to use a mixture of lightweight PET-G, ABS, and PLA+ materials. The primary reason for this choice is since they are more durable and flexible as well as heat-resistant which would be ideal for the nature of battlebots. Furthermore, considering majority of the parts would be created through 3D-printing, we assume that ABS or PEEK filament, which is primarily used for 3D-printers, would be ideal. # Criterion For Success Our High-level goal is to maneuver the robot away from the opponent with precision and control. Another goal is to have a horizontal spinning attacking mechanism which is ‘powerful’ enough to knock out robots of other shapes should not just ‘flick’ the other robot but actually make a significant impact to disable the opponent’s robot. |
||||||
3 | CCD image sensor board for film camera retrofit |
Ayush Garg Ethan Greenwald Jason Guo |
Haocheng Bill Yang | Viktor Gruev | proposal1.pdf |
|
# CCD Resurrection - modernizing film cameras to digital with salvaged image sensors Link to idea: [https://courses.grainger.illinois.edu/ece445/pace/view-topic.asp?id=76215 ](https://courses.grainger.illinois.edu/ece445/pace/view-topic.asp?id=76215) Team members: - Jason Guo (jkguo2) - Ayush Garg (ayushg6) - Ethan Greenwald (ethanmg4) # Problem: Multiple current factors create the need for a cost-effective solution to converting film cameras to digital using CCD sensors. The sudden explosion of demand for old CCD-sensor-equipped cameras amidst the digicam trend is mismatched against a supply of 10-20 year old cameras that are increasingly unreliable and outdated. High failure rates of the cameras themselves and their old, scarce accessories (proprietary batteries and storage cards) are decimating the supply of working examples and creating a glut of broken cameras which present opportunities for salvage. Thus, a remanufacturing of old sensors (which are rarely the point of failure) to modernized cameras presents a solution to this demand. # Solution Our proposal is to create a PCB that accepts commonly available salvaged CCD sensors and drops them into an advanced film camera. This would satisfy the rising demand for CCD-equipped cameras. Combining salvaged CCD sensors with modern microcontrollers and components keeps BOM cost low and reliability high, and the plentiful supply of advanced film cameras will ensure that this conversion is practical. The rising price of film intensifies this glut of technologically advanced, usable film cameras that are cheap to purchase in working condition but too expensive to operate (akin to an inkjet printer). In practice, our PCB and resulting conversion will emulate the Kodak DCS460 from 1995, but modernized to 2025 and created for different reasons. The PCB will contain the Sony ICX-453 CCD sensor, accompanying power supply, driving, and A->D conversion circuitry, an STM microcontroller with SDRAM buffer, an SD card slot, Li-Ion BMS for replaceable 2S 18650 cylinder cells, and buttons and 7-segment displays for user interface. The PCB will be installed into a 3D printed enclosure that holds the batteries and interfaces with the Nikon N90s camera (“Host Camera”), and the enclosed PCB (“Module”) will synchronize with the Host Camera through its Nikon 10-pin serial interface. # Subsystem 1: CCD sensor, driving circuitry, and A->D Subsystem 1 includes the one each of: Sony ICX-453 CCD sensor as salvaged from old cameras, CXD1267 CCD vertical clock driver, EL7457 horizontal clock driver, AD811 low-noise op-amp, and AD9826 16-bit A->D converter. This subsystem will largely borrow its schematic from the open-source CAM86 astrophotography camera project, since driving the CCD sensor is the single most error-prone part of our project and a critical success criteria with no substitute. # Subsystem 2: Power supply system The ICX453 CCD sensor requires +15V, -8V, and 6V and may draw moderate current at times due to its large capacitances. Additionally, the supporting circuitry in Subsystem 1, Subsystem 3 (Microcontroller, SDcard, and UI), and Subsystem 4 (BMS) will require 5v and 3.3v rails to support their components. This totals to 5 discrete voltages used in our project, powered by 2s Lithium Ion battery cells (a source that varies between 6.0-8.4V) The power supply architecture for our project primarily targets low noise and secondarily targets a small footprint and good power efficiency. Low noise is necessary since Subsystem 1 is partially analog and generates digital picture data, so best picture quality requires careful power rail noise suppression. Footprint and power efficiency considerations are driven by the desire to minimize Size, Weight, and Power (SWAP) on a hand-held, battery-powered device. Correspondingly, we are considering an architecture that uses DC->DC conversion followed by low-noise LDO regulators for all rails used by Subsystem 1 (+15V, -8V, 6V, and independent 5V), and DC->DC conversion only for the remaining 5V and 3.3V rails to maximize power savings. Where possible, we will use compact, efficient, and highly integrated DC->DC converters such as TI ‘Micro-SIP’. We may also add MOSFETs to selectively gate power to these rails and maximize power efficiency, thus enabling MCU-controlled power-saving functionality. # Subsystem 3: MCU, camera I/O, SDcard, User Interface Subsystem 3 will consist of an STM32H7 MCU, SDRAM buffer, MCU programmer circuit, SDcard slot, user-accessible buttons, and 7-segment or dot-matrix display. The MCU will be responsible for the following: - Controlling Subsystem 1 and receiving its image data - Synchronizing with the Host Camera through the camera’s Nikon 10-pin interface (including a triggering signal and a serial interface for querying the camera’s state and configuration) - Buffering images to SDRAM to enable rapid shooting - Processing and saving images to the SDcard over SDIO, maximizing readout speed to flush the buffer as quickly and possible - Accepting user inputs via button press and reacting accordingly, including configuration changes and a delete button that erases the most recently saved file. This will include the user-accessible power button, so that a power-off does not cause data loss. - Driving the 7-segment or dot matrix display to display the module’s state to the user Since these tasks require high-speed I/O and significant processing power, the MCU must have a native SDIO capability and a JPEG engine to enable real-time JPEG compression while minimizing cost. For these reasons, the STM32H7 family was selected. While the JPEG compression and preceding Bayer conversion are not part of the success criteria, we will target this functionality since it would significantly enhance the camera’s practical usability. As Subsystem 1 is largely based on the CAM86 project, some of the code from CAM86 may also be referenced or borrowed to form the interface between Subsystem 3 and Subsystem 1. Any borrowed code will require substantial revision due to the integration of substantially more functionality and since this project uses an STM MCU as opposed to CAM86’s AVR MCU. # Subsystem 4: Battery Management System Subsystem 4 will consist of an integrated BMS system and a USB-C charging port for the 2S 18650 Li-Ion power source, as well as a power kill switch for use during debug and assembly. This subsystem will integrate primarily with Subsystem 2 (Power regulation), but will also integrate with Subsystem 3 (MCU) to enable a user-visible battery level readout and to ensure that data is not lost during an over-discharge scenario. The BMS will provide battery level readout, short circuit and overcurrent protection, as well as overcharge and overdischarge protections to the sensitive and dangerous Li-Ion battery cells. The BMS system should be able to charge the battery cells from a USB-C port that is exclusively used for charging. We may incorporate USB-C fast-charging capability, but it will not be part of this project’s success criteria. # Criterion for Success In short, the completed project should resemble the Kodak DCS460 from 1995. Technical specification: - The completed module will connect to the Nikon N90s camera, and it will save 6 Megapixel color images in uncompressed RAW format. - The UI will consist of, at minimum, 3 buttons (Delete, Navigate, Select) and a 7-segment or dot-matrix display. - The module will accept SDHC/SDXC cards. - The N90s camera and module should be able to shoot at a rate of 1 picture per second, with no loss of data. - The module will be powered by 2s 18650 Li-Ion batteries. Form Factor: The PCB containing the ICX-453 CCD sensor, all accompanying circuitry, and MCU/BMS/SD card will be integrated into a 3D printed module that contains batteries (2S 18650s) and bolts onto (and may protrude from) the back of the host camera, replacing the film door. In form, this will resemble the [DCS410/420/460](https://upload.wikimedia.org/wikipedia/commons/9/9e/Digitalback_dcs420_02.jpg) but should be much smaller. Usage: - After installing the module onto the camera and connecting the Nikon 10-pin interface cable, the N90s film SLR camera will behave like a digital SLR camera: each time the user presses the camera’s shutter button, the camera’s shutter will fire and the module will save the resulting color image onto the SDcard in RAW format. - The RAW images that are saved onto the SDcard can then be imported into a computer and viewed or converted to JPEG as desired. - The user can configure the module and delete previous images using buttons, and the module status and battery charge will be visible on the 7-segment or dot matrix UI, but there will NOT be an LCD for image review. # Resources/Citations Some schematic and code may be used from CAM86 project. Since english language resources are sparse and distributed, all of the following links are describing the same project (or same family of projects). - https://www.cloudynights.com/topic/497530-diy-astro-ccd-16-bit-color-6mpx-camera/ - https://www.astroclub.kiev.ua/forum/index.php?topic=28929.0 - https://github.com/smr547/cam86 - https://github.com/axsdenied/cam86_fw - https://github.com/vakulenko/CAM8_software/tree/master - http://astroccd.org/2016/10/cam86/ Additionally, resources for interfacing with the Nikon 10-pin interface: - https://github.com/schoerg/nikonserial/tree/master While the Sitina camera exists, we consider it to be more of a parallel development than a useful resource due to differences in sensor and architecture. In any case, we would like to declare it here in case there is a coincidential commonality. - https://gitlab.com/zephray/sitina1 |
||||||
4 | Coffee Bean Freshness Tracker (CO2ffee) |
Abrar Murtaza Joshua Meier Nathan Colunga |
Surya Vasanth | Arne Fliflet | proposal1.pdf |
|
Team Members: - Joshua Meier (joshua51) - Abrar Murtaza (abraram2) - Nathan Colunga (colunga4) # Problem Many coffee connoisseurs care about using freshly roasted beans, as it gives you the best coffee and depth of flavor! However, when you buy coffee from a roastery, they only give you an estimated date for when you should use them (typically within a month). This means those picky with their coffee quality don’t know exactly when their beans are no longer considered “fresh”. To solve this issue, we want to make a custom coffee container that detects how fresh beans are and tells the user when they need to replace their beans. This way, the user can always make sure they are using fresh beans to get good coffee! # Solution For our design, we plan to create a container to track the amount of CO2 remaining in the coffee beans as this directly correlates with the freshness of coffee beans. This is based on the weight of beans that were added as well as the detected concentration of CO2 that builds up in the container over time. This would work by when the beans are first added to the container the user would add them and utilize a mobile interface to indicate new beans were entered and specify the type. The container would then detect the weight of beans that were added and the original concentration of CO2 in the container which should be approximately 420 ppm (atmospheric). Every hour from then on the container would update the current CO2 concentration and utilize that as well as the calculated volume of the container and mass of the beans when first put into the container. The system will calculate the milligram of CO2 that was released per gram of beans. Using this calculated CO2 release and comparing it to the type of beans that were added to the container which would be inputted by the user, the system will calculate the approximate percentage of CO2 remaining in the beans at any given time. For reference, we plan to utilize the data collected in this study to determine the approximate CO2 content of each type of roast: https://pubs.acs.org/doi/10.1021/acs.jafc.7b03310. It is also worth noting that we will be assuming that 100% of the CO2 is still contained in the beans upon being placed in the container. We will be assuming this because the expected use case is beans will be added almost immediately after being bought making the time since the roasting is extremely short and the escaped CO2 negligible. # Solution Components ## Subsystem 1 - Peripheral Devices (i.e. Sensors, Inputs, and Motors) There will be three main sensors integrated into this subsystem. The weight sensor determines the amount of beans in the container by mass which will be used to calculate the carbon dioxide released per gram. The carbon dioxide sensor which determines the ppm of the carbon dioxide in the closed container is also used in the CO2 per gram calculation. There will also be a button to open and close the container as well as a motor that will be used to rotate the lid on a hinge to open and close it. The motor will also be engaged every time the concentration builds up too much so that the CO2 sensor is not overwhelmed by the concentration of CO2 within the container. For the CO2 sensor, we will most likely be using the Infineon PASCO2V15 sensor as it supports I2C and has a fairly reasonable CO2 concentration range. The weight sensor would most likely be made up of a few load cells and connected to the microcontroller via an HX711. For the motor, we plan to utilize the Miuzei Waterproof Servo Motor because of its small design and price. ## Subsystem 2 - Microcontroller This subsystem consists of most likely an ESP32-WROOM-32E-H4 microcontroller with wireless communication capabilities and I2C for peripheral device connections. The microcontroller will read data from the CO2 and weight sensors to perform calculations for the percentage of CO2 remaining in the coffee beans. Based on this measurement, data will be output to the mobile interface via a wireless connection module, reporting as the freshness report of the coffee beans. The push button input and motor outputs are also connected, so on each press, the microcontroller can send data to the motor to begin opening or closing the lid, alternating. ## Subsystem 3 - Wireless Connection Module and Mobile Interface The wireless connection module is built into the ESP32 microcontroller and this will allow data communication between the mobile interface and microcontroller. On this mobile interface, there will be an input to indicate which type of coffee beans have been entered into the container. Based on this input, our system will be able to estimate the initial amount of CO2 stored in the beans. Through the mobile interface, the freshness of the beans will be reported as the approximate percentage of CO2 remaining in the beans. ## Subsystem 4 - Power System For power, we plan to utilize a couple of rechargeable batteries along with a power controller board. The power controller board will manage the power being taken in from a USB port to recharge the batteries and also manage the output power to all sensors and the microcontroller. We will most likely utilize double A rechargeable batteries and design our own power board in this subsystem. ## Materials Most of the body will be made using 3D-printed PLA material with the coffee itself contained in a metal bin that can be removed. To create air-tight seals on the container the lid will be fitted with silicone edges to decrease gas escaping. # Criterion For Success There are a few goals we will try to meet for this project: - Able to create a wireless connection to a phone for the user interface - Able to create a rechargeable battery system that correctly powers the rest of the system - A reasonably accurate measure of the CO2 lost by the beans while in the container which can be cross checked with manual measurements of weight - Have the ability to adapt the calculation to various bean roast types that are inputted by the user |
||||||
5 | Mesh Network Positioning System |
Noah Breit Peter Giannetos |
Michael Gamota | Michael Oelze | proposal1.pdf |
|
### Team Members - Peter Giannetos (PG19) - Noah Breit (NHBREIT2) # Abstract Create a wireless positioning system of meshed stationary nodes that is able to track moving nodes over a predefined area at long ranges in excess of 1 km. An inspiration for this project comes from high altitude amateur rocketry where GNSS exclusive tracking systems are unable to maintain a lock at high velocities. However, this is not limited to rocketry and can be expanded to drone swarms or other general asset tracking. _([Initial Idea Post](https://courses.grainger.illinois.edu/ece445/pace/view-topic.asp?id=76218))_ # Background Our engineering team Spaceshot from the [Illinois Space Society](https://www.illinoisspacesociety.org/) is working towards being one of the first collegiate teams to build and launch a completely student designed vehicle 100km to the edge of space; Also known as the Kármán line. A big challenge in achieving this goal is reliably validating altitude, because many commercial GNSS systems are not able to operate at those extreme conditions which is where the inspiration for this project is derived. _(Spaceshot recently broke the University's 7-year standing altitude record in June of 2024, and is looking to do so again in the summer of 2025. [Kairos II Launch](https://www.youtube.com/watch?v=6WY3OQx-jNs))_ # Objective The goal of this system is to lay the foundation for an alternative redundant positioning system that may one day be used to help verify vehicle altitude over long ranges. The scope of this project will not focus on achieving these long ranges, but the link budget appears feasible. Instead, this project will focus on creating a proof of concept for lower altitude vehicles and as a general tracking system that can be used for many other applications beyond vehicle tracking. ## Other Potential Usages - Drone tracking - Warehouse asset & robotics tracking - Car tracking ## Novelty Compared to GNSS A large part of the novelty is that this system is not entirely reliant on GNSS satellites meaning it serves as a redundant backup solution for assets that require extra reliability. However, here are some other potential novelties compared to GNSS - Indoors and/or outdoor usage - Different frequency band than GPS (2.4GHz vs 1.57GHz)* - Faster update rate than typical consumer GPS (+10Hz) - Higher velocity tracking - High velocity tracking** *_This helps de-conflict with Iridium usage which has been shown to sometimes interfere with GPS signals._ **_The system is most likely able to track higher velocities than consumer grade GNSS, but for the purposes of a demo we won't be able to test that in class. However, we may be able to test fly this system with our RSO in ~April._ # System Overview ## Key Points - Anchor nodes have a stationary predetermined known location via GNSS or other methods. - Rover nodes have an unknown moving location and are the subjects of the tracking - Anchor nodes with synchronized time use time division to sequentially ping rover nodes - Distance from ToF data from each Anchor node is used to calculate position - 2.4 GHz LoRa modulation is used as a carrier signal and the radio measures the"Time of Flight" between messages - 915 MHz LoRa is used for command and control of the Anchor nodes and to relay information between their mesh network - Each node also may have WiFi/Bluetooth connectivity for relaying data to the user - Each node has a battery charger circuit for charging LiPo batteries - Anchor nodes have a DC in that can be used for solar panels or other power sources in extended operation modes ### Diagrams - [Rover Node](https://drive.google.com/file/d/1s-36r-JjqxyTw7y8X974gufuxaq0_UOH/view?usp=sharing) - [Anchor Node](https://drive.google.com/file/d/1r33J0ESABEdzihPbGCJ6SNQ7NYmtnM9_/view?usp=sharing) ### Schematics - [Rover Node](https://drive.google.com/file/d/18-mt-91eqGyq5F5amYsRW0zK7k6PrgOG/view?usp=sharing) - [Anchor Node](https://drive.google.com/file/d/1wgfl9TMk5EhXdlnkxCsD5ZM0D6RoE0kX/view?usp=sharing) ### Layouts - [Rover Node Front](https://drive.google.com/file/d/177LjG0lPpOCkkIHr40mCEJd_GgLVCnwK/view?usp=sharing) - [Rover Node Back](https://drive.google.com/file/d/1zS2saVFtNqWhWLpZFaUtUVLBXwYDhPUr/view?usp=sharing) - [Anchor Node Front](https://drive.google.com/file/d/1umoRkIO3XNXMvPkvjQ44fK_oXh8rU63C/view?usp=sharing) - [Anchor Node Back](https://drive.google.com/file/d/1g0U6Ht-ASRbU-8QZHsjqA0HMhdr1DFMh/view?usp=sharing) _(All files have been shared with @illinois.edu emails)_ # High Level Success Requirements - Perform 3D trilateration of a rover node - Read and stream barometer, GNSS, and other data to another node for data logging - Publish live data to a local WiFi network # Final Demo Idea Playing catch or with a tennis ball with a 3D plot for position visualization, or walking around a field with the rover node that is then displays it's position on a phone/computer. ## Intermediary Objectives: - PCB bring up (Validate all subsystem work separately) - Perform 1D trilateration of a rover node - Perform 2D trilateration of a rover node - Calibrate ranging radios ## Side Objectives: _(For fun & only if we have time)_ - Create an antenna tracker connected to the mesh network to track the moving object (Good proof of concept for using high gain antennas for reaching 100km or for using cameras to record an asset. ([Adafruit Pan/Tilt Kit](https://www.adafruit.com/product/1967)) |
||||||
6 | Makeup Color Matcher |
Ashley Herce Shriya Surti Waidat Bada |
John Li | Yang Zhao | proposal1.pdf |
|
# **Team Members** Ashley Herce aherce2 Shriya Surti ssurt2 Waidat Bada wbada2 # **Problem** The beauty and skincare industry often struggles with accurately analyzing and matching human skin tones to appropriate color products due to variations in lighting, skin conditions, and subjectivity. This can result in mismatched cosmetics, dissatisfied customers, and wasted resources. # **Solution Overview** We propose a colorimeter device that uses a Raspberry Pi as the microcontroller and incorporates a color sensor to analyze skin tones. The device will employ RGB, HSV, and YCbCr color models to detect and quantify skin tones and match them to corresponding colors in a predefined database. By reducing external factors such as lighting and the distribution of melanin within the skin, we can more accurately measure skin color through an exact RGB value using a color sensor. Additionally, we plan to incorporate a moisture sensor into the device to analyze the skin’s moisture levels. Skin moisture can significantly affect how products perform on the skin, and by collecting this data, we can refine product recommendations for foundations, serums, and moisturizers tailored to the individual’s skin characteristics. The device will include a display to provide real-time feedback, color-matching suggestions, and skincare recommendations. We aim to collect a wide variety of foundations, skin tints, and serums across all brands, varying in price ranges, to make the device accessible to all users' needs. # **Solution Components** ## **Sensor Subsystem** To properly determine a foundation shade and skincare recommendations for a specific person, we require a method to measure both **color composition** and **skin moisture levels** with high accuracy. ### **Color Sensors** These sensors will determine the RGB, HSV, and YCbCr values for skin tone analysis: * **APDS-9960 Proximity Light, RGB & Gesture Sensor** * **AS7341 10-Channel Light/Color Sensor for spectral data** * **Adafruit AS7262 6 Channel Visible Light/Color Sensor** ### **Moisture Sensor** A **capacitive moisture sensor** will measure the hydration levels of the user’s skin. This data will allow the device to provide recommendations not only for color-matched foundations but also for hydrating or mattifying products based on the user’s skin type. * **Adafruit SHT31-D Temperature and Humidity Sensor** * **Adafruit CAP1188 - 8-Key Capacitive Touch Sensor Breakout** * **Sensirion SHT35-DIS-B** ## **Microcontroller Subsystem** The microcontroller will process the measured data from the sensor subsystem and compute: 1. The user’s skin tone for foundation matching. 2. The skin’s moisture level for skincare recommendations. ### **Options:** * **Raspberry Pi (e.g., Raspberry Pi 3B+)** for image and data processing. * **STMicroelectronics STM32F7** for increased processing power. ### **Software Subsystem** * **Skin Detection Algorithm:** Threshold-based detection algorithm that utilizes RGB-HSV-YCbCr models. Cornell University provides a human skin detection color model algorithm ([https://arxiv.org/abs/1708.02694](https://arxiv.org/abs/1708.02694)) which extracts the RGB values with a high accuracy rate. This algorithm would primarily be beneficial with the use of the color sensor as it breaks down computing the RGB of skin tone mathematically. * **Color Matching:** Database integration for matching detected skin tones with product shades, utilizing datasets like [Kaggle Makeup Shades](https://www.kaggle.com/datasets/utkarshx27/makeup-shades) and [Sephora API](https://www.retailed.io/datasources/api/sephora-product) The database provides the Brands, product name and hex values for products available at Ulta and Sephora retails. ### **Moisture Analysis and Recommendations** A secondary algorithm will analyze the moisture sensor data to classify the user’s skin as **dry**, **combination**, or **oily**, and provide tailored product recommendations. The following research paper goes more in depth on how it will work: [Design of a Handheld Skin Moisture Measuring Device for Application towards Eczema](https://macsphere.mcmaster.ca/bitstream/11375/14416/1/fulltext.pdf#:~:text=Skin%20moisture%20can%20be%20quantified%20as%20water%20content,capacitance%20due%20to%20the%20dielectric%20properties%20of%20water) ## **Display Subsystem** * **LCD Screen:** Displays matched colors and information. Supports HDMI, DisplayPort, or USB-C DisplayPort Alternate Mode. * **User Interface**: We plan to integrate a friendly User Interface on the LCD screen to display the user’s skin value and different products that match their value or are at an appropriate threshold. Alternatively, rather than use an LCD Screen we can create a mobile application which can be utilized separately from this project for makeup enthusiasts to find their perfect shade by manually inputting a RGB value of their skin as opposed to using our device to determine the RGB value. ## **Lighting Subsystem** * **Integrated LED Flash:** Provides consistent lighting for accurate color detection under various ambient light conditions. * Utilize enclosed environments to prevent interference with ambient lighting ## **Power Subsystem** * This subsystem ensures that all electronic components within the other subsystems are adequately powered during use. * **Power Supply:** Battery-powered operation with support for USB-C charging. * **Regulators:** Onboard 3.3V and 5V regulators to support sensors and peripherals. # **Criterion for Success** **Moisture Level Analysis**: Provide accurate moisture level readings to recommend suitable skincare products. * **Skin Detection Accuracy:** Achieve at least 90% accuracy in identifying skin pixels across various lighting conditions using RGB-HSV-YCbCr models. * **Color Matching Precision:** Accurately match detected skin tones to product colors with a precision of 90% or higher. Precision can be achieved by seeing the difference between the expected and measured RGB output of a tested foundation shade. * **User Interface:** Provide an intuitive interface with real-time feedback on matched colors via an LCD screen. * **Portability:** Ensure the device is lightweight, battery-powered, and easy to use. # **Alternatives** There are a few shade matchers on the market that are centered around one specific cosmetic line. Sephora and Ulta utilize a “Shade Finder” Quiz for users to determine their perfect shade match relying on the user's current foundation shade which may not be accurate or exist. The concept that we develop would rely on an exact match through measured RGB values rather than user perception. Additionally, YSL has a developed product specifically to create a perfect matte lipstick shade based on a custom RGB value ([via application](https://www.yslbeautyus.com/rouge-sur-mesure/rouge-sur-mesure-custom-lip-color-creator/WW-50912YSL.html?srsltid=AfmBOorPcSGqrBeUGsCVU5kpJlStYlK7uLuLsNLrijPrSQz3rsRFZICa)) However, the product is specific to lipstick and the brand which retails at $350 for the device while our proposed solution utilizes pre-existing products available on the market that accessible for all consumers. |
||||||
7 | Non-Intrusive Smart Unlocking Mechanism for College Dormitory Rooms |
Arnav Mehta Raghav Pramod Murthy Yuhao Cheng |
John Li | Viktor Gruev | proposal1.pdf |
|
# Non-Intrusive Smart Unlocking Mechanism for College Dormitory Rooms Team Members: Raghav Pramod Murthy (raghavp4)\ Arnav Mehta (arnavm7)\ Yuhao Cheng (yuhaoc7) # Problem Many college students living in dorms frequently face the problem of forgetting their keys. For many students, it’s their first time having to manage keys to get into their rooms, and with busy schedules, it’s very easy to forget or even misplace them. This can create a huge hassle. While some systems, like facial recognition systems, can bypass the standard key-lock system, they are not feasible to install on the college dorm doors; they need to be drilled into the interior of doors, which is costly. Other forms of authentication, such as voice recognition, are not easy to add either. This brings us to a more practical and non-intrusive solution: a lock/unlocking mechanism that does not modify the internal locking system of the door. Almost all door locks can be unlocked through the rotation of some exterior component of the door like the lock or the handle. This naturally leads us to explore a solution geared towards a flexible rotation system that can more easily integrate with existing door locks. # Solution We propose a portable system that turns the lock on the door (similar to how a person on the inside of the door would manually turn it to let someone in). This non-intrusive unlocking mechanism will be portable and transferable – it can be easily removed from one door and put onto another. The user attempting to access a room would scan their face on an app, and make a sound for 5 seconds (picked up by a microphone on the cellphone) to initiate voice authentication. The authentication would occur in the backend. If the face and the voice match a face and voice that has been previously registered on the app, the web app will send a signal to the microcontroller to initiate the unlocking process. The user will also be able to register other faces and voices (for example for their roommate) to allow multiple people to use this unlocking system. An important note is that this entire unlocking system will not interfere with manual unlocking with a key. # Solution Components ## Subsystem 1: Turning Mechanism This will be the component that physically turns the lock to unlock the door once it receives a signal. ESP32-S3 microcontroller chip\ DRV8825 Stepper Motor Driver\ Stepper Motor: STEPPERONLINE Nema 17 Stepper Motor Bipolar 2A\ Custom PCB\ LM1117-2.5 Voltage Regulator\ 12 V Battery\ Flexible Steel Cable to turn the handle ## Subsystem 2: Facial recognition + Voice Recognition app/User Interface for Authentication Function: Authenticate the user by scanning their faces and analyzing their voice Components: Android app\ Flask backend hosted in GCP\ Google Cloud speech-to-text + recognition API\ DeepFace open source model to compare faces\ MongoDB instance to store face data / voice data # Criterion For Success Unit Test Goals: 1. Desired accuracy of the facial recognition model: 95% (on large online dataset and around 20 of our own pairs of cellphone images) 2. Desired accuracy of the speech-to-text + recognition API model: 90% 3. Processing times (from when user submits voice and face to when the signal is sent to the PCB) under 5 seconds Functionality Goals: Portability/Transferability of Unlocking System: 1. We will achieve this goal if we can mount our contraption onto a door in under ten minutes. Facial Recognition + Voice Recognition: 1. We will achieve this goal if users who authenticate themselves (registering their face and voice), take a picture of themselves, and submit a voice sample can unlock the door without a key. 2. We will achieve this goal if an unauthorized user (a user who has not authenticated themselves with face and voice through the app) is unable to open the door. |
||||||
9 | Antweight Battlebot |
Allan Gu Evan Zhao James Yang |
Michael Gamota | Viktor Gruev | proposal1.pdf |
|
# Antweight Battle Bot Team Members: - Evan Zhao (evanhz2) - Allan Gu (allang2) - James Yang (jamesey2) # Problem We must create a Battlebot that weighs less than 2 lbs out of 3D printed materials in order to compete with other battlebots. It must be controlled through Bluetooth or Wi-Fi and be able to be easily shut down. In order to win in the competition, the robot must be robust and capable of destroying the opposing robot, while withstanding damage from other competitors. # Solution Our battlebot will be 3D printed with PLA+ and use a vertically spinning disk as our weapon. It will have a 4-wheel drive and be controlled via Bluetooth with an ESP32 microcontroller. This MCU will use PWM to control the H-bridges for motor activation and take in user inputs from a computer. # Solution Components ## Control System We plan to use an ESP32 for our MCU, as it has built-in Bluetooth and Wi-Fi capabilities. The battlebot will use Bluetooth to connect and communicate with a computer and a wired controller can be used with the computer to provide more inputs like varying speeds using the joystick. The controller will have a killswitch button for safe shutdown. The ESP32 has a variety of GPIO pins, which can support PWM. This will be used to control the H-bridges for motor speed and direction. ## Power System For our robot’s power system, we intend to use a 4S LiPo (~14.8 nominal voltage, ~16.8 charged voltage). We chose LiPo as it is a standard in most combat robotics applications for its high power density and ability to discharge lots of charge quickly which is seen a lot in the combat space with high-power weapons and drive motors. Since ESP32 and other modules that we may use do not typically operate at this voltage, we will also need voltage converters and regulators to supply the appropriate power to these sub-modules (typically 3.3V and 5V). ## Movement System Our combat robot will utilize a four-wheel drive with two brushless or brushed motors on either side of the chassis each driving 2 wheels in a tank-drive configuration. For a brushless configuration, we are considering brushless 1406 motors (https://repeat-robotics.com/buy/repeat-tangent-drive-motors/?attribute_motor-size=1406) that will provide us with plenty of power and torque for a relatively low-cost in weight. A 3-phase inverter will be needed to control the BLDC motors. If we chose brushed motors instead, we would use Repeat Drive Brushed Mk2 (https://repeat-robotics.com/buy/brushed/) which comes with an integrated gearbox and would be simpler to electrically implement than a brushless system at the cost of being less powerful and fast. The motors would be controlled with H-bridges and GPIO from the ESP32. ## Weapon System The weapon will be some kind of vertically rotating 3D-printed weapon driven by a brushless 2207 Battle Ready Hub Motor (https://repeat-robotics.com/buy/2207-battle-ready-hubmotor/). This motor is known to be reliable and durable for battlebots. Similar to the four-wheel drive motors, we will also need a 3-phase inverter to control the BLDC motor phases. # Criterion For Success It would be considered successful if the movement of the robot can be controlled via Bluetooth from a PC and it can function how we would desire within a match such as turning to face the opposing robot and ramming into it with the weapon. The weapon should also be controllable and powerful enough to damage 3D-printed material while maintaining its structural stability. |
||||||
10 | 3D Printed Antweight Battlebot |
Brian Pau Don Lazatin Shashank Sangireddy |
Jason Jung | Viktor Gruev | proposal1.pdf |
|
# Antweight Battlebot Competition Request for Approval Team Members: - Don Lazatin (dlazat2) - Shashank Sangireddy (ssangi2) - Brian Pau (bnpau2) # Problem For our project, we plan on competing in Professor Gruev’s Antweight Battlebot Competition. In order to do so, our Battlebot must adhere to several limitations and requirements. The requirements are listed below - Battlebot must be less than 2lbs - Battlebot must be constructed using 3D printed materials - Battlebot must have a PCB controlled via PC through Bluetooth or Wifi - Battlebot must have an attached fighting tool that will be activated by the motors - Battlebot must have a way of easy manual shutdown and automatic shutdown - Battlebot must adhere to in-competition rules Our overall goal for this project is to design, build and program a Battlebot object that is capable of disabling competing Battlebot objects with our fighting tool. # Solution We plan to build a battlebot with a destabilizing wedge as our fighting tool. We’ve decided to 3D print various parts of the battlebot’s body, such as the chassis and wheels. We will incorporate at least two different kinds of thermoplastic, PLA+ and PETG, according to their relative strengths. We’ve decided to go with a ESP32-S3-WROOM-1 for our microcontroller, which is responsible for controlling the connected motors. We chose this microcontroller for its superior flash memory space and built-in antenna for Bluetooth communication. In the event that a Bluetooth based protocol is not viable for our design, this microcontroller will allow us to pivot over to a WiFi based protocol. There will be at least 3 separate motors in use: one motor may be a brushless motor used for activating the fighting tool, and the other two may be DC-powered micromotors that will be used to activate the battlebot’s wheels for mobility. This set of wheels will be driven by H bridge, implemented using N-channel MOSFETs, which would enable precise control of forward-backward movement. The battlebot will be powered using a rechargeable 9V battery, with current-restricting circuit components such as step-down circuitry or chips incorporated to protect the rest of the circuit from excess voltage. # Solution Components ## Subsystem: Chassis The chassis of a battlebot should be thought of as the body and structural base for a competing battlebot. Our chassis will be 3D-printed out of PLA+ filament to ensure a strong armor and follow the weighting guideline for the competition. The chassis will house and protect the main circuit, motors, weaponry, and power source for our device. Our chassis will have a symmetrically horizontal structure so that if our battlebot is flipped over in competition, it will still have the same operational effectiveness as it would on its original side. We plan on constructing a square-like bodily structure so that there would be no weak side of the chassis for other bots to target. We plan on covering all the electronic components within the body to ensure safety and protection for the internal core of the battlebot. ## Subsystem: Drive System For our battlebot’s drive system, we plan on producing a 2-wheel drive mechanism with the front of the chassis (weapon) dragging across the floor on top of powerless mini wheels. For the tire treads, we plan on looking into rubber-like materials in order for our battlebot’s movement to be controllable and smooth during operations. For our battlebot’s drive motor, we plan on using a type of DC-powered micro motor. We are choosing to propose this motor as it would be ideal for a battlebot competition environment. With these types of motors, they significantly reduce the motor’s rotational speed and significantly increase the torque. This is ideal because we are not in need of much speed in a competition arena. Rather, the increase in power output would be more beneficial. ## Subsystem: Weapon System The fighting tool will be a wedge, designed to lift the opposing battlebot with the intent to destabilize, disorient, and ultimately flip the opponent over. It will be located in front of the battlebot, and will consist of at least 30% of the battlebot’s weight. This amount of mass, combined with motor activation for high amounts of power should result in an effective destabilizing fighting tool, fit to render the opposing battlebot unable to function. The wedge “weapon” will also be fitted with small, free-moving rubber wheels on its outer surface. The purpose of these wheels is to guide the opposing battlebot’s chassis along the surface of the wedge, increasing the efficiency of each motor activation or weapon use. The material of these smaller wheels is soft rubber because of the higher friction coefficient between itself and the majority of the viable thermoplastics, compared to the friction coefficient between any two of the viable thermoplastics. ## Subsystem: Power System For the power system of our battlebot, we will initially plan to use a 9V battery as it is generally light (0.1-0.2 lbs), familiar, and cheap while providing the necessary power for our battlebot. If we find that our drive and weapon will draw too much power, we can pivot to LiPo batteries, which can offer more power through 11.1V or 14.7V options at the expense of size, weight, and cost. Since some of the electronics, including the microcontroller, run at 3V3, step down circuits or chips will be used to step the voltage down to appropriate values. ## Subsystem: Control System For the main portion of our battlebot’s control system, we are planning to use the ESP32 microcontroller as it has both WiFi and Bluetooth capabilities, plenty of GPIO pins, and available development boards. In particular, we are thinking of using the ESP32-S3-WROOM-1 model as it has a built-in antenna compared to the 1U model and offers more flash space compared to something like the C3 model. We are aiming to use bluetooth in order to communicate with our battlebot, but should we run into difficulties, we can pivot to WiFi as the ESP32 would allow that. The control system will be responsible for the operation of our robot, including our drive and our weapon activation. In order to fulfill the manual shutdown requirement, we aim to have the robot shutdown if our bluetooth link is lost, but if time allows, we may try to implement a physical shutdown using a keyboard or controller. # Criterion For Success This project’s criterion for success are as follows: - Full functionality of the battlebot, including all operational components and subsystems of the battlebot, through remote communication using either a Bluetooth or WiFi protocol. - Successful, remote activation and deactivation of the battlebot, without any manual interference - Successfully and remotely activate the battlebot’s kill-switch mechanism, which should result in all subsystems of the battlebot being disabled or powered off. The kill-switch mechanism must not otherwise affect the battlebot’s functionality or performance in any tangible way. |
||||||
11 | Antweight Combat Robot |
Ryan Middendorf |
Michael Molter | Viktor Gruev | proposal1.pdf |
|
Antweight Combat Robot Team Members: Ryan Middendorf (ryanjm8), David Kapelyan (davidik2) Problem The constraints for Professor Gruev’s competition are as follows: Must weigh less than 2 lbs Must be 3D printed in PET(G), ABS, or PLA(+) Must have controlled movement Must be controlled over bluetooth or wifi by a PC Must have a fighting tool to use against other bots The main challenges involved in this are making a custom control solution and designing a combat robot that will not only survive the 2 minute matches but actually win them by immobilizing the other robot. Solution To meet these constraints, we plan to create a custom PCB that contains 3 brushless electronic speed controllers (ESCs) to control the drive and weapon motors and uses a microcontroller to communicate with a PC over bluetooth and control the robot. For the actual robot design we plan to build a vertical spinner which usually performs best in this weight class. The "tool" will be spun by a brushless motor, and so will both sides drive wheels. Subsystem 1 - Custom PCB and Power Our first subsystem will be the custom PCB. It has to contain 3 brushless ESCs and interface with a bluetooth enabled microcontroller such as an ESP32 or STM32 that will receive instructions from a PC and turn them into usable PWM signals for the ESCs. It will also have to be powered by a LiPo battery through an XC30 connector and include an integrated screw switch so the robot can be turned on and off simply and safely. Subsystem 2 - Drive train Our second subsystem will be the drive train. Our robot will be driven by 2 brushless motors, 1 on each side. Each motor will drive 2 wheels that are connected by a belt so the robot will have a simplified 4 wheel drive in a tank drive configuration. Subsystem 3 - Weapon/Tool Assembly Our third subsystem will be the weapon/tool assembly. Our tool will be a robust vertical spinner, most likely a drum/eggbeater style. This type of tool has a lot of success in combat robotics due to its ability to dissipate the force of hitting an opponent into the floor very efficiently. This will be driven by a substantially larger brushless motor than the drive system so it can deliver much more powerful hits. Subsystem 4 - Chassis Our fourth subsystem will be the chassis. The chassis has to be very robust and able to withstand all the damage that will be dealt to it throughout a match. It also has to be able to contain all the electronics and prevent them from being damaged. The chassis will be 3D printed out of one of the approved materials listed above but most likely PLA+. Subsystem 5 - Controlling from PC Our fifth and final subsystem will be how our robot is controlled by a PC. This will be a program run locally on a PC that takes keyboard inputs and transforms them into instructions that are sent to the microcontroller inside the robot to control it. Criterion for Success We would consider our project a success if we are able to communicate with the robot from our computer and successfully drive it around the arena during a match. The commands sent from the pc need to be processed by the microcontroller and the motors need to be powered properly and behave correctly during a match. The robot will also have to be able to shut itself off if the bluetooth gets disconnected for some reason. |
||||||
12 | Bench Organizer |
Liangcheng Sun Max Mu |
Maanas Sandeep Agrawal | Arne Fliflet | proposal1.pdf |
|
# Bench Organizer ## Team Members: - Liangcheng Sun (ls25) - Xiaohu Mu (xiaohum2) # Problem Most desk organizers only store items and don’t help users stay organized or productive. People often lose track of small items like pens or keys, which makes working harder. Digital tools exist, but they don’t work well with physical workspaces. We need a better way to help people keep their bench tidy and stay focused. # Solution We aim to create a Bench Organizer that detects and tracks items to help users stay organized. It will use RFID technology to know if items like pens are in the right place and send reminders if something is missing. The system will also include a custom PCB to connect and manage all components. The system will also include extra features like a wireless charging pad and Bluetooth notifications if time allows. # Solution Components ## Item Detection Subsystem This subsystem will use RFID technology to track items in the organizer. It will incorporate NFC stickers attached to items like pens and keys, and an embedded NFC reader (e.g., PN532 module) in the organizer to detect their presence. The microcontroller (e.g., Arduino Uno) will process the data and check if each item is in its correct spot. This subsystem will send the detection results to the notification subsystem. ## Notification Subsystem This subsystem will alert users if any items are missing or misplaced. It will use LED lights to indicate the missing items and a buzzer for sound alerts. Additionally, a Bluetooth module (e.g., HC-05) can send notifications to a smartphone or computer. This subsystem will receive data from the item detection subsystem and trigger the appropriate notifications. ## PCB Subsystem The PCB will be designed and fabricated to integrate all the components of the organizer. It will act as the central hub to connect all other components. This part will ensure proper power distribution to each subsystem, including voltage regulation for different components. # Criterion For Success The organizer must detect and track items with at least 90% accuracy in tests. It must notify users when items are missing or misplaced. Additionally, this system should work well under normal indoor lighting. Extra features, if added, should work smoothly without affecting the main functions. |
||||||
13 | Modular and Affordable Digital Accordion |
Guangyang Sun Henry Zhang Zhuoer Zhang |
Jiankun Yang | Michael Oelze | proposal1.pdf |
|
Team Members: - Guangyang Sun (gsun16) - Zhuoer Zhang (zhuoer3) - Hanyu Zhang (hanyu8) # Problem Traditional accordions are expensive, delicate instruments that require regular maintenance. Their sound quality is sensitive to environmental factors such as temperature and humidity, making them less reliable in varying conditions. Additionally, learning to play the accordion presents a steep learning curve, especially for beginners. Currently, digital accordions on the market cost over $7,000, making them inaccessible to most entry-level players and hobbyists. These challenges highlight the need for an affordable, beginner-friendly, and modular digital accordion that can replicate the traditional instrument’s features while addressing its limitations. # Solution We propose to build a low-cost ($150 or less), modular, and beginner-friendly 12-bass digital accordion. Our design will replicate the sound and functionality of a traditional accordion using modern electronics while offering improved durability and ease of maintenance. The solution will include the following subsystems: 1. Key Input Subsystem: Detects user inputs from bass buttons and treble keys. 2. Sound Synthesis Subsystem: Generates high-quality accordion sounds using a microcontroller. 3. Output Subsystem: Delivers audio through wired and optional Bluetooth connectivity. 4. (Optional) User Interface Subsystem: Offers optional features such as LED backlit keys, playback, lazy mode, and sound customization. The system will detect key presses via a matrix scanning technique, process the input in the microcontroller to synthesize accordion sounds using a MIDI sound bank, and output the audio through wired or Bluetooth connections. # Solution Components ## Subsystem 1: Key Input This subsystem is responsible for detecting treble key and bass button presses. A matrix scanning approach will minimize the GPIO usage while ensuring accurate detection. Design: Matrix Configuration: A 5x8 matrix (5 rows and 8 columns) will be used to detect inputs from 26 treble keys and 12 bass buttons. Components: - Tactile push buttons (low-cost option) or capacitive touch sensors (for enhanced user experience). - GPIO pins on the micro controller for interfacing with the matrix. Key Features: - Accurate key press detection with minimal input lag. - Scalable design for modularity. ## Subsystem 2: Sound Synthesis Subsystem This subsystem synthesizes high-quality accordion sounds in real time based on user inputs. Design: - Use a MIDI sound bank with pre-recorded accordion samples to replicate authentic sounds. - Generate polyphonic sounds by combining waveforms for multiple notes. - Utilize built-in DAC for waveform generation or an external DAC for higher audio quality. Components: - Microcontroller with DSP and DAC capabilities. - External DAC for better audio quality. - Flash memory or SD card to store sound samples and MIDI files. - Optional: Low-pass filter for improved audio output. ## Subsystem 3: Output Subsystem This subsystem delivers audio to external devices through both wired and wireless methods. Design: - Wired Output: A 3.5mm audio jack with an amplifier will support headphones or external speakers. - (Optional) Bluetooth Output: Integrate Bluetooth streaming for wireless audio playback. Components: - Audio amplifier. - 3.5mm audio jack and connectors. - (Optional) Bluetooth module. ## Subsystem 4: (Optional) User Interface This subsystem adds additional functionality to enhance user experience. Features: - LED Backlit Keys: Guide beginners in learning to play. - Playback Mode: Replay user performance or pre-recorded songs. - Lazy Mode: Random button presses play pre-recorded high-quality accordion sounds. - Sound Customization: An LED display and interface allow users to change sound profiles or remap keys. Components: - RGB LEDs for backlit keys. - Small OLED or TFT screen for the user interface. - Additional GPIOs for expanded functionality. # Criterion For Success Our project will be considered successful if it meets the following testable criteria: 1. The system can detect treble and bass key presses accurately with no noticable input lag. 2. The sound synthesis subsystem generates high-quality accordion sounds with minimal distortion. 3. Audio output is clear and functional through wired connection. 4. The system is modular, with components that can be easily replaced or repaired. 5. The total cost of materials stays below $150. (Optional features can be added to the project, such as LED backlit keys, playback, lazy mode, and bluetooth, if time permits) |
||||||
14 | Audio Augmented Reality Glasses (AARG) |
Evan Chong Nikita Vasilyev Sunny Chen |
Aishee Mondal | Michael Oelze | proposal1.pdf |
|
# Audio Augmented Reality Glasses (AARG) Team Members: - Sunny Chen (sunnyc3) - Nikita Vasilyev (nvasi2) - Evan Chong (eschong2) # Problem Have you ever seen a plant in nature or an animal in the wild that piqued your interest, but you didn’t have an efficient way of researching what it was? Repeatedly searching online to identify the subject can be a lengthy and tedious task, and this is the problem we seek to address. Our solution is meant to enlighten our user of unknown plants, animals, or objects in any setting they are observing. # Solution Our project idea stems from the surge of AR prototype glasses being introduced over the past year. We are planning to create our own glasses but in contrast to those on the market, ours will focus on the audio experience of the user. These glasses will have the explicit capability of capturing images of objects and relaying this information to an application that will process these images in the backend. The application will then send an explanation of the object back to an audio device on the glasses (either a speaker or bone-conducting device). The glasses will essentially work as a digital tour guide, with the explanation of the object being auditory rather than visual. The use case we have decided to tackle is a botanical tour guide, but the purpose is to create a platform that other applications can utilize for their objectives. The subsystems we have broken down the device into are power, peripheral, communication, physical, and application. They are divided such that each subsystem has a designated purpose working towards the goal of full functionality. # Solution Components ## Power System The power system consists of the battery powering the device and the supporting charging circuit to replenish the battery once out of power. Some candidates for batteries are PCIFR18650-1500 from ZEUS Battery and ASR00011 from TinyCircuits. ## Peripheral System The peripheral system focuses on the aspects of the glasses that interact with the outside world. This includes the camera, microphone, speaker, and interact button. These external components will interface with the microcontroller, provide crucial information to the application, and play audio to the user. For the moment we have the following components for each peripheral: Camera: ESP32-CAM (Comes with development board and camera) Microphone: CMA-4544PF-W Speaker: ADS01008MR-LW100-R Interact Button: B3U-1100P ## Communication System The communication system consists of a microcontroller and Bluetooth Low Energy interface. This subsystem should create an interface that can be used by applications connected through Bluetooth. This interface allows for all the sensor data to be collected, processed, and sent to the application when requested. The component we plan to use for this system is the ESP32-WROOM-DA-N8 which contains an ESP32 microcontroller with a built-in PCB antenna for Bluetooth. ## Physical System The physical system consists of the glass frame design and the mounting system for the PCB and hardware components. The frame design will be 3D printed. The goal would be to use premeasured plastic mounting points and screws to mount all components within the hollow frame. ## Application System The application system consists of image processing, audio transfer, and user interface. The image will be processed, the plant will be identified, and then have audio transferred back to the speaker in the peripheral system. We will develop this application for iOS and interact with the glasses via Bluetooth. # Criterion For Success The following goals are fundamental to the success of our project: - Successful User Flow - The user should be able to look at a plant, press the interact button, and then wait for the system to return the audio of the plant description. - Accuracy - The final prototype should be able to correctly identify plants 75% of the time. - Strong Bluetooth Connection - There should be an uninterrupted Bluetooth connection between the glasses and the mobile - device. Additionally, the glasses should be fully operational within a 15-foot range of the mobile device. The goals below are considered reach goals, and if not accomplished would not hinder the success of our project: - Bone Conduction Audio - An alternative way of relaying the audio to the user that involves transmitting sound vibrations through the bones. - Adjustable Audio Volume Level - Within the application system the user will be able to adjust the volume. - Voice Activation - In addition to the push button, users have the ability to speak to begin the system process. - Heads-up Display - A display on the glass lenses to aid in relaying the information to the user. |
||||||
15 | Antwieght Battle Bot |
Carlos Carretero Dany Rodriguez Troy Edwards |
John Li | Viktor Gruev | proposal1.pdf |
|
3D-PRINTED BATTLE BOT Group members: Daniel Rodriguez (drodr25) Carlos Carretero (ccarr27) Troy Edwards (troyre2) PROBLEM Our project revolves around Professor Gruev’s Battle Bot Competition. This competition has several requirements as well as limitations which must be adhered to. These requirements include 3D printed construction with predetermined materials, weigh less than 2 pounds, have a PCB that is controlled through Bluetooth or wifi, fighting capabilities, and safety measures for shutting the robot down. Our goal for this project is to have a robot that is capable of competing in the competition meaning that it can be controlled and attack as desired. SOLUTION As the project entails this robot will be fighting against other robots which means that our design must revolve around having the ability to disable the opponent's threats or render their robot immobile. In order to accomplish this we will have a 3D-printed chassis made of PLA+ with an ESP32 microcontroller for motor and movement control. This Microcontroller has onboard wifi and Bluetooth allowing us to decide which is best for controlling our robot. In our design, we will use 3 motors, two for movement and one for controlling our battle element which involves a lift to try and flip our opponents over. The motors will be powered by a set of LiPo batteries as they have a high power output in comparison to their size and weight helping with the weight restrictions. The motors used for movement will also have an h bridge that allows for forward and backward movement allowing the robot to turn and have smooth movement. Voltage control circuits will also be implemented in order to account for the different voltages required for the microcontroller and the motors. SOLUTION COMPONENTS SUBSYSTEM: CHASSIS The chassis of the battle bot will be 3D printed using PLA+ material to have a strong and lightweight robot. It will house all the components including the PCB, motors, and power source. Our weapon will also be incorporated into the chassis to ensure that the lifting mechanism is sturdy enough to flip over opponents as well as enclosed enough to prevent damage to the robot. The body will be horizontal with a very low center of mass to avoid others flipping it over. The wheels and all electronic components will also be enclosed to prevent any damage there. SUBSYSTEM: COMBAT Our lift system will be integrated into the chassis as a movable ramp that is powered with a motor for raising and lowering the ramp. The ramp will most likely be made of titanium in order to keep a lightweight setup. It is located on the front of the bot allowing us to drive into our opponents while raising the ramp to try and flip over the other bot. SUBSYSTEM: POWER DISTRIBUTION Since we will be using LiPO batteries which have higher voltages of either 11.1V or 14.8V we have to design a circuit to step this power down for our lower voltage components like the microcontroller and DC motors for movement. This part of the project will also need some sort of circuit to be able to safely cut power to motors in case of an emergency as required. This type of battery is commonly used in battle bot applications which is why we are using it for our design. The battery will first be connected to a kill switch before anything else to ensure that the robot can be shut down safely. SUBSYSTEM: CONTROL The ESP32 microcontroller is a great option for our project as it has wifi and Bluetooth built in allowing us to have a way to control our robot. We can either use the BLE protocol to talk to the microcontroller, due to low power consumption and low latency, and connect an external Xbox controller or use wifi to control them using a pc keyboard. SUBSYSTEM: MOVEMENT There will be 2 Brushed DC motors that control the 2 wheels in our robot and we will be looking to use something like the L298N DC motor driver to control those. This will also require voltage convertors as previously mentioned. The wheels will probably made from some high-friction material like rubber to ensure that the robot does not lose traction. The ESP32 has various GPIO ports that will allow us to control the motor drivers. For the motor for the ramp we can use a servo motor in order for precision control since we don’t need more than a 90-degree range of motion. CRITERION FOR SUCCESS Our project would be successful if the robot could move around using inputs given by the user externally. Also if the attack mechanisms had movement is the range that we wanted. We also want to ensure that the chassis has enough rigidity to handle the forces from the motors. It should be safe to power on and off. The robot should also be effective at immobilizing other robots. |
||||||
16 | A Modernized Analog Video Distortion Device |
Adarsh Payyavula Jun Hayakawa Matt Streicher |
Jason Jung | Viktor Gruev | other1.pdf |
|
# A Modernized Analog Video Distortion Device Team Members: - Jun Hayakawa (jundh2) - Matthew Streicher (mps11) - Adarsh Payyavula (adarshp3) # Problem In recent years, the force of nostalgia has made the aesthetic of analog glitches increasingly popular, and they have found wide use in mediums such as music videos, live concert visuals, video editing, and even film. However, authentic analog glitch devices are made only by a small number of artisans who alter (or "bend") the circuitry of vintage video hardware to introduce these visual artifacts, making them inaccessible to general hobbyist visual artists, such as the VJs who make visuals for house shows on campus. The cost of a unit generally ranges from $300 to $700, with resold units sometimes reaching over $1,000. This is due both to the increasing rarity of the hardware they’re built from and the small number of people who hand-make these devices. Even after placing an order, the turnaround time can be upwards of 6 months. Additionally, controls on these devices are abstruse, typically consisting of unlabeled switches and potentiometers. This makes operating them confusing and requires the user to carefully experiment with the controls in order to figure out how to dial in a visually appealing setting. The niche nature of this field makes it ripe for innovation, most importantly in the way of making them more accessible to the general artistic community. # Solution We propose a device which aims to solve these challenges. Using a custom PCB design, we aim to replicate the functionality of both a normal and bent video enhancer. Additionally, this analog circuitry will be controlled digitally using a microprocessor to adjust various amplifier gains and reroute signal flow. This way, the device can be interfaced with a user-friendly controller, making it easier and more fun to play with video distortion. Our proposed design separates itself from existing glitch devices in three key ways: 1) *Fully custom circuitry*: Almost all analog glitch devices are constructed from vintage video hardware which has been bent to introduce artistic visual artifacts. This reliance on discontinued hardware contributes to the cost and scarcity of devices, and by designing all hardware from the ground up we sidestep this bottleneck. 2) *Remote control*: Rather than being controlled with knobs and switches attached to the circuitry, our device will be controlled remotely using a microcontroller, which will allow parameters to be modified using something akin to a video game controller. There is no device on the market currently which has this functionality, and its inclusion will make operation of the device feel intuitive and immediate, decreasing the learning curve for new users. 3) *Auxiliary distortion circuitry*: Our analog circuit will be able to act both as a video enhancement circuit **and** video distortion circuit. When video enhancement circuits are bent, they lose their original functionality and become pure distortion circuits. We aim to have both by isolating feedback distortion lines using digital switches, which will preserve the original enhancement functionality as long as the switches are closed. This will allow the user to use the enhancement circuit for subtle adjustments to the image, or engage the distortion elements to make more radical modifications. A block diagram of our high-level design is provided in the album of images at the end of the proposal. The overall goal of this project is to make a robust and easily manufacturable analog glitch device that can be externally controlled, with an aim towards making them more accessible to hobbyists and people who are interested in analog video. # Solution Components ## Subsystem 1: Power Management A 5 volt switching regulator circuit which will control the power supplied by a 9V battery to the microprocessor and the enhancement/distortion circuit. Our controller will be low power enough to be powered via the same USB port that it uses to interface with the microprocessor. Components: - LMR33630 (3.8V to 36V, 3A Synchronous Buck Converter With Ultra-Low EMI) - 9V battery ## Subsystem 2: Video Enhancer In the US and Japan, composite analog video signals are encoded using the NTSC format. This encoding scheme divides image information into luminance (a 15.7 kHz square wave whose amplitude encodes the brightness of a pixel) and chrominance (a 3.2 MHz sine wave whose amplitude and phase encodes the color of a pixel). Our enhancement circuit will use wideband voltage controlled amplifiers (VCAs) to boost or attenuate the luma and chroma signals in order to control the image. It will control the following parameters: 1) *Contrast*: Boosts/attenuates luma signal from a factor of -2 to 2, negative values resulting in inverted brightness. 2) *Brightness*: Applies a DC shift to the signal to adjust the black/white balance. Ranges from pure black (-768 mV) to pure white (714 mV). 3) *Saturation*: Boosts/attenuates chroma subcarrier signal from a factor of 0 to 2, with 0 corresponding to a grayscale image and 2 corresponding with blown-out oversaturated colors. 4) *Hue shift*: Uses all-pass phase-shifting filters to shift the color content of the image along the color wheel (0 to 360 degrees). Components: - 2 to 4 LMH6505 ICs (Wideband, Low Power, Linear-in-dB, Variable Gain Amplifier) - Numerous resistors - Numerous capacitors - LTC1562 IC (Very Low Noise, Low Distortion Active RC Quad Universal Filter) for allpass filtering - 2 RCA I/O ports ## Subsystem 3: Video Distortion Circuit benders alter the circuitry of existing video hardware by feeding the outputs of amplifiers back into themselves at various points, creating resonant feedback loops which interfere with the enhancement circuitry and introduce various analog glitches in the image. We aim to accomplish this by building these "bends" into the circuit pcb and isolating them using digital switches, which will protect the main enhancement circuit. By creating these feedback loops in various amplifier stages, we aim to achieve the following visual artifacts: 1) *Ringing feedback*: vertical stripes which follow the contours of edges in the image. 2) *Rainbowing*: Cyclic rainbow patterns which appear on bright portions of the image. 3) *Horizontal tearing*: Misalignment of rows of image data on the display, giving the impression of the image being “torn”. 4) *Ghosting*: High frequency elements of the image persisting for too long on the electron beam, smearing the image to the right. The strength of these effects will also be able to be adjusted using digital VCAs. Examples of each of these types of artifacts are provided in a google doc linked at the end of this proposal, in order to better visualize the effects we aim to achieve. In order to prevent damaging the signal to the point of image dropout, we will use sync separator ICs which protect the horizontal and vertical sync portions of the signal so that the corrupted image is still able to be displayed on a CRT. Components: - 2 to 4 LMH6505 ICs (Wideband, Low Power, Linear-in-dB, Variable Gain Amplifier) - Numerous resistors - Numerous capacitors - 1 or 2 DG408 ICs (8-Ch/Dual 4-Ch High-Performance CMOS Analog Multiplexers) - LM1881 Video Sync Separator ## Subsystem 4: Interface / Housing Our circuit will be housed within a compact and durable laser cut plastic casing, which will have the RCA I/O and USB port to program the microcontroller and connect the external controller, such as a video game console controller. The MCU will read inputs from the controller and translate them into analog voltages which drive our circuit components on the analog processing circuit. Components: - Laser-cut plastic housing - USB host shield - STM32 Microcontroller - External controller # Criterion For Success - Our enhancement circuit can control image aspects such as contrast, brightness, saturation, and hue (tested using CRT test pattern). - Our distortion circuit can produce the characteristic analog glitches described in subsystem 3. - Our circuit can be interfaced with digitally using an external controller ## Figures https://docs.google.com/document/d/1jqR8Q9gRmVlFIulvmhQkPYblD1XzxrDst6jpfQKjtfA/edit?usp=sharing |
||||||
17 | Integrated Brushless Motor Exploration Platform |
Alex Roberts Jason Vasko |
Michael Gamota | Yang Zhao | proposal1.pdf |
|
# Integrated Brushless Motor Exploration Platform Note, project changed marginally from initial idea. Original idea post is [Multiple Motor Stimulation Hardware Investigation Tool](https://courses.grainger.illinois.edu/ece445/pace/view-topic.asp?id=76583) # Team Members: - Alex Roberts (asr9) - Jason Vasko (jrvasko2) # Problem Exploring topics in motor control requires at least a moderate knowledge of electronic hardware systems. Even when using commercial off the shelf motor drivers, microcontrollers, power regulators, and power supplies still need to be connected to the motor driver, which can cause confusion for people without a working electrical engineering knowledge. This makes it difficult for students in disciplines other than ECE, such as mechanical or aerospace engineering, to experimentally learn about motor control. # Solution We propose a single integrated device which is usable with minimal electronics experience that allows the user to test motors with different motor control algorithm parameters at different speeds. The board will act as an educational tool to allow people interested in topics such as field oriented control, or 3-phase power system in general, to operate brushless motors and explore control algorithms with as few external connections as possible. Our project integrates the microcontroller, sensors, power regulation, and motor drive circuitry required to spin a brushless DC motor into a single board. It will only require the user to connect a computer over USB, the 3 phase wires of the motor, and two simple power connections (one to a 12V wall adapter for logic and sensing power, and the other to a benchtop supply used only for motor bus voltage). On the computer there will be a GUI application that allows the user to control the motor, modify the motor control algorithms, and measure motor performance. Ultimately, the system will serve as a single platform for learning about brushless DC motor drivers and control algorithms with as few external tools needed as possible. # Solution Components ## Control Subsystem The control subsystem is responsible for driving and/or monitoring all other subsystems. It will periodically read data from the sensory array, monitor the health of the power subsystem, and generate PWM signals for the motor drive subsystem. It will also communicate with the PC app, updating the GUI periodically and allowing the user to set motor parameters such as speed and PID controller coefficients. This subsystem includes: - System Microcontroller (STM32F446RET6) ## Sensor Array The sensor array is responsible for recording data related to the motor’s operation and the overall health of the board. This subsystem includes: - Current and voltage sensors for the three-phase signals driving the motor and to monitor health of the voltage regulators (INA230AIDGSR) - We will use shunt resistors to use this same IC for both voltage and current monitoring. - Physical encoder to measure motor angle and speed (PEC11R-4220K-S0024) - We will use a 3D printed jig that the user attaches the motor to during operation. The motor shaft and encoder shaft will then be connected using gears attached to each, so the motor shaft position can be measured using the rotary encoder. ## Power Subsystem The power subsystem is responsible for generating the needed voltages for components on the board such as sensors and the microcontroller. A small 12V DC wall adapter will plug into a banana jack on the PCB, which is converted using a buck regulator to our logic voltage of 3.3V. We also require the user to connect a benchtop power supply which will provide motor bus voltage directly. This avoids needing to integrate a complex, multiple hundred watt converter into the board, which would be unrealistic given the timescale of this project. This subsystem includes: - Adjustable switching buck converter to convert the 12V supply to 3.3V to power the microcontroller (TPS562201DDCR) ## Motor Drive Subsystem The motor drive subsystem is responsible for generating the AC waveforms supplied to each phase of the motor. To do so, we will use gate drivers and half-bridges connected to the motor bus voltage coming from the benchtop power supply. This subsystem’s primary components are: - MOSFETs for the half-bridges for each phase (IRFI1310N) - Half-bridge gate driver ICs for each phase (DGD05473) # Criterion For Success We consider the project a success if it satisfies the following criteria: - User should be able to control motor speed and/or position through a PC app GUI connected to the board via USB. - User should be able to set and change motor driver parameters such as PID coefficients. - User should be able to see aspects of the motor control algorithm performance on the GUI, such as motor speed and three-phase voltages and currents. - User should only require four external connections to use the device: a wall power connection, a benchtop power supply, a usb connection to the laptop, and the motor phases. |
||||||
18 | Schedulable Autonomous Fish Feeder |
Brandon MacIntosh Colby Steber Jeremy Richardson |
Sanjana Pingali | Michael Oelze | proposal1.pdf |
|
Team Members: - Colby Steber (csteber2) - Jeremy Richardson (jrr13) - Brandon MacIntosh (bm53) # Problem Fish feeders currently on the market are limited on how much convenience they give fish owners when they are away from their tank. If you want to feed your fish at a certain time, you usually have to set a timer for 12 or 24 hours in advance to feed them. There is also no reassurance that your fish is actually being fed and eating. Owners just have to assume that the machine is working as intended. This poses a major problem when gone for extended periods of time, such as winter break. # Solution With our fish feeder, the user will not only be able to feed their fish from any location by using a mobile app, but they will also be able to schedule the exact times they want the feeder to dispense food, allowing them to customize their feeding times. In addition, the feeder will have a sensor that will detect when the food container rotates and send a notification to the user so they can ensure that their fish was fed. The feeder will be plugged into the wall to make certain that the feeder will work for extended periods of time. If the power goes out or if the feeder is not being supplied with AC power from the wall, it would switch to battery power. This solution would require a PCB, microcontroller with wireless transmitter, rotating motor, sensors, mobile app, and a power system. Other components could be added, such as a camera, water quality sensor, and indicator LEDs. # Solution Components ## Subsystem 1: Microcontroller This microcontroller will implement the processing of the data along with triggering the circuit to engage the motor, communicate via WiFi to connect to an app, and take input from sensors such as the feeder engage sensor. There will also be external ports that connect to the microcontroller for additions of other sensors, such as a possible water quality sensor or camera. Possible Microcontroller: ESP32 ## Subsystem 2: Rotating Motor and Sensor This subsystem will consist of a motor that will be connected to the main PCB via a relay. The relay will take input power from the battery and a signal to switch on from the ESP32. The output shaft will hold the container of food. The container will have a magnet on the part of the food container that rotates so that a sensor can detect when it rotates to ensure that the food actually dispensed. Possible Motor: 5V Motor at 12RPM Possible Sensor: Hall-effect sensor of some variety ## Subsystem 3: Mobile App The mobile app will be programmed with multiple buttons that will communicate with the wireless transmitter on the ESP32. These buttons would manually feed the fish, change the feeding schedule, and turn on/off the feeder. The app will also notify the user when food is being dispensed and when the food level in the feeder is low. The app would also be used for implementation of the camera or water quality add-on. ## Subsystem 4: AC Switching / Charging System This subsystem will consist of an IC that will be used to switch between AC power and battery power and another IC to control the charging of the battery. The battery would be a LiPo battery that is used as a backup to AC wall power. When AC power is restored, the charge controller will calculate how much charge is needed to put 100% charge in the battery. When AC power is available, the unit will use AC power. The battery will solely be for a backup. Possible Implementation: One IC to control the charge and one IC to implement switching different sources, a battery, and an input port such as USB-C. # Criterion For Success - Manual feeding via button on feeder and in app works. - Magnetic sensor detects that the food actually dispensed into the tank. - App successfully notifies the user that the food was dispensed. - When scheduling feeding times using the app, the food is dispensed at the specified times. - When no AC power from the wall is detected, the feeder switches to battery power. |
||||||
19 | Electric Water Blaster |
Clark Taylor Jaejin Lee John Lee |
Rui Gong | Michael Oelze | proposal1.pdf |
|
# Electric Water Blaster Team Members: - clarkmt2 - jaejin2 - junhee2 # Problem Common problems with traditional water guns are that they rely on manual pumping and squeezing, leading to inconsistent water pressure, limited range, and user fatigue. They also provide no feedback on the water level or have any interface for the users. Our project is to build a fully electric, high-pressure water blaster that aims to fix those issues and add additional features. It will deliver consistent, controlled bursts of water while providing real-time feedback along with improved ergonomics and enhanced water resistance. We will integrate intelligent electronics with a robust mechanical system to provide a more engaging and reliable experience for users. # Solution For our project, we will develop an electric water blaster. At a high level, it will have a 12V DC electric pump. The pump will fill the tank and pressurize the water up to 60 PSI. A solenoid valve will control the release of water to allow it to fire powerful bursts. Our project will make use of an array of sensors to detect internal leaks and the current water level of the tank. The water blaster will also feature an OLED display which will display the current state (filling, firing, idle, etc) of the water blaster to the user so they can understand what is happening at all times. The OLED display will also display the fill level of the water blaster so the user knows how many bursts of water remain in the tank. There will also be buttons below the OLED to allow the user to navigate through the menus and make adjustments to the firing logic of the water blaster. We plan to power the water blaster through an off-the-shelf battery, likely from electric tools due to its more durable and high output characteristics. # Solution Components ## Subsystem 1: Control Board The control board will ultimately receive all the sensor data and move through our state machine acting accordingly given the inputs from the sensors and the IO subsystems. It will interface directly with the 12V fast actuation solenoid valves, the 12V DC electric pump, the SPI display, and the IO board. ### Components Used: STM Microcontroller - STM32G070KBT6 ## Subsystem 2: Input & Output This subsystem provides user interaction through a display, buttons, and possibly analog controls. The SPI display will show real-time information like fill level, pressure status, and firing mode, and the buttons and knobs will allow for the firing to be altered to be more powerful or less powerful using a menu. If we find additional ways to tweak the firing characteristics in testing we will add those options for the user in the final menu. This will also control the basic trigger functionality with a push button and some creatively engineered 3D-printed parts. ### Sensors & Components Used: OLED SPI Display – NHD-0420CW-AB3 – Displays system status, fill level, and settings. Physical Buttons & Potentiometer – TS02-66-43-BK-160-LCR-D, PTV09A-4020U-B103-ND – Allow user control over burst length, power level, and menu navigation. Push button – TS02-66-43-BK-160-LCR-D – Trigger for water blaster ## Subsystem 3: Battery The battery subsystem will provide the power for the entire water blaster. We currently would like to use some sort of tool battery due to its characteristics like high current ratings and a relatively low voltage. This battery subsystem would likely be around 18V and would then be stepped down to the 12V we need for the electric pump, solenoid valve, and power input on the control board. ### Components Used: Battery - Milwaukee® M18™ RedLithium™ Battery XC5.0 12V DCDC – COM-18732 – Converts our 18V battery output to 12V ## Subsystem 4: Frame & Shell The frame and shell house and protect all components from water leakage. We will 3D-print our shell for precision, and use TPU-sealed buttons and NPT fittings to prevent any water leakage. ### Features & Components: TPU-Sealed Buttons & Covers – Prevent water ingress. 3D-Printed or Composite Shell – Ensures lightweight durability, potentially carbon fiber or fiberglass Conformal Coating – MG Chemicals - 419D-55ML 419D Premium Acrylic Conformal Coating – Protects electronics from potential moisture exposure. ## Subsystem 5: Sensor Array The sensor array will monitor various signals throughout the water blaster. Currently we would like to track for water leaks inside so we can safely shut off the device. To do this we will use water sensors. A larger part of the sensor array is the water level tracking. We currently have a few ideas but will need to investigate further before we decide on a final implementation. The current ideas are for an array of ultrasonic sensors to approximate the fill level, an LED and a photoresistor to determine the presence of water based on the measured resistance, and a pressure sensor at the bottom of the tank to estimate the weight of the water and therefore the volume of the water. ### Sensors Used: Water Leak Sensor – 101020018 water sensor – Detect leaks inside the frame of the water blaster Ultrasonic Sensor (Optional) – – Estimate how much water is in the tank Pressure Sensor (Optional) – MHR01305PBMNNEAA01 – Measures tank pressure for optimal pump operation. Flow Sensor (Optional) – 114991172 – Measures water input and output to track fill level ## Subsystem 6: Pump, Solenoid Valve & Tank This subsystem handles water pressurization and delivery. A 12V self-priming diaphragm pump will pressurize water up to 60 PSI, and a fast-actuating solenoid valve will control short, high-powered bursts. The tank will store pressurized water and be monitored for safe operation. This subsystem is responsible for filling the reservoir, pressurizing the water, and ultimately firing it from the water blaster. It will make use of a 12V self-priming diaphragm pump rated for up to 60 PSI. ### Sensors Used: 12V DC Electric Pump - HIGH FLO 12 DC Electric Electric 12 Volt Sprayer Pumps - fills the tank and pressurizes the water 12V Fast actuation solenoid valve – 1528-1280-ND – Opens and closes to release a burst of water Flow Sensor (Optional) – 114991172 – Measures water output for performance tuning. # Criterion For Success The goals we would like to meet for our project to be considered complete are as follows: - Consistent water bursts - Distance coverage of over 20ft - Complete frame of water blaster that houses all the components and is leak resistant - A screen that displays the current state of the blaster based on sensor data and timing as well as buttons to navigate the configuration menus - Accurately detects leaks and shuts down accordingly - Tracks water level and remaining shots |
||||||
20 | Vinyl Record Auto-Flipper |
Alfredo Velasquez Bustamante Mohammed Alkawai Riyaan Jain |
Chi Zhang | Yang Zhao | proposal1.pdf |
|
Team Members: - Alfredo Vasquez (av28) - Riyaan Jain (riyaanj2) - Mohammed Alkawai (alkawai2) # Problem Statement: Vinyl records have experienced a resurgence in popularity due to their rich and warm sound quality, and ability to physically own and view your favourite music and artworks. However, the need to manually flip records disrupts the listening experience, making listening to vinyl records more difficult than it needs to be. To address this, we propose developing an automatic record flipper that detects when one side has finished playing and seamlessly flips the record to continue playback without user intervention. # Solution Overview: Our design will integrate three primary subsystems to automate the playback of both sides of a 7-inch vinyl record: Tonearm Mover: Automates the lifting, positioning, and lowering of the tonearm to start playback and to clear the record during flipping. Record Flipping Mechanism: Automatically flips the record to play the opposite side upon detecting the end of a side. Turntable Rotator: Controls the rotation of the turntable to ensure proper playback speed and synchronization with the other subsystems. We will modify an existing compact record player to incorporate these subsystems, drawing inspiration from the flipping mechanisms used in vintage jukeboxes. Our focus will be on adapting these concepts to a smaller, modern context suitable for 7-inch records. # Solution Components: ## Tonearm Mover: Function: Automates the movement of the tonearm to initiate playback and to lift it away during the record flipping process. Components: - Servo Motor (HS-318): To precisely control the vertical movement (lifting and lowering) of the tonearm. - Stepper Motor (290-028): To manage the horizontal distance to get to the record - Ultrasonic Sensor (HC-SR04): To detect the end of the record by sensing a decrease in distance, indicating the tonearm is below it. - ESP Microcontroller (Part # not found on ECE supply): ESP microcontroller to process sensor inputs and control motor actions. ## Record Flipping Mechanism: Function: Automatically flips the record to enable playback of the opposite side. Components: - Roller Actuator (Part #W171DIP-21): To rotate the record from one side to the other. - Side clamps (5075A25): To securely hold the record during the flipping process without causing damage. - Ultrasonic Sensor (HC-SR04): To confirm the presence and correct positioning of the record before and after flipping. - Control Circuitry (W171DIP-21): To manage the timing and sequence of the flipping operation. ## Turntable Rotator: Function: Ensures consistent and accurate rotation of the record at standard playback speeds. Components: - DC Motor with Speed Controller (Part #ROB-10551): To drive the turntable at precise speeds (45 RPM for 7-inch records). - Rotary Encoder (Part #377): To monitor and adjust the rotational speed in real-time. - Power Supply Unit (Part #168605): To provide stable power to the motor and associated electronics. # Criteria for Success: - Automatic Detection: The system accurately detects the end of a record side without user intervention. - Seamless Flipping: The record is flipped automatically and correctly aligned for playback of the opposite side. - Tonearm Precision: The tonearm is precisely controlled to avoid damaging the record or stylus during lifting, positioning, and lowering. - Playback Quality: The system maintains or enhances the audio quality of the original record player, ensuring no degradation due to the automation processes. - User Safety: The automated components operate safely, posing no risk to users during operation. |
||||||
21 | ClassroomClarity: Portable Teacher Support Hub |
Jesse Gruber Kaitlin Gowens Maddie Donku |
Aishee Mondal | Michael Oelze | proposal1.pdf |
|
# ClassroomClarity: Portable Teacher Support Hub Team Members: - Maddie Donku (mdonku2) - Kaitlin Gowens (kgowens2) - Jesse Gruber (jgruber5) # Problem In the classroom today, students may be reluctant to raise their hands to ask a question, or the professor may not see them. Questions that are critical to understanding the material go unanswered as a result. Asking questions and getting clarification on class material is fundamental to learning, which is why the classroom needs to be more accommodating to students’ questions. While there are tools such as Mentimeter, these platforms require professors to use time outside of class to create slides and also take up screen space on the lectern. Another issue with the variety of sites used for student engagement is that there is no uniformity for the students. Cell phones and laptops can become clogged with numerous bookmarks for these applications for different classes. Lastly, professors may need an easily detectable, portable, physical alert to remind them to look at questions that students have posted, which cannot be provided by online means. Professors and students can benefit from a tool that will easily show them how the class is handling material and any questions that may arise. A hub that is consistent between classes will simplify the learning experience for both students and professors. # Solution Our solution introduces a clarity hub to the classroom. The hub will sit in the sight of a professor with indicator lights, relaying both how the students are absorbing the class material and any questions that have been sent to the hub. The hub will have a “raise-hand” feature which will notify a professor when a student wants to vocally ask their question and host a screen that will display questions that students may send in using an app. The hub will have a specific passcode that must be entered into the app to access the hub. A wearable will vibrate when a question is present to remind the teacher to look at the hub. This could either be worn or sit near the hub for alerts. # Solution Components ## 1) Hub Control System The control system of the hub facilitates communication between each subsystem and allows for user input through tactile means. This system would include the microcontroller as the brain of the device, as well as a series of LEDs, buttons, and dials. The ESP32-PICO-V3 microcontroller was chosen for its built-in Bluetooth, 520 KB SRAM to store student’s questions, ample GPIOs, and is supported by the Arduino IDE. The LEDs can be obtained from the supply center and will be in the colors; green, yellow, orange, and red (606-4302H5-5V, HLMP3401, 39K995, HLMP3301). The buttons to select and resolve questions will be D6C90 F2 LFS to provide tactile input on press. The dial to scroll through the questions will consist of a knob (EH71-1SB2S) connected to a 10k rotary potentiometer (P0915N-FC15BR10K). ## 2) Hub Power System This subsystem provides power to the microcontroller and its peripherals. To power the microcontroller, we believe a LITH-ION 3.7V 850MAH battery (1568-1495-ND) would be the best option. Its voltage and current rating provide enough range to handle the microcontroller in peak active mode (3.3V, 360mA) while reducing wires in the workspace, and increasing portability. The 3.7V is also within range to power the different colored LEDs and other peripherals. To meet the requirements of the different components, a voltage regulator (LM317T (NAT)) will be used with corresponding resistors and 1uF capacitors to step down the voltage. ### 3) Communication System The communication subsystem works as the link between the app, main hub, and the wearable band. As discussed in #1, #4, and #5, we will be using bluetooth to transmit data from the app to the main hub and to send a signal to the band to initiate the vibration. Since we are planning to use the ESP32 microcontrollers in the band and main hub, the bluetooth functionality is already built in. ## 4) App The app is the student interface which will allow students to submit data to the main hub to pose questions to the teacher and indicate their current understanding. We plan to use Android Studio Software to code our own app that will include sliders to rate understanding on a 1-5 scale, question submission through text or “raise hand” modes, and a way to connect to the hub via bluetooth. We want to use Android Studio because it works for both Android and IOS app development and we have worked with it in the past. ## 5) Wearable Band The wearable band acts as a tactile notification system for the teacher. It vibrates when a new question is submitted to subtly notify the teacher. For the same reasons as discussed in #1, we are looking at using an ESP32-PICO-V3 microcontroller to control the vibrations. Similarly, to power the microcontroller, we are currently looking at a LITH-ION 3.7V 850MAH battery (1568-1495-ND). The band vibration would be made using a vibration motor like ROB-08449-ND because it requires low voltage (3V), is small and therefore wearable, and operates within the battery specifications. A voltage regulator like LM317T (NAT) along with resistors and 1uF capacitors will be used to step down the voltage. # Criterion For Success - Students able to send questions wirelessly to the hub through an app - Students able to submit engagement ratings wirelessly to the hub through an app - Hub uses lights to indicate general class understanding based on incoming data - Hub displays questions asked by students or indicates that a student raised their hand - Hub allows anonymous/not anonymous posting when submitting a question - Wearable that vibrates upon a question being posted, will have different modes that allow for repeat vibrations if there is a question on the hub (reminders) - Professor able to clear questions on the hub one by one |
||||||
22 | Updating the Spurlock Museum's PTM Dome |
Nick Mitchell Priya Dutta Sam Mencimer |
Eric Tang | Arne Fliflet | proposal1.pdf |
|
# Updating the Spurlock Museum’s PTM Dome # Team Members: - Priya Dutta (dutta15) - Sam Mencimer (sgm8) - Nick Mitchell (nlm4) # Problem The Spurlock Museum on campus has a department dedicated to digital preservation of artifacts. Since 2001, they have been using a PTM dome to produce 3-D digital images of artifacts which allow them to be studied by researchers anywhere in the world without the risk of shipping the artifact somewhere. The original dome is no longer functional, and updates are needed so the museum can resume their documentation of digital artifacts. This project has been worked on by two other ECE 445 groups in the past. # Solution Our solution will involve building on the previous groups’ progress to hopefully come up with a functional PTM dome. As of now, the dome has 32 LED lights which are wired and functional. It also has a GUI which is intended to control each of these 32 lights individually, whether in sequence or manually. We will build on this to create a system which will: - Control the camera’s shutter button via 3.5mm jack - Sequence each of the 32 lights on the dome individually or in order - Interface with a controller - either software or hardware - to allow the sequencing to occur - Be controllable without causing movement to the dome, as this can cause issues with the photos In addition, we will provide detailed instruction manuals and troubleshooting steps for all aspects of the system to ensure longevity. All components will be designed with repairability in mind, and spares can be provided for components which are custom-designed. The goal is to have this dome work for as long as it is needed, independent of technology updates for operating systems, etc. # Solution Components ## Subsystem 1 - LED Controller board This controller board will have a microcontroller which is capable of controlling 32 12V LEDs individually based on its programming. It will interface with an external controller, which will either be software-based or hardware-based, and a camera shutter trigger in the form of a 3.5mm jack. The timing of the camera shutter and the LEDs should align to ensure proper functionality of photo capture. This controller board will build upon the design of past groups. Our intention is to design a new board, but with the assistance of course staff as needed to ensure that we resolve issues encountered in Fall 2024, specifically crosstalk between I/O lines and 12V power. Some of the improvements may include: - Different LED driver setup. Previous groups used an LED driver designed for an LED Matrix display, which may be unnecessary for simple applications of LEDs. -Microcontroller with C programming (e.g. STM32) rather than the arduino programming that was used previously. Our group has experience with embedded C programming, and it is more flexible than arduino. - Better routing of traces on the PCB to reduce crosstalk This will all depend on input from the course staff and the results of our research in the design phase. ## Subsystem 2 - User Interface Research will be conducted before design documentation is produced to make a decision regarding the best way to control the LEDs. The two options we have are to use the GUI developed by a previous group, or to develop a hardware-based control system. The requirements for the UI are that it must be able to control all 32 lights in sequence or individually, and be usable without causing motion in the dome. These requirements can be met with either control type. The main benefit of a hardware-based control system is that it works independently of an operating system which can become obsolete, but it may require some extra work on our end when it comes to designing it, as there are no off-the-shelf options that meet these requirements. The software-based control system can be changed easily, so if there is a problem, we can simply change some code, and it might be easier to use for the museum staff. ## Subsystem 3 - LED Lights & Dome This subsystem is largely complete, although we will have to evaluate the state of the wiring to ensure that it is compatible with our design for the LED controller board. ## Subsystem 4 - Longevity We are calling this a “subsystem” because it is an important component of the project, even if it is not a specific component of the design itself. The dome has the potential to last for a very long time, because inherently it is not very complex. Our designs will be centered around longevity - meaning that every part should be easily replaceable by the end user with a repair manual and basic hand tools. We will do our best to provide spare components for things that are not readily available - e.g. custom circuit boards. If we are unable to provide spares, we can provide design documents to allow manufacturing of replacement parts without the assistance of the people who designed the project. All aspects of the design will take into account what will happen 10+ years from now when, for example, Windows 11 is obsolete. For example, we will avoid using obscure or outdated connectors, or connectors which have the potential to be obsolete (e.g. any form of USB). ## Subsystem 5 - Control Enclosure The control enclosure will ensure that components are protected from ESD and external factors. It will most likely be 3D printed and assembled using screws, rather than glue or other permanent adhesives to ensure repairability. # Criterion for Success ## Criteria from Spurlock staff: - Get the 32 lights functional again - Optimize rewiring the current dome - Ensure the new control box and software program is functional from any OS (Mac, PC, etc.) - Design the lighting and control box to fit in our dedicated museum workspace for easy plug-in and access - Make sure the apparatus can be triggered without moving the dome, as the photos are sensitive to movement - Make the lights in sequence from 1-32 synced to our camera’s shutter being triggered to fire when each next light turns on. - Have the ability to independently turn on/off any lights in that sequence if needed (Currently, the control GUI turns any light on for 10 seconds only). - The expectation is a functioning dome that allows us to proceed with the essential work of digital artifact preservation. ## Our additional criteria: - Provide manuals and documentation for all aspects of the dome |
||||||
23 | Smart Snack Dispenser |
Adam Kramer Elinor Simmons Eric Nieto Gonzalez |
Surya Vasanth | Yang Zhao | proposal1.pdf |
|
Team Members: - Eric Nieto Gonzalez - Elinor Simmons - Adam Kramer # PROBLEM One common problem many people face is difficulty in controlling snack portions, which can lead to overeating and unhealthy eating habits. Mindless snacking, especially when working, studying, or watching TV, often results in consuming more than intended. Similarly, there seem to be no machines handling this issue within the current market, leaving individuals to rely on willpower alone or resort to ineffective portioning methods such as manually separating snacks into smaller bags. Without a structured approach, people often struggle to regulate their intake, leading to issues such as weight gain, unhealthy eating patterns, and difficulty in maintaining a balanced diet. # SOLUTION The smart snack dispenser addresses this issue by allowing users to set portion sizes and control snack intake. By offering a structured approach to snacking, it helps users develop healthier eating habits, prevent overindulgence, and manage calorie intake more effectively. This solution is particularly beneficial for individuals trying to maintain a balanced diet and/or are tracking their food intake. The machine will offer a specific set of 10 snacks. This includes M&M's, Skittles, Goldfish, Almonds, Cashews, and others. The machine will also plug into a wall outlet. The solution will include the following subsystems: - Motor Subsystem: There will be a motor for each snack implanted so that the user can ask for that specific snack, allowing for it to be dropped down to be given to the user. - Light Sensor/ Computer Vision Subsystem: There will also be a sensor for each snack as well to detect when the stock of each respective snack is running low. This will then be relayed back to the LCD Screen to inform everyone that one is running low. - PIR Subsystem: This will take care of the machines dropping mechanism and ensuring that there is a tray the user is providing for their needed snacks. It will be a PIR sensor to detect if there is a tray present. - Touchscreen LCD Display Subsystem: This will be the UI that allows the user to place their goals, access their profiles, and display crucial information. This will show things like date, time, type of snacks, and nutrition for that person to keep a log. - RFID Subsystem: This will scan each person's ID to access their own personal nutritional goals and data. This way the machine can be used within like a family instead of just one person. Therefore, the machine will also have data on each person's nutrition. - Portion Control Subsystem: This will make sure that the correct portion is being dispensed. - Software Subsystem: This will handle all of the internal features. These will include a "lockout" system that will prevent the user from dispensing any more snacks once a set daily calorie limit is hit. There will also be a recommendation system to recommend a more sufficient snack if needed, the displaying of the date, time, type of snacks, and nutrition for that person to keep a log. There will be two modes offered as well. One is a casual mode that will just allow the user to pick whatever snack they want and choose a portion, this will not implement the lockout system. The other mode is the main mode which will provide a user a snack after choosing if they are in need of something that will, for example, help give them more energy or they want something with more protein. # SUBSYSTEM 1 The Motor Subsystem is responsible for the snacks being dispensed correctly and no issues arising. Design: - Code the motors in the micro controller and ensure each functions properly with their respective snack. - Create a format where only a certain amount of snacks gets dispensed with no issues. Components: - A motor per snack that will be in the machine. # SUBSYSTEM 2 The Light Sensor Subsystem is responsible for checking the amount of snack present. It will also tell us when the snack on each one is running low. Design: - Code the light sensor where we know the depth of each snack container and then reduce the amount by a little. - This will then allow us to detect when a snack is running low when that previous depth as been reach once more. Components: - A light sensor per snack that will be in the machine. # SUBSYSTEM 3 The PIR Subsystem is responsible for checking if a tray is present or not. Design: - Code the PIR sensor in the micro controller and check when there is a tray present or not - This can be done where we detect the light coming back soon since in the presence of a tray, it should not take long compared to nothing being present. Components: - One PIR sensor for the dispensary. - Design the dispensary in a way that drops the snacks without blocking the PIR sensor. # SUBSYSTEM 4 The Touchscreen LCD Display Subsystem is responsible for displaying all the needed information to the respective user. Design: - Code the LCD display to work properly with the rest of the sensor as mentioned. - Create a solid user interface as well where the user can interact with the display. - Display needed nutritional facts as well and constraints on the user if needed. Components: - One Touchscreen LCD display - A memory to save all the data that will be implemented # SUBSYSTEM 5 The RFID Subsystem is responsible for checking in each user and locate their respective data. Design: - Code the RFID system so that it functions properly with the user tags Components: - RFID reader - RFID tag tag per user # SUBSYSTEM 6 The Portion Control System will make sure the correct portion is being dispensed. Design - Create a frame that will hold the weight sensor. - Code so that the weight sensor readings are read on the LCD and apply correct units. Components - Weight sensor # SUBSYSTEM 7 The Software Subsystem will handle all of the internal features. Design: - Code the lockout system, the recommendation system, date, time, type of snacks, nutrition log. Components: - The ESP32, so we can have access to WIFI # CRITERION FOR SUCCESS Our project will be considered successful if it meets the following testable criteria. 1. The weight sensor is accurate with a 5% tolerance 1. The motor system dispenses the snack with minimal issues (motor doesn't jam and snacks don't get stuck while dispensing). 1. The user interface works properly and the internal software systems work at the appropriate times. 1. Both sensors for refilling and checking if a container is present are working properly. 1. Snack is dispensed in at most 3 seconds after the user chooses the amount of snack. |
||||||
24 | FastFretTrainer (FFT) |
Eli Hoon Murtaza Saifuddin Omeed Jamali |
Eric Tang | Michael Oelze | proposal1.pdf |
|
FastFretTrainer (FFT) Team Members: - Eli Hoon (ehoon2) - Murtaza Saifuddin (msaifu2) - Omeed Jamali (ojama2) # Problem As a beginner guitarist one of the most difficult obstacles that you face is learning the notes on the fretboard/neck of the guitar. Guitars do not have markings for each note, only some fret numbers. There are some tools online that can show you where the notes are, but this is not an effective way of learning. The best way to learn is by doing so with feedback from a system so that you are able to take corrective action and quickly fix mistakes. Learning each note on the guitar is important because it allows you to solo/improvise on a piece of music if you know the key. The solution to this problem is FastFretTrainer (FFT). With this project, you will be able to learn notes with real-time feedback through the use of an attachment plugged into the jack of your guitar that would send data to a base station, and give visual feedback on how well you played the note and ways to improve through a computer based application. The user would be able to interact using the computer based application to interpret the feedback from the system. # Solution For this implementation there would be a small wireless fob that connects to the quarter-inch jack on the guitar; this fob will transmit via Bluetooth to the base station. The fob will be responsible for amplifying, converting analog to digital data, and correctly transmitting data to the base station. The computer-based application and base station will ask the guitarist to play a specific note on the guitar and connect to a computer via USB-C so that the web application can manipulate data and use a Fast Fourier Transform (FFT) to check the similarity between the note played and the expected note. The computer based app would give more detailed visual feedback, such as a scale showing how far off the played note is in the unit of cents. If time permits, we plan to add more advanced features such as several practice modes, including practicing with or without accidentals on single or multiple strings, and it could even include a chord trainer that would be able to recognize the appropriate notes in a chord to determine if the correct notes are played. As a backup and for testing purposes, we will also include a quarter-inch jack on the base station in case the Bluetooth fob is out of battery. # Solution Components ## Battery Subsystem The fob that communicates with the base station will be battery-powered. We are designing the fob to have an operating voltage around 3 volts. Thus we can base the DC power of the board around a CR 2032 battery. The base would be powered preferably over USB-C, but if the USB power delivery does not meet our needs we can use power from the wall through an AC-DC conversion stage. ## Amplification Subsystem The main microcontroller we will be using is the ESP 32 (ESP32-WROOM-32D). Reason for this is that it provides many interfaces (ADC, Bluetooth, UART) that we will be using in our project. The standard amplitude for an electric guitar output is about ~100mv, thus we would like to amplify the guitar signal before analog to digital conversion so that we are able to get a clear well-sampled signal to send via Bluetooth. This subsystem would use a low pass filter to remove high-frequency noise present in the signal before amplification using the TPA4860 at a 3.3 volt operating voltage. After amplification, we will pass the signal into the ADC. ## Analog to Digital Conversion Subsystem The electric guitar will output an analog signal which represents a sound wave that was created by the vibration of strings. Since we are using a UART/USB system to transfer data to a computer system to compute the FFT, we will need to convert the analog signal to digital. The microcontroller we selected provides an ADC chip with channels for us to enable in firmware, and from there we can transmit the data via UART/USB to the computer. Before we transmit the data via UART/USB, we will transfer the data to a different ESP microcontroller that is placed on the base station via bluetooth. The data received at the ADC will be amplified via the subsystem described above. ## Wireless Subsystem We will be taking advantage of the Bluetooth radio capabilities of a separate ESP 32 (ESP32-WROOM-32D) to establish communications between the fob and the base station. The fob and the base station will each have their own separate ESP 32s. We will use the Arduino IDE and the ESP 32 BLE (Bluetooth Low Energy) Library for communication between the two microcontrollers. The Bluetooth transmission will be responsible for sending the sampled signal data from the fob to the base station for further processing or analysis. ## UART/USB Subsystem We will transmit data to the computer system to compute the FFT for our program via UART/USB protocol. The ESP32-WROOM-32D microcontroller that is hosted on the base station will receive data from the microcontroller hosted on the fob (this is digital data), and from there the job is to send it to the computer system via UART/USB system. There are TXD/RXD (transmit/receive) pins on the microcontroller for UART communication, and we will be using a UART/USB bridge to convert to USB protocol and then transmit the data to the computer. We will use the FT232RL chip as a UART/USB bridge, and then we will connect the pins from the bridge (main pins are the D+/D- pins for USB) to a USB connector, and via a USB cable we will connect to the computer to transfer data. The USB connector we plan to use is USB4230-03-A. We will also be delivering power to our base station via the USB subsystem. To do so, we just need to wire the power and ground wires to the microcontroller respectively. ## LCD Subsystem Some basic system information such as whether the system is powered on will be displayed on an LCD display on our base station. We will also display basic feedback (how many cents off the user is from the expected note) that would also be available via the computer based app. The LCD subsystem would be composed of a simple I2C LCD Display, I2C LCD Adapter, and some wires to connect pins. The display would be hooked up to the adapter and the microcontroller would send data and power the LCD through the adapter via Vcc and GPIO pins. The LCD would be programmed to show metrics delivered through the data pins with the LCD-I2C Arduino Library in C++. ## PC Subsystem The PC Subsystem is responsible for acting as both a source of power for the base station and the main system used for data manipulation and visual feedback. The PC will also be responsible for the note selection that the user must play. The PC will be set up to enable single-string, multiple-string, and other more advanced practice modes. The PC subsystem would receive the signal from the base station via USB-C and Python scripts would be used to take the FFT via the FFT module of the SciPy library. If we encounter the issue of Python not being fast enough for data to be processed in real time, C++ could also be used with the FFTW library. Once the strongest frequency from the FFT has been found, we can compare the frequency of the played note and that of the expected note (hardcoded) via conversion to cents, a unit in music that measures note intervals. This conversion can be done via a simple formula (1200 * log base 2(f1/f2) where f1 is the played frequency and f2 is the expected) and a threshold of +/- 5 cents can be used to measure whether the note played was accurate. The PC subsystem would then send the cents to be shown back on the LCD display of the base station via the PyUSB library. The PC Subsystem will also have a local application that could be used to see more visual feedback on how close the played note is via a scale (inspired by GuitarTuna’s UI for tuning) showing the expected note in the center and the value in cents to the left or right at a distance based on the cents and whether the note was too high-pitched or too low-pitched. The UI could also be used to switch between more complex modes like chord training. # Criterion For Success One of our main criteria for success is that our fob is battery-powered and able to communicate with our base station and provide signals via Bluetooth from a distance of at least 5 feet. Our base station should be able to display information like whether the power is on or off on the LCD display as well as basic feedback about the played note without the screen of the computer. The computer will be attached via USB-C (which is also the power connection for the base station) and will implement several software features for added complexity. The base station should effectively communicate with the fob and receive digital signals to pass onto the local computer-based application to take the FFT and accurately compare the played note to the expected note. The application should send basic feedback back to the base station to display on the LCD and give more detailed visual feedback on the computer display such as a scale showing how off the played note was in cents. The system will support at least a single string practice mode, where individual notes are requested and then tested after the user plays to ensure accuracy. One of our group members has never played guitar before, so it would be worthwhile to test with them to see how they are performing/improving with this application. |
||||||
25 | Electronic Martial Arts Paddles |
Alexander Lee Liam McBride |
Jason Jung | Yang Zhao | proposal1.pdf |
|
# Title: Electronic Martial Arts Paddles Team Members: - Liam McBride (liamjm2) - Alexander Lee (asl9) # Problem Currently there is no good way to accurately quantify performance in Taekwondo training for drills such as speed and power drills. There exists electronic gear for automatic scoring by tracking the power and the location of the martial artists’ kicks, but that gear is only used in competition and is prohibitively expensive. # Solution We are proposing electronic target paddles with pressure sensors at different locations of the paddle and LED’s to measure power and speed for kicking during training. We will also facilitate reaction speed/timing drills via sound or blinking of the LEDs. Example paddle here: [https://www.ctitkd.com/product-page/vision-kicking-target](url) We would have our main system (pcb) be a separate box that would handle the inputs from the paddles, and connect to a display to show scores and statistics. # Background Both Liam and Alex are executives for the university’s RSO Competitive Taekwondo Club, and have practiced Taekwondo for 10+ years, competing at local to international levels. # Solution Components ## Subsystem 1: Control Box and Display Explain what the subsystem does. Explicitly list what sensors/components you will use in this subsystem. Include part numbers. Custom PCB Bluetooth receiver for connecting with the paddle and sending/receiving data and instructions: The target paddles will be difficult to maintain if there were wires coming out of it to the PCB, so we will utilize bluetooth connection for the LED and sensors HDMI Out to regular display or LCD screen: Have an HDMI connection to a monitor or LCD screen directly from the PCB to display our scores using a health bar mechanism as commonly seen in video games. We will also display statistics for our drills Wall power/power supply: We would need a constant source of power, which we would use a power supply. The power supply will be connected to a wall outlet. Sound system/speaker (optional): We will use speakers that play a sound when the target paddle is hit, along with the LED. We will also use the speakers to give a sound cue for reaction drills different sounds for different kicks or choosing right or left leg ## Subsystem 2: Electronic Paddle Pressure/force sensor: three of these sensors each placed at the front, middle, and rear side of the paddle to distinguish the location of the hits. Each of these sensors will measure how strong the hits were, and crossing a certain force threshold will indicate a valid hit. Since force sensors that handle high forces can be fairly expensive, we would need to come up with a way to dampen the impact or distribute the force, and then scale the measurement so we can use cheaper, lower threshold sensors. We are also considering the use of an accelerometer for a potentially more affordable option. LEDs or LED strip: These led strips will be an indicator for a valid hit, or for reaction drills, or each led with different colors will indicate which part of the paddle was hit. Different colors for different kicks or choosing right or left leg Bluetooth transmitter for connecting with control box Battery power: Since the components on the target paddles will be physically separate from the PCB box, we will need battery power to keep the LED and sensors operating without a direct power supply from the wall. # Criterion For Success - System is able to accurately track response times - System is able to accurately measure force of strikes - Bluetooth is working so we don’t resort to using wires - Paddle and auxiliary machinery is able to withstand repeated strong blows without breaking. (> 10 strikes min) - LED, speaker, and sensors are working in cohesion - Display is accurately reflecting desired results. |
||||||
26 | Solar Panel Cleaner |
Cameron Little Geoffrey Swisher Thomas Cai |
Maanas Sandeep Agrawal | Arne Fliflet | proposal1.pdf |
|
# Solar Panel Cleaner Team Members: Cameron Little (clittle5) Thomas Cai (wcai10) Geoffrey Swisher (swisher5) # Problem Solar panels are highly sensitive to shading and dirt accumulation, which can significantly reduce their energy generation efficiency. Even partial shading or debris on the surface can create hotspots or disrupt the panel's output, leading to substantial energy losses over time. During ECE 469 Power Electronics Laboratory, we explored techniques to extract maximum power from solar panels installed on the roof of ECEB. However, these experiments highlighted how environmental factors, such as dust and shading, limit the panels' ability to consistently deliver optimal power output. # Solution To provide a cheap and effective solution for various types and models of solar panels, we are going to design a rail-based cleaner. The rail can be attached to the top of the solar panels, with wheels to allow horizontal movements. A soft material like felt can be used to prevent damage to the panels. The cleaning module is then attached to the rail through cables, which can be shortened or lengthened through controllable motors to achieve vertical cleaning. # Solution Components ## PCB Controller The controller subsystem includes a front panel with inputs for controlling the cleaner, as well as the MCU which will interface with the panel and the drivetrain. The front panel will have buttons/switches/knobs to enable and control the operation of the cleaner. The microcontroller will be STM32C0. ## Drivetrain The drivetrain subsystem is responsible for moving the cleaning module across the solar panel surface. The design involves two distinct motion components: Vertical movement for the cleaning module to scale up and down the solar panel using attached cables. Horizontal Movement for the cleaning module to be able to move along the rail attached at the top of the panel Motors such as the NEMA 17 stepper motor will be used for accurate control of both vertical and horizontal movement. These motors will be paired with motor drivers (e.g., DRV8825) to interface with the microcontroller. ## Cleaning Mechanism The cleaning module consists of an interchangeable microfiber cloth and a cleaning solution dispenser. The solution can be dispensed with the use of a [Digiten](https://www.digiten.shop/products/digiten-1-2-dc-12v-electric-solenoid-valve-normally-closed-n-c-water-inlet-flow-switch) ½ inch, 12 Volt solenoid valve. ## Energy Storage Two (2) 12V drill batteries, such as [Warrior ](https://www.harborfreight.com/12v-lithium-ion-battery-with-charger-57763.html?gQT=1) 12V Lithium-ion Battery. Charger will be included with the batteries. DC-DC converters will be used to power the motors and supply 3V for the microcontroller. In order to provide power to devices on the moving cleaning component, a [coiled cable](https://www.amazon.com/RIIEYOCA-Female-Cable%EF%BC%8CDC-Extension-Stretched/dp/B0BJT9TC5J?th=1) could be used. # Criterion for Success To ensure the solar panel cleaner is effective, the following goals can be tested: The cleaning mechanism must remove at least 80% of visible debris (e.g., dust, dirt, or bird droppings) from the solar panel surface. Cleaning tests will demonstrate a measurable increase in power output of the cleaned panel, with a minimum improvement of 10% compared to an uncleaned panel under identical lighting conditions. Solar panel extraction power can be done in the Power Lab on the fourth floor. Manual operation via front-panel controls must allow precise movement of the cleaning module in both horizontal and vertical directions. The drivetrain, motors, and other electronics must function correctly after 40 cleaning cycles without significant wear or failure, ranging in environments 32°F - 100°F. |
||||||
27 | Desk Learning Aid Device |
Aidan Johnston Conan Pan Ethan Ge |
Kaiwen Cao | Michael Oelze | proposal1.pdf |
|
# Desk Learning Aid Device Team Members: - Conan Pan (cpan23) - Aidan Johnston (aidanyj2) - Ethan Ge (ethange2) # Problem As a result of ongoing technological growth and the Covid pandemic, there has been a shift in recent years to integrate technology into the classroom via computers, devices, and virtual learning. However, this shift has generated further problems, specifically in elementary school classrooms. Problems we have noticed include young children spending more time on screens, less time socializing, and being far more disruptive. These trends contribute to a less effective and unhealthy learning environment. In the pursuit of generating a more social, engaging, and nurturing environment for young students we propose the desk learning aid device. # Solution The desk learning aid device will function through various buttons connected to a customized PCB device. Buttons will correspond to responding to polls/questions, comprehension checks, asking questions, and more. The device will communicate to an application that can be monitored by the teacher where they will receive real-time feedback. The teacher can have a better understanding of the student’s comprehension levels and be able to properly cater towards providing the students the most effective lesson. The purpose of this device would be to provide a cost-effective solution that can be set up at each student’s desk to promote a holistically better learning environment for students. This differs from other options on the market due to easier set up because other options require you to create a question in order to receive a response, however, our device allows for many passive inputs including comprehension and other urgent needs. In addition, other portable solutions require students to buy each device individually costing them hundreds of dollars, but our solution only requires the purchase of a reusable RFID keycard that is cheap and easy to use. # Solution Components ## Input Subsystem This subsystem will include response (EVQ-P7K01P (Panasonic)) buttons for comprehension checks, request for assistance buttons, feedback buttons, and mental/emotional health check-in buttons. These buttons will be labeled accordingly so that student interaction with the device is simplified. The advantage with having a variety of buttons is to enable teachers to have as little interaction with the app as possible. In addition, this subsystem will include a scroller (Bourns PTL30 Series (PTL30-15O0F-B103)) that will enable students to adjust in real-time how they are feeling throughout the day. ## Interface Subsystem (SSD1306 0.96" OLED Display (I²C)) This subsystem will include a basic interface that serves several purposes: Will display the user’s name once the user check’s in thus verifying the check-in. Will display a range of emotions that students can select via their scroller. Will display the answer choice selected by the user for comprehension checks. ## Microcontroller Subsystem The ESP 32 S3 Microcontroller is programmed with firmware to recognize button inputs and process them according to whether a question has been asked. Transmit student data to the mobile app run by the teacher. ## Mobile Application The mobile application subsystem serves as the teacher’s interface to monitor student responses, track participation, and adjust lesson pacing in real time. The app receives data from student devices via Wi-Fi or Bluetooth, displaying responses in a structured and visual manner. The teacher can view class-wide comprehension trends, see which students need help, and manage classroom activities such as quizzes and polls. It would also receive data from the RFID/NFC keycard subsystem and store data for each student’s participation, attendance, and comprehension. Components: Frontend UI: Built using React Displays real-time responses, feedback, and participation data. Backend & Communication: Firebase Realtime Database to handle instant message transmission. Secure BLE/Wi-Fi communication with ESP32 devices. Data Processing & Visualization: Aggregates student responses for charts, graphs, and heatmaps. Uses D3.js or Chart.js for real-time visualization of classroom engagement. Authentication & Security: Teachers log in with Google OAuth or school credentials. ## Power subsystem The 103454 LiPo rechargeable battery system will be used to power the device. Ideally, we’d like to also make the system as efficient as possible as to ensure that it doesn’t need frequent recharging. This battery system is preferred over wired power due to the installation and cable management that comes with wired power. Furthermore, desks are constantly moving in a classroom, whether that be re-arranging seats or during seasonal cleans, thus further highlighting the advantage of the battery system. ## RFID/NFC (Keycard) subsystem The RFID/NFC subsystem allows students to log in quickly and anonymously using keycards without the need for manual name entry or personal devices. This ensures a seamless and low-disruption way to track participation, attendance, and response data. By tapping their RFID or NFC card on their desk device, students authenticate themselves before answering questions or engaging in activities. This enables teachers to monitor individual engagement and performance trends without requiring students to use personal logins. Components RFID/NFC Reader Function: Reads the keycard’s unique ID Part: RC522 NFC Module (SPI-based) RFID Key Cards Function: Unique identifier for each student Part: MIFARE 13.56 MHz RFID Cards # Criterion For Success The PCB device accurately registers button presses and sends the data to the mobile application. The mobile application receives user(s) data from the microcontroller and stores/analyzes trends in the data for classroom comprehension for specific topics. The keycard correctly signs the user into the “classroom” that the device belongs to. The entire system remains functional throughout an entire school day. The button design is clear and simple for students to interact with. The desk learning aid device integrates smoothly into the existing classroom format - without adding substantial work for students and especially teachers. |
||||||
28 | Early Project Approval: GymHive Tracker |
Aryan Shah Kushal Chava |
Aishee Mondal | Arne Fliflet | proposal1.pdf |
|
# **Early Project Proposal: GymHive Tracker** # **Team Members:** Aryan Shah (aryans5) Kushal Chava (kchav5) # **Problem** A common frustration among gym-goers is that equipment tends to be occupied quickly during peak gym times. Many fitness enthusiasts craft workouts designed around tracking their strength across each workout, requiring them to follow a structured routine of the same machines in a consistent order. During these peak times, machines can be occupied by individuals and even have lines forming to use them, making it important for a gym-goer to know if their next machine is occupied and an estimated wait time (based on how many other people are waiting to use it, as well as their estimated rep and set counts) until they can use it. A solution to this issue would significantly optimize an individual’s workout by reducing idle time and benefiting the gym itself by improving the quality of workouts for its customers. # **Solution** Our solution is the GymHive Tracker, a pressure sensor-based system that will monitor each gym equipment’s utilization and display real-time availability updates on an external screen, such as a monitor. The system relies on weight sensors attached to common points of usage for each gym equipment, depending on its type. Most gym equipment will have sensor points placed on pads of contact, where the body is stabilized onto the machine. As a rough guideline to cover most equipment, we will have the sensor on points of contact where the user typically sits/stands/leans. This covers most gym equipment, however, some equipment may inevitably be inapplicable for this preliminary design. A status update of occupied vs not occupied will then be displayed to provide real-time information to users. In addition, we plan to implement a “check-in” system, which allows other users waiting in line to join a “queue” to use the equipment, providing info such as their estimated sets and reps to be able to give other gym-goers an estimated “wait time” until they can use the machine. To create a modular design, each machine would have its PCB that is specifically designed for its intended usage. Each PCB would transmit data through RFID technology. The RFID chip/module will communicate with a microcontroller chip (ESP32) which will handle data transmission to an AWS server for our processing. This would make it such that a user would simply hover their smartphone over the machine and then be presented with an app we design. This app will allow the user to join the queue and occupy the machine when it is their turn. They will input their desired reps and sets, which would be useful information for a gym-goer wanting to use the machine when it is free. Based on the data we gather, we will provide an estimated wait time based on how many individuals, sets, and reps are yet to be completed. In addition, when the next user’s turn is coming up in the queue and the current user is on his last set, it will send a notification to that next user so that he can start heading over to the equipment. An IMU (Inertial Measurement Unit) sensor will be placed to track the repetitive motion patterns for what is counted as a “rep” for each machine. By collecting this data, we can then be able to track when a user has completed their reps/sets, and update this accordingly in the app without any action from the user. # **Solution Components** **Subsystem 0: Microcontroller Chip (part of custom PCB)** Description: The Microcontroller Chip will be a part of the custom PCB we design, which will process the sensor data, handle communication via the RFID chip, and transmit data to an AWS server Microcontroller: ESP32 chip capable of handling Wi-Fi communication necessary to handle data to the AWS server RFID data input Wireless Communication: Wi-Fi communication from chip to AWS Server Inputs and Outputs: GPIO pins for connecting pressure sensor and RFID module. Also, the chip has Serial/UART communication to help debug. Custom PCB: Designed to integrate the ESP32 microcontroller chip, power management circuits, and RFID module. **Subsystem 1: Pressure Sensing Module (part of custom PCB)** Description: This module is the main component to detect whether the gym equipment is being occupied by an individual. It works by measuring a certain amount of pressure applied to a surface, most likely a bench seat, handle, or other type of platform used as the main point of contact for the machine. Pressure Sensor: High-precision sensors (e.g.,https://www.digikey.com/en/products/detail/uneo-inc/GHF-10/15657152) attached to gym equipment to detect weight changes and determine occupancy. It can measure any weight from 0 to 110 lbs, which is sufficient to detect if an individual is occupying it. Analog output to readable signal: The output of the GHF-10 comes as an analog output, so we need to translate it using a voltage divider resistor (maybe around 10kOhms resistance) and a capacitor to reduce noise. The output of this readable signal will be read by the ESP32’s ADC pins. **Subsystem 2: RFID Communication (part of custom PCB)** Description: This module allows for the ability of the user to use their smartphone to simply hover over the gym machine and be presented with the app to join and view the queue. RFID Module: The RC522 (https://www.electronicwings.com/esp32/rfid-rc522-interfacing-with-esp32) seems to be one of the most popular RFID modules that are used along the ESP32 Communication with ESP32: The plan is to use I2C communication to begin with due to having more resources online and being sufficient for our needs (although SPI can also be used). **Subsystem 3: Inertial Measurement Unit (part of custom PCB)** Description: This module allows our PCB to detect reps based on motion data. This will allow for real-time updates to an equipment’s availability without any user action. IMU Sensor: The ICM-20948 is a motion sensor IC that has the necessary tracking elements: an accelerometer, gyroscope, and 6 degrees of freedom, as well as being easily integrated with the ESP32 (https://www.digikey.com/en/products/detail/tdk-invensense/ICM-20948/7062698) Communication with ESP32: The plan is to use I2C due to lower pin usage and not needing SPI communication. It will analyze the motion and feed this data to the ESP32 which will then send it over to the app to track the workout progress. **Subsystem 4: Mobile App** Description: The app is designed to provide users with real-time equipment status and input/output functionality for them to “check-in” to the machine. Functionalities: The app will need to be able to: display real-time availability, and wait times, allow users to “check-in” or join the queue, allow for user input for users to specify desired sets and reps, notify users of their turn coming up Backend: Data from the ESP32 is processed and stored in an AWS server. Our app will query that server to gather and process the data. **Subsystem 5: Power Supply (powers the sensors and other modules, part of custom PCB)** Description: This module powers the sensors and other hardware Battery: Rechargeable 12V lithium-ion battery packs for portability and reliability. DC-DC Converter: Provides stable voltage outputs to the pressure sensor and the Bluetooth module. Components: Will be responsible for power delivery to the sensors, RFID, and ESP32. We will test it out with a power supply first as suggested in the lecture, and then move on to the battery integration. # **Criterion for Success** 1. Sensors must detect equipment occupancy with at least 95% accuracy. 2. The system should be able to differentiate between legitimate equipment usage and random weight placement/fluctuations in sensor data (testing pressure sensor functionality). 3. Reps must be detected with at least 90% accuracy data (test for motion data from the IMU) 4. Sensor data should be transmitted and displayed within 1 second of a change in occupancy status. 5. The system should provide wait time estimates with an error margin of around 20%. 6. All hardware must function correctly after 40 hours of operation in a simulated gym environment. 7 . The display system must enable gym-goers to track available equipment efficiently. 7. We will test it by having multiple users verify the system’s tracking capabilities as well as wait time estimates in a gym environment. |
||||||
29 | Smart Tripod |
Henry Thomas Kadin Shaheen Miguel Domingo |
Chi Zhang | Viktor Gruev | proposal1.pdf |
|
# Smart Tripod Team Members: - Henry Thomas (henryjt3) - Kadin Shaheen (kadinas2) - Miguel Domingo (jdomi8) 1. Problem Traditional tripods provide stability for cameras and smartphones but lack dynamic adjustability and real-time framing assistance. When setting up a shot, users must manually adjust the tripod’s angle and position, often requiring multiple iterations to get the perfect frame. This is especially inconvenient for solo photographers, vloggers, or group shots where precise positioning is essential. Additionally, while taking personal videos, standard tripods will not adjust their camera angle to ensure you stay in frame and centered. Though motor controlled tripods do exist, they lack the extra functionality of being able to view your camera image real time, and do not offer automatic subject tracking. 2. Solution We are creating a smart tripod system that enhances traditional tripods by integrating motorized adjustments and real-time framing assistance. This system will allow users to remotely control their phone’s position and preview the shot through an external display, making it easier to capture well-framed images and videos without manual repositioning. The smart tripod will connect wirelessly to a user’s smartphone and use stepper motors to adjust the phone’s angle and orientation. An external display will provide a live preview of the camera feed and serve as the control interface for adjusting the tripod’s position. The system will also include a tracking feature where the camera will follow a subject, adjusting the camera’s orientation ensuring that the subject stays centered on the field of view. 3. Solution Components Subsystem 1 - Motorized Positioning System (MPS) The MPS will utilize 2 stepper motors for zenith and azimuth orientation. The main body will be made out of a non-toxic 3d printed body, most likely PLA. It will also include a phone mount and clamp made of the same material. The MPS will have the following electronic components: Custom PCB, An ESP32 for Websocket interfacing and motor control, 2 Makerlabs DRV8825 stepper motor controller, 2 Adafruit 324 12V 350ma stepper motors, A power system (discussed below) Subsystem 2 - Remote Display and Control Interface The ESP32S3 controls the tripod’s motors via WebSockets over WiFi, with physical buttons for azimuth (horizontal) and zenith (vertical) adjustments. A Raspberry Pi 4, running RPiPlay, wirelessly receives the iPhone’s camera feed via AirPlay and displays it on a Waveshare 2.4-inch SPI LCD. OpenCV on the Raspberry Pi processes the video to track a subject, sending position data via GPIO through a SparkFun BSS138 Logic Level Translator to the ESP32S3, which adjusts the tripod accordingly. A switch toggles between tracking and manual modes. WebSockets over Wi-Fi enable motor control and iPhone camera actions (photo, video, zoom). The ESP32S3 provides a shared Wi-Fi network for seamless communication. The remote control interface will also contain a custom pcb and a power system, the latter of which is discussed below. Subsystem 3 - App Interface A custom app will use WebSockets to receive ESP32S3 commands over Wi-Fi and control iPhone camera functions via AVFoundation, including video start/stop, photo capture, and zoom. Subsystem 4 - MPS Power System This subsystem is intended to supply power to the stepper motors, esp32, and motor drivers. The power system will include: 1 KBT 12V, 2600mAh Li-Ion battery pack, 1 Recom R-78B3.3-1.0 3v3 buck converter Subsystem 5 - Remote Display and Control Interface Power System The power system of the control interface is designed to supply and maintain onboard power to the Raspberry PI, ESP32S3, and other onboard circuit. The power system will include:, 1 3v7 LiPo 2000mAh 2c battery, a 1S 3v7 2c (4 amp working) BMS, A Type-C connector for charging, A 3v3 step down voltage regulator for the ESP32 and Logic Level Translator, 1 5V step up voltage regulator for the Raspberry Pi, Logic Level Translator, and LCD display 4. Criterion For Success - Motors must respond to inputs and tracking commands within 250ms with precise movement (±2°). - iPhone camera actions (photo, video, zoom) must trigger within 500ms over Wi-Fi. - iPhone screen must stream to the remote display via AirPlay with <1s latency and ≥24 FPS. - Tracking must detect and follow the subject within 250ms after receiving video, maintaining focus on the first detected subject. - The system must run for at least 30 minutes without overheating, maintaining stable operation. |
||||||
30 | PawFeast: Food on Demand |
Arash Amiri Kathryn Thompson Omkar Kulkarni |
Aishee Mondal | Yang Zhao | proposal1.pdf |
|
# PawFeast: Food on Demand Team Members: - Omkar Kulkarni (onk2) - Arash Amiri (arasha3) - Kathryn Thompson (kyt3) # Problem All pet owners must remember to feed their pets at set times during the day. There are times that people forget to feed their pets, double feed them when there are multiple people in the house and there is poor communication, or have trouble feeding them on time when they get home late if there are prior conflicts. In these times, pets either eat too much, too little, or irregularly. As a result, timer-based pet feeders have been created that release food at set times. However, this introduces a new problem, food that was sealed in an airtight container is now released into an open environment. When this food sits out for extended periods of time, this risks the food becoming stale or bugs getting introduced to the food. # Solution We are seeking to solve this problem by having timers preventing overeating, coupled with pressure sensors required to release food. Given both of these conditions, the dispenser will release the food for the pet. This ensures that the food is fresh when the pet goes to eat and that the pet is fed on time. In addition, if the pet were to not eat all the food at once and leave the food dispenser, the pressure sensor will tell the dispenser to cover the food until the pet returns ensuring freshness and preventing bugs. In addition, our system will notify owners when the pet has eaten, or when the food dispenser has low levels of food. # Solution Components ## Subsystem 1: Refilling Food Store The subsystem for checking if the food store is empty will include an infrared presence sensor (Vishay TSSP77038) that is able to measure up to 2 meters in front of it to determine whether an object is present—this will serve to detect whether the food inside of the container is low and needs to be refilled. When the food store reaches a certain level, a signal will be sent to subsystem 7 (the brain), which will send an automated message to notify the owner to refill the food store. ## Subsystem 2: Power System The power system will supply energy to all of the various subsystems, including the microcontroller of subsystem 7, the stepper motor driver in subsystem 3, and the various sensors in subsystems 1, 5, and 6. The low voltage buses will be 3.3V, 12V, and 5V, all of which will be powered by a combination of rechargeable lithium-ion batteries and low-dropout regulators. This compact, custom battery pack will allow the pet dispenser to be portable, even for family vacations or road trips. The AMS1117 LDO has variable voltage levels of 1.2V, 1.5V, 3.3V, 5V, and more for all of the lower voltage needs; the 12V supply will come directly from the battery pack or possibly from a DC-DC converter that is implemented. ## Subsystem 3: Food Dispenser This subsystem will utilize a E Series Nema 17 stepper motor that will slide open a door to the food container when needed and allow a set amount of food through before sliding closed the door. The motor will be controlled by a stepper motor controller IC (DRV8825) with a custom control circuit that takes in 12V and outputs 3.4V to the motor. Hard cut offs will also be coded for maximum food and min and max on times during the day. The main objective of this subsystem is to make the system an on-demand food system. We will utilize a large button for the dog to step on when hungry. When pressed, the system will start the motors to release food, on the condition that enough time has elapsed since it ate last. The pet would be trained how to ask for food using this button. Only when the button is pressed and enough time has passed, will food then be dispensed. ## Subsystem 4: RFID for pet identification We will have receivers for minimum 2 RFIDs, UHF RFID tag 9662 Long Distance Passive Alien H3. The RFIDs would be used to signal which pet is using the dispenser, so the owner is able to utilize the same dispenser for one or more pets. ## Subsystem 5: User Interface The subsystem for notifications will be a phone application that we will build that is able to connect with the pet feed dispenser system. For the application side, we will utilize React Native and Expo to create a user friendly method to check the pet’s eating habits. This will notify users for when the pet has been fed, if the food tank is low and requires refiling, and if the food was covered due to an empty bowl or a partially filled one. ## Subsystem 6: Brain The brain subsystem will take in inputs from all of the other sensor subsystems and output the according signals to the user interface (for notifications) and the food dispenser motor driver. This system will also track the amount of time that has passed and sound a soft chime for when enough time has passed for the pet to be able to eat. The ESP32 microcontroller, known for its wifi connectivity and security measures, would have a set of I/O pins for taking in these signals. This MCU programmer circuit and other control level circuitry—firmware or equivalent—would be incorporated onto this board along with a UART circuit to detect pet RFIDs (done with the UHF RFID reader JRD-4035). # Criterion For Success The mechanical feeder system should drop a bowl-full of food once a valid RFID is nearby, a pet-intended button has been pressed, AND enough time has passed since the pet last ate. An owner should receive a notification once the food store has decreased below 10% and 20%, and when criterion 1 has been satisfied. The power system should be power efficient: the rechargeable battery should last for at least 10 food dispenses. |
||||||
31 | Exercise Repetition Counter Using Discrete Clip On Device |
Arhan Goyal Prithvi Patel Vikrant Banerjee |
Sanjana Pingali | Yang Zhao | proposal1.pdf |
|
# Title Exercise Repetition Counter Using Discrete Clip On Device Team Members: - Prithvi Patel (prithvi7) - Arhan Goyal (arhang2) - Vikrant Banerjee (vikrant3) # Problem Maintaining proper workout form and accurately tracking repetitions during exercises can be challenging, especially for individuals working out at home or without a trainer. Existing solutions either rely on expensive gym equipment or smartphone applications, which lack precision and real-time feedback. There is a need for a cost-effective, standalone device that can accurately count repetitions and display them in real-time without setting up a camera in the middle of gym (all existing solutions for the problem). # Solution We propose a wearable, discrete clip-on device with a custom PCB that uses an MPU6050 accelerometer and gyroscope to detect arm motion during exercises. The system will process motion data to identify and count repetitions, displaying the count on an 8-segment display in real-time. Additionally, the device will include a timer to measure the duration of each repetition and provide feedback through a vibration motor when the user completes a repetition. The time per repetition and set completion criteria can be adjusted using a simple dial or potentiometer. # Solution Components ## Subsystem 1: Motion Detection and Processing - **Function**: Captures arm motion using a 6-axis motion sensor and processes the data to detect exercise repetitions. - **Components**: - MPU6050 (6-axis accelerometer and gyroscope) - Microcontroller for data processing and communication ## Subsystem 2: Timer and Feedback Mechanism - **Function**: Measures the duration of each repetition and provides feedback through vibration. - **Components**: - Timer functionality implemented in software - Vibration motor (e.g., 310-101) for feedback - Dial or potentiometer for adjusting time settings (e.g., 10K potentiometer) ## Subsystem 3: Custom PCB - **Function**: Provides a compact and efficient platform for integrating the motion sensor, microcontroller, power supply, and display connections. - **Components**: - PCB with integrated traces for components (sensors can be directly connected/soldered onto the PCB without additional breadboards or jumper cables) - Voltage regulator (e.g., LM7805) for stable power supply - Power source (rechargeable battery) ## Subsystem 4: Display and Feedback - **Function**: Displays the real-time repetition count to the user. - **Components**: - 8-segment display - Driver IC (e.g., MAX7219) for efficient control of the display ## Subsystem 5: Power Management - **Function**: Ensures the device operates efficiently and reliably over extended periods. - **Components**: - Battery charging circuit (e.g., TP4056) - On/off switch for user control # Criterion For Success 1. The system accurately detects and counts exercise repetitions with a minimum accuracy of 90%. 2. The 8-segment display updates the repetition count in real-time without noticeable lag. 3. The timer accurately measures and tracks the duration of each repetition and signals set completion through vibration feedback. 4. The time for repetitions and sets can be easily adjusted using the dial or potentiometer. With this design, we aim to provide a practical, affordable, and user-friendly solution for fitness enthusiasts to track their workout reps effectively. |
||||||
32 | Smart Pulse Oximeter |
Faris Zulhazmi Jason Machaj Sidney Gresham |
Shengyan Liu | Yang Zhao | proposal1.pdf |
|
# Smart Pulse Oximeter Team Members: - Jason Machaj (jmach5) - Faris Zulhazmi (farisaz2) - Sidney Gresham (sidneyg2) # Problem Describe the problem you want to solve and motivate the need. The problem at hand is the inaccuracy of pulse oximeters in individuals with darker skin tones due to the way these devices interpret oxygen saturation levels. Pulse oximeters function by emitting light through the skin and measuring how much is absorbed to determine oxygen levels in the blood. However, higher concentrations of melanin absorb more light, leading to less accurate readings and potential overestimation of oxygen saturation in individuals with darker skin tones. Addressing this problem is essential to improving equitable healthcare outcomes. A more inclusive and reliable pulse oximetry technology is needed—one that accounts for diverse skin tones and ensures accurate readings for all individuals. # Solution Describe your design at a high-level, how it solves the problem, and introduce the subsystems of your project. This project aims to develop an adaptive pulse oximeter that adjusts the number of wavelengths used based on the user's skin tone. Traditional pulse oximeters often produce inaccurate readings for individuals with darker skin tones due to increased melanin absorption, which interferes with light-based oxygen saturation measurements. Many modern devices attempt to address this by using multiple wavelengths, but this approach increases power consumption. Our solution integrates a camera and computer vision algorithms to determine skin tone and a wavelength-switching mechanism to optimize accuracy while conserving power. The device will also measure heart rate using the same optical components, making it a multifunctional health monitoring tool. All collected data will be displayed digitally for real-time user feedback. # Solution Components ## Subsystem 1: Pulse Oximeter Subsystem This subsystem will use infrared and red light to measure blood oxygen levels as well as heart rate. The way this works is that oxygenated blood will absorb more infrared light and pass through more red light. Deoxygenated blood does the opposite. Knowing this, we can capture and calculate the total blood oxygen level (SpO2) based on the ratio of red and infrared light passing through with a photodetector and a calibration algorithm. In order to properly measure the heart rate, the system will measure the photoplethysmography signal (PPG). When the photodetector records the light intensity, the blood volume increases as the heart beats, causing more light to be absorbed, reducing the signal. These wave-like pattern peaks correspond to the heartbeats and use the time difference between each successive peak to calculate the heart rate in BPM. We will use the respective emitter LEDs and photodiodes: - Red - Kingbright APT2012SECK - Infrared - Vishay TSAL6100 - Photodetector - Hamamatsu S1223 ## Subsystem 2: Color Recognition via Computer Vision Subsystem This subsystem will utilize the “300K PIXEL USB 2.0 MINI WEBCAM” in conjunction with a flashing light to image the skin tone of the user. Using these images, color recognition will be employed to determine whether multiple wavelengths of light would need to be used to provide higher blood oxygen level measurement accuracy depending on user skin tone. ## Subsystem 3: Digital Display Subsystem To display the contents of our measurements, data will be taken from the microcontroller and will be displayed on an external digital display. This will show the blood oxygen levels and heart rate to the user in real time. ## Subsystem 4: Power Supply Subsystem This system must be able to operate on a rechargeable lithium-ion battery. This subsystem will provide appropriate power to each other subsystem/component using this battery with DC-DC converters (buck/boost converters). Reasonable operation time must also be available from one charge of the li-ion battery. Power efficiency can be managed via the switching of the oximeter from one to two wavelengths depending on skin tone, leading to longer operation time on one charge and higher efficiency. # Criterion For Success Describe high-level goals that your project needs to achieve to be effective. These goals need to be clearly testable and not subjective. - Read blood oxygen within a 2% range. - Read heart rate within a 2% range. - Camera successfully captures and sends data to the microcontroller. - Ability to change wavelengths depending on skin tone. - Assistance via computer color recognition (to show success, try with and without to see difference in measurement) - Correctly display measured blood oxygen levels and heart rate. |
||||||
33 | Table Cleaning Robot: Autonomous Elevated Surface Cleaner |
Ann Luo Bolin Pan Yening Liu |
Jason Zhang | Yang Zhao | proposal1.pdf |
|
Team Members: - Ann Luo (hluo12) - Bolin Pan (bolinp2) - Yening Liu (yeningl2) # Problem Cleaning tables is a repetitive and time-consuming chore that people usually do by hand. Tables often collect dust, crumbs, and spills, making regular cleaning necessary. While robots for cleaning floors are common, there aren't many options for robots designed to clean tables. The main challenges for such a robot include stopping it from falling off the table, avoiding objects like cups and plates, and making sure the surface is cleaned thoroughly. # Solution The Table Cleaning Robot is a small, self-operating device made to clean flat surfaces like tables. It is designed to handle challenges such as detecting edges to avoid falling, collecting dust and crumbs, and avoiding obstacles with the advanced functionality of cleaning space under objects like glasses and plates. Unlike floor-cleaning robots, this robot is built specifically for elevated surfaces, making it safe and effective. It works by using sensors to detect edges and prevent falls, rotating brushes to collect debris, and smart navigation to move around obstacles and clean the table. # Solution Components ## Subsystem 1: Edge Detection and Fall Prevention This part of the robot makes sure it doesn’t fall off the table. It uses sensors, like infrared or ultrasonic sensors, to detect the edge of the table. When the robot approaches the edge, the sensors send a signal to the robot’s microcontroller, which tells the wheels to stop or turn around depending on whether the robot finished cleaning. This way, the robot stays safe and doesn’t fall. ## Subsystem 2: Debris Collection This part is responsible for cleaning the table by picking up dust, crumbs, and small messes. It may have spinning brushes underneath that sweep the dirt into a small bin or a tiny vacuum fan can also suck up finer dust. The brushes are powered by a small motor, and the bin can be removed to empty out the dirt when it’s full.\ \ The robot should clean the table row by row, ensuring the coverage of the entire surface of the table. ## Subsystem 3: Obstacle Detection This part helps the robot navigate around obstacles and interact with objects like cups or plates. It uses sensors, like ultrasonic or LiDAR sensors, to detect objects in its way. To clean the surface under the object, we propose two potential methods, The first method is to lift objects up using a robotic arm and clean the area underneath. However, this method may struggle with smooth-surface objects like glass cups which are hard to grip securely. Also, this method may only work for lightweight objects since we plan to build a portable small-sized robot. Another way is to push the objects aside, clean the exposed area, and then push the objects back to their original positions. However, a concern of this method is that the robot might push things off the table if the objects is at the edge of the table. Therefore, we may consider combining these two methods. ## Subsystem 4: Size and Portability The robot is designed to be compact and portable, making it suitable for a wide range of table sizes and shapes. The robot should weigh no more than 2 kg to ensure easy portability and will be no more than 20 cm x 20 cm x 10 cm(L x W x H). # Criterion For Success 1. Edge Detection and Fall Prevention: \ The robot must detect and avoid edges with 100% reliability to prevent falls.\ It should stop or turn around within 2 cm of the edge. 2. Debris Collection:\ The robot should collect at least 90% of debris in a single cleaning cycle.\ The collection bin should hold debris from at least three cleaning cycles for a 60 cm x 60 cm table. 3. Object Interaction:\ The robot should avoid obstacles as small as 5 cm in diameter and as large as 20 cm in diameter.\ It should successfully move objects weighing up to 500 grams without knocking them over.\ The navigation system should achieve a 95% success rate in avoiding obstacles and completing cleaning tasks. 4. Size and Portability:\ The robot should operate effectively on tables ranging from 60 cm x 60 cm to 120 cm x 80 cm.\ It should clean at least 90% of the table surface area, including under objects. |
||||||
34 | Board Buddy |
Alfredo Angel Gabe Valles Louie Conn |
Surya Vasanth | Michael Oelze | proposal1.pdf |
|
# Board Buddy ## Team Members: - Alfredo Angel (alfredo9) - Gabriel Valles (gvall4) - Lewis Conn (lewisc2) # Problem Instructional writing boards, such as chalkboards and whiteboards, are widely used in educational and professional settings, but manually erasing these boards is time-consuming and disrupts workflow. During brainstorming sessions, lectures, or meetings, manually erasing the board can slow down productivity. Additionally, custodians spend hours cleaning boards outside of school hours, making it a labor-intensive task. Current solutions are either completely manual or require expensive, rail-based automated systems that only work on pre-sized boards. There is a need for a cost-effective, portable, and efficient automated eraser that can remotely clean boards of various sizes and shapes with minimal human intervention. # Solution We propose **BoardBuddy**, an autonomous board eraser designed to clean magnetic writing boards efficiently. The device will attach securely using **neodymium magnets** and navigate using **omnidirectional wheels** controlled by an **ESP32 microcontroller**. It will feature **edge detection using microswitch lever arms** and **an accelerometer for stability**. The system will be battery-powered with a **low dropout regulator (LDO)** for stable operation. It will include a **lightweight 3D-printed housing** for structural support and protection. Additionally, a **mobile application** will allow users to remotely activate the device, schedule cleaning sessions, and monitor its status. # Solution Components ## Subsystem 1: Locomotion and Mounting **Function:** Enables the device to move smoothly across the board while maintaining consistent contact for effective erasing. **Components:** - **Omnidirectional Wheels** – Allow unrestricted movement in any direction. - **Neodymium Magnets** – Securely mount the device while allowing mobility. - **DC Motors** – High-torque motors for smooth movement (e.g., **Pololu 25D Metal Gearmotor 12V**). - **Motor Drivers** – Dual-channel motor drivers (e.g., **TB6612FNG**). ## Subsystem 2: Erasing Mechanism **Function:** Erases the board as the device moves. **Components:** - **Eraser Pads** – Replaceable pads mounted on the device. - **Spring Mechanism** – Ensures even pressure for effective cleaning. ## Subsystem 3: Navigation and Edge Detection **Function:** Prevents the device from falling off the board by detecting edges and obstacles. **Components:** - **Microswitch Lever Arms** – Detect board edges and trigger direction changes. - **ESP32 Microcontroller** – Processes sensor inputs and controls movement. - **IMU Sensor (Optional, e.g., MPU-6050)** – Provides additional orientation data. ## Subsystem 4: Power Management **Function:** Supplies stable power to all components. **Components:** - **LiPo Battery Pack** – Rechargeable power source. - **LDO Voltage Regulator (e.g., LM7805)** – Steps down battery voltage for ESP32 and other components. ## Subsystem 5: Enclosure **Function:** Protects components and provides a lightweight, compact structure. **Components:** - **3D-Printed Housing** – Custom enclosure for durability and heat dissipation. ## Subsystem 6: PCB Design **Function:** Integrates motor control, edge detection, and power management into a single, compact PCB. **Components:** - **Custom PCB** with footprints for: - **ESP32** - **Motor Driver ICs (TB6612FNG)** - **Voltage Regulator (LM7805)** - **Edge detection circuitry (microswitch connectors)** - **Standard connectors for battery, motors, and sensors** ## Subsystem 7: Application **Function:** Provides remote control, scheduling, and monitoring features. **Features:** - **Remote Activation** of the eraser. - **Scheduled Cleaning Sessions** (e.g., set to clean at night or after class). - **Manual Control** via app. - **Usage History/Log** for tracking. - **Status Monitoring** (Idle, Cleaning, Error). - **Developed using Flutter or React Native** for cross-platform compatibility. # Criterion for Success BoardBuddy will be considered successful if it meets the following criteria: - **Effective Erasing:** Cleans most of residue in a single pass. - **Secure Mounting:** Neodymium magnets must hold the device firmly on the board without slipping. - **Edge Detection:** The device must detect board edges and avoid falling off. - **Smooth Locomotion:** Omnidirectional wheels must provide **consistent and unrestricted movement**. - **Reliable Power:** The battery must provide **at least 30 minutes of continuous operation**. - **Compact Design:** The device must be lightweight and compact to minimize magnet usage. - **Custom PCB Functionality:** The PCB must integrate motor control, edge detection, and power management without external breadboards or components. - **Application Integration:** The app must allow for **remote control, scheduling, and monitoring**. |
||||||
35 | CarGuard: Autonomous Hot Car and CO Poisoning Mitigator |
Cathy Boman Emily Xu Parvati Menon |
Rui Gong | Viktor Gruev | proposal1.pdf |
|
# CarGuard: Autonomous Hot Car and CO Poisoning Mitigator Link to Discussion: https://courses.grainger.illinois.edu/ece445/pace/view-topic.asp?id=77034 Team Members: - Parvati Menon (parvati3) - Cathy Boman (cboman2) - Emily Xu (exu7) # Problem Every year, many children and pets die due to hot cars or carbon monoxide poisoning when they are left in a locked vehicle. Parents often forget or knowingly leave their children and pets in a hot, locked car. Even if the parent leaves for a quick 10-minute errand, there are still concerns about heatstrokes since temperatures inside can rise as much as 20 degrees in that short duration of time (Seattle Children’s Hospital). In 2024 alone, 39 kids died from heatstroke of being in a hot car (NSC). Currently on the market, there exist devices to remind users to open the back door or check the backseat (CNN). The volume of these alarms can be reduced, and as a result, parents can forget they are going off. There is no autonomous solution that works to mitigate the situation when the car’s interior temperature is unsafe. There are Carbon Monoxide detectors on the market today, but these devices simply sound an alarm when a certain threshold is reached. However, if the user is not in the vicinity, they might get notified of the incident too late. Sources: NSC: https://injuryfacts.nsc.org/motor-vehicle/motor-vehicle-safety-issues/hotcars/ Seattle Children’s Hospital: https://www.seattlechildrens.org/health-safety/injury-prevention/dangers-child-alone-car/ CNN: https://www.cnn.com/2023/06/29/business/hot-car-interior-radar/index.html # Solution Our solution is to create an autonomous device that immediately creates ventilation for passengers to prevent deaths in a hot car and alert users of a defective exhaust system. Our device will have a temperature and carbon monoxide sensor. If the vehicle's temperature is too high, the device will open the windows in the car. On the driver's side, you attach a device near the buttons (without obstructing the driver) that will push the button to open all windows. If the carbon monoxide levels are too high, the system will alert the user and recommend they get their exhaust checked. The device will also alert the car's owner through an app that the temperature or CO level is too high and the windows have been lowered. Furthermore, the vehicle will have an intermittent buzzer or alarm that sounds until the CO or temperature levels are safe. The alarm will shut off automatically once the levels are safe or once the car is turned on. The device will also have a camera streaming footage to the app. Through the app, the user can monitor the inside of their car and talk to the passenger. # Solution Components The device comprises a carbon monoxide sensor, a temperature sensor, a power control system, a robotic arm mechanism, a communication module, and a microcontroller to monitor surroundings and detect abnormal CO or high-temperature levels. It is also connected to an app that alerts users of possible incidents. We plan to integrate two PCBs: one for the sensors and robotic control modules and the other for the communication and monitoring module. ## Subsystem 1: Sensor Subsystem - Carbon Monoxide Sensor - MQ-7 (20-2000ppm) - Detects the CO concentration in the air and sends the information to the microcontroller. - Temperature sensor - We will make this sensor using resistors, transistors, and a thermistor. - This sensor will measure the car's temperature and send the information to the microcontroller. - Buzzer - CYT1008 - A buzzer that will sound when the temperature exceeds the chosen threshold. The buzzer will intermittently buzz until the temperature in the car decreases to a safe level. ## Subsystem 2: Robotic Arm - Proximity Sensor - SEN-24049 - This sensor will confirm that the window was rolled down completely. This sensor uses ultrasonic waves to detect if an object is in front of it. Once this sensor detects that nothing is in front of it (i.e., the window is completely rolled down), it will signal to the microcontroller that the arm mechanism rolled the windows down. - Linear Actuator: - A linear actuator will be used to control the robotic arms to lower the car windows when the temperature inside the car is too high. It can retract, so it doesn’t obstruct the door handle. - 3D printed prongs - Four prongs will push the buttons on the driver’s door to open the windows. ## Subsystem 3: App - We will use Flutter to implement a custom application to communicate with our microcontroller and sensors. This application will contact the user about high temperature or high CO levels. ## Subsystem 4: Power and Voltage Control Subsystem - We will use a 12-volt power battery to power our first PCB with the sensors and robotic arm. However, in real life, the product will be powered by a car battery as the car’s “resting voltage” measures about 12 volts. Source: https://www.jiffylube.com/resource-center/car-battery-voltage. - Our system will only use the 12 volts when the robotic arm is activated. For most of the time when the system is simply checking the temperature and CO levels of the car, roughly 3.3 volts will be used. - Since the majority of the time, 3.3 V is used, this system will not drain the car’s battery excessively. - We will create a Voltage regulator/Buck Converter using a zener diode, capacitor, and resistor. - This subsystem is meant to provide the correct voltage to systems that use a voltage lower than the battery voltage. For example, if we use a 12-volt battery and the microcontroller needs 3.3 volts, this subsystem will convert the 12 volts to 3.3 volts. - For the second PCB, a 5-volt battery will power the camera and speaker subsystem on the second PCB. ## Subsystem 5: Microcontroller - ESP32 - The microcontroller will handle the processing of signals from the sensors, sending data to the app, and sending signals to the arm mechanism. ## Subsystem 6: Monitoring and Communication (this subsystem will be on a separate PCB) - ESP32-CAM - The camera allows users to monitor the inside of their car in real-time. We will only be using the ESP32-S chip and the OV2640 lens component that will be attached to the PCB via a FPC 24 connector. - Mouser COM-11089 - A speaker system will allow parents or users to talk through the app to comfort kids in the car. The speaker can also deter potential thieves who will use this window of opportunity to steal things from the vehicle. - We plan to use a separate PCB for this subsystem to allow users to place the camera where they see fit. # Criterion For Success ## Testing Procedure We will be in contact with the machine shop to have a demo driver’s door. We will use this door to test our system. We plan to test the temperature sensor by placing a heat source on the sensor. To test the CO sensor, we will put the system by the car’s exhaust and run the car. ## On the Software Side: - The app will also stream camera footage of the interior of the car and allow the user to talk to the passenger in the car. This would allow the user to monitor their car for theft and check on any remaining passengers within the vehicle. - The app successfully sends a test notification to the user ## On the Hardware Side: - When the system detects a temperature above some threshold temperature, it will activate the system. - A sound will play when the temperature threshold is reached - The linear actuator will extend, and the prongs will press the buttons - The window should be rolled down past the proximity sensor - When the temperature returns below the threshold, the alarm should stop ringing, and the linear actuator should retract to its initial position. - When the temperature sensor detects a temperature above the threshold, the buzzer sounds an alarm. ## Integration: - When the temperature sensor detects the temperature to be above the threshold, the microcontroller sends an alert to the user via the app. - The user can speak through the app and this is audible through the speaker. - If the CO concentration is too high, the sensor will send a signal system to notify the user via the app and recommend fixing the exhaust pipe. |
||||||
36 | Bike Alert: Bike Lock with Real-Time Security Monitoring |
David Youmaran Diego Herrera Kenny Kim |
Aishee Mondal | Michael Oelze | proposal1.pdf |
|
# Bike Alert: Bike Lock with Real-Time Security Monitoring ## Team Members - Diego Herrera (dherr4) - Kenneth Kim (kk67) - David Youmaran (dcy2) # Problem Bicycle theft remains a major issue, especially on campus. While traditional locks provide physical security, they fail to notify owners when tampering occurs, leaving bikes vulnerable. A security solution is needed—one that not only prevents unauthorized access but also alerts the owner in real time when theft attempts occur. # Solution The Bike Alert system is an advanced security attachment for standard bike locks, integrating multiple tamper-detection mechanisms with real-time notifications. The device will: - Detect lock disengagement and unauthorized tampering using various sensors. - Utilize an ESP32 microcontroller to process sensor data. - Communicate alerts via Wi-Fi to a mobile app, notifying the user in real time. - Feature a secondary locking mechanism (deadbolt) controlled by RFID for enhanced security. - Be battery-powered and rechargeable to ensure long-lasting operation. We acknowledge that previous attempts have been made to develop bike locking systems. However, most existing designs focus primarily on physical security without incorporating real-time alerts or secondary security measures. To our knowledge, no prior project has successfully implemented both mobile app notifications and an RFID-controlled deadbolt lock. Our design aims to bridge this gap by providing a comprehensive security solution that enhances both theft prevention and user awareness. # Solution Components ## Data Collection Subsystem (Tampering & Lock Disengagement Detection) This subsystem monitors the lock and detects unauthorized access. It consists of: - Hall-Effect Sensors for Lock and Case Monitoring - Lock Disengagement Detection: A Hall-effect sensor and magnet will detect when the lock is disengaged. If the magnet moves past a predefined threshold, an alert is triggered. - Case Tamper Detection: Inspired by [TI's application](https://www.ti.com/lit/ab/sboa514a/sboa514a.pdf), we will use a Hall-effect sensor positioned inside a 3D-printed enclosure to detect when the outer case is tampered with. A magnet embedded in the case ensures that when closed, the sensor detects a high flux density. If the case is opened/moved far enough, the decreasing flux density will trigger an alert. - Spring-Based Adjustable Vibration Sensor - Detects physical tampering such as cutting or shaking the lock. - The adjustability allows fine-tuning of sensitivity to differentiate between minor disturbances and actual theft attempts. - ESP32 Microcontroller - Collects data from all sensors and sends it to the Wi-Fi-connected mobile app. ## Communication & Mobile App Subsystem This subsystem enables real-time notifications and user interaction. - ESP32-to-App Communication - The ESP32 will transmit sensor data via Wi-Fi, using the campus network for connectivity. - If an alert is triggered (lock disengagement, tampering detected), the app will receive a real-time notification. - Mobile App Features - Display current lock status. - Send push notifications for tampering or disengagement events. - Event log to track past security incidents. - Allow the user to enable/disable monitoring modes manually (e.g., "In Use" vs. "Not In Use" mode). ## Secondary Security Subsystem (RFID Deadbolt Lock) To add an additional layer of security, the system will include an RFID-controlled deadbolt locking mechanism. - Purpose: Even if the main lock is broken, the deadbolt will prevent full disengagement of the bike lock. - How it Works: - The deadbolt is controlled via RFID authentication for convenient unlocking. - A small, high-torque motor will drive the deadbolt mechanism. - Requires a motor driver circuit and relay to switch power efficiently. ## Power Supply Subsystem The system must support continuous operation, including sensor monitoring, Wi-Fi communication, and motor operation. - Power Source: Rechargeable Lithium-Ion Battery. - Battery Capacity Considerations: - Must sustain ESP32 operation and Wi-Fi connectivity. - Should provide enough power for motor-driven deadbolt activation. - Efficient power management circuit to maximize battery life. # Criterion For Success - Reliable Detection – Sensors must accurately distinguish between normal activity and actual tampering. - Alerts – Wi-Fi-enabled notifications must reach the user in real time. - Secure Secondary Lock – The RFID-controlled deadbolt should prevent theft even if the primary lock is compromised. - Battery Life – The system must operate for at least 48 hours per charge under normal conditions. # Conclusion The Bike Alert system offers an approach to bicycle security by combining tamper detection, real-time notifications, and an RFID-based secondary lock. This project integrates multiple subsystems into a compact, user-friendly solution that enhances traditional bike locks without compromising convenience or being overly expensive. |
||||||
37 | Automatic Card Deck Sorter |
Alfred Hofmann Kyle Mahler Rocky Daehler |
Sanjana Pingali | Michael Oelze | proposal1.pdf |
|
# Automatic Card Deck Sorter Kyle Mahler (kmahler3) Alfred Hofmann (alfredh2) Rocky Daehler (walterd2) # Problem Lots of different card games require different setups of the deck before you can start playing, Euchre, for example, requires only cards 9's and above to be played. Short deck poker requires only 6 cards and above to be played. # Solution Myself and my teammates to this problem posit an automatic card deck sorter. This device would use a camera, a raspberry pi, sorting departments, and a mechanical arm of sorts in order to automatically sort the deck of cards based on a given input. The device could be used to separate the deck for a specific game, or even to sort the cards in a specific order (AKA, to rig the deck). Possible extension of this idea could additionally be a mobile app for user input, or also an automatic shuffling featuring when the whole deck is wanted for gameplay. As for the design of this device, we are imagining three different bins for our sorting system. Imagine three bins from left to right. The middle bin is where the cards would originally go, as well as where the camera and mechanical arm are. The left is where we would put our ‘don’t care’ cards, and on the right, we would put all our do care cards. At each card, the camera would scan the corner of the card and identify it using the ML software, then the arm would push it in the appropriate direction. # Subsystem 1 : User Interface Function: Input given by the user to pick what range of card values to accept. Components: Buttons or touchscreen # Subsystem 2 : Camera and Card Recognition Function: Captures images of the corner of the cards and recognizes suit and rank. Components: LED’s inside of the device in order to keep the cards well lit. We have a few options for cameras; Raspberry Pi camera, Arducam mini… etc. # Subsystem 3 : Sorting Arm/System Function: Sorts the cards into one of two bins to start: cards we care about, and cards we do not care about. Components: Rubber tip on metal arm, motor # Subsystem 4 : Control System Function: Manages all logic and communication for the different subsystems to work together. Defines card sorting process and forces action on it by sorting system. Error handling will also be a big part for this. Components: ESP32, STM32, Arduino, etc. These are all options. |
||||||
38 | Athletic Tracking Sensor |
Ethan Pizarro J.D. Armedilla Ryan Horstman |
Jiankun Yang | Yang Zhao | proposal1.pdf |
|
# Title Team Members: - Ryan Horstman (ryanjh4) - Ethan Pizarro (epizar4) - J.D. Armedilla (johndel2) # Problem Currently the main metric of progress in weightlifting is varying weight and reps, but there is also value in (and workouts designed around) moving weight either quicker or slower, known as Velocity Based Training. However, this type of training is inaccessible as current sensors are very expensive and infeasible for the everyday weightlifter. Additionally, incorrect form in workouts can lead to gradual and immediate injury to users, especially to those new to working out. Current sensors offer some solutions, but lack in some key features. Some assist with form tracking but not velocity. Most current sensors offer "real-time" feedback that consists of the lifter doing their exercise and then checking their results on their phones. This results in the user finishing a set, then getting feedback, then going back to another set. For exercises that are not just "move the weight as fast as you can" this is unideal. Additionally, with respect to form, this type of feedback does not inform until bad form is already used and the damage is done. # Solution We propose a compact wearable device that takes and transmits workout data to a phone via Bluetooth. It will utilize a 9-axis sensor (acceleration, gyroscope, and magnetometer). However, in addition to sending data to a phone, it will internally process data taken during the workout and provide immediate feedback to the user through haptic signaling and/or LED feedback. Before starting the workout, the user can indicate on his phone which workout he is doing and any desired constraints. Based on that workout the device will track the user's form and acceleration, alerting him/her if a desired constraint is not being met so that it can be immediately corrected mid-set. It would be small enough that you could strap to your wrist or neck, around a weight set, or attach to a desired object. If time allows, we could add a plug-in module that would connect a force sensor (likely piezoelectric) for quantification of exercises that are force based (another feature not currently available with other current acceleration sensors). # Solution Components ## Microcontroller Our microcontroller would an ESP32, and it would take data from the sensor and process it based on constraints transmitted to it from the app. For example, determine if velocity exceeds or is under a certain level or if form is incorrect to the point of risk. The ESP32 includes Bluetooth capability that will be used to communicate with the app. ## Sensors Our 9-axis sensor would be a ICM-20948, which includes acceleration sensor, magnetometer, and gyroscope. This would be utilized to collect acceleration data, as well as motion tracking data for form analysis. The data would be sent to our microcontroller. Additionally, our add-on force sensor would be one such as a 7BB-20-6 Piezo Disc. ## Feedback The immediate feedback to the user would be through vibration with a FIT0774. It would be actuated by the microcontroller. Additionally, we could integrate LED feedback via single-color LEDs. ## App The app would communicate to the device via Bluetooth and send constraints to the microcontroller based on what workout is being done (for example, maximum acceleration in a given direction or gyroscope orientation that indicates correct form). There would be a library of workouts, or the user could implement his own workout. Throughout the workout, the microcontroller will send data to the app. Once finished with the workout, the app will display the data that been collected as well as key statistics, such as the maximum and minimum acceleration/force. ## ... # Criterion For Success For our device to be effective, we will have to be able to enter constraints into the app, do a workout, and be alerted whenever in that workout we are not meeting our goals, or if our form is posing risk. We will first aim to utilize with squats (which necessitates good straight-back form) and bench press. Our app will have to also accurately display workout data. |
||||||
39 | The Illini Wagon |
Ian Watson Neha Joseph Ramya Reddy |
John Li | Arne Fliflet | proposal1.pdf |
|
Self Driving Wagon Team Members: - Neha Joseph (nehaej2) - Ian Watson (ianjw2) - Ramya Reddy (ramyar3) # Problem College students and urban dwellers often face the challenge of carrying heavy loads while walking across campuses or within walkable cities. Whether heading to a tailgate, a picnic, grocery shopping, or hosting an outdoor event, transporting multiple items can be inconvenient and physically demanding. While existing solutions like rolling carts and backpacks provide some relief, they still require manual effort and become impractical over long distances. With the rise of walkable cities and car-free urban spaces, there is a growing need for a hands-free, autonomous way to carry personal belongings over short distances without relying on traditional vehicles. # Solution We propose a self-driving smart wagon that autonomously follows the user using GPS tracking while carrying their items. # Solution Components ## Subsystem 1 – Robot Controls System The Robot Controls System utilizes an ESP32 microcontroller to receive Bluetooth data, enabling seamless communication with the user. It integrates the Adafruit Ultimate GPS Breakout Board for precise navigation to provide GPS coordinates. Additionally, the MCU interfaces with the motor system to control the vehicle’s motion, ensuring smooth and responsive movement. Components: 1 x ESP32 Microcontroller 2 x Adafruit Ultimate GPS Breakout Board ## Subsystem 2 – Motor Control We will equip the wagon with two 12V DC motors (3420) for propulsion and a servo motor (Tower Pro MG996) for steering, powered by a 12V battery (ML7-12 SLA). The steering system and electronic speed controller (ESC) will be integrated into a custom PCB, with velocity controlled via pulse width modulation (PWM). The wagon's speed, and equally voltage supplied to the DC motors, will dynamically adjust based on its distance from the user. Designed to handle loads of up to 30 lbs with ease, we may explore smaller, more cost-effective components to enhance efficiency while staying within budget. Components: 2x 3420 DC motors for propulsion 1x Tower Pro MG996 Servo motor for steering 1x ML7-12 SLA Battery ## Subsystem 3 – Human Tracking System This subsystem will include a Bluetooth module and a secondary GPS module. The user will carry this system in their pocket. The GPS module will output coordinate data to the Bluetooth module, which will then transmit this data to the MCU. The MCU will also receive location data from the on-unit GPS module (described in a previous subsystem). These two data streams will enable the MCU to calculate distance and directional information, which will be sent to the motor control subsystem. Components: Bluetooth Module (HC-05/HC-06 or RN-41) – transmit coordinate data to MCU The Adafruit Ultimate GPS Breakout Board – send location data to bluetooth module # Criterion For Success Describe high-level goals that your project needs to achieve to be effective. These goals need to be clearly testable and not subjective. Robot can follow a human in an open, outdoor space with no obstacles. Robot is able to follow human around a bend/corner. Robot is able to carry a load between 10-15 lbs. Robot is able to maintain a set level of distance between itself and the human. Robot can be turned on/off. Robot is able to navigate around a singular obstacle placed in its path. A successful project will complete 4 out of 6 of these goals, with the sixth goal being a reach goal. To demonstrate and test the robot, we will run the robot in the main quad with weighted items. |
||||||
40 | Smart Medical Pill Dispenser |
Adi Perswal Aryan Gosaliya Aryan Moon |
Jiankun Yang | Yang Zhao | proposal1.pdf |
|
# Smart Medical Pill Dispenser (SMPD) # Team Members: - Aditya Perswal (apersw2) - Aryan Gosaliya (aryanag2) - Aryan Moon (aryanm7) # Problem People often struggle with two major medication challenges. First, they forget to take their medications at the right time or take incorrect amounts. Second, they spend time sorting multiple medications into daily doses, which is both time-consuming and prone to errors. This is especially difficult for the elderly with multiple prescriptions they organize each week. # Solution An intelligent device that both sorts and dispenses medications automatically. Instead of manually organizing pills into compartments, users simply load entire medication bottles one at a time into the device. The system then automatically sorts these pills into correct daily doses and dispenses them at scheduled times. You can use a website connected to your device's ID and place in the times, dosages, and days you need to dispense medications and the SMPD will buzz at those times and on clicking dispense give you the correct dosages at once. # Solution Components ## Subsystem 1 The RTC module provides precise timekeeping to dispense the medications according to schedule. It maintains accurate time even during power outages through its backup battery system. The module communicates with the microcontroller via I2C protocol to provide current-time data. ## Subsystem 2 The ESP8266 enables WiFi and remote management of the dispenser. It runs a web server that hosts the UI for medication management and provides real-time status updates. The module allows users to receive notifications. It processes HTTP requests for schedule updates and transmits dispenser status data to the cloud. ## Subsystem 3 The microcontroller drives the stepper motors that rotate the dispensing cylinder. It will make sure that the pills were dispensed at the right time (aka did you take your medication). The microcontroller activates buzzers or speakers for audible notifications and LEDs for visual alerts when it is time to take medication. Using ESP8266, the microcontroller connects to a web app to send reminders, allow remote monitoring, and enable users to adjust schedules. ## Subsystem 4 The weight sensor system uses load cells and an HX711 amplifier to measure medication quantities. It monitors the weight of each medication compartment to track pill counts and verify successful dispensing. The sensor data is used to detect when medications are running low and trigger refill alerts. This system also lets us dispense one pill at a time. ## Subsystem 5 The device housing and mechanical components are fabricated using food-safe PLA or PETG filament. The design includes separate sealed compartments for each medication type, a rotating dispensing mechanism, and channels for pill routing. ## Subsystem 6 The top of the device will consist of a funnel-like structure which will enable the user to dispense the pill bottles one at a time. The funnels will drop the pills into a jar to sort them into different placeholders. Once one kind of pill has been emptied, there will be a disk in place to rotate onto the next jar for the next pill to be dispensed into. To have the pills dispense properly we will have the top of the funnel open and close when it is ready to take in a new pill, additionally, it will rotate the pillars so that it is directly below the funnel, so we never lose pills in any sense. # Criterion For Success The following are all True/False evaluations - The smart medical pill dispenser correctly buzzes at the right day and time - The smart medical pill dispenser correctly dispenses the right medication with the correct dosage - The smart medical pill dispenser automatically sorts the inputted bottles into separate compartments - The smart medical pill dispenser appropriately alerts users when they need a refill by determining when certain medications are about to run out - Cost of project to be $250 for end product and $5 for monthly server expenses |
||||||
41 | Antwieght Battle Bot Project Proposal |
Anthony Shen Batu Yesilyurt Praman Rai |
Sanjana Pingali | Viktor Gruev | proposal1.pdf |
|
# Antweight Battlebot Batu Yesilyurt (batuy2) Praman (pramanr2) Anthony (arshen2) # Problem Eight teams will compete with their own battlebots in a tournament. The antweight battlebots have the following constraints: Less than 2 lbs, 3D printed plastics, custom PCB that connects via bluetooth to microcontroller, motor or pneumatic fighting tool, and easy manual/automatic shutdown. # Solution Our plan is to be able to control and prevent the opposing robot from moving to win by decision. Controlling the opposing robot is an effective yet simple way to earn points. We plan on having arms that extend out and grab the opposing battlebot, preventing it from moving.The biggest challenge that we predict we will face is the 2 lb weight constraint. This might prevent the use of any additional features such as weapons to damage the opposing battlebot when we have it under control. # Solution Components ## Materials The primary purpose of our robot will be to control the enemy, this means that our robot needs to be resistant to their attacks. Most battlebots will use kinetic weapons, so we plan on using PETG because of its impact resistance. ## Control System The controls will be managed and powered by an STM32 microcontroller, which will direct the 3 DC motors (2 drivetrain and 1 weapon) while also utilizing its embedded wireless communication. The bluetooth module will interface with an external controller (likely PC) and will enable low latency wireless control. The microcontroller will also leverage GPIO and PWM to enable precise speed control and directional control for the motors. Furthermore, we will implement an H-bridge for additional control and stabilization. ## Power System We plan on using a 12v LiPO battery because it would provide us with lots of power for our weapons system while also being light. ## Movement System We plan on using brushless motors to operate 2 wheels on either side of the battlebot. Our winning condition will involve pushing and controlling the other team's robot so higher torque will be more preferred over high speed motors to be able to move around the other team's battlebot. To save weight we will use a high torque motor with a fixed gear ratio. We will sacrifice speed for torque. We will also try to distribute the weight of the robot components over the wheels to maximize downforce for grip. ## Weapon System For our weapon, we plan to utilize 2 arms that would wrap around the other robot to control and prevent it from moving. These arms will utilize a big portion of the weight budget in order to make sure they are strong enough to restrain the other robot and also take hits when not deployed. # Criterion For Success For a successful project, the robot should complete 3 goals. First is the remote control of the robot through bluetooth or wifi from the PC. Second the robot should automatically disable in the event the remote connection is disabled. Third the robot should drive and operate the weapon to a functional degree |
||||||
42 | FPV Drone Custom Flight Controller |
Hulya Goodwin Jaelynn Abdullah Muhammad Rabbani |
Jason Jung | Viktor Gruev | proposal1.pdf |
|
# Team Members: Muhammad Rabbani (rabbani3) Hulya Goodwin (hyg2) Jaelynn Abdullah (jja8) # Problem: Building a custom drone from scratch requires both hardware and software development, particularly in designing an efficient and reliable flight controller. Most off-the-shelf flight controllers come with proprietary firmware, which limits customizability. For advanced applications such as autonomous navigation, swarm coordination, or precision control, users require deeper access to the flight algorithms and hardware integration. Our goal is to develop a fully functional flight controller to run an FPV drone system. The system is broken down below. # Solution We plan to design a custom flight controller that interfaces with drone hardware to provide real-time flight stability and navigation control. The system will consist of a microcontroller-based flight control unit, sensor fusion for IMU data processing, motor control algorithms, and wireless communication for user input. Our custom firmware will handle: Sensor data processing (gyroscope, accelerometer, magnetometer) PID-based flight stabilization Motor speed control via pulse-width modulation/ESC Wireless communication for remote control Constant streaming of the drone’s live camera feed Manual and autonomous flight modes Additionally, we will construct a drone frame through 3D printing or PCB design and integrate all components, ensuring a robust and modular design for future improvements. # Solution Components ## Subsystem 1 – CPU STM Microcontroller This subsystem processes sensor data, computes control outputs, and interfaces with the drone’s actuators. We will use Betaflight to handle: PID (Proportional-Integral-Derivative) control loops for pitch, roll, and yaw stabilization. Sensor fusion algorithms to accurately estimate the drone’s orientation. Communication protocols (I2C, SPI, UART) for sensor integration. The STM32F405 microcontroller is a good candidate due to its real-time processing capabilities ## Subsystem 2 - Sensors The flight controller must read and process sensor data in real time to maintain stability and control. This subsystem will include: IMU (Inertial Measurement Unit): Includes an accelerometer and gyroscope to determine the drone’s orientation. We will likely use the MPU6050 or ICM-20948 for IMU data ## Subsystem 3 - Power Within our system, the STM32 requires 3.3V and some components require up to 12V. With this, a 4S battery rated for 14.8V and can provide up to 1400mAh will be used to power the Flight Controller, motors, and peripherals. Using a voltage regulator will ensure that the components are getting the correct voltage and a simple voltage divider component will be added to ensure we can send 3.3V for the STM. Since we are using brushless motors, there will not be a need for an H-Bridge. The battery specifically would be a BetaFPV 4S 450mAh 75C. ##Subsystem 4 - Telemetry System We will purchase a radio transmitter (LiteRadio 3 SE Radio Transmitter from BETAFPV) in the shape of a game controller to allow us to control the movement of the drone. It uses Tx protocol ExpressLRS to transmit the user's input to the radio receiver. The radio receiver will be an ELRS Nano Receiver (from BETAFPV) with a receiver protocol CRSF to communicate between the receiver and the FC. The FC then communicates with the KISS 24A ESC that will be using ESC protocol Dshot to control the speed of the motors. ## Subsystem 5 - Physical Drone The physical drone will be made by us. This frame will either be 3D printed and sanded down for aerodynamics or made of PCB material that’s insulated and separated from the flight controller PCB in case of a crash. It will be an X-Frame Quadcopter with a larger center to place the FC on. If our frame is not flyable, then there are cheap drone frames we can purchase as well. For the motors, we will probably stick with the same brand and use BetaFPV 1404 3800KV. ## Subsystem 6 - Camera + Goggles For using analog communication, a Caddx Ant Lite or a Runcam Nano 3 would be useful considering we are using a MAX7456 OSD. To broadcast this device, we would need a plug in receiver for the phone, however, the latency would be an issue. For staying within our budget, a Eachine ROTG02, however, has a latency near 100ms. We can utilize apps online (such as GoFPV and FPViewer) but if time allows, we can make our own interface. For the phone, we will create a housing similar to Google's Cardboard. ## Criterion For Success We will demonstrate a working flight controller that has full control over our various subsystems: We receive live data from our sensors. We receive live video from our camera. We have complete control over our power subsystem and various motors to achieve synchronous motion. Our micro controller has our custom/modified program to completely analyze our sensor data to control our motors in response to our orientation and inputted controls. |
||||||
43 | Autonomous Featherweight (30lb) Battlebot |
Jason Mei Michael Ko Qinghuai Yao |
Michael Gamota | Viktor Gruev | proposal1.pdf |
|
# Autonomous Featherweight (30lb) Battlebot Team Members: - Jason Mei (jasonm5) - Qinghuai Yao (qyao6) - Michael Ko (ykko2) # Problem iRobotics, a RSO on campus, has built multiple battlebots that are entered into competitions across the U.S. One of the robots that has been developed is called "CRACK?", a 30lb hammer-axe battlebot. The robot has already been designed and completed - however, the project would be to upgrade this robot from manual control to autonomous control. # Solution For this project, the plan is to use a camera mounted just outside the polycarbonate walls for a live state of the arena, sending information to a computer. The computer can then use image transforms to get an accurate top-down view of the field, which allows the computer to then calculate the next movements, either directly using a pure-pursuit algorithm, or a machine learning algorithm potentially. The control is then passed over to a microcontroller board mounted within the robot, which sends signals to the motors, and drives the robot or fires the hammer. # Solution Components ## Camera Subsystem The main computer takes in the data from a camera (ALPCAM 2MP Varifocus USB Camera) mounted on the outside of the arena. The camera uploads a standard size (640x480 or 1280x720) to the computer. For every frame, the python program (utilizing OpenCV) creates a binary image with perspective transforms, color filters, and other information. It will also scan for april tags, which will be mounted on specific sides of the robot, allowing for the computer to identify the both of the robots’ full pose (position and orientation) within the arena. ## Autonomous Control Subsystem After gaining both of the robot’s poses, the computer will identify the next actions for the robot to perform. Initially, we will use a standard pure pursuit algorithm, where the robot will simply minimize the distance between itself and the opponent without regard for orientation. Potentially, we may switch to using a reinforcement learning algorithm, utilizing machine learning within a custom OpenAI environment. The computer will then use bluetooth to connect wirelessly to the robot, and then send over the instructions. ## On-robot Subsystem The motors on the robot itself are typically controlled by a receiver, which uses PWM signals (1.5 ms on a 50 ms period is a “neutral” signal). We will be inserting a microcontroller (ESP32S3) board in between the receiver and the motor ESCs (electronic speed controllers), to analyze the information from both the receiver and the computer. Additionally, to maximize the information available to the user, we will be adding both a voltage divider to analyze battery voltage, as well as an accelerometer sensor (MPU6050) to display the robot’s movement. # Criterion For Success We would define a successful project with a specific set of goals: The system must identify the robot and track the location and pose live (using the april tag). The system must be able to allow the robot to drive to any specific location directly at close to full speed, similarly to how a human would. The system must be able to shut off safely and immediately if there are ever any safety violations. The robot will compete at Robobrawl X, an event that is held this year on campus (April 4th and 5th, 2025). |
||||||
44 | Self Temperature and Taste Regulating Tea Cup |
Anirudh Kumar James Li Lahiru Periyannan |
Rui Gong | Yang Zhao | proposal1.pdf |
|
Team Members: - zyli2 - lahirup2 - kumar67 # Problem Current methods to brew tea lack ways to handle different tea leaves and maintain temperature. For instance, tea is usually brewed by adding boiling water to a cup of tea leaves. This is effective for tea leaves like black tea, however, for more delicate teas like green tea, this would bring out more bitterness as it burns the green leaves. Adding boiling 100°C water is way over green tea's preferred temperature range of 70-80°C. Temperature is important in brewing tea because different tea leaves require different temperatures to effectively bring out its aromatic compounds. The ability to heat different tea leaves to its optimal temperature and maintain its warmness would provide the best possible tea drinking experience. # Solution We propose a cup that can heat liquid optimal to the type of tea leaf chosen and maintain the liquid to a user-specified temperature. Our system provides a precise temperature control to combat inconsistencies in conventional tea brewing methods. Our cup integrates multiple subsystems to ensure optimal flavor extraction, temperature retention, and ease of use: 1) Sensors: detects and monitor tea temperature and tea bitterness 2) Heating and stirring: maintain uniform temperature and tea taste consistency 3) Power: power source to support other subsystems like heating, sensors, etc 4) Control and communication: receives commands from mobile app and collects data from sensors and transfers to mobile app 5) Mobile app/User interface: displays temperature and tea information 6) Cup: holds liquid for drinking # Solution Component ## Subsystem 1: Sensors Sensors are needed for many functionalities in this project. We of course need to monitor the temperature of the liquid within the cup. Given that our project will be working with mostly water and water - based tea, Campbell Scientific’s specialized sensors for water such as TempVue 50 could be useful here. We also need a way to determine how strong a liquid, such as tea, is. This can be done through a Total Dissolved Solids sensor, such as Seeed Studio’s Grove sensor. ## Subsystem 2: Heating and Stirring These are two features of our project which are of course very important for people who drink tea. The tea needs to be maintained at the user's desired temperature, as having to reheat it can affect the flavor. To maintain temperature, a thermoelectric Peltier Module (PM) can be used; specifically, TEC1-07103 or similar, depending on the size of cup and desired efficiency. As for stirring, we can use N20 Micro Gear Motor mounted on the top of the cup, with an appendage to stir the liquid inside. ## Subsystem 3: Power To power the Peltier module and stirrer, while maintaining portability, the cup will need to be battery-powered. A lithium-ion battery can be used, although bench power supply can be used initially and during early stages. In order to dynamically utilize the PM, which is dependent on input voltage to set the level of heating, a DC-DC Buck-Boost converter/regulator is needed. One potential unit is TPS63070/XL63070, which has a maximum output of 9V (The PM has a maximum rated voltage of 8.5V, so this minimizes potential overvoltage issues of other converter models). A switch is also needed to cut power to the device when necessary; the TPS63070 has built-in functionality to shut off output power while connected to the input. ## Subsystem 4: Control and Communication We can use a microcontroller such as ESP32-S3-WROOM to collect the data from the sensors and communicate it to the mobile app / user interface (Subsystem 5). It will also be able to receive information (user settings) from the mobile application and then control the other subsystems as and when needed. It should monitor temperature and TDS regularly, perhaps every thirty seconds or even more frequently. Stirring will also be controlled by this subsystem and done at a regular interval as well. ## Subsystem 5: Mobile App / User Interface We can either have a mobile app or a web app. Either way, it will display temperature and TDS readings to the user, as well as allow them to control both those values. It should also alert the user when their tea has reached the desired strength or temperature and allow them to provide settings for stirring. This app will communicate with the microcontroller from Subsystem 4 via Bluetooth / Wi-Fi. ## Subsystem 6: Cup In order to best conduct heat from the Peltier module to the liquid, a layer of metal, such as aluminum is necessary on the bottom of the cup. Other than that point of contact, the rest of the cup may be made from ceramic, double-walled steel, or another similarly insulating material. # Criterion For Success - There should be accurate temperature control, within the range of a couple of degrees Celsius of the desired temperature. - There should be accurate measurement of the strength of the tea via the TDS sensor. We will need to correlate TDS readings to categorical strength values (e.g. ‘weak’ or ‘strong’) as most people will not read a TDS reading and know what it means in terms of strength. - The cup needs to be washable, so all the electronics should be waterproof. - The mobile / web application for the user should be easy to use and clearly communicate all the necessary information. |
||||||
45 | AI-based Meeting Transcription Device |
Chang Liu Gao Gao Ziyang Huang |
Jiankun Yang | Arne Fliflet | proposal1.pdf |
|
## Team Members: - **Ziyang Huang** (ziyangh3) - **Gao Gao** (xgao54) - **Chang Liu** (changl21) ## Problem During the pandemic, we found Zoom’s live transcription very useful, as it helped the audience catch up quickly with the lecturer. In many professional and academic settings, real-time transcription of spoken communication is essential for note-taking. Additionally, individuals with hearing impairments face challenges in following spoken conversations, especially in environments where captions are unavailable. Existing solutions, such as Zoom’s live transcription or mobile speech-to-text apps, require an internet connection and are often tied to specific platforms. To address this, we propose a standalone, portable transcription device that can capture, transcribe, and display spoken text in real time. The device will be helpful since it provides a distraction-free way to record and review conversations without relying on a smartphone or laptop. ## Solution Our **Smart Meeting Transcription Device** will be a portable, battery-powered device that records with a microphone, converts speech into real-time text, and displays it on an LCD screen. The system consists of the following key components: 1. **A microphone module** to capture audio input. 2. **A speech processing unit** (Jetson Nano/Raspberry Pi/Arduino) running the Vosk speech-to-text model to transcribe the captured speech. 3. **An STM32 microcontroller**, which serves as the central controller for managing user interactions, processing text display, and storing transcriptions. 4. **An LCD screen** to display transcriptions in real-time. 5. **External memory** (SD card or NOR flash) for saving transcribed conversations. 6. **A power system** (battery with efficient power management) to enable portability. --- ## Solution Components ### **Subsystem 1: Speech Processing Unit** - **Function:** Captures audio and converts speech into text using an embedded speech-to-text model. - **Microphone Module:** Adafruit Electret Microphone Amplifier (MAX9814) - **Processing Board:** Jetson Nano / Raspberry Pi 4B - **Speech Recognition Model:** Vosk Speech-to-Text Model - **Memory Expansion (if required):** SD card (SanDisk Ultra 32GB) ### **Subsystem 2: STM32 Central Controller** - **Function:** Manages the user interface, processes the transcribed text, and sends data to the LCD screen. - **Microcontroller:** STM32F4 Series MCU - **Interface Components:** Buttons for navigation and text saving - **Memory Module:** SPI-based NOR Flash (W25Q128JV) ### **Subsystem 3: Display Module** - **Function:** Displays real-time transcriptions and allows users to scroll through previous text. - **LCD Screen:** 2.8-inch TFT Display (ILI9341) - **Controller Interface:** SPI Communication with STM32 ### **Subsystem 4: Power Management System** - **Function:** Provides reliable and portable power for all components. - **Battery:** 3.7V Li-ion Battery (Adafruit 2500mAh) - **Power Regulation:** TP4056 Li-ion Charger + 5V Boost Converter - **Power Optimization:** Sleep mode for STM32 to enhance battery life --- ## **Criterion for Success** 1. The device must accurately transcribe speech to text with reasonable latency. 2. The LCD screen must display real-time transcriptions clearly. 3. The STM32 must successfully manage system operations and communicate with peripheral components. 4. The system should support local storage for saving transcriptions. 5. The battery life should last at least **2-3 hours** under normal usage conditions. |
||||||
46 | BioSteady |
Alisha Chakraborty Asmita Pramanik Pranav Nagarajan |
Surya Vasanth | Yang Zhao | proposal1.pdf |
|
**Team Members:** Alisha Chakraborty (alishac4) Asmita Pramanik (asmitap2) Pranav Nagarajan (pranavn6) **PROBLEM** The rigor of student life has not only contributed to our rising stress levels, but also to our dependence on stimulants like caffeine to rapidly increase overall productivity. Furthermore, the wide availability of coffee shops in our school and workplaces make it easy for us to turn to such stimulants, without considering the detrimental combined effects of caffeine and stress on our health. Heightened levels of stress cause various physiological changes such as increased heart rate and skin conductance. Current research suggests that caffeine intake also exhibits similar physiological changes which introduces us to the problem of not being able to differentiate between the two. The ability to differentiate between the two will allow students to make informed decisions about the frequency of their caffeine consumption, consequently contributing to better physical and mental health. **SOLUTION** Our proposed solution is to integrate data collected from heart rate and galvanic skin conductance sensors to estimate and notify the user whether they are most likely experiencing physiological changes under stress or caffeine. Doing so will make it easier for them to decide whether it is wise to drink coffee in moments of high stress. When we are affected by stress, the adrenaline release in our body immediately triggers a ‘fight or flight’ response, which causes a spike in heart rate. An additional bodily response is also that there are sudden spikes and drops in skin conductance with a general decrease. Under the effects of caffeine, the heart rate does increase but gradually over a couple of minutes. Its effect on skin conductance is that it is normal to low for about 200 seconds and then exhibits a steep increase. Using these facts, we will determine whether the user should consume coffee or not based on their general state of mind. **SOLUTION COMPONENTS** **1. Subsystem 1: Biomedical Sensing** This subsystem will collect the user's physiological data like heart rate, oxygen levels, and skin conductivity and will transmit it to the MCU for data processing. Heart Rate and Oximeter Sensor Sensor : MAX30102 Datasheet: https://www.analog.com/media/en/technical-documentation/data-sheets/max30102.pdf Functionality : uses PPG (PhotoPlethysmoGraphy) to measure heart rate and oxygen saturation when processed through an MCU Communication : I2C Power Requirements : I2C pull-ups operate on 3.3 V and core operates on 1.8V Galvanic Skin Response Sensor Sensor: Elecbee GSR Skin sensor module Datasheet : https://www.seeedstudio.com/Grove-GSR-sensor-p-1614.html?gad_source=1&gbraid=0AAAAACiAB45royCnyQi5xNgTS40BTYnFL&gclid=CjwKCAiAneK8BhAVEiwAoy2HYbC6TTsLlyUQMAoK6wCHRL13LKu2egu27oheSHQcOb3TPxl8o-h5IxoC6jQQAvD_BwE Functionality : measures skin conductance to process physiological stress levels, higher voltage output = lower skin resistance which means more sweat Communication : Analog output voltage changes based on the skin’s conductance Power Requirements: 3.3V - 5V **2. Subsystem 2 : MCU & Power management** This subsystem will use the biometric data from the sensors for analysis as well as manage communication with external interfaces Microcontroller : STM32L432KC Datasheet : https://www.st.com/resource/en/datasheet/stm32l432kc.pdf Interfaces : two I2C for MAX30102 and ADC for GSR sensor Power Supply : 1.71 to 3.6 V for I/Os and 1.62V to 3.6V for ADCs Functionality: This MCU will be used to collect the data from the sensors as well as use the USB to UART bridge for frontend web application Voltage Regulators LM39401-A : 5V to 3.3V Regulator for MCU and sensors ASM1117-1.8 : 3.3V to 1.8V Regulator for MAX30102 Cores **CRITERION FOR SUCCESS** The system must reliably collect physiological data using the MAX30102 heart sensor and the GSR sensor, ensuring accurate measurement of heart rate, oxygen saturation and skin conductance.These values should be processed in real-time without errors or delays. The microcontroller (STM32) must integrate the sensor data and differentiate between stress-induced changes (characterized by rapid spikes in heart rate and increased skin conductance) and caffeine-induced changes (characterized by gradual increases in heart rate with stable skin conductance). Data transmission from the microcontroller to the web application should be a seamless process without any data-loss, ensuring real-time visualization of physiological states. The web application must display the processed results in a clear, user-friendly format, allowing users to quickly interpret whether their physiological changes are stress- or caffeine-related. The system must work reliably, right from the collection of data through the sensors to being able to display the results on a web application, ensuring it functions effectively in different scenarios. The project should be able to let its users make informed decisions about their caffeine intake based on clear, actionable feedback provided by the system. **REFERENCES** Villarejo, María Viqueira, Begoña García Zapirain, and Amaia Méndez Zorrilla. 2012. "A Stress Sensor Based on Galvanic Skin Response (GSR) Controlled by ZigBee." Sensors 12 (5): 6075-6101. https://doi.org/10.3390/s120506075. |
||||||
47 | Pitched Project (Professor Manuel Hernandez): Smart Cognitive-Motor Rehabilitation Mat for Remote Exercise Monitoring |
Adithya Balaji Jashan Virdi Scott Lopez |
Michael Gamota | Michael Oelze | proposal1.pdf |
|
Team Members: - Adithya Balaji (abalaji5) - Scott Lopez (slope22) - Jashan Virdi (jvird2) # Problem Many older adults don’t have access to rehabilitation for Multiple Sclerosis compared to people of younger age groups. During the previous semester a group was able to create a prototype for a square stepping mat that provides useful feedback to a user in order to aid in rehabilitation; however, this prototype has some flaws that need to be addressed such as the (1) voltage from each square is interfering with others which reduces the accuracy of step detection and for (2) computers needing a USB connection for data transfer which reduces the portability of the mat. # Solution Our project proposes to enhance the existing rehabilitation mat by focusing on two key areas: Optimizing and increasing step detection accuracy through improved sensor integration and signal processing. Developing a wireless, low-power system for operation, using relevant communication protocols and energy-efficient components. # Solution Components ## Sensing Subsystem Pressure-sensitive sensors (e.g., Velostat-based) for detecting step position and timing. The main work here will be to iterate on and develop signal conditioning circuitry for improved step detection accuracy. Additionally, an area of research will be to explore the usage of materials other than copper strips to prevent voltages from each square interfering with other squares. ## Microcontroller subsystem Microcontroller (ESP32-S2-mini-1) to manage sensor data and process step events. This specific microprocessor is used because it has an integrated WiFi antenna for WiFi communications with mobile devices. The microprocessor enables real-time control of visual and auditory feedback for the user. ## Power Management subsystem The power management subsystem will send and regulate power to the microcontroller and sensing subsystems, and the LEDs on the mat. ## Wireless Communication Subsystem Integration of Wi-Fi or Bluetooth Low Energy (BLE) module for wireless data transmission. Low-latency data transfer protocol for real-time communication. Currently, they have data transfer locally using LAN but using wired connections, which is why we will be introducing BLE to reduce wired connections and improve portability. ## Custom PCB Custom PCB integrating the microcontroller, sensor interfaces, and power management circuits to ensure compact and reliable operation. The main focus here will be to accommodate the wireless module that will be implemented for this project. # Criterion For Success Achieve a step detection accuracy of at least 95% (larger than previous prototype aim of 90%), taking into account unexpected variances due to variations in step styles and uneven pressure applications on the mat Wireless communication module with low latency for remote operation to eliminate the need for wired data transfer Successful data processing and feedback delivery from the smart mat during cognitive-motor exercise routines |
||||||
48 | Pitched Project (Prof Manuel Hernandez) Insole for Gait Monitoring and Furthering Research of Fall Risk in Older Adults |
Jess Sun Lily Hyatt Nasym Kushner |
Kaiwen Cao | Michael Oelze | proposal1.pdf |
|
# Insole for Gait Monitoring and Furthering Research of Fall Risk in Older Adults Team Members: - Jessica Sun (jzsun2) - Nasym Kushner (nasymjk2) - Lily Hyatt (lhhyatt2) # Problem A major cause of injury, especially for the elderly population, is from falls. 8 million adults over the age of 65 are injured each year, and an estimated 3 million require emergency care for injuries. In the US alone, on average 32,000 deaths a year are due to falls, and worldwide, falls are the second most common cause of unintentional death. Currently, early smart home fall detection technology for high risk adults is lacking and fails to incorporate relevant data from monitoring changes in fall risk and frailty. As a response to the gap in the market, Dr. Manuel Hernandez’s lab created a TENG sensor designed for the insole. Our goal is to integrate the sensor into our device to monitor gait for data collection, and improvement and characterization of sensor. The device should be portable, allowing the user to walk as they would normally. It should be able to accurately convert the signals from the sensor into a digital format and transmit via Bluetooth. The challenges we face moving forward are: Measuring/dealing with high voltage (up to 40V) and low current (on the order of micro amps). Addressing the portability/wearability of the current sensor as well as its implementation into our design. And implementing and testing its self powering nature. # Solution As gait is one of the most important indicators of health, we also plan to improve development for a pressure sensing insole. This insole will have a custom triboelectric pressure sensor to analyze timing of the patient’s steps. An added feature of the triboelectric nanogenerator is its self powering ability. The main feature we plan on improving is usability. This will be accomplished through bluetooth integration with an easy to use mobile application which will store and display the collected data. This will make it easier to monitor patient status and enable further research on the effects of fall risk and fragility through data collection, advancing understanding of behavioral mechanisms related to balance and gait dysfunctions in older adults. The triboelectric sensor we will be working with is described as high voltage, low current. It detects load by passing current when changes in load are made. We aim to test the current custom triboelectric sensor to benchmark “high”, “medium”, and “low” loads based on factors such as weight, age, and gender and set thresholds to mark this as interpretable data for measuring step timing. We also need to create hardware that is comfortably wearable and compatible with the sensor, and synchronize the sensors from the left and right feet. As stated, the most important factor we plan to address is ease of usability. We understand that even though technology can unlock great opportunities for patient care, products that are difficult to use or incompatible diminishes these effects. As such, we strive to make our interface as user-friendly and intuitive as possible. Through the creation of a robust app, seamless data collection, and durable hardware, we hope to create a system patients and providers will enjoy using. # Solution Components ## Measurement Subsystem This subsystem measures step timing and load and makes the signal suitable for microcontroller. Pressure sensing insole (this component will be provided by Dr. Manuel Hernandez) Resistors (step high measurement voltage) diode (protect microcontroller against voltage spikes) capacitor (filter noise) ADS8689 (ADC) ## Data Processing Subsystem This subsystem process the measurements and exports them via Bluetooth) ESP-32 Bluetooth Module ## Power Subsystem This subsystem powers the data processing subsystem 3.3V Battery Power switch LED (indicate On/Off and status) ## Housing Subsystem Hold power and data processing subsystem Compact 3D printed case with spot for switch, LED, and openable battery compartment Velcro strap (for nearby attachment) ## Shoe Subsystem The sensor will be placed inside the sole of a sandal located on the heel. Orthopedic podiatric friendly sole with cutting to fit sensor Thin padding over the sensor for comfortability and protection of the sensor while not detracting from the load sensing capabilities ## Mobile Application Subsystem The app will receive Bluetooth data from the insole and display relevant information. Functions: Receive data from ESP-32 over bluetooth Display status of device Visualize and export data # Criterion For Success - Calibration of each sensor. Custom made sensors will have slight variations, so in order to capture the most standardized data sets between the two sensors worn on both feet, calibrations must be made. - Sensor accuracy. Data collected should have consistent readings under repeated same loading conditions. This should remain true under high step frequency (up to ~5Hz) - Voltage safety implementation. The voltage imputed into the microcontroller should always be within the rated voltage (3.3V or 5V depending on pin). - Ease of Use: The whole system (sole and user interface) should be easy and intuitive to use. The user should not have to worry about the internet settings on their device. The device should be easy to set up/install. - Durability: The product should be able to work properly and maintain accurate readings through rigorous usage over many cycles with variable loading weight and frequency. |
||||||
49 | Automated Smoothie Machine |
Anay Koorapaty Avyay Koorapaty Max Gendeh |
Jason Zhang | Arne Fliflet | proposal1.pdf |
|
# Automated Smoothie Machine Team Members: - Anay Koorapaty (anayk3) - Avyay Koorapaty (avyayk2) - Max Gendeh (mgendeh2) # Problem Making smoothies often requires measuring different ingredients with different measurement utilities. Liquid ingredients must be measured in mL, while solid ingredients are usually in units of cups or tablespoons. We will automate the measuring and dispensing process. This will enable creating smoothies with different recipes efficiently. Our system will be able to make smoothies by following preset smoothie recipes or recipes the user creates. # Solution Our solution will be a compartment mounted to hang just above the top of a blender. It will consist of a circle of ingredient compartments attached to a funnel. Each compartment will have a structure including two motors, a dispenser, a force sensor, a jug, and some strings. This is the Ingredient Compartments subsystem. Complementing this physical structure, we will have software to control the motors incorporating the force sensor measurements. This is the Recipe Execution subsystem. To be able to create different kinds of smoothies, there will be a UI for inputting recipes and selecting preset recipes. This is the Recipe UI subsystem. These three subsystems will work together to automatically dispense the appropriate amounts of ingredients into the blender for different recipes, increasing efficiency in creating smoothies. # Solution Components ## Ingredient Compartments This subsystem has eight compartments, each one for an ingredient. Five can be set ingredients, and three the choice of the user. Each compartment will contain a vertically mounted force sensor with a hook, a jug suspended from that hook by a string that attaches at two points to the jug, a dispenser, and two motors. We will probably use cereal dispensers for solid ingredients and liquid dispensers for ingredients such as water or milk. A motor will turn the dispenser handle, dropping the ingredient into the jug. The force sensor will measure the weight of the jug and ingredients in the jug. When the weight is the required amount for the recipe, a second motor will pull a string attached to the bottom of the jug, overturning the jug and its contents into the funnel through to the blender. The structure will have pillars to the table, to support the bottom of the funnel hanging just above the top of the blender. There will be a small air gap between the bottom of the funnel and the top of the blender, to facilitate removing and placing the lid of the blender. The motors we plan to use are Adafruit Accessories DC Gearbox Motor - TT Motor - 200RPM - 3 to 6VDC. ## Recipe Execution The instructions for the selected recipe, consisting of a compartment number and quantity amount will be read from memory. For each instruction, the motor corresponding to the correct compartment will empty the ingredients into a jug that is pulling down on a force sensor. Once the weight is the desired amount, another motor overturns the jug to empty the ingredients into the funnel which feeds into the blender. All ingredient quantities will be standardized to grams for ease of interfacing with the weight sensor. When a user selects ingredient amounts for their own recipe, it will be displayed on the LCD display converted to cups or tablespoons, common measurement units for smoothie recipes, while being stored in memory in grams. This will be the software aspect of the project interfacing with the force sensor to find weight, checking if value read from memory matches, and overturning the jug to empty ingredients into the container. ## Recipe UI We plan to have 4 buttons that control the entirety of the recipe UI. These buttons feed directly into an ESP32 microcontroller. For pre-set recipes, the user can press button 3, select which recipe using buttons 1 and 2 to select between recipes, and press button 3 again to confirm. An LCD will display the recipe name the user is currently looking at. To create a custom recipe, press button 4. Adjust ingredient quantities (in grams) using buttons 1 and 2, press button 3 to switch ingredients, and press button 4 again to save and finalize the recipe. Users may press and hold buttons 1 and 2 for faster quantity changes. We are considering using a potentiometer to adjust ingredient quantities but will explore both options to see which we like better. The LCD shows the selected quantity in grams and its equivalent in common units (e.g., tbsp for honey, cups for milk). The button we plan to use is a tactile switch (part number PTS645SL43SMTR92 LFS) and the LCD will be a 16x2 will be best (can’t find part number). # Criterion For Success The machine should be able to: Accurately measure ingredient amounts Transfer the correct ingredient amount from the dispenser, into the jug, then into the funnel through to the blender User blends the smoothie ingredients Allows user to input recipes or select preset recipes |
||||||
50 | Weather-Resilient Camera System for Autonomous Vehicles |
Adam Shore Deyvik Bhan Jacob Camras |
John Li | Arne Fliflet | proposal1.pdf |
|
# Weather-Resilient Camera System for Autonomous Vehicles # Group members - Adam Shore (ajshore2) - Jacob Camras (camras3) - Deyvik Bhan (deyvikb2) # Problem: Snow and freezing temperatures can severely impair the functionality of car cameras used for object detection, such as those in autonomous vehicles like Teslas. When snow or ice obstructs these cameras, the vehicle's object detection system may fail, leading to potential safety risks. Existing solutions are limited and often fail to address the real-time detection and prevention of this issue. # Solution: Our system keeps car cameras functional in adverse weather by integrating real-time detection and response mechanisms. Temperature and moisture sensors monitor conditions and detect when freezing or obstructions threaten visibility. If snow or ice accumulates, a targeted heating element activates to clear the lens, ensuring uninterrupted object detection. To maintain visibility in the rain, an optical rain detection system identifies raindrops in real-time. A pretrained CNN, deployed with TinyML, processes camera images to detect raindrops. When rain appears, the system applies a hydrophobic nanocoating to repel water and prevent droplets from sticking to the lens. In heavier rain, the heating element warms the lens to evaporate moisture. A microcontroller manages the entire system, processing sensor inputs and triggering the necessary responses. It runs the optimized TinyML-based CNN, which operates efficiently in low-power environments using an estimated 5-7MB of memory. A rechargeable Li-ion battery with voltage regulation ensures stable power distribution. By combining these real-time detection and response mechanisms, our system keeps car cameras clear in all weather conditions, improving the reliability and safety of autonomous vehicle object detection systems. # Solution Components ## OpenCV Module with Camera Subsystem: A low-power camera will be used to capture images for processing. This system will utilize OpenCV in order to manage real-time image processing. This will allow for detecting raindrops and ice obstructions. We will use a pre trained CNN model in order to identify raindrops on the lens of the camera, similar to the following: https://github.com/tobybreckon/raindrop-detection-cnn To summarize, the key components here include a Camera Module (OV7670 or OV5642) for real-time image capture and an OpenCV-based image processing for additional detection and filtering. ## Microcontroller Subsystem: The primary control system will be powered by an STM32 microcontroller (STM32F746NGH6 Arm® Cortex®). This control unit will manage input from the sensors and trigger the heating and wiping mechanisms as needed. Specifically, this will act in the form as a rain sensor in order to properly trigger the outputs accordingly. We plan to use the DHT22 Temperature & Humidity Sensor. Additionally, AI-based algorithms will run to optimize real-time decision making. It will utilize TinyML to deploy the CNN to reduce the RAM needed. The system will process images from the camera and run the TinyML-based CNN model. ## Spray on-Hydrophobic Coating Subsystem: There will be a mechanism in order to apply the hydrophobic coating when the rain is triggered. This spray solution is linked here: https://www.amazon.com/Nanoskin-NA-HQD16-Express-Hydrophobic-Polymer/dp/B00DOS0PMS?source=ps-sl-shoppingads-lpcontext&ref_=fplfs&psc=1&smid=ATVPDKIKX0DER&gPromoCode=sns_us_en_5_2023Q4&gQT=1 Before applying the hydrophobic coating we also will need to apply a spray on cleaning system. We propose a spray on product such as this one here: Precision Optics Cleaning Solution – 2oz Spray Bottle (PLC2S Once the product is sprayed on we will need to wipe it off the camera. We are proposing creating our own wiper using this product here https://www.amazon.com/Micro-Helicopter-Airplane-Remote-Control/dp/B072V529YD A cloth will be attached to this wiper to wipe off the cleaning fluid before the hydrophobic coating is applied. ## Heating Subsystem: For our heating component, we will use a KIWIFOTOS USB Lens Dew Heater. This component is linked here: https://www.amazon.com/Temperature-Condensation-Prevention-Telescopes-80mm-110mm/dp/B08B4TJP6M?source=ps-sl-shoppingads-lpcontext&ref_=fplfs&smid=A2VY9ZK1UXR49Y&gQT=1&th=1 This component will be attached to our PCB so that our microcontroller can trigger it when needed, based on the sensor input. ## Power Subsystem The battery subsystem will supply power to all components of the raindrop detection system, ensuring reliable operation in various environments. We plan to use a rechargeable Li-ion battery pack, likely 7.4V or 12V, depending on the power requirements of our microcontroller and sensors. A voltage regulation circuit will be implemented to step down or stabilize power for different components. A buck converter will provide a steady 3.3V or 5V for the microcontroller and sensors, while higher-power components, such as potential heating elements or additional processing units, will receive direct power as needed. ## PCB: Used to connect the following components/systems: Camera, Microcontroller containing the CNN and sensors, Heating Component, Spray on-Hydrophobic coating, Power Subsystem # Criterion For Success: We will test our solution by simulating rainy and snowy conditions manually. We will pour water onto the lens and verify the hydrophobic nano coating and computer vision modules work by seeing whether or not the hydrophobic coating is sprayed when rain is detected by the CV module. if the temperature and moisture sensor work by putting ice on the camera and manually seeing if the ice has melted off, and we can see through the camera again. The goals we aim to achieve for our project to be considered successful are as follows: The system accurately detects raindrops and ice obstructions using the TinyML-based computer vision model and sensor inputs. The hydrophobic nano-coating is successfully applied when raindrops are detected by the computer vision model. The heating element activates when the temperature and moisture sensors detect freezing conditions, effectively clearing the obstruction. The microcontroller efficiently processes the computer vision model while simultaneously handling sensor inputs and system activations. The camera remains unobstructed and functional in simulated adverse weather conditions, allowing clear vision for object detection. The PCB integrates all components seamlessly, ensuring stable power distribution and communication between sensors, microcontroller, and external systems. |
||||||
51 | Integrated Robotics Battery/BMS |
Adi Nikumbh Rishav Kumar Ritvik Kumar |
Shengyan Liu | Arne Fliflet | proposal1.pdf |
|
## Robotics Low Voltage BMS/Battery Pack Team Members: rishavk2 ritvik3 nikumbh2 ## Problem One issue with the development of robotics systems for small companies is the issue of battery packs. Manufacturing of a battery pack can be dangerous, and can require the expensive development of a custom BMS system. There are currently few options for completely developed and integrated lightweight battery packs that also contain a high quality BMS system that has capabilities of cutting voltage off in the event of issues. We propose a solution that uses a combination of temperature sensors and voltage sensors to develop a BMS system that can detect when our battery is in danger of thermal runaway and take action to prevent it. This system will be lightweight and inexpensive, making it suitable for use in a wide range of drones, robots and other applications. With the rapidly increasing use of drones and autonomous robots in a wide range of applications, from agriculture to logistics, the need for reliable and safe battery systems is more important than ever. Our solution will help to ensure that these systems are safe and reliable, reducing the risk of development and making robotics safer for everyone. ## Solution Our solution will be a prototype of a battery pack containing a battery management system (BMS). The system will use temperature sensors to monitor the temperature of the battery, and voltage sensors to monitor the voltage of the battery. These sensors will be hosted on a PCB daughterboard that will directly interface with each cell. The daughterboard will be connected to a mainboard that will be responsible for processing the data from the sensors and taking action to fault the BMS if an improper condition is detected. The fault conditions will include overvoltage, undervoltage, overcurrent, and over temperature or under temperature. If any of these conditions are detected, the BMS will take action to prevent thermal runaway, such as shutting down the battery output through a contactor, or initiating the cooling of our pack through fans. We plan to create a 50V max, 44.4V nominal, 12s1p system that can be used in a wide range of applications, from drones to robotics. Solution Components ## Battery Pack This subsystem will be a 12s1p lithium ion battery pack. We will use high capacity pouch cells with a nominal voltage of 3.7V. We have chosen pouch cells due to being able to manufacture our pack without needing to spot weld. The cells will be connected in series to create a 44.4V nominal battery pack, with a capacity of 13 ah. The voltage was chosen to match the 52V system the Tesla Optimus robot runs off of. The battery pack will be housed in a lightweight and durable enclosure, with provisions for mounting the BMS and other components. The cells will be bolted together with low resistance bolts, and the pack will be designed to be easily disassembled for maintenance and repair. The pack will also include provisions for cooling, such as vents or heat sinks, to help prevent thermal runaway. ## Daughterboard This subsystem will be a PCB that will host the temperature and voltage sensors. The daughterboard will be connected to the mainboard via a two wire isoSPI interface, which will allow for easy communication between the two boards. The daughterboard will be responsible for monitoring the temperature and voltage of each cell in the battery pack, and sending this data to the mainboard for processing. The daughterboard will use Analog Devices LTC chips to monitor the voltage of each cell, and will use thermistors to monitor the temperature of each cell. The daughterboard will also include provisions for connecting to the mainboard, such as headers or connectors. ## Mainboard The mainboard will be a PCB that will host the microcontroller and other components. The mainboard will be responsible for processing the data from the daughterboard, and taking action to fault the BMS if an improper condition is detected. The mainboard will use a STM32H7 microcontroller to process the data from the daughterboard, and will use relays or MOSFETs to control the battery output. The mainboard will also include provisions for connecting to the daughterboard, such as headers or connectors. ## Software The software for the BMS will be developed using the STM32 HAL library, and will be responsible for processing the data from the daughterboard and taking action to fault the BMS if an improper condition is detected. The software will use a state machine to monitor the temperature and voltage of each cell, and will take action to prevent thermal runaway if any of the fault conditions are detected. The software will also include provisions for logging data, such as temperature and voltage readings, to help with debugging and troubleshooting. It will communicate with a ground station via a serial interface, such as UART or CAN, to provide real-time data and status updates. ## Criterion For Success In order to successfully complete this project, we will need to meet the following criteria: - The BMS must be able to monitor the temperature and voltage of each cell in the battery pack, and take action if any of the fault conditions are detected. This action could include cooling the battery pack, shutting down the battery output, or other actions as necessary. - The BMS must be able to communicate with a ground station via a serial interface, such as UART or CAN, to provide real-time data and status updates. - The BMS must be lightweight and inexpensive, making it suitable for use in a wide range of applications. We also have a number of extensions that we would like to pursue if we have time. Our project will still be - deemed successful without them, but these would allow us to showcase additional technical complexity - Integration of a shunt resistor or hall effect sensor to measure current and pack power onboard - Development of a passive or active cell balancing algorithm - The development of a laptop hosted GUI to view the live status of the cells inside the pack - Wireless transmission of the pack data for viewing - (very stretch) integration of an onboard DCDC to output variable voltage and power |
||||||
52 | Heated Bridge System + Seeking one partner |
Adriel Taparra James Raue Kahmil Hamzat |
Jiankun Yang | Arne Fliflet | proposal1.pdf |
|
# Heated Bridge Safety System **Team Members:** - Kahmil Hamzat (khamza2) - Adriel Taparra (taparra2) - James Raue (jdraue2) ## Problem During winter, bridges freeze faster than regular roads due to their exposure to cold air from all sides, making them hazardous for drivers. Existing solutions rely on passive warnings such as "Bridge Ices Before Road" signs, which do not actively prevent ice formation. Our goal is to create an active heating system that prevents ice and snow buildup on bridges, improving safety and reducing accidents caused by icy road conditions. ## Solution Our project will implement a heated bridge system using an array of nichrome heating wires embedded in a simulated bridge surface. The simulated bridge will be a plywood model, with a metal sheet simulating the road surface and nichrome wires beneath the sheet for heat generation. The system will be controlled by a microcontroller that monitors real-time weather conditions via **temperature, moisture, and precipitation sensors**. If freezing conditions and moisture are detected, the system will activate the heating elements to prevent ice formation. A MOSFET-based power switching circuit will be used to regulate power delivery to the heating wires efficiently. When the microcontroller outputs HIGH, the MOSFET allows current to flow, heating the wire. ## Solution Components ### **Heating Subsystem** - Nichrome wire heating elements embedded in a plywood bridge surface to simulate real-world conditions. - MOSFET switching circuit to control power delivery based on microcontroller input. - 12V/24V DC power source, either from a wall adapter or a rechargeable battery with a DC-DC converter. ### **Sensing and Control Subsystem** - **Temperature sensor** to monitor surface temperatures. - **Moisture sensor** to detect the presence of water on the surface. - **Precipitation sensor** to determine if snow or rain is present. - **Microcontroller** to process sensor data and activate the heating system accordingly. ### **Power and PCB Subsystem** - **Custom PCB** designed to integrate the microcontroller, MOSFET power control circuit, and sensor connections. ## Criteria for Success 1. **Accurate sensing** – The system must reliably detect temperature, moisture, and precipitation to determine when heating is necessary. 2. **Effective heating** – The nichrome wire should generate enough heat to prevent ice formation on the bridge surface. 3. **Power efficiency** – The heating system should activate only when necessary to conserve power. 4. **Demonstrable functionality** – The prototype should successfully operate in a simulated environment (e.g., an ice box) and respond appropriately to changing conditions. |
||||||
53 | Ultrasound Remote Operated Vehicle |
Gabriel Inojosa Jamil Yeung Ted Josephson |
Kaiwen Cao | Michael Oelze | proposal1.pdf |
|
# Ultrasound Remote Operated Vehicle Team Members: - Gabriel Inojosa (gvi2) - Ted Josephson (tdj4) - Jamil Yeung (jamilyy2) # Problem Submersible remote operated vehicles are often used for the inspection of underwater structures.The use of electromagnetics is predominant in most cases of wireless communication. However, electromagnetic waves of the frequencies typically used for communication in air and free space do not propagate well in water. As a result, submersible ROVs have been developed which communicate with the operator acoustically. However, these are very expensive. # Solution We intend to develop a proof of concept for a lower cost acoustically controlled ROV which operates in air, using cheap ultrasonic transducers designed for range finding. We would like to develop a low-cost method of wireless communication using acoustics for remote control that will fit within the budget of ECE 445. For simplicity of the project, we will use the ECE 110 car as the mechanical basis of our design. # Solution Components ## Subsystem 1: Transmitter Subsystem A STM32H7B3RIT6 Microcontroller with a MA40S4S Piezoelectric transducer will be transmitting a modulated control signal using frequency shift keying (FSK) at a carrier frequency of 40 kHz. The packets will be sent using a sequence of data bits that will vary to provide various instructions to the vehicle. The system will be connected to a laptop drawing 5V power over USB-C and will be communicating over UART using an FTDI FT231XS-R UART to USB converter for debugging purposes. ## Subsystem 2: Receiver Subsystem Receive ADC on the STM32H7B3RIT6 microcontroller demodulates the output of the RX piezo from the carrier frequency. For debugging purposes, it will also use the FT231XS-R UART to USB converter in order to be read. ## Subsystem 3: Actuator system A finite state machine will be used to set the vehicle to move forwards, backwards, and allow it to turn. This system will be driving an H bridge to control two DC motors to drive. It will be taking a 9V input from the battery. The STM32 will synchronously drive a MOSFET H-Bridge using a gate driver and a dead time circuit. ## Subsystem 4: Sensor system Using Serial Peripheral Interface (SPI), the STM32 will communicate with voltage sensors to provide current and voltage readings from the 9V battery discharging on the moving car. Additional sensors (Temperature, etc). May also be included in the SPI bus if the time permits. ## Subsystem 5: Power subsystem For the moving vehicle, a 9 volt battery will be providing power. This will be stepped down to 3.3 volts using a R-78E3.3-0.5 non-isolated buck DC-DC Converter in order to power the STM32 Microcontroller. 5V power from the USB port will be used for the transmitter board with reverse current protection. A low dropout linear voltage regulator will be used in both systems to keep the voltage at around 3.3V. # Criterion For Success Describe high-level goals that your project needs to achieve to be effective. These goals need to be clearly testable and not subjective. The power system will draw from a 9V battery to power the microcontroller with 3.3V +- 0.1% With a microphone and a spectrum analyzer, the transducers will send a modulated FSK signal within a carrier frequency close to 40kHz. Proof of FSK modulation will be provided using oscilloscope screenshots Upon receiving the modulated signal, a print statement over UART will provide the demodulated instruction packet. Proof of FSK demodulation will be provided using oscilloscope screenshots The actuator system will successfully turn the car motors forward, reverse, and turn. The sensor system will be able to properly communicate with the microcontroller over SPI. A varying voltage source will be swept up to 9 volts to prove the measurements of the sensor. Print statements will be provided over UART. The car with the sensor system will be able to transmit its measurements back to the controller that is remotely positioned. Print statements will be provided over UART |
||||||
55 | Waste Segregation System (Team members: syedr3, rutvadp2, konarkd2) |
Ahmed Raza Konark Dhingreja Rutva Pandya |
Maanas Sandeep Agrawal | Michael Oelze | proposal1.pdf |
|
# Problem Inefficient waste segregation is a critical environmental challenge. While recycling facilities exist, their effectiveness is severely limited by improper waste sorting at the source. Manual sorting is prone to errors, time-consuming, and often results in recyclable materials being sent to landfills. There's a clear need for an automated system that can accurately segregate waste at the disposal point. # Solution Our solution is an intelligent waste segregation system that automatically identifies and sorts waste into appropriate categories using computer vision and mechanical automation. The system comprises a main intake chamber with a camera for material identification, connected to four separate collection bins (for glass, plastic, metal, and non-recyclable waste). A pre-trained machine learning model running on an Arduino processes images to identify materials, while a tilting platform drops items into their matching bins. # Solution Components ## Vision and Processing Subsystem - HD camera for waste item imaging - Custom PCB with Arduino for system control and ML model execution - Pre-trained TensorFlow model for material classification - LED indicators for bin status and error conditions ## Mechanical Sorting Subsystem - Routing mechanism with 4-way directional control - Emergency stop mechanism for system blockages - Anti-jamming detection system ## Power and Housing Subsystem - Converts standard outlet power to the required sensor, microcontroller, and communications module demands Example RFA (cont.) # Criterion for Success Our solution will be considered successful if it achieves: Material identification accuracy of >70% under various lighting conditions, Sorting speed of at least 1 item every 20 seconds, Ability to handle items up to 500g in weight, and Less than 20% system jamming rate during continuous operation |
||||||
57 | Wireless EMG and IMU Sleeve for Hand Gesture Recognition |
Diqing Zuo Harbin Li Jameson Koonce |
Michael Molter | Yang Zhao | proposal1.pdf |
|
# Team Members: - Jameson Koonce (jrk8) - Diqing Zuo (diqingz2) - Harbin Li (hdli2) # Problem As advancements have been made in the Virtual Reality (VR) space, more practical applications of the technology have been found such as in education, engineering, utilities maintenance, and entertainment ([source](https://pmc.ncbi.nlm.nih.gov/articles/PMC9517547/#sec4-ijerph-19-11278)). However, this technology is not yet immersive enough as the majority of users experience some level of cybersickness during use characterized by discomfort ([source](https://pmc.ncbi.nlm.nih.gov/articles/PMC8886867/#Sec1)). Part of this immersion loss can be attributed to how VR consoles track the user’s hands, with some solutions involving controllers, leading to a lack of immersion, and others involving computer vision, which can be inaccurate in many hand/arm positions. There needs to be a more effective way to immerse a VR user’s arm and hands into a virtual environment. # Solution We are looking to create a system which tracks arm movements and recognizes hand gestures for more immersive Virtual Reality (VR) Environments. Specifically, we are going to develop a wireless sleeve lined with Electromyography (EMG) and Inertial Measurement Unit (IMU) sensors in order to detect electrical signals, orientation, and acceleration information from a user's arm and use on-device processing of machine learning algorithms to classify individual finger gestures and track arm movement. This system will be more immersive than existing solutions because the user’s hands will be free in a VR environment, and the arm motion will be tracked even when the arm is out of view. The system will make use of EMG and IMU sensors on a physical sleeve, connected to a wireless module to assure that the information can be used as a controller for external devices and the user is physically unconstrained. The data will be processed in our on-sleeve ML framework for classification and tracking, but raw data can be processed off-sleeve for higher computational efficacy, with an increase in latency. # Solution Components ## Sensor Array System Description: Array of sensors responsible for collecting and preprocessing the analog signals for use by the processing unit. - Dry sEMG Electrodes: large array of dry electrodes for recognition of movements in the hand. - IMUs (ICM-20948 9-axis IMU): collection of accelerometer, gyroscope, and magnetometer to track the orientation and movement of the arm. - Op-amp Denoising (OPA4277UA): operational amplifiers for signal conditioning. ## ML-Based Gesture Recognition (Software) Description: Processes EMG data collected using ML models to classify hand/finger/arm gestures in real time. Components: - Microcontroller : responsible for interfacing with the EMG sensors, preprocessing raw signals, and system control STM32WB55 Series MCU - ML Framework: Optimized for real-time, low power consumption inference. - TensorFlow Lite for Microcontrollers (tflite-micro/tensorflow/lite/micro/examples at main · tensorflow/tflite-micro · GitHub) Possible external dataset Ninapro (Ninapro) - Edge processing module (is only needed for high real-time inference latency requirements): executes the ML model directly on device for low-latency inference —nRF52840 SoC - Training model on EMG signal data We are training our model solely on our own collected EMG data from a single user, focusing on a limited number of gestures first to demonstrate feasibility. The ninapro dataset could serve as a reference dataset for understanding gesture patterns but would currently not be used directly in training. The training and optimization of the ML model would be divided into the following parts: 1. Data Collection: Data would be collected from a single user and focusing on a small subset of predefined gestures. This would then be labelled and used to train our model. 2. Feature Extraction: Extract relevant features from EMG signals, including amplitude, frequency domain characteristics, and time-domain patterns. 3. Model Architecture: Uses a lightweight deep learning model. We consider two primary approaches CNN and RNN, while our primary attempt would be focusing on CNN due to a lower requirement for processing power and memory. Based on the above, we train the ML model and then converter the trained model into TensorFlow Lite model. - Classification of EMG signals (text/command) We start by first preprocessing the raw EMG signals by applying filtering techniques to remove noise and enhancing signal quality. The extracted features such as signal amplitude, frequency, patterns are analyzed to identify the gesture characteristics. The processed data would then be fed into our trained ML model that classifies the EMG signals into specific gestures and thus converted into text-based commands, control signals, etc. ## Wireless module Description: Manages real-time communication between the wearable device and external systems, enabling efficient transmission of classified gesture data for further processing or user interaction. Components: - Wireless Protocol: We would be using BLE for efficient, low-power wireless communication - Integrated BLE MCU: The STM32WB55 includes a built-in BLE radio ## Physical component (wearable form) Description: Physical - Nylon-spandex sleeve with electrode cutouts - PCB and electronic mounts - Li-Po battery and attachment # Criterion For Success - Reliability/consistency in discerning gesture - Show viability by implementing it on one person only. - Achieve 95% accuracy in recognizing a set of 6 gestures - Demonstrate wearability for extended periods (1+ hours) without significant signal degradation (maintaining 90%+ accuracy). - Achieving same or similar accuracy between sessions of wearing, with minimal or to no calibration. - Wireless capability - Demonstrate wireless capability and clearly show gesture recognition and arm tracking results on external device - Latency - Achieving latency of below 200ms |
||||||
58 | Virtual Reality Gloves |
Aditya Nebhrajani Ashton Billings Hamza Lutfi |
Jason Zhang | Viktor Gruev | other1.pdf |
|
# Title Team Members: - Ashton Billings (Ashton6) - Hamza Lutfi (hamzael2) - Aditya Nebhrajani (avn5) # Problem Despite the recent breakthroughs in VR technologies and experiences, it's clear that the technology is still in its infancy. Our project will explore one area lacking in the industry, good and relatively cheap hand tracking. As of now, most flagship VR devices still rely on handheld controllers, with only the more expensive products such as the Valve Index having hand-tracking capabilities. Even still, these flagship devices are still lacking in certain areas. # Solution We will develop a hand-tracking system meant for VR development. To test our system, we will be using the Unity game engine combined with its OpenXR SDK which includes standardized hand-tracking software and the ability to deploy virtual scenes to an Oculus Quest 2 headset, which we will be using for this. We will use a combination of multiple sensors to achieve this result, with the main one for finger tracking being stress gauge sensors over each joint. These sensors change in resistance as they stretch, making them perfect for measuring joint bending. We also need to track where our hand is. We will be using IMUs combined with sensor fusion algorithms for this. Finally, With this data, our firmware will process it and arrange such data in compliance with Unity's XRHands package before sending it to Unity over BLE. We will also implement haptic feedback in our gloves, giving the illusion of holding objects in our hands. We will do this by using servo motors attached by string to each finger that will lock once our finger collides with an object. This will be done through Unity colliders, which provide simple true-false information about whether or not an object is colliding with another. We will send this to our hardware when true to lock the finger. # Solution Components ## Subsystem 1 - Hardware We will be using five main pieces of hardware for our design: stress gauge sensors for joint tracking, IMUs for hand tracking, servo motors for haptic feedback, rechargeable lithium-ion batteries for power, and of course our nRF52840 MCU. All three pieces will need their own set of complementary hardware/software to make them work as intended. The stress gauge sensors will be embedded in the fabric, and then sown onto a glove, such that when you flex your hand in the glove, it will stretch the fabric, and therefore stretch the sensor. These sensors change in resistance as they flex, which will need to be converted to a voltage for measurement. This will be down with a Wheatstone bridge, then this voltage will be digitized with an ADC onboard the nRF52840 chip. The IMU will be embedded on top of the hand. The accelerometers are prone to error due to the double integration of noise so sensor fusion algorithms will be used. Many open-source algorithms such as the Madgwick filter can be used for this, which combines data from the accelerometer, magnetometer, and gyroscope to reduce error in our position tracking. The servo motors will be used for haptic feedback, stopping the fingers in place when they run into objects. As such we need a servo motor that can lock in place, so we've opted for the MG 996R for this. Since we will need 5 motors per hand, we will need a PWN module such as the PCA9685. This module allows for control of all 5 motors through one I2C communication bus and frees our MCU from needing to produce PWN modules. To power our system we need batteries. We will be using lithium-ion batteries for our device which is typical of such devices. We will be using a battery management system device such as the bq2407x to control and charge these batteries. Finally, we have our MCU, the nRF52840. This microprocessor is fast and built for wireless communication as well as being low power, perfect for our desire to use BLE. This MCU will be responsible for speaking with our three main components packing up this data and communicating it with Unity. ## Subsystem 2 - Firmware Firmware for the nRF52840 will be written using the nRF5 SDK. The decision between the Connect SDK (Zephyr RTOS based) and “plain” SDK will be made once we are testing on a dev board. The firmware will implement code to read data from the sensors using the nRF’s ADCs, then send it to the computer running a game engine through BLE via Zephyr’s/SoftDevice. Our firmware will send this data via binary serialization for as low latency of a solution as possible. ## Subsystem 3 - Software Our software will both receive data as well as send data for the proper function of our gloves. It will also provide us with the test scene to test our gloves functionality. All of this will be done in the Unity game engine, which uses C# for game development. Our software will take in sensor data sent from the firmware and via a custom XRHands provider script convert it in such a way as to work with the standardized XRHands Unity package. This package has standardized functions for XRHand operations making it perfect for our project. An XRHands provider is simply a script used to take our raw sensor data and convert it to standardized XRHands data form. Our software will also need to send interaction data to our glove for haptic feedback. This can be done using Unity colliders and collider callbacks. Functions like OnCollisionEnter will run once two colliders interact, meaning that in this function we can send a halt command to our firmware to tell the servos to stop, allowing for simple haptic feedback. Finally, we have our test scene. Making a simple XR test scene in Unity is very simple with many free ones online available as well. These scenes contain interactable objects with Rigidbodies allowing for them to act like physical objects, colliders allowing for interaction, and meshes allowing for the computer to determine where these objects are for interaction. Our software will have a simple scene with objects to pick up and throw for testing. # Criterion For Success Our first criterion for success is that with these, we can pick up virtual objects with the gloves. These gloves will also lock properly when grabbing objects for this haptic feedback. Our next criterion for success is that the gloves have a latency of under 1 second. We can test this by moving the glove in a predictable way, and timing how long it takes for Unity to react. We can do this by having a timer in our firmware and software and comparing the two. Our final criterion for success is that the accuracy of the gloves is within a reasonable range. We can test this by predictably moving the glove into a certain position, say bending the index finger in relative to the palm in by 30 degrees, and seeing how much Unity moves the virtual hand in response. I'd say a reasonable range is that the fingers track within +/-15 degrees. |
||||||
59 | Virtual Synthesizer using MIDI Keyboard |
Connor Barker Dylan Pokorny Patrick Ptasznik |
Eric Tang | Yang Zhao | proposal1.pdf |
|
# Virtual Synthesizer using MIDI Keyboard Team Members: - Connor Barker (cbarker4) - Patrick Ptasznik (pptas2) - Dylan Pokorny (dylangp2) # Problem The high cost of professional-grade virtual studio technology (VSTs) and digital audio workstations (DAWs) presents a significant barrier to entry for aspiring music producers. Many individuals, specifically those just starting out, lack the financial resources to gather the necessary equipment, limiting their ability to explore the world of music. This project aims to address this problem by creating an affordable, standalone hardware synthesizer that replicates VST functionality, making music production more accessible for the average music enthusiast. # Solution This project aims to create a low-cost hardware synthesizer, making music production accessible to a wider audience. The design centers around an ESP32-S3 microcontroller which acts as the brain, which processes input from a MIDI keyboard to generate sounds through speakers. Power is supplied through a wall adapter and a buck converter to ensure proper voltage levels for all components. The generated audio is then outputted to a speaker for real-time sound production. A user interface consisting of a potentiometer for volume control and buttons for instrument selection along with an LCD screen for displaying information such as wave type allows for intuitive interaction. This standalone device bypasses the need for a computer and complex software, significantly reducing the financial cost. # Solution Components ## Subsystem 1: Microcontroller / Software (ESP32) We plan on using the ESP32-S3 microcontroller because it has a few main features that greatly help our project. First, it has USB hosting that can turn some of its GPIO pins into pins that support reading directly from our midi keyboard’s usb port with a USB adapter. Additionally, it has multiple I2C ports for us to connect to the LCD screen as well as multiple I2S ports that can output audio data to the speaker. Finally, it is powerful with more cores than some of its counterparts that can allow multiple processes to be running when looking for user input and generating the sound wave (sine, saw, etc). ## Subsystem 2: Power (wall outlet) We will draw power from an AC 120V 60Hz wall outlet using an adapter to convert it to DC 5V. The DC 5V supply will power the LCD screen and the MIDI keyboard through a USB-A adapter. We will use a buck converter to step down the voltage to 3.3V for use with the microcontroller and speakers. Wall outlet adapter 120V 60Hz AC to 5V DC Buck converter components Transistors Diode Capacitors Inductors ## Subsystem 3: Speakers We plan on using a 4-Ohm speaker with a power range between 3-5 Watts in order to have sufficient sound quality while limiting power demand. We have listed some example options below at different ranges of power. ## Subsystem 4: User Control Volume control will be handled by a potentiometer that connects to a GPIO pin on the ESP that will control the output signal in software. Additionally, we will use simple mechanical buttons connected to GPIO pins so the user can cycle through available instruments. ## Subsystem 5: LCD screen We will display the name of the sound currently selected on an LCD screen controlled by the microcontroller through I2C protocol. The display will visually assist users in selecting the sound they want to play. ## Subsystem 6: MIDI Keyboard We plan on using the MIDIPLUS AKM320 as our piano input that outputs midi data via USB type A. We will have a USB-A connector that splits into VCC, D+ , D-, and GND so that we can use the USB host mode to then connect to the ESP32. # Criterion For Success The synthesizer will be capable of switching between several sounds of different waveforms that are clearly distinguishable. Volume control potentiometer can raise and lower the volume of speaker output in a continuous manner. Supports multiple notes being played at the same time (Chords). Supports a range of at least 3 octaves of notes to be output on speaker. Sound features must be adjustable in real time, as the synthesizer is in use. # Resources & Citations **Microcontroller / Software (ESP32)**: https://www.amazon.com/Espressif-ESP32-DevKitC-VE-Development-Board/dp/B087TNPQCV?source=ps-sl-shoppingads-lpcontext&ref_=fplfs&smid=A33XZ36WFNH796&gQT=2&th=1 LCD Screen: https://www.amazon.com/GeeekPi-Character-Backlight-Raspberry-Electrical/dp/B07S7PJYM6?crid=3NFE1JY7T1MDW&dib=eyJ2IjoiMSJ9.3LG-rdQyBtOaCRNH2P5W1gbZ0fmHmFZQ9pHUMksSeyRTMIO-_dFWjwM5dELoTud6V_NowIFGdGGkOcVWORnhcPIu2jGzKywg_-0sluGTvejwLetYOb44z6zOB2wjYhh4r2w7umgCugyzyDLOEyJa7JYFfm7lbD0HnLQN4wgbOWSkLDwhAqS8Z-__CkpfdozsjuaDIEInA5Z64L0Wzp20CMMDfx2oz_9hkgdhBOMHWaebiTp2HxdOnCEikWO_XFQDGeQrIvo6K64-ZDbe0OmUf8RzQnFAAFKPXG6WEq2TYUoh3gfP9mySKIdCHB3rw4Zw3ff-yNT244T6Jo4X5fq-mbNkaL08CNzNgrmgK3ZBlu8.Pi6n6hRDZvfI_iccKXpOIpZVY0Q-vsD9BjD9otaEsJk&dib_tag=se&keywords=lcd+screen+i2s&qid=1738196187&s=electronics&sprefix=lcd+screen+i2%2Celectronics%2C142&sr=1-4 Speaker Options: https://www.amazon.com/Gikfun-Speaker-Stereo-Loudspeaker-Arduino/dp/B01CHYIU26/ref=sr_1_5?crid=H3YZHZ7EW1LD&dib=eyJ2IjoiMSJ9.JxfX0DtoMc3EK4kMjWnChI0FreS6wWoy9zEvJmvhcjj-UTOBNjy4oEsL_4rq1b3hge0U0YyboxhnX-h-FQe3nFRhVbOICJDh88talb83w61MyBHqj9GONi-uylmW7PQ71P_gCSX2skcK4eX_s2fvjz5qMBYPI5kpEDOHIjXlPpaxd1TALGcSZdGKOupGIm7FhsglNMLOKX_jMSx3Y_OCDbvstR2fvILpAWEHm5uS7B0.XcpkmIU-GtrD8iRgeiyV2xOXJEMB9xLfhKBddBAjjQs&dib_tag=se&keywords=circuit+loudspeaker&qid=1738196030&sprefix=circuit+loudspeake%2Caps%2C109&sr=8-5 https://www.amazon.com/Gikfun-Speaker-Loudspeaker-Arduino-Replacement/dp/B081169PC5/ref=sr_1_1?crid=H3YZHZ7EW1LD&dib=eyJ2IjoiMSJ9.JxfX0DtoMc3EK4kMjWnChI0FreS6wWoy9zEvJmvhcjj-UTOBNjy4oEsL_4rq1b3hge0U0YyboxhnX-h-FQe3nFRhVbOICJDh88talb83w61MyBHqj9GONi-uylmW7PQ71P_gCSX2skcK4eX_s2fvjz5qMBYPI5kpEDOHIjXlPpaxd1TALGcSZdGKOupGIm7FhsglNMLOKX_jMSx3Y_OCDbvstR2fvILpAWEHm5uS7B0.XcpkmIU-GtrD8iRgeiyV2xOXJEMB9xLfhKBddBAjjQs&dib_tag=se&keywords=circuit+loudspeaker&qid=1738196030&sprefix=circuit+loudspeake%2Caps%2C109&sr=8-1 Power Adapter, 120V 60 Hz AC to 5V DC 15W: https://www.amazon.com/MTDZKJG-Adapter-100V-240V-Transformer-Security/dp/B0BZP65GRW/ref=sr_1_8?dib=eyJ2IjoiMSJ9.lt4Dgb27bTajkIeDcd8swsiOjzJ1W2QmIfdBQ7_ahaAwoZQW7WZT5-8AAq5eO-U3gPg7JLb7gG5ApYMsSGhn1URvtswbMboxyXNguxbZp9x8vo-XKVhFeYR718fDVvqt5pq8Fm69GqbQcccbft7M2FIN5mx-wSvo81yy8O-vkdiITNwAqmRbwcdA-aLqEeghpkxNBbo6j4YeaQV-XAnYrKYwaAvx15HuXzDKm35MaTMQN0lhteHusMF8TQp_oZvaKlfOphY4AJMI20KQTlm8nyCyNAt7phcz6irY1BdM-99ZCwEv2LpjeK-jcJOBBF26QSp5H0I9qG4lq_Mb6l-NVVxCE_5YrAUNsBm5j_fXqy0.YoKiwCFxh_6txGCrj5XQvP6w7R17ZPkm87osANvsZfw&dib_tag=se&keywords=120v+to+5v&qid=1738197561&s=electronics&sr=1-8 MIDI Keyboard: https://www.amazon.com/midiplus-32-Key-Midi-Controller-AKM320/dp/B00VHKMK64/ref=sr_1_1?crid=19Z5UVHJGE6MO&dib=eyJ2IjoiMSJ9.B3fOhHaP4O1-06iJF-1ObLtOnDzngOUeP1gjPnLKux8F-oCAti98qP5_9hxSh3xXi34fWhRQLeZFHMQQtj_HZiJxdDVbdczE6f6u6-TvAaCz6bvXD1t2vbNnFTN-Nf2NWRaVr5BM8IWNMJDoqouDdxHyRDn9abehbUR-an58-oj5K5mOA1opEmGjvoHeit2b04v9ehE0842C8DKo0yppB4qpp3icjy5IgsC1RDlcbvXs_GCHzerrx2XiPcJwtzhOk5-6MWAZ8YB0vf7lO62AhQQJpIF0Vcm019Jpt_I3D6bAR2DTWmNdikYfCFw4z-5Kb9EcRF49MTHNKLxTwHV0zzqfnjJd2pOaz5LzexPNCbjTz3b32f9KCotyeP5L_s5lHni3peR32R6jAi2IWb24NM304vJ0_cjZLNlbY-uAb_2cYIluJ7ljKLcFs6-q1_P9.2k1JoRB3bVdFtLBBRn1p1PAaxmC4y8WTYLLVdzy9kKA&dib_tag=se&keywords=midiplus%2Bakm320%2Busb%2Bmidi%2Bkeyboard%2Bcontroller%2C%2Bblack%2C%2B32-key&qid=1738199055&sprefix=%2Caps%2C102&sr=8-1&th=1 USB 2.0 Type A Female SMT Connector: https://www.mouser.com/ProductDetail/TE-Connectivity/292303-7?qs=e6gk%2FTaAuqWZCg5WWmtijA%3D%3D |
||||||
60 | Digital Pitch Shifter for Guitar |
Eric Moreno William Chang Zhengjie Fan |
Shengyan Liu | Michael Oelze | proposal1.pdf |
|
# **Digital Pitch Shifter for Guitar** Team Members: - William Chang (wqchang2) - Eric Moreno (emoren40) # **Problem** Guitarists without access to a tremolo system face significant limitations in their ability to create expressive vibrato and pitch-bending effects, which are essential for adding emotional depth and dynamic variation to their playing. Without these techniques, the guitar’s sound can feel static or restrained, especially in genres like rock, blues, and jazz, where pitch manipulation is crucial. Traditional tremolo systems, though effective in addressing this issue, require invasive modifications to the guitar body, such as routing or altering the bridge. These changes not only compromise the guitar’s original design but can also affect its sound and value. Additionally, such systems may not be suitable for all playing styles, or for guitarists who prefer a more minimalist approach. As a result, players seeking greater versatility in their instrument face the difficult choice between sacrificing their guitar’s aesthetics or settling for limited expressive capabilities. This is the gap the proposed project aims to fill. # **Solution** The solution to the aforementioned issue is a compact, attachable digital pitch-shifting device that uses a sonic sensor to detect the proximity of the guitarist’s hand to the bridge of the guitar. As the player moves their hand closer or farther from the sensor, the pitch of the guitar signal is dynamically adjusted, allowing for real-time pitch shifts up or down. This enables the guitarist to perform expressive techniques like vibrato and pitch bending, similar to those provided by traditional tremolo systems, but without the need for invasive body modifications. Additionally, the device includes a switch or button that lets the player toggle between upward or downward pitch shifts, offering greater flexibility in controlling the pitch. This lightweight solution enhances the player's creativity while preserving the guitar’s natural design and playability. Furthermore, the additional buttons or switches can enable further effects such as reverb, chorus, or delay, giving the player more creative control over their sound. These augmentations enhance the guitarist’s ability to experiment with a wider range of tones and textures without needing to modify the guitar's body or permanently alter its design. # **Solution Components** **Sonic Sensor** The HC-SR04 ultrasonic sensor will play a crucial role in detecting the proximity of the guitarist’s hand to the sensor, which will then be used to adjust the pitch of the guitar signal. The sensor operates using two primary pins: the Trigger pin and the Echo pin. The Trigger pin receives a pulse signal from the ESP32 to initiate the emission of an ultrasonic wave, while the Echo pin sends back a signal to the ESP32 that is used to calculate the distance based on the time it takes for the wave to return. This distance will dynamically influence the intensity of the pitch-shifting effect. **Guitar Preamp** A guitar preamp pedal will be placed between the guitar and the microcontroller to boost the guitar’s signal, which typically ranges in the hundreds of millivolts. The preamp will increase the signal to a level suitable for the ESP32's ADC, ensuring that the microcontroller can properly process the audio input. **Microcontroller (Audio I/O, Signal Processing, Sensor I/O)** The ESP32 microcontroller will serve as the central unit responsible for managing both the input and output of signals, as well as performing real-time signal processing for the project. One of its primary roles will be handling audio input and output through its ADC (Analog-to-Digital Converter) and DAC (Digital-to-Analog Converter) pins. The ESP32 will convert the guitar signal from analog to digital using the ADC, process it with pitch-shifting algorithms, and then convert it back to analog using the DAC for output to a guitar amplifier. In addition to audio processing, the microcontroller will interact with the HC-SR04 ultrasonic sensor by sending a trigger pulse through its GPIO pin to the TRIG pin in order to initiate a reading. It will then read the output of the Echo pin to calculate the distance between the sensor and the player’s hand, which will influence the pitch-shifting parameters. Furthermore, the microcontroller will manage user interactions such as toggling effects or adjusting parameters using additional GPIO pins connected to buttons or switches. **Guitar Amplifier** A 7-watt combo amp will be used to amplify and output the pitch-shifted guitar signal from the ESP32 to an audible level. After the microcontroller processes the audio and applies the pitch shift, the combo amp will boost the signal, making it loud enough for the guitar speaker to produce sound. **Power System** The power management system will use a 5V power supply to ensure stable operation of both the ESP32 microcontroller and the HC-SR04 ultrasonic sensor. Since the ESP32 requires 3.3V, a voltage regulator will step down the 5V supply to provide a stable 3.3V output for the microcontroller. The HC-SR04 sensor, which operates at 5V, will be powered directly from the same 5V supply to ensure proper functionality. A common ground will be shared between all components to maintain reliable communication. Additionally, since the HC-SR04’s Echo pin outputs 5V, a voltage divider can be used to step down the signal to a safe 3.3V for the ESP32’s GPIO. # **Criterion For Success:** - Non-Intrusiveness – The device must attach to the guitar without requiring permanent modifications, preserving the instrument’s original design and functionality. - Real-Time Pitch Control – The pitch of the guitar signal should shift dynamically (range of 2 octaves) in response to the player’s hand movements, ensuring smooth performance. - Adjustable Pitch Direction – A switch or button should allow the player to toggle between shifting the pitch up or down, providing flexibility. - Maintain Guitar Signal Integrity – The device must process the guitar’s audio cleanly, maintaining tonal quality without noticeable latency or unwanted distortion. - Compact and Lightweight Design – The attachment should be small and light enough to avoid interfering with playability or altering the guitar’s balance. - Reliable Power Source – The system must have a stable and efficient power supply, ensuring consistent performance without frequent battery replacements or power interruptions. - Expandable Features – The device should support additional effects like reverb, chorus, or delay through buttons or switches to enhance creative possibilities. |
||||||
61 | Keyless Smart Lock (Secured Illini) |
Andrew Ruiz Bowen Cui Sebastian Sovailescu |
Sanjana Pingali | Arne Fliflet | proposal1.pdf |
|
# Title Keyless Smart Lock (Secured Illini) # Team members Sebastian Sovailescu (ss159) Andrew Ruiz (ruiz25) Bowen Cui (tianyuc3) # Problem In the darkest hours of the night, when the moon barely shines, grimy Chambana thieves creep up on bikes and snatch whatever they can: wheels, seats, and most times the entire thing. Last semester, my bike was stolen right from in front of my apartment. My case is not isolated: according to data , hundreds of bikes are stolen every year in the CU area. For this reason, we want to design a smart bike lock that 1) deters thieves and 2) offers keyless capabilities. # Solution The proposed smart bike lock would include all the features of a conventional U-Lock (bolt cutter resistance, waterproof, etc.), but it would also come equipped with a 100+dB siren that is triggered by unwanted tampering. To provide keyless capabilities, the MCU would include a Bluetooth chip that allows the user to enable/disable the lock using an app. # Solution Components # # Subsystem 1 : Sensor Subsystem The accelerometer is used to detect tampering by recording unusual spikes in acceleration. Once an anomaly is detected, the alert system is triggered, which would activate the siren for a set amount of time. This would only occur when the FSM is in the armed state vs when in the unarmed state all sensors would be deactivated thus not leading to false alarms. Microcontroller - ESP32-S3-WROOM-1U will interpret the readings from the accelerometer/gyroscope and activate the sirens when the readings are out of range. Accelerometer - MPU6050 it has both accelerometer and gyroscope which would not only detect for sharp movement but also slower movement. Siren - PK-35N29WQ 12V 10mA relatively high power draw but in practice should not be active almost at all during typical usage can output 90dB # # Subsystem 2 keyless entry system: The purpose of this system is to allow for keyless entry using a bluetooth capable device (phone). It should also allow for logging of past access attempts.The MCU keeps track of an FSM of two states, armed versus unarmed. In the locked and armed state, the microcontroller will switch between the locked and unlocked states based on a message over bluetooth Components: Bluetooth device - mobile phone with app to control locking of the bike and access a log of past unlocks or tampers. Microcontroller - ESP32-S3-WROOM-1U - esp32 microcontroller to interface with the phone to control the locking and unlocking of the bike, and to log unlocks and tampers in conjunction with the accelerometer. # # Subsystem 3: Power supply system Our system is going to need 12V 3.7V and V rails so in order to achieve we will plan to use a 2 pack of Samsung 40T 21700 4000mAh 35A Battery and step up and down the voltages needed using asynchronous buck and boost converters to save on not needing as many signal amplifiers. Components: Battery - Samsung 21700 cells # Criterion For Success To achieve success for this project we will have a fully working locking mechanism with an app to access the locking mechanism as well as an alert system and BLE on the lock. We also will require the lock to have a siren to play to deter thieves. We also want to fully fledged out the app attached to our lock to see battery stats and to receive the alerts if it is being tampered with. If these core goals are completed we will then implement the app to include biking statistics such as movement, path traveled, etc as well as a GPS functionality on the lock to recover if lost. |
||||||
62 | Multi-Game Card Dealer |
Daniel Gutierrez Matthew Tzeng |
Jason Jung | Michael Oelze | proposal1.pdf |
|
# Multi-Game Card Dealer Team Members: - Daniel Gutierrez (danielg9) - Matthew Tzeng (mttzeng2) - Third Member (______) # Problem Dealers are the heart of every card game imaginable. Cards must be dealt out in a certain, specific fashion both at the beginning and throughout the game. Humans are the ones who have been dealing cards for centuries, however, human error has remained a factor. Misdeals slow down setups, mess up gameplay, and ruin the card game experience for players. # Solution To remove errors from the dealing process, we propose an automatic card dealing machine that can act just as a human dealer with knowledge of multiple different games and rules. Our solution aims to achieve three goals: - Eliminate misdeals from the playing card experience - Provide validation for players that the dealt cards are fair and the playing field is level. - Offer “human dealer” actions, such as player identification and responses to player action (such as dealing more cards, or moving on to the next player) # Components ## Dealing subsystem - Nema 17 Stepper Motor [Link](https://www.amazon.com/STEPPERONLINE-Stepper-Bipolar-Connector-compatible/dp/B00PNEQKC0?mcid=e981ddab58e43534b29effba82c3f107&hvocijid=1482600293604330599-B00PNEQKC0-&hvexpln=73&tag=hyprod-20&linkCode=df0&hvadid=721245378154&hvpos=&hvnetw=g&hvrand=1482600293604330599&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9022185&hvtargid=pla-2281435179978&psc=1) - 5V Stepper Motor (ECE Supply Shop) ## Swivel subsystem - HITEC STANDARD SERVO (E-shop) ## Player Detection subsystem - TOF10120 Time-of-Flight Distance Laser Distance Measuring Sensor 5-180cm UART I2C Output [Link](https://www.amazon.com/HUABAN-TOF10120-Flight-Distance-Measuring/dp/B089SLWYZ9) - Focus 5MP OV5647 Sensor [Link](https://www.arducam.com/product/arducam-ov5647-standard-raspberry-pi-camera-b0033/) ## Deal Validation/Card Identification subsystem - Raspberry Pi 3 - Focus 5MP OV5647 Sensor [Link](https://www.arducam.com/product/arducam-ov5647-standard-raspberry-pi-camera-b0033/) ## Bluetooth/Wifi subsystem - Raspberry Pi 3 - Player’s phones / web browser ## Power subsystem - Spektrum 11.1V 1300mAh 3S 30C Smart G2 LiPo Battery [Link](https://www.spektrumrc.com/product/11.1v-1300mah-3s-30c-smart-g2-lipo-battery-ic3/SPMX133S30.html) - 5V 3A Buck (Step-down) - 3.3V 1A Buck - 6V–12V Adjustable Buck # Criterion For Success - Dealer can “simple” deal (eject one card at a time at a constant speed with perfectly even angles) - Dealer can “real-world” deal (eject one card at a time with variable speeds depending on the distance and variable angles depending on each player's position at the table) - Dealer can rotate 360 degrees around a pivot point and stop at different specified angles with high accuracy - Player detection: The front camera is successfully able to detect when a player has sat down to play or got up to leave - Deck validation: The inside camera can detect when there is a fault deck (either duplicated cards or missing cards) - Potentially accomplished with cool ejection patterns - Bluetooth/Wi-Fi GUI (App or external GUI) connected to the inside camera to add statistics for a better viewing experience |
||||||
63 | Water Quality Monitoring System |
Haokai Liu Harry Griggs Jackie Fang |
Rui Gong | Viktor Gruev | proposal1.pdf |
|
Water Quality Monitoring System Team members: Haokai Liu haokail2 Jackie Fang jackief3@illinois.edu Harrison Griggs hgriggs2 Problem: Access to clean water is critical for human health, agriculture, and ecosystems. However, water pollution due to industrial waste, agricultural runoff, and inadequate infrastructure poses a global threat. Current methods for monitoring water quality often involve manual sampling and lab testing which is time-consuming, expensive, and lacks real-time data. Our project addresses these issues by designing a low-cost, scalable IoT system to monitor water quality parameters in real time. Solution We propose an IoT-based water quality monitoring system designed to provide real-time, actionable insights into water safety. Our solution features a custom PCB that integrates the ESP32 microcontroller , sensors for pH, turbidity, temperature, and conductivity, and power/communication circuits, ensuring a compact and reliable design. The system measures critical water parameters in real time and transmits data wirelessly to a cloud dashboard for remote monitoring. Powered by solar energy, it is ideal for remote deployment and operates sustainably in off-grid environments. Additionally, the system will be low-cost, portable, and scalable, making it suitable for diverse applications such as households, farms, and public water sources. By combining affordability, real-time data, and ease of use, our solution empowers communities to monitor water quality proactively and prevent contamination risks Solution Components(subsystems) Core Requirements: Microcontroller: ESP32 (QFN package, pre-soldered by lab or ordered from E-Shop). The Microcontroller Subsystem is the core processing unit of the water quality monitoring system, responsible for acquiring, processing, and transmitting sensor data. It collects analog and digital signals from the pH, turbidity, temperature (Digikey 480-2016-ND), and TDS sensors, converting them into digital values using its ADC. It also optimizes power usage for the battery, ensuring efficient operation with the power subsystem. Sensor Array The Sensor Array Subsystem is responsible for collecting real-time water quality data by measuring key parameters such as pH, turbidity, temperature, and total dissolved solids (TDS). pH Sensor: 5016-SRV-PH-ND Turbidity Sensor: 1738-1185-ND Liquid Temp Sensor: Digikey 480-2016-ND (ECE 445 Parts Inventory) TDS Sensor: DigiKey 1738-1368-ND Communication: The Communication Subsystem enables data transmission, remote access, and cloud integration for the water quality monitoring system. This ensures real-time monitoring and data storage for further analysis. ESP32 Built-in Wi-Fi (QFN package). UART Header for Programming (Through-hole pins). IoT Connectivity: ESP32/ESP8266 for Wi-Fi or LoRa module for long-range communication. Cloud Integration: Data sent to AWS IoT/ThingSpeak for storage and analysis. Power System The Power Subsystem ensures a stable and reliable energy supply for the water quality monitoring system, supporting both solar and battery-powered operation for increased efficiency and sustainability. Solar Panel: external to PCB, connected via through-hole terminal block, Wide traces for high-current paths. Battery Management: TP4056 Charging Module (through-hole). Voltage Regulator (Through-hole for easy soldering). Criterion for Success: Our project will be considered successful if its sensors are accurate within 5% error of the calibrated lab equipment, real-time data transmission updates to the cloud every 30 minutes with less than 5% packet loss, the cost is under $150, and if it can last 24 hours on battery/solar panel, |
||||||
64 | # Secure Food delivery dropbox |
Dhruva Dammanna Rohan Samudrala Taniah Napier |
Chi Zhang | Michael Oelze | proposal1.pdf |
|
# Secure Food delivery Dropbox Team Members: - Rohan Samudrala (rohans11) - Dhruva Dammanna (dhruvad2) - Taniah Napier (tnapier2) # Problem 70% of college students order food from a third-party delivery service like uber eats or doordash weekly. Unfortunately, many food deliveries can get stolen as some people order ahead of time when ordering at the door dash and the food will just stay in front of their house for anyone to take. We want to prevent people from stealing food deliveries. # Solution Our solution is to create an insulated box that only the delivery person and customer can open, locking the food in otherwise. The box will generate a one time use passcode that the customer can send to the driver through the app. Once the driver drops off the food, a weight sensor will ensure the delivery was actually left and lock the box. The weight sensor also ensures that the driver can not go back and take the food. After, the orderer uses a master passcode to open the box and receive their order. # Solution Components ## Subsystem 1: 9 digit keypad This is used for the driver to put in the code to unlock the box. The code for the driver is a one time use, so it will change after the food is placed inside. The user has a master code that always works so they can unlock. The code put into the keypad will be checked if it is correct and then unlock the box. Parts used: ADAFRUIT 1824(Keypad) ## Subsystem 2: Weight sensor The weight sensor is used to check if food is in the box. It is also used to generate a new passcode for the keypad for the next driver. The old pass code that the previous driver used will not work after the weight sensor is activated. Parts used: TAL220B-3KG(Weight sensor) ## Subsystem 3: Locking mechanism When the box is closed a servo motor will push a deadbolt, locking the box shut. Parts used: 900-00005 (servo) ## Subsystem 4: Control and Power System We will use a microcontroller to write code in C++ or C to interact with the sensors and the rest of the project. A random passcode will be generated here for the driver to use. Once generated, the wifi capability of the ESP32 will send it to the user. Also the voltage will be regulated here. It is battery powered. Parts used: ESP32-WROOM-32 (Microcontroller), AP2112K-3.3(Battery), 1N4148 diode (Voltage Regulation) ## Subsystem 5: Box Insulation We will use foam panels to insulate the food inside so that it stays at the temperature that it needs to stay at. This will make the food more enjoyable. Part: Foam Panels # Criterion For Success Food containment unit is initially locked. Once specified 4 digit codes are inputted into the keypad of the containment unit, the containment unit will unlock. A 4 digit code is randomly generated upon weight sensor activation. This randomly generated code is intended for 1 time use. Once the weight sensor detects that the food has been placed inside the containment unit, the unit will lock. Once the weight sensor detects that the food has been placed inside the containment unit, the randomly generated passcode will no longer work and another code will be generated for next use. A specific 4 master code can be entered at any time to the keypad and the food containment unit will open. Both the random code and the master code will be sent to the user. |
||||||
65 | Automatic Guitar Tuner |
Ethan Lin Nathan Kim |
Shengyan Liu | Michael Oelze | proposal1.pdf |
|
# Automatic Guitar Tuner Team Members: - Nathan Kim (nrkim2) - Ethan Lin (ethanl7) # Problem For many guitar players, keeping their guitar in tune or wanting to change the tuning of their guitar can be a hassle. Looking at the tuners currently on the market, the most common type of guitar tuner is a clip-on tuner where the player is required to manually tune each string using the attached tuner as a pitch guide. There also exist automatic guitar tuners but these are limited by either the number of strings that can be tuned at once, the price of the tuner, or the amount of work needed to be done by the player (i.e. the player still has to move the tuner around the pegs or strum the strings). # Solution Our solution is to develop an automatic guitar tuner that attaches to all six tuning pegs of the guitar and can tune each string to a pitch that is set by the user. So, the user will intermittently strum all six strings until an LED flashes which indicates that all strings are correctly tuned to whatever has been set by the user - an attached Piezo Disk Transducer will be used to determine the real-time frequencies and vibrations within the guitar. To accomplish this overall task, we will introduce 4 essential subsystems: a power subsystem, motor subsystem, processing subsystem, and a vibration-sensing subsystem. # Additional Notes This project draws inspiration from projects in SP 1999 and SP 2020, both being automated guitar tuners. The 2020 project features a fully automated system, including strumming, for three strings at a time. Our solution differs from this project by tuning all six strings at once, utilizing a Piezo Disk Transducer instead of a microphone system to determine the pitch of the strings. As a result, our solution will be more noise-resistant, especially when tuning in more chaotic environments, as it will be based on vibrational pitch sensing. In addition, our solution will be safer, limiting the motor strength so as to not break the string, and will also include a more robust user interface, allowing the user to set the pitch of each string within a set range. # Solution Components ## Subsystem 1: Power System The power system will provide power for the motors and processing system. As the design will be portable, it will be run from a 9V battery (233) and require a step down voltage regulator (LM317T) to get the power to an acceptable level for our motor and processing systems. ## Subsystem 2: Motor System The motor system will be responsible for turning the tuning pegs based on the processing system output. The motors (ROB-11696) will be driven by H-bridges (4489) on the PCB and will have limited torque and power in order to ensure the system will not damage the guitar. ## Subsystem 3: Processing System The processing system is the heart of the project, as it will take input from the vibration system, distinguish between all six strings, process which direction to tune each string, and finally send out power to the motor system to tune the guitar. ## Subsystem 4: Vibration-Sensing System This system will take input from a piezo disk transducer (DZS Elec 35mm Piezo Disc Transducer Contact Microphone Trigger Sound Sensor with 4 Inches) and amplify (LM386N-1 Digikey 296-44414-5-ND) it to an acceptable level for the processing system to handle. This system may also take input from multiple transducers and combine them for a more accurate and reliable input. # Criterion For Success Our criterion for success is to be able to identify the pitch of each string from a series of strums. From here, our solution should be able to tune all six strings within ±12 cents of the set tone per string (this is said to be the value where people can start to detect when something is out-of-tune). The entire tuning process should finish within a minute. In addition, our solution should perform similarly in both a quiet and noisy environment. |
||||||
66 | A New Approach to an External Ventricular Drain (Capstone Project) |
David Kapelyan Isiah Lashley Ralph Nathan |
Jason Jung | Yang Zhao | proposal1.pdf |
|
Team Members: - Ralph Nathan (ralphn2) - David Kapelyan (davidik2) - Isiah Lashley (ilashl2) # Problem External Ventricular Drains (EVDs) are used to drain cerebrospinal fluid (CSF), but if done incorrectly, they can cause severe damage, including death. To ensure the correct amount of CSF is drained, the pressure transducers on the EVD must be properly zeroed. However, patients often move during sleep or daily activities such as showering, which can lead to incorrect pressure readings and improper CSF drainage. According to Dr. Suguna Pappu, there have been numerous cases where approximately 40 ccs of CSF were drained instead of the intended 10 due to zeroing errors. This, again, can result in significant harm or even death. In summary, a new approach to EVDs is necessary, one that provides stable pressure readings even when the patient is in motion. This capstone project aims to create advancements in EVDs. # Solution We plan to utilize an STM32 microcontroller to process input from a pressure transducer connected to the catheter through which cerebrospinal fluid (CSF) is drained from the brain. Our design will incorporate a pipe tee in series with a two-way solenoid valve. The catheter extending from the skull will be connected to the tee, which will also be fitted with a pressure gauge. This pressure gauge will be linked to the microcontroller, which will control whether the solenoid valve is open or closed. Measuring pressure digitally, rather than using a manometer, will eliminate the issue of set-point shifts caused by patient movement. Additionally, there will be no need to manually set a “zero” point, as this can be calibrated in software. We will use an instrumentation amplifier with a shunt resistor to buffer signals from the pressure transducer, ensuring accurate readings by the microcontroller. Digital signal processing (DSP) will then be performed via the microcontroller, including noise filtering, adaptive thresholding for real-time pressure management, and data logging of pressure readings. The system will regulate the flow of CSF to a drain collection bag via a push-connected solenoid valve. The microcontroller will communicate with a display or bedside monitor via Bluetooth, presenting pressure data—including real-time pressure graphs, an alarm system for abnormal pressure readings, and data logs for physician review—through a graphical user interface (GUI). Additionally, we will implement fail-safes to prevent over-drainage or blockage and include a manual override in case of system failure. # Solution Components ## STM32 Microcontroller An STM32 microcontroller with an on-package RF transceiver that supports Bluetooth will be utilized. The ADC of the controller supports a resolution of 12 bits which will be useful for accurately measuring the output signal of our pressure gauge. The STM32 Microcontroller comes with an internal reference voltage that is typically derived from the supply voltage. ## Power System Circuit A high-voltage rail powered by an AC-DC wall adapter will be used to power the board. A linear regulator will be utilized to decrease the voltage such that it can be used to power the microcontroller. ## Push Connect Solenoid Valve For Drainage A switch will be placed between the high-voltage rail and the solenoid input. The switch will be controlled by an output signal from the microcontroller. ## Pressure Transducer The pressure transducer will be connected to the pipe tee. The pressure transducer will need to be a precision pressure transducer as the standard Intracranial Pressure is approximately 16mg(0.309 PSI) which is a relatively low pressure. The transducer will have a current output which will be connected to a shunt resistor across which the voltage will be measured using an instrumentation amplifier. # Criterion For Success A successful project will result in a device that accurately reads and processes pressure data from a transducer with minimal noise and high precision. The system must effectively regulate cerebrospinal fluid (CSF) drainage by dynamically controlling a solenoid valve to maintain an average outflow of 10cc/hour, preventing over-drainage or blockage. Additionally, the microcontroller must wirelessly transmit real-time pressure readings via Bluetooth to a bedside monitor, where a graphical user interface (GUI) will display real-time pressure graphs, generate alarm notifications for abnormal pressure levels, and log data for physician review. To ensure safety and reliability, the system must incorporate fail-safes to prevent malfunctions and provide a manual override for emergency control. By meeting these criteria, the project will achieve its goal of delivering an automated, accurate, and user-friendly solution for CSF drainage management. # Parts: STM32 PCB Push Connect Solenoid Valve Pipe Tee Pressure Transducer Instrumentation Amplifier Links: ¼” push connect solenoid valve ⅛” npt solenoid valve https://www.omega.com/en-us/pressure-measurement/pressure-transducers/px119/p/PX119-015GI https://www.coleparmer.com/i/cole-parmer-0-25-accuracy-transmitter-0-to-2-psi-4-to-20-ma-output/6807503 https://www.mouser.com/ProductDetail/Analog-Devices/ADR435BRZ?qs=WIvQP4zGanhj7%2FQWeFYslw%3D%3D&utm_id=22030944703&gad_source=1&gclid=CjwKCAiAtYy9BhBcEiwANWQQLyuDFchHNWjCoLscoWoVpM2fdflY2CcCi-fQ9bxPrEm5EPQFvoIeNxoCPqgQAvD_BwE |
||||||
67 | Automatic Water Quality Monitoring using Test Strips |
Abdullah Alsufyani Fahad Alsaab Teodor Tchalakov |
Jiankun Yang | Viktor Gruev | proposal1.pdf |
|
# Automatic Water Quality Monitoring using Test Strips Team Members: - Fahad AlSaab (fahadma) - Abdullah AlSufyani (aaa25) - Teodor Tchalakov (ttcha2) # Problem Using water quality testing strips to identify key characteristics can be time-consuming. Each color strip can have different color scales and varying wait times before the chemical agent provides valid results. While it is true that some tests, such as pH, have digital alternatives, these alternatives tend to be more expensive, often require additional calibration, and sometimes do not exist for certain chemical tests. Consequently, automating water quality testing across a wider range of chemicals and substances continues to rely on test strips. # Solution Our solution is an automated system that applies water to a test strip and records its values. The enclosed system consists of a mechanism to dispense water onto a test strip. It then waits for the chemical reactions to complete and reads the color results using sensors. A mechanism will replace the used test strips with a fresh one from a storage stack, ensuring multiple days to weeks worth of testing before needing user replacement. Water will be dispensed using a solenoid, with water sourced either from a reservoir or a home water inlet. The colors will be measured using either color sensors or a digital camera, with LED illumination for consistency. This system enables automated daily monitoring with fresh water samples compared to other water quality testing designs. It expands the range of testable chemicals by leveraging traditional test strips while maintaining affordability by avoiding expensive digital water sensors. The system will be evaluated based on its ability to reliably execute the testing cycle and the accuracy of its color reading compared to human observations. # Solution Components ## Test Strip Storage Cartridge This subsystem stores and dispenses test strips. The strips are stacked vertically and dispensed using a roller mechanism similar to a printer. The cartridge ensures that a fresh test strip is available for each test cycle. ### Components Motorized roller mechanism Vertical test strip storage compartment Sensor to detect the presence of test strips. ## Feeder System The feeder system transports test strips from the storage cartridge to the testing chamber, It ensures proper alignment and positioning of the strip for water application and water detection. ### Components Stepper motor with precision control Guide rails for strip movement Optical sensor for strip alignment verification Adafruit Motor Shield (https://www.adafruit.com/product/169) ## Water Reservoir and Droplet Dispenser ### Components: Solenoid valve for controlled water dispensing Water reservoir with level sensor Tubing and nozzle for precise droplet application ## Test Strip Color Sensor This module measures the color of each square of the test strip and has illumination via onboard LEDs to make reading the color more accurate. We will use a color-sensing chip to test its accuracy first and switch to a conventional camera if we do not get the accuracy we want. The TCS3472 color light to digital converter chip provides us with color measurements. ### Components: RGB color sensor (e.g., The TCS3472 color light to digital converter chip provides us with color measurements.) LED illumination for consistent lighting ## Displaying Results Print over serial USB connection the measured concentrations of chemicals and minerals found in the water. ## Power System Powered from a standard wall outlet using an AC to DC converter. # Criterion For Success 1. The cartridge system is reliably able to dispense test strips to the feeder system. Able to do at least 5 water quality testing cycles automatically without jamming. 2. The feeder system can move and position the test strip underneath the water droplet dispenser and color sensor within half a centimeter. 3. The water droplet dispense system can dispense exactly one drop of water at a time accurately onto the square chemical papers such that the test square is fully saturated. 4. The color sensing system can accurately determine the concentration for each test within 10% accuracy compared to a human’s reading of the same test strip. 5. The system can reliably store used test strips in a removable container for the user to dispose. |
||||||
68 | Power-Factor-Corrected Musical Tesla Coil |
Ali Albaghdadi Kartik Singh Maisnam |
Shengyan Liu | Arne Fliflet | proposal1.pdf |
|
# Gentle Giant: A Power-Factor-Corrected Musical Tesla Coil Team Members: - Ali Albaghdadi (aalba9) - Kartik Maisnam (maisnam2) # Problem Tesla coils are impressive visual and auditory devices; some can a surprising range of sounds using arc discharges, and thus have found uses as display pieces in entertainment and STEM education. A particularly large one is permanently mounted to a ceiling inside the Museum of Science and Industry in Chicago. However, for the majority of their existence, they have been crude instruments. The way they are built and operate typically results in a suboptimal use of AC power, also known as a poor power factor, and even with the advent of "solid-state" Tesla coils (SSTCs) that use power semiconductors, the problem has not improved. Areas with lower-voltage mains like the United States are often at a disadvantage due to details in many of these implementations. Further, when scaling up to large Tesla coils for use in performances, they can have a significant effect on the grid. Solving these problems can improve the efficiency and portability of these novelty constructions. # Solution We aim to build, for a comparatively low cost, a Dual-Resonance Solid State Tesla Coil (DRSSTC) with an active Power Factor Correction (PFC) front end. The combination of these two advancements puts our Tesla coil at the very forefront of Tesla coil hardware technology, and solves many of the technical issues with other modern designs. Some background: Tesla coils are effectively giant transformers, with a secondary winding that has many times more turns than the primary. Conventional SSTCs operate by first rectifying mains AC to a high-voltage DC, then using a half-bridge or full-bridge of power semiconductors to switch the primary of the Tesla coil. This results in a very large voltage being generated in the secondary, which causes it to release arc discharges. A major benefit that DRSSTCs like ours bring over SSTCs is that it operates more like a resonant converter. In the design phase of the transformer, the primary and secondary must be tuned to have close LC resonant frequencies. During operation, feedback from the primary is used to switch it at its resonant frequency, which results in energy being built up in the system more quickly and more impressive arc discharges. This energy buildup must be stopped intermittently by an external PWM signal called an interruptor (which can simultaneously be used to modulate music into the arc discharges). The primary feedback also enables zero-current switching (ZCS), reducing thermal losses in the power stage to near zero. We choose to improve even further by designing a digitally controlled boost-type active PFC to create the high-voltage DC rail. This brings with it several benefits of its own, like improving system power factor, making the system agnostic to mains voltage and frequency, and allowing for smooth capacitor precharging without the use of a separate precharge circuit. With a high power factor, both of the following are possible: 1. For the same apparent AC power, the generated arcs can be larger 2. Arcs of the same size can be generated for less apparent AC power Thus the whole system consists of the PFC, the feedback controller, the power stage, and the transformer. # Solution Components ## Boost-type PFC Stage This subsystem draws power from the AC mains and creates a 400-volt DC rail. It is digitally controlled using an STM32F103 microcontroller, which allows it to ramp the voltage for precharging and compensate for different mains voltages and frequencies. A boost-type PFC consists of a bridge rectifier, an input inductance, an output capacitance, a FET and an individual diode. We plan to use the Panjit KBJB bridge rectifier, Rohm SCT3120ALHR SiC FET and Wolfspeed C6D04065A SiC diode. Since we only need one of each in the product, their costs are negligible. A Texas Instruments UCC5710x gate driver can be used to allow the STM32F103 to drive the FET. The projected frequency of switching is 50kHz. ## Feedback controller This subsystem implements a simple ZCS feedback controller using comparators and digital logic chips, and utilizes a long plastic optical cable to safely and remotely play simple musical notes via PWM (this is the interruptor signal). The optical receiver will be an Industrial Fiber Optics IF-D95T, which is an inexpensive device that has been highly proven in Tesla coil design history. Though in theory the microcontroller could also perform the logic task, we felt that it would not have low enough latency. The feedback itself is provided by a current transformer made of a Fair-Rite #77 ferrite core, which feeds into a burden resistor. Microchip MCP6561 comparators perform the zero crossing detection, and 74HCT logic chips manipulate the signal, combine it with the interruptor signal, and create gate drive waveforms for the power stage. ## Power stage The power stage simply consists of a full bridge of four 60N65 IGBTs, and the primary LC is connected in the middle. The switches are driven by gate drive transformers (GDTs) to save cost and complexity versus developing a solution with isolated gate drive ICs. GDTs have been by far the leading solution to drive SSTC power semiconductors, and there is little incentive to do otherwise. ## Transformer This is the Tesla coil itself. It will stand at around three feet tall once completed. It has no electronic components, but its physical design places some constraints on the electronic components. Preliminary calculations place the resonant frequency of the primary at around 200kHz. # Criterion For Success A PWM generator with an optical transmitter needs to be able to remotely start and operate the Tesla coil, causing it to release arc discharges. The arc discharges should be at least 1 foot in length, and the power factor of the whole system needs to be above 0.95 during normal operation. |
||||||
69 | Shamir Secret Self-Destruct USB |
Alex Clemens Danny Metzger Varun Sivasubramanian |
Michael Gamota | Viktor Gruev | proposal1.pdf |
|
# Team Members - Varun Sivasubramanian (vsiva4) - Alex Clemens (clemens9) - Danny Metzger (djm14) # Problem Traditional USB flash drives pose a security risk if lost or stolen, especially for highly sensitive data such as cryptographic keys, classified documents, or personal information. Even if encrypted, existing encrypted USBs rely on software-based security, which is vulnerable to forensic recovery or brute-force attacks. Some physical destruction, like crushing or snapping, may still leave recoverable data on the drive. Furthermore, USB devices often do not enforce security via the device itself. # Solution A custom USB flash drive with built-in cryptographic security and hardware self-destruction, ensuring that sensitive data cannot be recovered under any circumstances. The system will: - Encrypt and split the drive's decryption key using Shamir’s Secret Sharing across multiple physical hardware keys. Require a threshold number of shares (⅔) to reconstruct the key and decrypt the data. - Trigger a hardware-based self-destruct mechanism under various circumstances. - Ensure complete destruction by physically rendering the flash memory unreadable. # Solution Components ## Subsystem 1: Shamir Secret & YubiKey Authentication Purpose: Ensures multi-factor authentication and prevents software access by restricting key reconstruction to hardware. Components: - Microcontroller: ESP32, STM32 or similar. Should handle reading YubiKeys and managing key reconstruction along with triggering destruction. - Secure Element: AES-256 Encryption capable. Handles all cryptographic operations and is tamperproof. - 3 USB-C YubiKeys: Hold each share of the Shamir Secret in a ⅔ authentication. Upon first connection, the user is able to set up Shamir Secrets by plugging in all YubiKeys and initiating the MCU and SE to create the shares. ## Subsystem 2: Storage System Purpose: The flash drive should still have traditional storage and functionality. Conceals encrypted portion unless Shamir is reconstructed. With partitioning, an unencrypted partition should also be allowed. Components: - Flash NAND storage: Any small size (8-16GB) is good. Should support basic partitioning. - USB Mass Storage Controller: Facilitates communication with the computer. - External USB-C ports: Allow YubiKeys to be connected to the PCB - USB-A or USB-C interface: Plugs into the computer. The encryption of the storage will be done by the secure element. ## Subsystem 3: Hardware Self-Destruction Purpose: Ensures that if there is a potential attacker, the storage is permanently destroyed. The exact method of self-destruction is contingent on circuit design, but a voltage overload is most feasible. Components: - Boost Converter: Steps voltage to create destruction. - MOSFET: Switches from normal functioning to destruction voltage. - 2 LiPo or CR2032 batteries: Allows destruction to take place even when unplugged. - Tamper detection circuit: A circuit that detects when two pins are no longer in contact i.e. when the casing has been opened up. Trigger Mechanisms: There are multiple triggers that lead to frying the NAND. Multiple YubiKey fail attempts, opening the physical casing, or attempting to access the Secure Element should trigger the self-destruction. The MOSFET should direct high voltage directly to the NAND, irreversibly damaging memory. # Criterion for Success 1. Shamir Secret: The Shamir key can only be reconstructed via firmware on the physical drive, not on a computer. 2. Irreversible: Destruction of the NAND is irreversible. Data should not be recoverable. 3. Tamper-Resistant: Removing casing or tampering with the SE should lead to destruction. |
||||||
70 | Automatic Drum Tuner |
Joey Bacino Jonathan Fejkiel Max Wojtowicz |
Shengyan Liu | Yang Zhao | proposal1.pdf |
|
Members Joey Bacino - jbacino2 Jonathon Fejkiel - jfejkiel2 Wojtowicz - mwojt3 Problem Playing instruments is a pastime enjoyed by millions of people across the world. A task that almost every musician must endure before playing is tuning their respective instrument. For many this is done easily if they are of able body and have good pitch. However, turning lugs and listening for the right tune can be difficult if someone is weaker such as a child or the elderly, or if they are inexperienced in hearing perfect pitch such as a beginner. Solution The solution we propose is an automatic tuner for instruments that will adjust the instrument until the desired pitch is reached. We will specifically design our tuner for use on drums. The device will strike the drum, listen for the pitch, calculate how much it should either tighten or loosen the drum, and instruct a motor to do so. It will perform extra checks to ensure the drum was adjusted properly. Additionally, the mechanism will connect to a mobile app to select pitch if time permits. Subsystem 1 Power Management System: To have enough power for striking the tuning hammer and turning the pegs of the drums, we will utilize a power tool battery such as a Milwaukee M12 battery system. The same battery will power the microcontroller and sensors so it must be regulated to the correct voltages to ensure the safety of the components and the user as well. The power management subcircuit will have over-current and over voltage components such as fuses and diodes to ensure circuit protection. A buck converter will step the 12V supply down to the required inputs of the rest of the components. Subsystem 2 Drum Striking Hammer: For the motor that drives the hammer that would strike the drum, we will use a push-pull solenoid. We’re choosing a push-pull solenoid because they can provide a consistent and quick tap. Consistency is important around the entire drum, we need to make sure each strike is the same for every single hit on every single lug we would like to tune. A quick tap also allows the drum to resonate fully and not dampen the hit by leaving the hammer on the drum head. This is important because we want our pitch detection to be able to hear the purest/most dominant tone around each lug without any type of interference. Minimizing overtones will simplify our pitch detection system as we want as close to only one tone at any given time. Also, we would experiment with different materials such as rubber, wood, and felt to see which gets us the best result for our hammer. Subsystem 3 Pitch Detection: To detect the pitch of the drum at its current state, a microphone will begin to read the input of audio after the hammer has struck. The returned sound snippet will be recorded and the raw audio data will be converted to frequency domain data on the microcontroller. This can be done using a Fast Fourier Transform algorithm on the microcontroller. The dominant frequency will be noted as the pitch of the drum. Based on the input for the desired note, the microcontroller will then decide if the drum needs to be tightened or loosened and by what amount. Subsystem 4 Tuning Motor Control: For the motor that would be turning the lugs, we want to use a high-torque servo motor. High torque is a requirement for this part because when you want to tune your drum higher and higher, you need more and more torque as the drumhead provides more and more resistance against the tuning lugs. Servo motors also offer very precise control with feedback, so we could calibrate the motor to each lug and precisely determine how much the pitch changes with how much rotation. Subsystem 5 Pitch Correctness LEDs: The device will have LEDs that will indicate to the user if the current pitch of the drum is correct, close, or far off from the desired pitch. It will begin lighting when the drum is first struck. Every time the drum is struck after a pitch adjustment, the LEDs will display a different color so that the user will know the progress of the tuning. Green will be displayed and stay lit once the device has finished tuning to indicate to the user that they are ready to play. While the device is not in an active tuning task, the LEDs will stay lit blue to indicate a standby mode. Criterion For Success Our first criteria for success will be being able to accurately detect the pitch from our pitch detection system, as that will be the basis for how the two motors act. Another criteria for success will be repeatability, our system should return consistent pitch readings and tuning results across multiple tests. The second criteria is the accurate striking of the drum. This can not be too fast or slow, and must be the correct length of time. One more can be our lug-turning motor being able to accurately turn the lugs to the desired pitch without too many intermediate hammer strikes and adjustments. We also want minimal noise and interference from our motors. |
||||||
71 | Automatic Light Switch |
Andrew Kim Ruize Sun Sun Lee |
Eric Tang | Yang Zhao | proposal1.pdf |
|
#Automatic Light Switch Team Members: - Sangsun Lee- Sangsun2 - Andrew Kim- Akim229 - Ruize Sun- Ruize2 # Problem Many buildings and rooms still use traditional, non-smart light switches, requiring individuals to manually turn lights on and off. Upgrading these switches to smart ones typically involves removing the existing switch and installing a smart light switch in its place. However, for people living in rented rooms or apartments, this option may not be feasible, as they do not own the property and are often restricted from making permanent changes to the electrical fixtures. This limitation creates a challenge for renters or those in temporary living arrangements who want the convenience and energy-saving benefits of smart lighting systems without violating lease agreements or incurring high installation costs. Moreover, current solutions for retrofitting smart functionality are either limited in functionality, expensive, or complicated to install, making them inaccessible to the average tenant. As a result, there is a growing need for innovative, non-invasive solutions that enable smart functionality without requiring structural modifications to existing light switches or electrical wiring. # Solution The solution to this problem is to design a smart switch that can be easily mounted over the existing light switch without requiring any modifications to the electrical wiring or permanent changes to the property. This smart switch would fit seamlessly over the traditional switch, allowing users to control their lights both manually and remotely. To enhance convenience, we will also develop a companion mobile app that allows users to control the smart switch wirelessly. This solution ensures that renters can enjoy the benefits of smart lighting without violating lease agreements, while also offering an affordable, non-invasive, and user-friendly experience. # Solution Components Subsystem- Voice Control: We will be adding voice recognition modules on the STM32 microcontroller. The two main modules will be MP23ABS1 and STEVAL-MIC008A, which includes microphones and allows DSP solutions to implement voice control. We will also be using X-CUBE-AUDIO-KIT as the software expansion to code the DSP algorithm to recognize specific sounds in order to control the light switch. With the Voice Control, the user can ask the device to turn the light on/off. Subsystem- Wifi: We will be adding an ESP32 microcontroller or Inventek ISM43362-M3G-L44. This will act as a WIFI module that will connect the device to wifi, so that with the app, you can control the device from anywhere. Or just buy a wifi-enabled STM32 Board. Subsystem- App: Use an emulator to develop and test. We will be coding in Java to develop the app, which will contain a simple on/off switch that connects to the light switch through wifi, which will then send signals through the Wifi subsystem microcontroller. Subsystem- Power Subsystem: Use double AA batteries and a battery case. Subsystem- Mechanical Subsystem: We will design a box to cover the home switch. Inside the box, we will use Stepper Motor (NEMA 17) and Gear Transmission to control the home switch. The motor can be controlled by the app. We will design a gear and place two rods on the gear to clamp the switch. As the gear rotates up or down, the rods will move up or down, thus driving the switch to move up and down. If the motor does not have enough force to turn the switch, we will try to extend the rod as much as possible and use the principle of leverage to reduce the required torque. # Criterion For Success Our Solution can seamlessly turn the light on and off without any delay. Using wifi, can control the box from a far distance. Can control the box’s functionality of turning the light on and off by voice control. Our idea is different from previous solutions because we are implementing a voice control, Wifi, and an app’s functionality, so that the user can seamlessly turn the light on/off with out any use of body movement, from any distance away, and with just the touch of a button in the app. |
||||||
72 | AquaSense: Affordable Water Quality Monitoring for Aquariums |
Anurag Ray Chowdhury |
Michael Molter | Arne Fliflet | proposal1.pdf |
|
AquaSense: Affordable Water Quality Monitoring for Aquariums Team Members: - Arnav Garg (arnavg8) - Michael Yan (myan13) - Anurag Ray Chowdhury (anuragr3) # Problem Maintaining proper water quality in aquariums is essential for the health of aquatic life. However, many hobbyists and small-scale fishkeepers lack access to affordable, real-time water monitoring solutions. Traditional testing methods involve manual testing kits that require frequent manual intervention, making it difficult to detect rapid fluctuations in pH, temperature, or dissolved oxygen that could be harmful to fish and aquatic plants. Additionally, existing automated water quality monitoring solutions are often expensive and designed for industrial-scale applications, leaving home aquarium owners with few accessible options. To bridge this gap, we need a low-cost, plug-and-play solution that continuously monitors water conditions and provides real-time alerts when the water quality becomes unsuitable for fish. # Solution We propose AquaSense, a low-cost ESP32-based plug-and-play PCB that integrates various water quality sensors to provide real-time monitoring for aquarium owners. This system will - Continuously measure pH levels, temperature, and dissolved oxygen in an aquarium. - Use an ESP32 for wireless connectivity to send alerts and log data on a web dashboard. - Provide automated alerts via mobile notifications (Wi-Fi or Bluetooth) when water parameters fall outside safe ranges. By offering a cost-effective and easy-to-use monitoring system, this project aims to make aquarium care more accessible and stress-free, especially for beginner fishkeepers. # Solution Components ESP32 (Main Controller) - Handles sensor data processing and wireless communication (Wi-Fi/Bluetooth). - Stores water quality thresholds and triggers alerts when needed. - Sends data to a mobile/web dashboard for remote monitoring. Water Quality Sensors pH Sensor (SEN0161) - Measures acidity/alkalinity of the water. - Important for maintaining fish health, as extreme pH levels can be deadly. Temperature Sensor (DS18B20) - Tracks water temperature. - Ensures stable conditions for tropical and cold-water species. Dissolved Oxygen Sensor (DFRobot Gravity DO) - Measures oxygen levels in the water. - Prevents hypoxia (low oxygen), which can be fatal for fish. Turbidity Sensor (Analog Turbidity Sensor) (If we have time) - Measures water clarity (detects debris or algae overgrowth). PCB - Provides a plug-and-play interface for all sensors to interface with the ESP32 - Includes voltage regulators for stable sensor operation. Connectivity & Data Logging - Wi-Fi/Bluetooth (ESP32) → Sends alerts to a mobile app. - Web dashboard or mobile app integration for real-time monitoring. # Hardware Background The ESP32 microcontroller is at the core of this system, acting as the central processor to collect, process, and transmit water quality data. Unlike Raspberry Pi or expensive industrial controllers, the ESP32 is: - Cost-effective (~$5-$10 per unit). - Power-efficient (supports battery or direct power operation). - Built for IoT applications (Wi-Fi + Bluetooth connectivity). The pH, temperature, and dissolved oxygen sensors are widely used in aquaculture and environmental monitoring. By integrating them into a single, easy-to-use PCB, we enable affordable real-time monitoring for small aquarium owners. # Future Expansion The AquaSense platform has the potential to grow into a scalable IoT solution for home and commercial aquariums. Future expansions could include: - AI-based predictive analytics → Detect trends in water quality over time. - Automated filtration control → Trigger water changes when quality declines. - Additional sensors → Include ammonia and nitrate detection for fish tanks. - Battery-powered version → For outdoor ponds and fish farms. - By making smart water quality monitoring accessible, this project could benefit not just aquarium owners but also educators, researchers, and environmental conservationists. # Criterion for Success Affordability - The system should cost under $50-$80, making it cheaper than industrial solutions while maintaining accuracy. Ease of Use - The PCB should support plug-and-play functionality with clear user documentation. - Should be easy to install in any size aquarium. Functionality - Must successfully monitor pH, temperature, and dissolved oxygen with real-time updates. - Provide instant feedback via mobile notifications. Power Management - The device should run on a single power input (5V USB or battery) with proper voltage regulation. - Should include low-power standby mode for energy efficiency. Scalability and Expandability - The design should allow future integration with additional sensors. - Support remote data logging and historical tracking for advanced users. Performance - Sensor accuracy should be within ±0.2 for pH, ±0.5°C for temperature, and ±0.5 mg/L for dissolved oxygen. - The system should provide real-time updates every 5-10 seconds. |
||||||
73 | Climate Control Grow Box |
Andrea Gardner Gabrielle Wilki Rhea Tiwari |
Surya Vasanth | Arne Fliflet | proposal1.pdf |
|
# Climate Control Grow Box Team Members: - Gabrielle Wilki, gwilk2 - Andrea Garner, agardn7 - Rhea Tiwari, rtiwari3 # Problem Improper climate is often the cause of death for house plants. When plants hit winter the temperature gets too cold but other factors such as humidity, improper lighting, and water quantity also can play a factor into a plant’s death. Current options for climate control are limited to larger areas with climate units designed to control a whole room or are house wide ones that require the humans who own the plants to live in the same environment as their plants. However, both of these had the flaw of being unable to isolate a specific desired climate for limited square footage and are not suitable for the person trying to grow a few plants. # Solution We propose a climate control grow box that will have the capability to regulate the humidity, light, and airflow. This will allow for a small climate controlled area for plants in the home. By being able to control these variables in a smaller floor plan we should be able to help gardeners all over the world who find themselves in apartments or other low square footage housing. We plan to do this by collecting a variety of sensor inputs to a ATMega32U4 or similar board which will control the required components of each subsystem. #Solution Components ## Humidity Control Subsystem This subsystem will incorporate a AM2303 Digital humidity sensor that will monitor the active humidity level of the environment within the grow box. This should be able to communicate data to the exhaust fan and the humidifier to make sure the humidity is within the range it should be in. If needed we will add a dehumidifier to this system inorder to lower the humidity within the enclosure alongside the exaust fan. ## Light Control Subsystem After a selection from the user we will set the grow lightsto the desired level of brightness. By using different sensitivity photodiodes we will check and approximate the light levels and turn on and off the interior lights to help the plants grow. ## Water Control Subsystem The user will set a desired quantity of water to provide a plant, at certain times of day a small water pump will activate and send water from a reservoir into the main section of the enclosure. Water output will be measured to ensure that the plants are not over or under watered. There will be an option to not turn this feature on. ## Power subsystem The grow box is meant to be a stationary product and thus we are intending to be able to use wall power to provide power to the grow box. We will use a transformer to step down from the grid power to a lower voltage. Then using that output we will use an AC to DC buck converter to power the majority of the system, except for the fan which may need an AC input. If this is the case we will set up a DC to AC conversion for the fan. # Criterion For Success For us to consider success the grow box must be able to have a certain amount of control of the interior environment. In specifics it should be able to control and keep a humidity level consistently higher or lower than the exterior environment by at least 10%, the ability to control light luminosity within the product at minimum a dim, normal, and bright settings, and a way to water the plants with a designated amount of water that is predetermined by the user by volume (EX: 1 cup or water, 2 cups of water, or a liter of water) The grow box must maintain a standard of being both aesthetically pleasing and fit within the bounds of a piece of furniture or smaller as the box is intended for interior home use. # Component Links ## Humidity system https://www.amazon.com/ATIMOSOS-AM2302-Digital-Temperature-Humidity/dp/B073TW7V1T?source=ps-sl-shoppingads-lpcontext&ref_=fplfs&psc=1&smid=AXI6THZLFAC34&gQT=1 https://www.amazon.com/Absorber-Ventilator-Ventilation-Extractor-Electronics/dp/B0CFDRYQC1/ref=asc_df_B0CFDRYQC1?mcid=2ec5371709a13dad9cc8c2102333736f&hvocijid=4951330560480715156-B0CFDRYQC1-&hvexpln=73&tag=hyprod-20&linkCode=df0&hvadid=721245378154&hvpos=&hvnetw=g&hvrand=4951330560480715156&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9022185&hvtargid=pla-2281435178058&psc=1 https://www.amazon.com/Four-Spray-Humidifier-Module-Atomization/dp/B0D83Y9858/ref=asc_df_B0D83Y9858?mcid=fe1a5c27b85a38e5bdb290f5b056f973&hvocijid=1275569093432181403-B0D83Y9858-&hvexpln=73&tag=hyprod-20&linkCode=df0&hvadid=721245378154&hvpos=&hvnetw=g&hvrand=1275569093432181403&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9022185&hvtargid=pla-2281435177138&psc=1 ## Water system https://www.amazon.com/PULACO-Submersible-Fountain-Aquarium-Hydroponics/dp/B07Y27SVPP/ref=sr_1_4_sspa?dib=eyJ2IjoiMSJ9.wFzhuUYLdzsMIHyRh7VRepFtbTWMe32hOfuv5qIdWWVApI9D9WOwEJl2VNUzeefX2jo96g42kQ5A66ob5aYsXETwbuvDWaQ-09R2Nu56Mcqin53-2vuYmWtQeoE2dNu1Uxbh3CgVYX9kk7-KGLX3__adx17ZJvreu8wxZX4uTha-Z6d04bA8hxiWqJ7mpBt5XISRfb7rdzXh98z_MS36KvrwZjdIzeFYs6nQxVk9A2fzud5SSUzyP2ByGBYKaSgsNF6ugiBAXEHXzLR_g3NOaEDD0cuD-AGoxXtsYqVbPnQ._7yWlm2FV1WQsNP-QrerlETaOn0psdLFGAeUB291rYM&dib_tag=se&keywords=water+pump+small&qid=1738707885&sr=8-4-spons&sp_csd=d2lkZ2V0TmFtZT1zcF9hdGY&psc=1 ## Light system: https://www.amazon.com/Plant-Light-Spectrum-Indoor-Flower/dp/B07VG1282Q/ref=asc_df_B07VG1282Q?mcid=4c6dab575399331cbea837f16a1490cb&hvocijid=9516987453758452840-B07VG1282Q-&hvexpln=73&tag=hyprod-20&linkCode=df0&hvadid=721245378154&hvpos=&hvnetw=g&hvrand=9516987453758452840&hvpone=&hvptwo=&hvqmt=&hvdev=c&hvdvcmdl=&hvlocint=&hvlocphy=9022185&hvtargid=pla-2281435177378&th=1 |
||||||
74 | Affordable Soil Moisture Monitoring for Home Gardeners |
Arnav Garg Michael Yan |
Sanjana Pingali | Arne Fliflet | proposal1.pdf |
|
# Team Members: - Arnav Garg (arnavg8) - Anurag Ray Chowdhury (anuragr3) - Michael Yan (myan13) # Problem: Maintaining soil moisture is crucial for plant health, especially for home gardeners, farmers, and even plant enthusiasts. Many individuals struggle to monitor soil conditions effectively, leading to over/underwatering, which damages plants. Traditional methods involve soil moisture testing, requiring frequent checks and guesswork. Advanced irrigation systems exist but are expensive and designed for large scale agriculture, making them inaccessible for gardeners. To bridge this gap, we need low cost solution that monitors soil moisture and environmental conditions while providing alerts when needed # Solution: We propose PlantSense, a low cost ESP32 based PCB that integrates environmental sensors to provide realtime monitoring for gardeners This system will: - Continuously measure soil moisture, temperature, and humidity. - Use an ESP32 for wireless connectivity to send alerts and log data on a web dashboard. - Provide automated alerts via mobile notifications when soil conditions become unsuitable. By offering an affordable smart plant care system, PlantSense makes gardening precise for beginner planters # Solution Components ESP32 (Main Controller) • Handles sensor data processing and wireless communication • Stores soil moisture thresholds and triggers alerts • Sends data to a mobile/web dashboard for remote monitoring # Soil & Environmental Sensors Soil Moisture Sensor (Capacitive Soil Moisture Sensor v1.2) • Measures water content in the soil • Helps prevent overwatering and underwatering Temperature & Humidity Sensor (DHT11/DHT22) • Monitors air temperature and humidity around plants • Useful for optimizing growing conditions for different plant species Light Sensor (BH1750) • Measures ambient light levels • Helps ensure plants receive enough sunlight PCB (Custom Hardware Board) • Provides plug-and-play connectivity for all sensors • Includes voltage regulators for stable sensor operation Connectivity & Data Logging • Wi-Fi/Bluetooth (ESP32) → Sends real-time data to a mobile app • Web dashboard/mobile app for historical tracking of soil moisture trends # Hardware Background The ESP32 is system core which is responsible for collecting, processing, and transmitting soil data. Furthermore, the ESP32 is: • Affordable (~$5-$10 per unit) • Energy-efficient (supports battery or direct power operation) • Ideal for IoT applications (Wi-Fi + Bluetooth) The soil moisture, temperature, and humidity sensors are used in agriculture and indoor gardening. By combining them into a single smart monitoring system, we provide an affordable solution for plant owners. # Future Expansion PlantSense has the potential to evolve into a fully automated plant care solution. Future upgrades could include: • AI-based predictive analytics to detect watering patterns. • Automated irrigation control by connecting to a smart watering system. • Additional sensors for CO₂, nitrogen, and soil pH monitoring. • A battery-powered version for outdoor gardens. • Smart home integration with Alexa, Google Home, and Apple HomeKit. This system could benefit home gardeners, greenhouse operators, and small-scale farmers looking for an affordable, data-driven way to care for plants. # Criterion for Success Affordability • The system should cost under $50-$80, making it cheaper than commercial smart gardening systems Ease of Use • The PCB should be plug-and-play with clear documentation • Should be easy to install in potted plants and gardens Functionality • Must successfully monitor soil moisture, temperature, and humidity with real-time updates. • Provide instant alerts via mobile notifications. Scalability and Expandability • The design should support future integration with more sensors. • Must support remote data logging for historical analysis. Performance • The system should provide real-time updates every 10-15 seconds to our web system # Conclusion PlantSense is an affordable, plug and play monitoring solution that makes plant care easier and precise. By bridging the gap between manual testing and expensive smart irrigation systems, it ensures optimal growing conditions for both beginner and experienced gardeners. |
||||||
75 | Plant Hydration and Weather Integration System |
Aashish Chaubal Iker Uriarte Jaeren Dadivas |
Maanas Sandeep Agrawal | Arne Fliflet | proposal1.pdf |
|
Plant Hydration and Weather Integration System - Iker Uriarte (iuriar2) - Jaeren Dadivas (dadivas2) - Aashish Chaubal (achau7) Outdoor plant care is challenging due to varying weather conditions and the risk of overwatering or underwatering. There are available automated watering systems that water plants, but they don’t consider the option of saving water by considering upcoming rain. A smarter solution is needed to optimize plant hydration while conserving water. This idea could minimize the risk of overwatering, which some people consider worse than underwatering, and could maximize water usage. # Solution Our objective is to design a smart plant-watering system that monitors soil moisture levels and integrates weather forecasting to optimize water usage for outdoor plants. The system will delay watering if rain is forecasted, minimizing water waste. We will use an ESP32 microcontroller capable of requesting an API to a weather forecasting service (such as AccuWeather). Our design will include a motorized water pump whose water flow will be dependent on both weather and moisture sensors. Finally, our system will be powered by rechargeable lithium batteries to provide the user with better long-term sustainability and cost efficiency. # Solution Components # Subsystem 1: Control Board The ESP32 will be the microcontroller for this project. It offers built-in Wi-Fi and Bluetooth functionality, making the design process more efficient and less risky. Additionally, the ESP32 is a highly popular microcontroller with documentation available, which could be helpful. The ESP32 will handle receiving soil moisture readings from sensors. If the moisture level falls below a specific threshold (which will be manually set based on the type of plant we choose to use), the ESP32 will use APIs (e.g., AccuWeather) to get real time weather data and check the rain precipitation probability. If the probability is high, the system will delay watering and wait for the rain to moisture the soil. Otherwise, it will activate a water pump to hydrate the plants. Lastly, the ESP32 will calculate the amount of water saved by comparing moisture levels when watering is delayed due to rain versus manually watering the plant with the water pump. Since the relationship between moisture levels and water usage is not linear, developing this calculation will be a challenging but essential part of the system. These results will be sent to a website that will display a graph showing water savings over time. ### Components Used: esp32-wroom-32 voltage ## Subsystem Batteries Since our project is focused on targeting outdoor plants, we have to use lithium-ion batteries since they are the most efficient against cold weather and they can be recharged. The problem that these types of batteries tend to have is that they don’t go well with cold temperatures, so that’s why we have to go with lithium-ion batteries that are cold weather resistant. We will connect our battery to a 5V boost converter and connect the converter to the ESP32 5V input. Components used: ###Components used: 2 Pack 3.7 Volt 18650 Rechargeable Battery 3400mAh 18650 Li-ion Battery ## Watering and Sensors subsystem We will use moisture sensors pinned to soil in order to actively measure soil moisture levels. These moisture sensors will be SEN0114. SEN0114 will be connected to the 3.3V ESP32 pins. The moisture sensors will work with our water pump. Our water pump will water plants whenever the moisture levels are low and there is no forecasted rain. Our pump will be a JT-180A. ###Components used: SEN0114 JT-180A 5V ## Data Subsystem our ESP32 will collect and upload data to a Firebase database, a cloud-based storage solution that will serve as our repository (data storage). We’ll build a custom website that connects to this database, allowing us to generate plots from the stored data. In short, Firebase will be our dependable “storage box” for all the semester’s data. ## Rain Detection Subsystem In order for our control board to see if it rains, we will be using a Load Cell + HX711 that will go under a container (i.e. cup) that will, in the case of rain, measure the change in weight of the container including the plant. The weight accumulated due to the added water from the rain can then be used in calculations by the microcontroller to determine the amount of water that was saved by holding back the watering system. This subsystem will also be able to confirm that is raining to our control board. ##Criterion For Success For our smart plant-watering system to be effective it needs to be able to autonomously sustain an outdoor plant via watering them by doing the following: it needs to be able to accurately monitor the soil moisture levels (and a set moisture threshold) and the real-time weather forecast and only activate the water pump accordingly (when the moisture level falls below the threshold and when there is no rain). Furthermore, it must be able to delay watering when the probability of rain is above a set threshold. The system will calculate and display water savings by comparing the system’s actual water usage and the baseline amount of water used without weather forecast information. |
||||||
76 | Driver Fatigue System |
Julio Cornejo Vincent Ng |
Maanas Sandeep Agrawal | Arne Fliflet | proposal1.pdf |
|
Driver Fatigue System Team Members: * Julio Cornejo (jcorne23) * Vincent Ng (vng20) **Problem** When driving for prolonged amounts of time, some key body movements and facial changes can be made due to drowsiness. The drowsiness, if unmonitored, can pose dangerous conditions for other drivers and the drivers themselves. Intoxication while driving is also a rampant issue; there is no universal breathalyzer that prohibits driving based on BAC. I propose this device that uses facial recognition and eye-level detection sensors and cameras to detect symptoms of fatigue along the road while also prohibiting intoxicated drivers from proceeding to drive. This device can monitor head position, yawns, and register long blinks. It can also track the driving duration and eventually register all these symptoms if it detects fatigue. Once enough triggers are set, an app interface can assess your tiredness or driving incapability via Wifi transmission. It can suggest and locate the nearest rest stop or call emergency contacts (set by the user). When certain drowsiness scores are reached, the user's BAC and live drowsiness rating will be displayed with in-house buzzer systems. **Solution** The system revolves around an algorithm that makes use of a variety of sensors and cameras. One is a breathalyzer that measures the blood alcohol concentration of the driver. Using the software, we monitor the live value and set triggers to call emergency contacts and monitor until a safe concentration for resumable driving. Drowsiness and tiredness can be detected in various ways ranging from yawn frequency, long blinks, and head tilts. They happen suddenly. Most traditional cars use your position in a lane to track tiredness. Tracking head movement and analyzing the face for more key indicators precisely provides more information that can be essential to identify when a driver needs to step away from the wheel and request a ride elsewhere. The PCB can be housed in a small, compact shape like a cube that sits over any dashboard with detachment features to trigger the BAC sensor correctly. **Solution Components:** **PCB Controller:** The device will have operable buttons to measure and clear the sensor and turn on the components and precise data for use. ESP32-S3 is the microcontroller intended for this project since the WiFi connection is crucial. **Sensor and Device Subsystem** * OV2640 Camera Module (For eye & head movement detection) * BNO055 IMU (For head tilt & movement tracking) * DFRobot Gravity MQ-3 Sensor (For BAC detection) * OLED DISPLAY (e.g., SSD1306) displays live-drowsiness score calculation and BAC reads. * BUZZER (if there is a significant change in the drowsiness measure, we ring a buzzer for the first tier of detections). **Power Subsystem** Power Management (Li-ion battery) - A rechargeable lithium-ion battery for a recommended runtime of about 10 hours would suffice for long drives. **Algorithm** The most ethic-heavy aspect of this project is to design a rationale and fair approach to assess and set triggers for the alerts. Setting criteria for calculating a drowsiness score for periodic yawns, blink duration, and rampant head movements. Not to mention, algorithmically, we have to assess what constraints from the sensors classify a yawn or blink, which can be difficult depending on the light settings. The goal is to be processed within the ESP32-S3 as the algorithm's in-house computation, with the possibility of an out-of-PCB Raspberry PI for additional computation power (if necessary). To capture blink frequency/length, we will be utilizing Haar cascades, or some sort of pretrained model if we decide to implement the raspberry pi for processing. For drowsiness detection, we will be implementing an Eye Aspect Ratio algorithm to check if the drivers eyes are closed and blink frequency/length will be tracked based on how many frames are in the eye closed EAR threshold and how many frames are in the eyes open EAR threshold. **Sample Weighted Calculation (idea):** Fatigue Score = (Weight1 * Blink Score) + (Weight2 * Yawn Score) + (Weight3 * Head Movement Score) For example, Blink Score could be calculated as: * Blink frequency (e.g., 0-5 blinks/min → score 0-50). * Blink length (e.g., >500ms → score 0-50). * Final Drowsiness Level: - Score 0-30%: Alert (Buzzer or immediate audio alert) - Score 30-60%: Warning - Score 60-100%: Fatigued (Follow user designated-procedure) **User Interface & Connectivity** The PCB would send data via WIFI from the microcontroller through the ESP-32. Using HTML and CSS, we can design a mobile application that includes charts and emergency functionality to follow procedures like assigning and calling emergency contacts once a drowsiness or BAC threshold is reached. The focus is on a mobile application, as a driver usually has that in hand. **Criterion for Success** 1. Accurate measurement of BAC levels: The device should be able to measure the BAC levels of the driver accurately. Having a professional police-enforced measure for comparison can help calibrate. 2. Effective detection of yawns, blinks, and tilting heads: The device should be able to classify and count how many yawns and blinks occur in timed driving intervals while actively updating and displaying the tiredness score for the driver to be wary. We can test this with varying yawn lengths and close-eye durations. 3. Fair Algorithm: The design is intended to save lives. We want the user to be able to set up a procedure when a threshold is set up. This can be calling driving services or an emergency contact once a BAC exceeds this. 4. Updated interface: The web interface should reflect updates and show graphic analytics to predict better when tiredness occurs and optimal driving times for all safety. |
||||||
77 | Portable Offline Translator |
Josh Cudia Lorenzo Bujalil Silva |
Chi Zhang | Arne Fliflet | proposal1.pdf |
|
# Portable Offline Translator Team Members: - lorenzo9 - cudia2 # Problem Traveling is an exciting part of life that can bring joy and new experiences. Trips are the most memorable when everything goes according to plan. However, the language barrier can limit communication with others, causing unnecessary stress on an otherwise enjoyable trip. Although most modern phones provide translation applications, these require a reliable internet connection. In times when the connection is weak or there is no connection at all, translation apps may not be a solution. # Solution We want to solve this problem by building a portable translator that you can ideally use anywhere in the world without internet connection. The idea is to have a small device that can be programmed to make translations between two different languages, then is able to listen what the person says, converts the speech to text, translates the text to the target language, then converts the translated text back to speech, and drives a speaker with the target translated speech. We want to design our translator to encompass a few subsystems: Main Processing Subsystem (MCU), Secondary Processing Subsystem (Compute Module), Audio Subsystem, User Interface Subsystem, Communication Subsystem, and Power Management Subsystem. Through this design, someone should be able to turn on the device, set the languages up and start talking into the device, and after a few seconds the translated speech will be played. Then, the device can be programmed the other way to have the other party translate. Ideally, this will facilitate communication between people without a common language and make life easier while traveling. # Solution Components ## Main Processing Subsystem Components: [STM32F407](https://www.st.com/en/microcontrollers-microprocessors/stm32f407-417.html)([STM32F407IGH7](https://www.st.com/resource/en/datasheet/stm32f405zg.pdf)) (MCU) The main processing subsystem will manage the workflow for the system, control all I/O, and communicate commands/data to the secondary processing subsystem. When the system powers on a simple interface will be prompted to the user to allow them to select the source and target language to translate. The MCU will support the user inputs through a push button to select the language and will drive the display. Then when the user decides to start translation, through a particular push button, the MCU will change states to start listening on the port for audio data from the microphone. The INMP441 microphone will output a digital signal and communicate over I2S which can be interpreted through our MCU. The MCU will also need to buffer data and need to normalize it to be within the appropriate bit range to be interpreted by the STT model. After preprocessing the data, we will need to set up code to communicate packets of this data over a SPI or I2C protocol to the compute module. We also may need to set up some kind of custom protocol to set the compute module to start listening for a data sequence. Then the compute module will take over and do the translations and conversions to speech and output pulse code modulation data. This data is transmitted again over SPI or I2C to the MCU that is listening. Then the MCU will move to another state to start writing the data to the MAX98357A that will drive the speaker. Then the MCU will move back to a state of user input again to allow the user to translate again. Other than managing the entire workflow for the system, it needs to control the I/O which will include reading inputs from the user on the push buttons and will need to drive an LCD display to show what the user is currently selecting. With enough time, we may also add some status messages onto the LCD display to see what is happening in the system. We decided to use the STM32F407 for this project because we required high levels of communication between various systems along with the numerous I/O. We also found that it has a LCD parallel interface and JTAG interface. We also have a long reach goal to do some audio manipulation (e.g. filtering, noise reduction) before sending it off to the compute module. We can also expect to support a real time control of the audio and peripheral management. ## Secondary Processing Subsystem Components: - [Raspberry Pi Compute Module 4](https://datasheets.raspberrypi.com/cm4/cm4-datasheet.pdf) (SC0680/4) Models: - Speech to Text: [Whisper](https://github.com/openai/whisper) (Tiny Model) - Translation: [MarianMT](https://marian-nmt.github.io/) - Text to Speech: [Piper](https://github.com/rhasspy/piper) The main purpose of the compute module is to offload high compute tasks, including speech to text transcription, translation, and text to speech conversion. It would be too computationally complex to host all three models on the STM32 while processing I/O data. We specifically chose the RPI Compute Module because it has the computational power to run the AI models, it can interface over SPI or I2C to the MCU, it runs Linux, and it has eMMC Flash to store the models on board. For this subsystem, we are going to have to build an infrastructure around querying the models, reading and sending data to and from the MCU, and a data processing pipeline to move through different stages of translation. It will also need to have some level of state awareness to know what kind of translation is being requested. We hope to design the PCB so that we can simply plug in the compute module onto the PCB through pinouts. ## Audio Subsystem Components: - [INMP441](https://invensense.tdk.com/wp-content/uploads/2015/02/INMP441.pdf) (I2S Microphone) - [MAX98357A](https://www.analog.com/media/en/technical-documentation/data-sheets/max98357a-max98357b.pdf) (Amplifier) - [Dayton Audio CE32A-4](https://www.daytonaudio.com/images/resources/285-103-dayton-audio-ce32a-4-spec-sheet.pdf) (Speaker) The audio subsystem encompasses all of the components managing analog signals. Essentially this will include listening to the user voice then delivering that to the MCU. It will also include the amplifier and speaker that are managing converting the digital signals to an analog signal and then speaking out the translated speech. Additionally, the microphone will do analog to digital conversions and some amplification to the signal. To avoid conflict with noise from the speaker with the microphone, we are going to build a very synchronous system where we can only use the microphone or the speaker at a time. ## User Interface Subsystem Components: - [ST7789](https://cdn-learn.adafruit.com/downloads/pdf/2-0-inch-320-x-240-color-ips-tft-display.pdf) (LCD Display) - Mini Pushbutton Switch We are going to have a LCD display that can let the user decide what languages to translate between and some push buttons to be able to decide. We are also going to have a push button that will start listening on the microphone, then stop listening so we can ensure that all of the data has been stored. In the case that the translation lasts too long, we may add some feature to automatically stop the input of speech so we make sure not to have too much data to translate. This UI subsystem will essentially make it so that this is a usable product. ## Communication Subsystem Protocols: - I2S (Audio I/O) - SPI/I2C (MCU-CM Communication) We are going to support communicating with I2S between the MCU and both of the audio devices, and we are going to have SPI/I2C communication to the compute module. This may also require some circuit level design to add to the I2C or I2S lines (pull-up resistors). ## Power Management Subsystem Voltage Levels: - Power Rails: - 5V (CM) - 3.3V (MCU, LCD, Audio) - [LM317DCYR](https://www.digikey.com/en/products/detail/texas-instruments/LM317DCYR/443739?gclsrc=aw.ds&&utm_adgroup=Integrated%20Circuits&utm_source=google&utm_medium=cpc&utm_campaign=Dynamic%20Search_EN_Product&utm_term=&utm_content=Integrated%20Circuits&utm_id=go_cmp-120565755_adg-9159612915_ad-665604606680_dsa-171171979035_dev-c_ext-_prd-_sig-CjwKCAiAtYy9BhBcEiwANWQQL3rwnMSMyK-ZG3XMXp5nru6UD3FAfss_oPQyDGeG0f-Hh2RLp4BssBoCgKIQAvD_BwE&gad_source=1&gclid=CjwKCAiAtYy9BhBcEiwANWQQL3rwnMSMyK-ZG3XMXp5nru6UD3FAfss_oPQyDGeG0f-Hh2RLp4BssBoCgKIQAvD_BwE&gclsrc=aw.ds) (Adjustable LDO) X2 - [18650 Battery Holder](https://www.digikey.com/en/products/detail/keystone-electronics/1048P/4499389?gclsrc=aw.ds&&utm_adgroup=&utm_source=google&utm_medium=cpc&utm_campaign=PMax%20Shopping_Product_Low%20ROAS%20Categories&utm_term=&utm_content=&utm_id=go_cmp-20243063506_adg-_ad-__dev-c_ext-_prd-4499389_sig-CjwKCAiAtYy9BhBcEiwANWQQL67T2WMFu7kbd5dYcLA8KUBEH5cQLd09vsCRMQqPdleIVlpNC45jqhoCQ38QAvD_BwE&gad_source=1&gclid=CjwKCAiAtYy9BhBcEiwANWQQL67T2WMFu7kbd5dYcLA8KUBEH5cQLd09vsCRMQqPdleIVlpNC45jqhoCQ38QAvD_BwE&gclsrc=aw.ds) - [Samsung 25R 18650 2500mAh 20A Battery](https://www.18650batterystore.com/products/samsung-25r-18650?utm_campaign=859501437&utm_source=g_c&utm_medium=cpc&utm_content=201043132925&utm_term=_&adgroupid=43081474946&gad_source=1&gclid=CjwKCAiAtYy9BhBcEiwANWQQL9KXMQZ0cgicw8SuV3VKk3KPHvVdYIrlZydGXjFnZ7StpWRljqYGjRoCGikQAvD_BwE) This portable power management system will use a Samsung 25R 18650 2500mAh 20A rechargeable Li-Ion battery to supply stable voltage to two power rails. We will use the 5V power rail for the Raspberry Pi Compute Module (CM) and 3.3V power rail for the MCU, LCD, and audio subsystem. This system includes two LM317DCYR adjustable LDO regulators. One will be used to step down the battery voltage to 5V and the another to step down the voltage to 3.3V. Lastly, the 18650 battery holder securely holds the battery and enables easy replacement. # Criterion For Success The basic goals that we hope to achieve with this design is a system where we can press a button to listen to our voice then stop listening, process this data to the compute module, translate the speech, and drive a speaker to hear the translated speech and this speech needs to sound cohesive and be understood. At the simplest level, we hope to translate between English to another language. At the minimum, if this works we can expect to be able to add additional functionality to support more than two languages. We also believe that our system is going to be very synchronous, as if one thread is being used to move the data through all the pipeline. A more asynchronous system would be able to listen and start speaking right as we are talking, but there are more caveats to that system that may be too complicated to get done within one semester. We are able to test this at a high level as we are able to hear what the translated speech is, and we can dive deeper by having stages within our translation pipeline where we can dump the status of the translation to see what exactly we received as input and what we got as output. We also hope to be able to have this be a simple portable design with an encasing to make it very easy to use especially through the UI. |
||||||
78 | Carpal Tunnel Wrist Glove |
Deepika Batra Li Padilla Rawnie Singh |
John Li | Arne Fliflet | proposal1.pdf |
|
# SMART CARPAL TUNNEL WRIST GLOVE Team Members: * Deepika Batra (dbatra3) * Li Padilla (jpadi4) * Rawnie Singh (rawnies2) # Problem Digital artists often experience fatigue and discomfort in the wrist, knuckles, and fingers after prolonged drawing sessions. This strain typically goes unnoticed until pain develops. Continued stress on the hand muscles can lead to more serious conditions, such as carpal tunnel syndrome, which can cause hand/wrist pain, burning/numbness in fingers, and overall weakness in the wrist and hand. The repetitive motions of digital art that come with brush strokes, sketching, and rendering, can cause significant swelling around the tendons in the carpal tunnel, resulting in pressure on the median nerve. # Solution Many use compression gloves to alleviate symptoms related to carpal tunnel syndrome, but there is an opportunity to introduce a similar product with a technological component. This product would use strain gauge sensors to monitor the user’s grip and monitor joints/muscles that undergo prolonged repetitive motion. Through these sensors, a software application could assess the level of stress on the hand and wrist, provide notifications to prompt breaks, and suggest targeted stretches aimed at alleviating tension in the specific regions of the hand and wrist. Since this device aims to promote good wrist/hand health and muscle strain prevention, it aims to point the user to healthier stretching and break unhealthy habits of prolonged muscle stress. # Solution Components ## Subsystem 1: Sensor Layer **Strain Gauges** Strain gauges can measure deformation or mechanical strain within a material by changing its electrical resistance when stretched or compressed, which works well for applications related to structural load analysis. Strain gauges used in tandem with IMUs allow for a fuller picture of mechanical movement within the hand, i.e. wrist angle and flexion detection (which correlates to potential nerve compression risks). A strain gauge rosette can measure wrist angles and analyze strains that occur during wrist flexion/extension/radial and ulnar deviation, and may be placed near key ligaments such as the dorsal wrist area.. A suitable strain gauge that may be used is the Vishay CEA-06-062UR-350, as it can measure multi-axial strains. This approach adds a biomechanical analysis layer to the glove, which may detect harmful wrist postures even when muscles are not active. **Inertial Measurement Unit** IMUs can track repetitive motion by measuring linear acceleration and angular velocity, which makes them useful to detect wrist and hand movements associated with repetitive strain which can then contribute to nerve compression. An option for an IMU would be a ICM-20948, which is a low-power sensor making it suitable for wearable applications such as our glove. These IMUs would be placed in specific positions of the wrist (such as the dorsal side to capture radial/ulnar deviation), just above the wrist to track forearm rotation, and the back of the hand (near the metacarpals) to monitor fine motor motion such as finger extension/flexion dynamics. ## Subsystem 2: MCU The microcontroller will take the signals from the sensor layer (from the sensor gauges and IMUs) as inputs and perform amplification, filtering, and analog-to-digital conversion that were outlined in subsystem 1 above. We are thinking of utilizing a microcontroller with a built-in ADC and programmable amplifier/band-pass filters such as the TI-MSP430FR5994. ## Subsystem 3: Amplifying/Filtering Signal Processing (live input) Strain Gauge Signal Processing Strain gauges measure strain by changing its electrical resistance; to convert these tiny resistance changes into measurable voltages, a Wheatstone bridge is used to amplify changes in resistance caused by strain (we aim to use a full-bridge with four gauges to maximize sensitivity and thermal stability). The output of resistance changes into voltage is usually too small; the signal will first be amplified and filtered (will likely use low-pass filtering to remove high-frequency noise since strain gauge signals are typically maximum 20 Hz for human motion) prior to analog-to-digital conversion. Then, the microcontroller will calculate wrist angle and stress based on the digital signal and trigger any feedback (i.e. user notification or display system). **Inertial Measurement Unit Signal Processing** The raw data from the IMUs will likely involve filtering to remove noise unrelated to actual motion. Useful data related to motion frequency, range of motion, and angular velocity can be derived with FFT for orientation tracking (the IMU collects time-series data and would include periodic signals if repetitive tasks are being performed). Processed IMU data can then be correlated with strain gauge outputs to identify patterns of repetitive motion and grip. ## Subsystem 4: Power Subsystem (signal analysis) As shown in the block diagram, the main power systems are transferring the required amount of voltage from the PCB to the battery, communication module, and MCU. The board design is explained in subsection 6, but the biggest component to discuss is the rechargeable battery which will connect to the glove. We will most likely use a lithium ion battery. Smart Carpal Tunnel Wrist Glove RFA - Block Diagram https://docs.google.com/document/d/1LMrlfA7iYeF-7hu9MXWzzaYylBaH1Q71_ltEKfnskbM/edit ## Subsystem 5: Communication Protocol/Display System This subsystem will receive signals from Subsystem 3 and compare them with threshold values we set to assess whether the user is applying prolonged stress that may lead to harmful muscle activity. The output voltage readings (vout) across the strain gauge is proportional to the change in electrical resistance. We can calculate the strain by dividing the ratio of vout/vin by the gauge factor (the ratio of the relative change in electrical resistance and relative change in length). Force is then calculated by multiplying the strain, young’s modulus of the material, and the cross-sectional area of the stressed material. We can then use this force value for our Maximum Voluntary Contraction (MVC) comparison with a threshold value of 20%. The stress readings from the strain gauge, along with their duration, will be compared to the 4% threshold. The wrist flexion/extension will be compared to the value 30º and radial deviation will be compared to the value of 15°. We will have an application that communicates the result of this comparison. If any of the readings exceed the threshold, the system will suggest the user take breaks every 20-30 minutes (through an LED/text display system on the glove). Additionally, strain gauge readings will provide insights into which joints undergo repetitive motion and, subsequently, relevant muscles. The external app will also display various stretches to help reduce stress and tension in those muscles and joints. We will explore the possibility of exporting real-time signal data from the MCU to a PC via UART through implementing a very simple program that interprets the data and intelligently suggests a stretch out of a database. ## Subsystem 6: Board Design The PCB will be designed on KiCad and will be the ‘brain’ of the system. The design will route the microcontroller signals to the notification system, ensuring that power will be supplied to both portions of the design (live input & user-facing). Both subsystems will likely have varying current limits and required voltage inputs, which will be taken care of on the PCB using voltage regulation and power conversion techniques - possibly a buck converter or linear regulator. # Criterion For Success Accurately measures repetitive motion and where (i.e. Accurately measures angle of wrist flexion and extension Notifies user of prolonged muscle strain and proposes stretches that targets the users’ muscles that were under prolonged use with 80% accuracy System properly works on 2 different group members, with different grips to show detection can be used on unique users in |
||||||
79 | Voice Dosimeter for Voice Therapy |
David Gong Jaden Li Michael Rizk |
Chi Zhang | Michael Oelze | proposal1.pdf |
|
# Voice Dosimeter for Voice Therapy Team Members: - David Gong (dsgong3) - Jaden Li (sizhel2) - Michael Rizk (rizk2) # Problem The US societal costs of voice-related teacher absenteeism and treatment expenses alone have been estimated to be as high as 2.5 billion dollars annually. Such absenteeism could be prevented if teachers were able to measure how much they were using and straining their voices every day. Furthermore, there are currently no commercially available voice dosimeters. Some devices were available in the past, but they cost thousands of dollars. A low-cost and widely available voice dosimeter would allow for clinical and research use of voice-related problems and voice therapy options. This project will be conducted in collaboration with graduate student Charlie Nudelman and Professor Pasquale Bottalico at the College of Applied Health Sciences. Their group had previously developed a DIY voice dosimeter using a contact microphone and a portable audio recorder. They are still using this device today, but there are a few improvements they would like to see. # Solution The device that Charlie and Pasquale are using is bulky and requires a wired connection. It is impractical for patients to wear daily and collect data for a long period of time. We aim to create a cheaper and more comfortable voice dosimeter that is capable of recording data for long periods of time without recharging while the data can be uploaded to another device wirelessly. # Solution Components ## Sensor System Accelerometer: Low power and need bandwidth of at least 3kHz (BMA580) Medical Tape Silicone enclosure ## Microcontroller Microcontroller: SOC with BLE low power consumption (NRF52832-QFAB-R) ## Data Processing and User Interface The recorded data is collected and uploaded to a device through bluetooth. This data is then processed to extract information about sound pressure level, fundamental frequency, and cepstral peak. Finally, this data can be viewed either through a website or an app. ## Power Battery: enough power to last 8 hours given MCU, power consumption estimate - Either a small rechargeable battery, or a button cell battery that can be replaced PMIC: TPS65720EVM-515 # Criterion For Success - Design a device that costs less than $200 - Accurately measure within some uncertainty compared to a microphone measurement in a quiet room: sound pressure level within 2dB, fundamental frequency within 5Hz, cepstral peak within 2dB - Reject external noise and voices by at least 20 dB compared to the wearer’s voice. - At least 8 hours of battery life ## Secondary Objective - Cloud integration |
||||||
80 | MazEscape |
Jatin Tahiliani Jayanto Mukherjee Will Knox |
Aishee Mondal | Viktor Gruev | other1.pdf |
|
Maze Quiz Jayanto Mukherjee(jayanto2) Jatin Tahiliani(jatint2) Will Knox (wk9) Problem Modern-day theme park immersive games have become stale and predictable, so we wanted to make them more entertaining by seeing if it is possible to mix some of them. So, we devised a fun idea for a mix between a maze and an escape room where the participants will enter a labyrinth and answer questions to move onto the next level or to the next room and complete the game. Solution To tackle this challenge, we have decided that there will be a set of four smart lock systems, two of which will have an LCD screen along with a keypad with which the user will be able to interact with the whole system, and the other two will be the emergency escape lock system. Each set of smart lock systems will be attached to a door that will open up to the next part of the maze or the next level or to a door that will take them out of the maze and back to the starting point. The questions that will be asked on each of the smart locks will be related to small puzzles or general knowledge questions that they will get one chance to answer, as all the questions displayed will be multiple-choice. The players will answer the questions using the keypad by selecting one of four choices: A, B, C, or D. There will be a total of two levels: an entry-level or the first level, which will be the first instance where the player will be asked to answer a question, and upon successfully answering the question, the system will unlock the gate and the player will be able to move onto the next level and which will be the second or the final level. The player will then again be asked to answer a question, and if they get the correct answer, they exit the maze and claim their prize. If, however, in any of the two levels, the player selects the wrong answer, then the smart lock will send a signal automatically to the escape smart lock system, which will be put on an escape gate, to unlock the gate so that the player can leave the game and go back to the starting point. Each of the two smart locks which will have an LCD screen, will also have a motion sensor so that the smart lock is automatically able to detect if a player has approached it, and then it can display its question. The smart lock systems which will ask the questions will also be able to communicate with each other so that the user is not introduced to the same question. The player will also have an additional option to leave the game by pressing a leave button on the keypad, upon which the smart lock system will send the escape lock system a signal to unlock the gate. Solution Components Subsystem Mechanical subsystem: We will use a 1602 LCD Display Module to display the problems the user will solve and a Numeric Keypad to input their answers. The LCD module will be very important for the user interface as all the information the user will need to use the device properly will be available on the LCD display. The user will be able to navigate the different functionalities using the keypad. The LCD module and the keypad will communicate via the SPI protocol with the microcontroller. Subsystem Microcontroller: The ESP32 microcontroller will have different types of questions organized into various kinds of questions (MC questions about trivia and general knowledge questions answered with pressing buttons). Subsystem Wifi/Bluetooth: We will use the ESP32 Microcontroller as a Wifi/Bluetooth module that will connect all the LCD screens together. The Bluetooth module will also allow the smart lock system to send the escape lock system a signal in case the player gets the question wrong or if they want to leave the game. Subsystem Motion sensor: To have a unique and interactive experience, we will implement an HC-SR501 Infrared PIR Motion Sensor Module that will interact with the user by detecting them, and then once the user is detected, it will prompt them with a question to unlock the system Subsystem Mechanical Lock: We will use a sliding lock when the questions are answered. It will unlock the door, and it will lock after the user closes the door. When the questions are fully answered, the sliding lock will be in the form of a rod and operated by a motor on command. Criterion For Success High-level goals our project needs to accomplish to be effective Successfully update and randomize problem sets so that solutions aren’t memorized Ensure the door is unlocked when problems have been solved and locked when the closet is closed Make sure that the different lock systems receive data regarding questions and locking and unlocking via Bluetooth. Minimize power consumption of the system Adjust the difficulty of problems based on user feedback and experience |
||||||
81 | Fire and Gas Detection with Real-Time LED Navigation |
Abel Garcia Alex Parafinczuk Jainam Shah |
Surya Vasanth | Yang Zhao | proposal1.pdf |
|
Team Members: - Alex Parafinczuk (atp6) - Abel Garcia (abelg3) - Jainam Shah (jshah74) # Problem Commercial Smoke detectors in the market currently give users the ability to call first-responders immediately and play an alarm sound when there is a hazard present in one's home. Some smoke detectors come with the ability to connect with your phone via messages or mobile apps alerting the homeowner to potential hazards in their home. The issue with these types of smoke detectors is that help isn't immediate. Responders take a little while to reach home, and during this time if there was a way to help mitigate the effect of the hazard there could be a potential save in property and lives. # Solution With the use of sensors such as gas and temperature sensors, we will know right away when a hazard is detected. If the hazard detected is a gas such as methane, butane, an alarm will sound indicating that the family should leave and get emergency responders. If the detected hazard is a fire, we will have an app that will have your floorplan of the house as well as locations of where each sensor you placed around the house. With this information, an algorithm will run which will designate an exit route that can be taken for the family to escape. When the fire breaks out at a location, we will have bright LEDs on the smoke detectors which will light up in the direction you should take to exit the house safely based on the route given. At this time, we will close the vents around the hazardous areas in order to help weaken/prevent the growth of the fire in the house. In addition to this, since closing a vent doesn’t guarantee the power being off for an HVAC unit, to prevent any damage to the HVAC unit we will shut the HVAC unit. # Solution Components ## Temperature/Gas Sensors Subsystem Outside of the main smoke detection unit, we will have temperature sensors which will be placed in designated rooms in order to give our control unit relevant information about where the hazard has originated from. They will also contain the LEDs to lead inhabitants to the designated exit and alarms to notify them of any hazard detected. - Temperature Sensors (LM335AH) - MQ-9 Gas Sensors for Carbon Monoxide and Flammable Gasses - Alarm on Board - LEDs ## Vents Subsystem This will be a motorized controlled vent that will open and shut depending on whether a fire hazard is detected. - Stepper Motor to control the movement of the vent. ## Control The control system will be in control of receiving data from the sensors. When a temperature sensor spikes up indicating a fire, the control will run the algorithm first and then send the signal to a set of leds for the optimal route to take for safety. In the case of a fire, a signal will be sent to shut all of the vents. - ESP-32 ## App Subsystem This will be where the user sets the floor plan of their house. They will be able to designate all the rooms in the house, connections between rooms, as well as all possible exits in the house. This interface will communicate with the control unit giving it the information on where the sensors are located around the house. - React Native Frontend - Firebase Backend ## Power Subsystem We will use batteries as our power source which will be situated in our central control unit. The batteries, with converters, will then power everything including the sensor system, the control system, and the motorized vents. A sensor will also be connected to check the remaining charge of the batteries, which will be sent to the app for the user to see when they need to be changed. - Batteries - Buck converters # Criterion For Success The following goals are fundamental to the success of our project: - Most optimal path to safety is chosen for conditions involving fire. - Other gasses found such as Carbon Monoxide and other flammable gasses will sound an alarm to notify residents to leave and get emergency services. - LED’s light according to the path chosen by the control unit. - All vents will close upon detecting a high temperature signaling a fire that has broken out. - App successfully communicates with the phone and system to upload the floorplan. The goals below are reach goals we will try to achieve if time allows: - Vents will have more functionality and be able to keep designated exits clear of smoke. - App will automatically call emergency services in the presence of life-threatening gas hazards. - In the case of all primary exits being blocked, we would want the user to designate secondary exits such as windows as a last resort method for the algorithm to give. |
||||||
82 | RFA: EMG Controlled Midi effects controller |
Joseph Schanne Karan Sharma Paul Matthews |
Eric Tang | Yang Zhao | proposal1.pdf |
|
## This is a repost of https://courses.grainger.illinois.edu/ece445/pace/view-topic.asp?id=77612, since it was in the wrong post category. # MyoEffect Team Members: - paulgm2 (netid) - karans4 (netid) - schanne3 (netid) # Problem Most musicians working in digital audio workspaces (DAWs) desire fine control over the effects they use and the intensity at which those effects operate. Most of the time, in order to quickly assign effects to controls and to fine tune them, musicians will make use of MIDI controllers; however, these controllers are often both clunky to use and expensive to obtain. # Solution Our design offers a cheaper and more expressive method of using effects by bypassing the use of a separate controller entirely and controlling effects with the motion of the musician’s own body. Instead of having to turn knobs or press buttons, a musician can simply close their fist, or extend their arm, or twist their leg; our design’s goal is to give complete freedom to the musician as far as what motion they want to control their effects. Our design makes use of electromyography (EMG) sensors to map contractions of users’ muscles to voltage readings; these readings will then be converted into corresponding MIDI parameters. Our device can then be plugged into a DAW and operate in exactly the same way as a normal MIDI controller. # Solution Components # Subsystem 1 EMG (Electromyography) Sensor System This system will either be a custom built sensor setup using the MIT-Licensed OpenEMG project as a base, or will use the MyoWare 2.0 sensor produced by SparkFun. The major advantage of the OpenEMG project is that we can simplify the overall design of the sensor and microcontroller into one board and save money. The overall cost of the parts (excluding the fabrication of a custom PCB) needed for the OpenEMG sensor setup should be about $5, which is much less than the cost of one MyoWare 2.0 sensor, which is $40. Either way, this system uses two adhesive EMG pads (or alternatively metal plates) to measure the amplitude of the electric signals corresponding to muscle contraction. For specific partouts, check the OpenEMG git repo. https://www.sparkfun.com/myoware-2-0-muscle-sensor.html https://www.mouser.com/pdfDocs/Productoverview-SparkFun-DEV-18977.pdf https://github.com/CGrassin/OpenEMG # Subsystem 2 ## MCU's to process EMG data We’re planning on using ESP32 C3 MCUs to collect, digitize, encode, and send signals from the wearable sensor to the main midi controller. Initially we will be using devboards to set this up, but eventually we’ll design and manufacture our own proprietary boards to fit into a proprietary (watch-like) housing that can be strapped to any given large muscle. The ESP32 offers an on-board Analog to Digital Converter (ADC), allowing us to collect data directly from the EMG sensors and convert it to a digital signal. We will encode this digital signal into an acceptable format and transmit it to the receiver (Subsystem 3). In order to communicate with the receiver, we have decided to use the ESP-NOW protocol developed by Expressif, the company behind the ESP32, as we have found their protocol to offer lower latency than bluetooth. Low latency is essential in music production, as even just a quarter of a second could lead to you being a note or two behind. The ESP-NOW protocol’s max speed of 512 kbps and latency as low as 5ms is ideal for our use case. Since we are limited to 512 kbps, and we will need to be transmitting data to the MIDI controller over 100 times a second, we are limited to a low amount of data to encode and send for each sensor. The bitwise resolution of our encoding and sampling rates are a design decision that remains unanswered as of now. ESP NOW protocol: Max speed is 512 kbps, so we can’t encode a super high quality signal from every sensor. Ex: 22,050 Hz * 16 bits = 352800 bps We eventually do want our design to be able to handle as many sensors as possible, but we will be limited by our data transmission rate with the use of the ESP32. However, all of our boards have Bluetooth options already; therefore, if we feel the need to change our priority away from latency issues and towards having as many sensors as possible, we have the option of changing over to Bluetooth instead to allow this. # Subsystem 3 ## MIDI controller This part of the design will act as a hub for multiple sensors and will modify certain parameters of the MIDI protocol depending on the input from each sensor. Each sensor would modulate one effect, for example a leg sensor could be used for a “wah” effect, while hand position/grip strength could correspond to distortion, etc. We plan to use an ESP32 C3 MCU to act as the brains of this subsystem in order to maintain compatibility with the ESP-NOW protocol. We will receive the digital signals from the transmitter and convert it to a MIDI compliant USB signal. The end user would plug in a USB cable into his or her Computer/ Digital Audio Workstation (DAW), and could pick and choose the effects applied to each signal. Since MIDI is a standardized format, this data signal should be compatible with any DAW, with little to no user configuration required. # Criterion For Success Criterion For Success To make our criterion more specific we will decide that if we can control the pitch on at least one person by extending or contracting their arm, we will consider that a success. Specific measurements: Our user must fully extend their arm as far as it can go while playing a steady note on the guitar, and the pitch of the note will be shifted down by by half a step. We will specifically ask the guitarist to play A3 (220.0000 Hz), and consider it a success if when the arm is fully extended, the note being played is A3 flat (207.6523 Hz) or lower. We are choosing pitch because it is the easiest to quantify and measure. If we can accomplish this, we would move on to adding more body parts and more effects, also making it work for multiple people, but they are not necessary for the project. Our sole criterion will be if we can change an effect like as pitch by extending or releasing the arm on one person. |
||||||
83 | Automatic Door Conversion Kits |
Alex Vega Love Patel Romeo Delgado |
Chi Zhang | Viktor Gruev | proposal1.pdf |
|
**Team Members:** - Romeo Delgado (rdelg2) - Love Patel (lovep2) - Alexander Vega (avega40) **PROBLEM:** With accessibility being considered more in modern infrastructure, more and more systems for accommodating people with physical disabilities are being installed every day. Most of these systems are installed in public locations and paid for by the government. However, installing similar accessibility systems in one's home for those with movement limitations is much more costly and difficult, even for such common accessibility obstacles such as doors. Cheaper and easier to install automatic doors meant for residential homes would alleviate this cost barrier and difficulty of installation for those with movement limitations that struggle to use standard, manual doors. **SOLUTION:** Our solution for the high barrier to entry for making one's home accessible is to make more cost effective and quicker to install automatic door conversion kits for residential homes. These kits would include a Bluetooth door opener, door handle, and remote. The Bluetooth door opener would attached to door you're converting and its frame, allowing it to be remotely closed/opened with the Bluetooth remote. To allow the door to close/open remotely, it's latch would also be replaced by a Bluetooth actuator that would close/open in sync with with the door opener to allow it to swing freely. **SOLUTION SUBSYSTEMS:** 1. **Power Subsystem:** - A lithium coin battery slot for the Bluetooth remote, allowing for batteries to be replaced. - Converts standard outlet power to the motor and the PCB. - A set of 4 AA batteries to power the door latch actuator. 2. **Processor Subsystem:** - An ESP32 microprocessor for sending/receiving Bluetooth signals between the handle motor, the PCB, and the remote. - Internal A/D conversion and signal processing. 3. **Door Close/Open Subsystem:** - A 25 W to 30 W motor capable of producing 30 Nm of torque, allowing the closing/opening residential doors through jointed arm on a guided rail system. 4. **Door Latch Actuator Subsystem:** - A 6 V DC solenoid actuator connected to a Bluetooth receiver and a PCB that opens in sync with the door being opened to allow it to swing freely. - This system will be housed inside a custom 3D printed door handle 5.**Remote Control Subsystem:** - A handheld Bluetooth remote with 3 buttons and a 8 segment hex display showing which door is currently being targeted for closing/opening. **CRITERION FOR SUCCESS:** - A minimum of two operational kits able to convert 80 inch tall, 36 inch wide, and 55 pound residential doors into automatic doors. - A Bluetooth remote is able to switch between which door is targeting to close/open while displaying the set number of the currently targeted door and whether it is open or not. - Each kit is cheaper and easier to install than commercially available kits with similar features. |
||||||
84 | Mobile stray cat rescue station |
Frank Chen Ming Yi Yilin Tan |
Rui Gong | Michael Oelze | proposal1.pdf |
|
#Group members - Yilin Tan(ytan47) - Ming Yi(myi22) - Frank Chen(sihan6) # Problem: For now, because of the kind people from all walks of life and the existence of many adoption agencies, it is difficult for us to see stray animals on the street without a fixed place to live, but this is not absolute! Because in my community, I always see a lot of stray cats, regardless of age and species. When I meet them, they are not necessarily alive. The cold and food shortage will threaten their lives. And every time I want to help them, I can't just happen to be able to provide them with food. Even if I can give them food, the stray cat population and its offspring will continue to wander. So my idea is how to help them live better. I plan to build a portable stray cat rescue station that can be placed in areas where stray cats are present to provide them with warmth and food. # Solution: My solution is to design a wireless constant temperature device that can detect the temperature in the cat's nest at all times to determine whether the heating device is turned on and off. Then equip it with feeding equipment, and detect whether there is enough food in the cat's food bin. Through wireless data transmission, maintenance personnel can decide whether to add food to the equipment according to the data. This device includes the following modules: Power supply module: as the power source of the entire device and other subsystems Heating module: to control the device temperature Main control module: transmit real-time data of the device Casing and heating layer: enhance the warmth of the device at the physical level #Solution components: ## Subsystem 1: Power supply module Overview: This subsystem uses solar panels and lithium batteries to realize the device power supply system. While avoiding the trouble of replacing batteries, it can also realize the convenience of this device and can be installed outdoors. It is also conducive to the operation of subsystem 2 (heating module) at night. Design: Power generation method: solar panels can generate solar energy outdoors Power storage method: use lithium batteries of suitable capacity to store electricity ## Subsystem 2: Heating module Overview: This subsystem automatically controls the temperature in the device through heating equipment and temperature and humidity sensors Design: Heating method: heating by laying heating wires in the device Temperature control method: use temperature and humidity sensors to detect whether the temperature in the device is lower than the set value and turn on the heating device, and the temperature can be monitored in real time through subsystem 3 (main control module) ## Subsystem 3: Main control module Overview: Detect data in the device and transmit data wirelessly to the receiving station Design: Transmission method: Use LoRa devices for transmission to ensure that data is obtained when there is no network in the wild. Reserve detection: Use pressure or infrared sensors to detect food residue Power monitoring: Real-time recording of battery remaining power Activity detection (optional): Infrared sensor detects cat entry and exit (optional: camera monitoring, but it is not suitable for no network and the price is higher.) ## Subsystem 4: Shell and heating layer Overview: Use physical methods to assist and strengthen the role of subsystem 2 (temperature control module), and reduce power consumption. Design: Shell: Waterproof plastic board, aluminum plate or 3D printing material, and a metal frame about 10 cm above the ground to prevent rainy weather Inner layer: Warm and fireproof foam board or other insulation material # Criterion For Success: This design would be considered successful if The heater will automatically turn on and off when the temperature is within the set value range When there is not enough food in the feeding bowl, it will automatically dispense food Ability to obtain various real-time data on the LoRa master station |
||||||
85 | Poker Buddy: Chipless Poker Companion |
Austin Abraham Lorenzo Dumitrescu Vishal Ramvelu |
Eric Tang | Yang Zhao | proposal1.pdf |
|
# Poker Buddy: Chipless Poker Companion # Team Members: - Austin Abraham (austina5) - Lorenzo Dumitrescu (ldumit4) - Vishal Ramvelu (ramvelu2) # Problem Traditional poker games rely heavily on physical chips for betting, which can be cumbersome, error-prone, and prone to mismanagement or theft. Managing chip counts, handling physical money, and tracking bet amounts often slow down the game and can lead to disputes among players. In addition, determining whose turn it is during fast-paced games can be confusing and cause a lot of frustration between players. With the growing demand for digital integration in gaming, there is an opportunity to streamline the poker experience by eliminating physical chips and automating bet tracking and game flow. This is different from online poker because we want to maintain the in person experience of playing against your friends face to face, but without the inefficiencies of standard chips and markers that represent blinds. # Solution We propose a modular device that removes the need for physical chips while enhancing the poker-playing experience. Each player will use a dedicated device that features LED displays to show both their current balance and the money in the pot, along with a built-in turn indicator light that activates when it is their turn. We will use a force sensitive touch sensor to interpret different gestures—one tap for fold, two taps for check, and a long hold for call—eliminating the need for manual chip handling to signal actions. Additionally, five colored buttons correspond to different chip denominations for quick and easy betting. While we could use some type of sensor for these buttons, we want to maintain the tactile feel and choose to use buttons for our design. These devices will wirelessly connect to a centralized mobile/web application that manages buy-ins, tracks all player balances, and synchronizes game status in real time, ensuring an efficient and error-free gaming experience. Although these devices will not track cards, they must handle the real-time logic of betting, maintaining balances, and managing turn order without relying on a computer. The game logic is distributed and managed by the PCBs in each Poker Buddy. This means that each Poker Buddy keeps track of: - Whose turn it is to bet (reflected by the turn signal LED). - The current bet amounts and how they contribute to the pot(reflected by LCD display). - The players’ individual balances(reflected by another LCD display). - The outcome of each hand (i.e., when a player wins, the entire pot is automatically credited to their balance). These devices communicate wirelessly with each other and can optionally sync with a centralized mobile application for overall game monitoring and account management by the host of the game (this will just be used for buying in chips and determining payouts at the end). The system is designed to be portable and is powered by disposable batteries, ensuring flexibility and ease-of-use in various settings. # Solution Components ## Sensor Subsystem The Sensor Subsystem captures all user inputs without the need for physical chips: - Force Sensitive Touch Sensor: An FSR (Force Sensitive Resistor) module will detect user gestures and differentiate between a single tap (fold), double tap (check), and long press (call) based on the force and duration of touch. - Button Array: A set of 5 tactile push buttons, each in a distinct color, will represent specific chip denominations for placing bets. ## Microcontroller and Processing Subsystem This subsystem processes inputs, drives outputs, and manages wireless communication: - ESP32-WROOM-32 Module: Serving as the core microcontroller, the ESP32 provides built-in WiFi/BLE connectivity for real-time data exchange with the mobile/web application as well as handling the logic for the game. - LED Displays: Two displays (7-segment LED displays such as the LTL-307EE) will show the player's balance and the current pot amount. - Turn Indicator LED: A dedicated LED will signal when it is the player's turn, ensuring immediate visual recognition. - Voltage Regulator: A voltage regulator such as the LM2596 DC-DC Buck Converter will ensure a stable power supply to the ESP32 and peripheral components. - Power Supply – Disposable Batteries: The device is designed for portability and can be powered by disposable batteries (AA battery packs) or via a direct power connection. ## User Subsystem The User Subsystem integrates physical device interaction with a digital game management system: - Physical Interface: The combination of the LED displays, turn indicator, force sensitive touch sensor, and colored buttons creates an intuitive interface that replaces traditional chip handling. - Mobile/Web Application: A dedicated application will allow users to buy in, view real-time balances, monitor the pot, and receive instant updates on game status, seamlessly synchronizing data across all devices. - Secure Communication: Robust wireless protocols will ensure that all transactions and game data are transmitted securely and accurately between the Poker Buddy devices and the central application. # Criterion For Success - Real-Time Status Updates: The system must update player balances, pot amounts, and turn indicators on the app within five seconds in at least 90% of cases. - Accurate Gesture Recognition: The force sensitive touch sensor should reliably distinguish between a single tap (fold), double tap (check), and long press (call) with a false detection rate below 2%. - Reliable Wireless Communication: The ESP32 module must maintain stable and consistent connectivity with the mobile/web application, achieving at least a 90% connection success rate during active gameplay. - User-Friendly Interface: The physical device should offer clear visual feedback through its LED displays and turn indicator, ensuring that users can operate it intuitively without the need for physical chips. - Game is Mathematically Correct: Since poker involves complex betting logic, the system must correctly update sums, properly rotate blinds around the table, and accurately calculate winnings. The distributed game logic must ensure that all arithmetic and game state transitions are mathematically correct and robust against errors. |
||||||
86 | Smart Backpack + Inventory Tracking System |
Aashish Subramanian Seth Oberholtzer Shreyas Sriram |
Rui Gong | Arne Fliflet | proposal1.pdf |
|
Smart Backpack + Inventory Tracking System Team Members: Shreyas Sriram (ssrir5) Seth Oberholtzer (sethmo2) Aashish Subramanian (asubr2) Problem Many people struggle with tracking their belongings inside their backpacks, often forgetting essential items or falling victim to theft in crowded areas. Traditional backpacks lack intelligent security and organization features, making them inefficient for modern users. There is a need for an innovative backpack that provides smart tracking, theft prevention, and automated security. Solution Overview We propose a Smart Backpack with Inventory Tracking & Security, integrating advanced RFID tracking, theft detection, automated security features, and real-time mobile connectivity. This backpack will help users keep track of their belongings, prevent theft, and provide alerts for missing items, ensuring both convenience and security. Solution Components RFID-Based Item Tracking This backpack integrates an RFID tracking system to help users keep track of their essentials. Small RFID tags are attached to commonly carried items like a laptop, notebook, wallet, and keys. An STM (or any other) microcontroller scans the backpack’s contents and sends real-time alerts to a mobile app if an important item is missing before the user leaves a location. Anti-Theft Security System Designed with theft prevention in mind, the backpack features an accelerometer and gyroscope (IMU) to detect unusual movement, such as someone attempting to grab or open the bag while it's unattended. If unauthorized access is detected, a hidden buzzer or vibration motor activates to alert the user, adding an extra layer of security. Bluetooth & Mobile App Connectivity The backpack connects to a smartphone via Bluetooth Low Energy (BLE), allowing users to check their bag’s contents in real-time through a dedicated app. It also includes geo-fencing alerts, which notify the user if they leave the backpack behind in a public place, helping prevent loss. Auto-Zip & Auto-Lock Mechanism For added security and convenience, the backpack features motorized zippers and an electronic or magnetic locking system. It can automatically lock itself based on the user's location—securing in crowded areas and unlocking at home. This feature prevents unauthorized access while making it easy for the user to carry and access their belongings when needed. Criteria for Success Accurate RFID Tracking: The system must reliably detect and track RFID-tagged items in real-time, alerting users when an item is missing. Effective Theft Detection: The IMU sensors should correctly identify unauthorized movements and trigger alerts or alarms. Seamless Mobile App Integration: The app should provide real-time inventory tracking, geofencing alerts, and security notifications. Reliable Auto-Zip & Locking Mechanism: The motorized zippers and locks must function consistently and respond correctly to user-defined security settings. Low Power Consumption: The system should operate efficiently on a portable battery to last for extended periods without frequent recharging. |