Project

# Title Team Members TA Documents Sponsor
62 AI-Nutritious Culinary Assistant
Griffin Kelley
Jackson Brown
Tony Liu
Aniket Chatterjee design_document1.pdf
proposal1.pdf
Team Members:
- Jackson Brown (jcb10)
- Kadin Shaheen (kadinas2) **Actively looking for a 3rd Teammate as Kadin dropped out.**
- Tony Liu (zikunl2)


# Problem
The processed food industry has become increasingly toxic due to chemical flavor additives (12%-32% higher cancer risk), yet cooking often intimidates new chefs when preparing. Therefore, students and working professionals may rely on convenient but unwholesome meals. Most available ‘smart cooking’ tools do not provide a real-time experience that guides users through the process from raw ingredients to the finished dish. This reduces the likelihood that the user will learn. It is also inefficient to design recipes that can be adjusted to the user's available ingredients. Users will waste food, and recipe creation is an expert skill that is difficult to customize. A healthy diet is important for increased productivity and long-term health, but is difficult to accomplish.

# Solution
We propose an AI-Nutritious Culinary Assistant that recognizes available ingredients and generates a personalized recipe with interactive, step-by-step guidance. Using the Meta Quest 3 as the user interface and sensor front-end, the system streams video and voice commands to an edge vision processor running an ingredient recognition pipeline. In addition to vision, the device integrates an environmental sensor module that measures ingredient weight (for portion verification) and ambient temperature (for context/safety telemetry). Finally, the appliance includes a circular rotating seasoning carousel driven by a stepper motor for proportional seasoning action and a servo-actuated gate for controlled dispensing, enabling closed-loop “dispense to target grams” assistance during cooking.

# Solution Components

Explain what the subsystem does. Explicitly list what sensors/components you will use in this subsystem. Include part numbers.

## Subsystem 1 - VR Headset Sensor Platform (Software)
This subsystem uses the Meta Quest 3 as the primary user interface and sensing front-end. The headset’s built-in RGB cameras capture the cooking scene for ingredient recognition, and its microphone captures the user’s wake word and spoken prompts. The headset also serves as the output display, presenting step-by-step recipe instructions as an AR/VR overlay. Captured camera frames and voice transcripts/commands are transmitted to the Vision Processor subsystem for inference and planning, while the returned recipe steps and alerts are rendered back in the headset.
### Parts:
Meta Quest 3 (VR headset with RGB cameras + microphone + display)
Wireless link (Wi-Fi) between Quest and Jetson (stream frames + commands, receive instructions)
## Subsystem 2 - Vision Processor (Software)
This subsystem performs the core perception and recipe-planning computation on the edge compute unit. It receives camera frames from the Meta Quest 3 and first detects/segments candidate ingredient regions using a real-time model (YOLO for bounding boxes, or FastSAM for masks). Each candidate region is then cropped (or masked) and passed through a CLIP-style vision encoder to generate an image embedding. In parallel, the system maintains a library of text embeddings for ingredient labels (e.g., “tomato”, “onion”, “spinach”). By comparing image embeddings to text embeddings (cosine similarity), the processor assigns the most likely ingredient label to each detected region, enabling more flexible recognition than closed-set detection alone. The final output is a structured ingredient list (label + confidence + location/mask), which is then provided to the LLM agent to generate step-by-step recipes and instructions that are sent back to the headset.


### Parts:
NVIDIA Jetson Nano (edge compute unit for model inference + agent logic)
Candidate region model: YOLOv8/YOLO11 (det/seg) (bounding boxes or segmentation) or FastSAM (mask proposals)
Vision-language classifier: CLIP / OpenCLIP / MobileCLIP (region embedding + text embedding matching)
LLM agent (recipe selection + instruction generation using detected ingredients)
Communication interface: Wi-Fi (Quest ↔ Jetson) for frames/instructions; optional UART/I2C/USB (ESP32 ↔ Jetson) for weight/safety telemetry

## Subsystem 3 — Environmental Sensor Subsystem (Weight + Room Temperature)
This subsystem measures ingredient mass for closed-loop dispensing and ambient temperature for environmental context (improving recipes by understanding room temperature) and essential fire hazard checks. The ESP32 reads both sensors, filters/calibrates the data (tare + scale factor), and forwards measurements to the main controller.

### Sensors/Components
Load cell(s): 5 kg single-point load cell (e.g., TAL220B 5kg) or 4× half-bridge load cells (for a round platform)
Load cell ADC/Amplifier: HX711 24-bit load cell amplifier breakout/module (HX711)
Ambient temperature sensor: SHT31-DIS (Sensirion SHT31-DIS-B) or BME280 (Bosch BME280) over I2C
MCU interface: ESP32 reads HX711 (GPIO clock/data) + temp sensor (I2C)

## Subsystem 4 — Battery Subsystem (Rechargeable Power for Portability)
This subsystem powers the device from a rechargeable battery, supports charging via USB, and generates stable rails for logic and actuation. It provides regulated 3.3 V for ESP32/sensors and 5–6 V for servo/actuators while preventing brownouts during motor current spikes.

### Sensors/Components
Battery pack: 1-cell LiPo, 3.7 V (e.g., 2000–5000 mAh, JST-PH)
Battery charger: TP4056 1S Li-ion charger module (with protection variant preferred) or MCP73831 (Microchip MCP73831T)
3.3 V regulator (logic rail): Buck converter module (e.g., MP1584EN) set to 3.3 V or LDO if current is small
5–6 V regulator (servo rail): Buck converter module (e.g., MP1584EN) set to 5.0–6.0 V
Power monitoring (optional but helpful): MAX17048 LiPo fuel gauge (MAX17048G+U) over I2C
Protection/robustness: power switch + fuse/polyfuse + bulk capacitors near servo rail

## Subsystem 5 — Rotating Carousel Ring + Dispenser Subsystem (One-Piece Circular Device)
This subsystem’s purposing on seasoning and provides the “lazy Susan” mechanism: a circular rotating ring that indexes ingredient pods to a fixed dispense station above the scale. A motor rotates the ring to the selected pod; a servo opens a gate to dispense into the center bowl. The ESP32 controls indexing, homing, and dispensing, using the scale feedback to stop at a target mass.

### Sensors/Components
Rotation motor: NEMA 17 stepper motor (e.g., 42BYGH-class)
Stepper driver: DRV8825 or A4988 stepper driver module
Homing/index sensor: A3144 Hall-effect sensor + small neodymium magnet (defines “slot 0”)
Dispense actuator: MG90S micro servo (metal gear) or SG90 (lighter duty)
Mechanical drive: GT2 timing belt + pulley set (e.g., GT2 6mm belt, 20T pulley) or friction wheel drive
Ring support: lazy-susan bearing (turntable bearing) or printed rail + small rollers/V-wheels
Dispense hardware: fixed chute + passive pod gate (flap/valve) engaged by the servo at the station

# Criterion For Success

We’d like to set the criterion for success with numerical benchmarks:
The sensor for weight measurement should have a resolution of 1 gram and a gram error range of 0 to 100 grams.
Fully chargeable battery offering approximately 20 minutes of runtime.
The rotating carousel dispenser should spin to the desired spot within 2 error.
Ingredient classification and localization accuracy >= 85% with 10 FPS real-time performance.

Eventually, the pipeline of feeding input from a VR headset (visual image stream) and sensors (hardware) to the vision processor and returning a contextual recipe back to the VR headset should be completed.


## References:

[1]: Hasenböhler, Anaïs et al. “Intake of food additive preservatives and incidence of cancer: results from the NutriNet-Santé prospective cohort.” BMJ (Clinical research ed.) vol. 392 e084917. 7 Jan. 2026, doi:10.1136/bmj-2025-084917






Iron Man Mouse

Jeff Chang, Yayati Pahuja, Zhiyuan Yang

Featured Project

# Problem:

Being an ECE student means that there is a high chance we are gonna sit in front of a computer for the majority of the day, especially during COVID times. This situation may lead to neck and lower back issues due to a long time of sedentary lifestyle. Therefore, it would be beneficial for us to get up and stretch for a while every now and then. However, exercising for a bit may distract us from working or studying and it might take some time to refocus. To control mice using our arm movements or hand gestures would be a way to enable us to get up and work at the same time. It is similar to the movie Iron Man when Tony Stark is working but without the hologram.

# Solution Overview:

The device would have a wrist band portion that acts as the tracker of the mouse pointer (implemented by accelerometer and perhaps optical sensors). A set of 3 finger cots with gyroscope or accelerometer are attached to the wrist band. These sensors as a whole would send data to a black box device (connected to the computer by USB) via bluetooth. The box would contain circuits to compute these translational/rotational data to imitate a mouse or trackpad movements with possible custom operation. Alternatively, we could have the wristband connected to a PC by bluetooth. In this case, a device driver on the OS is needed for the project to work.

# Solution Components:

Sensors (finger cots and wrist band):

1. 3-axis accelerometer attached to the wrist band portion of the device to collect translational movement (for mouse cursor tracking)

2. gyroscope attached to 3 finger cots portion to collect angular motion when user bend their fingers in different angles (for different clicking/zoom-in/etc operations)

3. (optional) optical sensors to help with accuracy if the accelerometer is not accurate enough. We could have infrared emitters set up around the screen and optical sensors on the wristband to help pinpoint cursor location.

4. (optional) flex sensors could also be used for finger cots to perform clicks in case the gyroscope proves to be inaccurate.

Power:

Lithium-ion battery with USB charging

Transmitter component:

1. A microcontroller to pre-process the data received from the 4 sensors. It can sort of integrate and synchronize the data before transmitting it.

2. A bluetooth chip that transmits the data to either the blackbox or the PC directly.

Receiver component:

1. Plan A: A box plugged into USB-A on PC. It has a bluetooth chip to receive data from the wristband, and a microcontroller to process the data into USB human interface device signals.

2. Plan B: the wristband is directly connected to the PC and we develop a device driver on the PC to process the data.

# Criterion for Success:

1. Basic Functionalities supported (left click, right click, scroll, cursor movement)

2. Advanced Functionalities supported(zoom in/out, custom operations eg. volume control)

3. Performance (accuracy & response time)

4. Physical qualities (easy to wear, durable, and battery life)