Project

# Title Team Members TA Documents Sponsor
46 Snooze-Cruiser
Alex Wang
Jiachen Hu
Jizhen Chen
Jiaming Xu
#Snooze-Cruiser
Team Members:

Jiachen Hu (hu86)

Jizhen Chen (jizhenc2)

Alex Wang (zw71)

#Problem

Many people suffer from sleep inertia, a condition where individuals instinctively silence alarms without fully waking up. Traditional alarm clocks and smartphone alarms rely solely on audio, which can be easily ignored or dismissed while half asleep. Existing alternative solutions such as puzzle-based alarms or flying alarms are often ineffective, unsafe, or impractical in confined environments like dorm rooms and bedrooms.

The fundamental issue is that current alarm systems fail to reliably force physical engagement, allowing users to return to sleep without becoming fully alert. A more effective alarm must require the user to physically interact with the system in order to disable it.

#Solution

We propose Snooze-Cruiser, a two-wheeled differential-drive robotic alarm system that physically moves away from the user when the alarm time is reached. Instead of simply producing sound, the robot navigates around the room, forcing the user to get out of bed and chase it in order to silence the alarm.

The robot operates autonomously in a confined indoor space, using onboard sensors for obstacle avoidance and odometry-based localization to remain within a defined area. The alarm is disabled not by pressing a button, but by detecting when the robot has been picked up using inertial sensor data. This interaction ensures that the user must physically wake up and engage with the device.

The system is divided into motion control, sensing, alarm/audio, localization, and power management subsystems.

#Solution Components

##Subsystem 1: Motion Control and Navigation

Function:
This subsystem enables the robot to move autonomously, wander unpredictably, and avoid obstacles while remaining within a confined area.

Components:

Microcontroller: STM32F446RCT6

Motor Driver: DRV8833PWP dual H-bridge motor driver

Motors: N20 micro gear motors with quadrature encoders (x2)

Inertial Measurement Unit: MPU6050

Obstacle Sensors: VL53L1X Time-of-Flight distance sensors (multiple)

Description:
The STM32 generates PWM signals to control the motors through the DRV8833 motor driver. Wheel encoders provide feedback for estimating speed and displacement. During alarm operation, the robot drives forward at a base speed and periodically introduces random heading changes. Obstacle avoidance is triggered when distance sensors detect nearby obstacles, causing the robot to turn away and resume wandering motion. Encoder and IMU data are fused to estimate the robot’s position relative to its starting point.

##Subsystem 2: Localization and Soft Geofencing

Function:
This subsystem prevents the robot from leaving the intended operating area (e.g., a bedroom).

Components:

Wheel Encoders (from Subsystem 1)

IMU: MPU6050

Description:
Wheel encoder data and IMU measurements are fused using a Kalman Filter (or equivalent sensor fusion approach) to estimate the robot’s displacement from its starting location. A soft geofence is defined as a radius around this starting point. If the robot exceeds this radius, it enters a return-to-center behavior by rotating toward the estimated origin and driving inward until it re-enters the allowed area.

##Subsystem 3: Alarm Timing and Audio Output

Function:
This subsystem handles timekeeping and audible alarm generation.

Components:

Microcontroller: STM32F446RCT6

Audio Amplifier: PAM8301AAF

Speaker

Description:
The STM32 maintains a real-time counter for alarm scheduling. When the preset alarm time is reached, the microcontroller simultaneously enables the audio amplifier and activates the motion subsystem. The alarm sound continues until a valid caught event is detected.

##Subsystem 4: Caught Detection (User Interaction)

Function:
This subsystem detects when the robot has been picked up by the user and disables the alarm.

Components:

IMU: MPU6050

Wheel Encoders

Description:
Caught detection is performed by analyzing IMU acceleration and vibration data in combination with wheel encoder feedback. A caught event is identified by sudden changes in acceleration magnitude, high-frequency vibrations from human handling, and inconsistencies between wheel motion and measured acceleration (indicating loss of ground contact). Once confirmed, the system immediately stops motor output and silences the alarm.

##Subsystem 5: Power Management

Function:
This subsystem supplies and regulates power for the robot.

Components:

Battery Charger IC: MCP73844

Rechargeable Battery

Voltage Regulation Circuitry

Description:
The battery supplies power to the MCU, sensors, motor driver, and audio system. The MCP73844 manages battery charging. Voltage regulation ensures stable operation during high current events such as motor startup.

#Criterion For Success

The project will be considered successful if the following objective criteria are met:

Timed Activation:
The alarm triggers within ±X seconds of the programmed time.

Synchronized Operation:
Robot motion and alarm audio start simultaneously upon alarm activation.

Autonomous Motion:
The robot moves continuously without user intervention during alarm operation.

Obstacle Avoidance:
The robot avoids obstacles placed in its path without repeated collisions.

Confined Operation:
The robot remains within a predefined operating radius and returns toward the starting location when the boundary is exceeded.

Caught Detection:
When picked up by a user, the robot reliably stops motion and audio within a short time window.

Decentralized Systems for Ground & Arial Vehicles (DSGAV)

Mingda Ma, Alvin Sun, Jialiang Zhang

Featured Project

# Team Members

* Yixiao Sun (yixiaos3)

* Mingda Ma (mingdam2)

* Jialiang Zhang (jz23)

# Problem Statement

Autonomous delivery over drone networks has become one of the new trends which can save a tremendous amount of labor. However, it is very difficult to scale things up due to the inefficiency of multi-rotors collaboration especially when they are carrying payload. In order to actually have it deployed in big cities, we could take advantage of the large ground vehicle network which already exists with rideshare companies like Uber and Lyft. The roof of an automobile has plenty of spaces to hold regular size packages with magnets, and the drone network can then optimize for flight time and efficiency while factoring in ground vehicle plans. While dramatically increasing delivery coverage and efficiency, such strategy raises a challenging problem of drone docking onto moving ground vehicles.

# Solution

We aim at tackling a particular component of this project given the scope and time limitation. We will implement a decentralized multi-agent control system that involves synchronizing a ground vehicle and a drone when in close proximity. Assumptions such as knowledge of vehicle states will be made, as this project is aiming towards a proof of concepts of a core challenge to this project. However, as we progress, we aim at lifting as many of those assumptions as possible. The infrastructure of the lab, drone and ground vehicle will be provided by our kind sponsor Professor Naira Hovakimyan. When the drone approaches the target and starts to have visuals on the ground vehicle, it will automatically send a docking request through an RF module. The RF receiver on the vehicle will then automatically turn on its assistant devices such as specific LED light patterns which aids motion synchronization between ground and areo vehicles. The ground vehicle will also periodically send out locally planned paths to the drone for it to predict the ground vehicle’s trajectory a couple of seconds into the future. This prediction can help the drone to stay within close proximity to the ground vehicle by optimizing with a reference trajectory.

### The hardware components include:

Provided by Research Platforms

* A drone

* A ground vehicle

* A camera

Developed by our team

* An LED based docking indicator

* RF communication modules (xbee)

* Onboard compute and communication microprocessor (STM32F4)

* Standalone power source for RF module and processor

# Required Circuit Design

We will integrate the power source, RF communication module and the LED tracking assistant together with our microcontroller within our PCB. The circuit will also automatically trigger the tracking assistant to facilitate its further operations. This special circuit is designed particularly to demonstrate the ability for the drone to precisely track and dock onto the ground vehicle.

# Criterion for Success -- Stages

1. When the ground vehicle is moving slowly in a straight line, the drone can autonomously take off from an arbitrary location and end up following it within close proximity.

2. Drones remains in close proximity when the ground vehicle is slowly turning (or navigating arbitrarily in slow speed)

3. Drone can dock autonomously onto the ground vehicle that is moving slowly in straight line

4. Drone can dock autonomously onto the ground vehicle that is slowly turning

5. Increase the speed of the ground vehicle and successfully perform tracking and / or docking

6. Drone can pick up packages while flying synchronously to the ground vehicle

We consider project completion on stage 3. The stages after that are considered advanced features depending on actual progress.

Project Videos