Project

# Title Team Members TA Documents Sponsor
21 MULTI-SENSOR MOTION DETECTOR FOR RELIABLE LIGHTING CONTROL
Joseph Paxhia
Lukas Ping
Sid Boinpally
Shiyuan Duan design_document1.pdf
final_paper1.pdf
presentation1.pdf
proposal1.pdf
Team Members:
- Joseph Paxhia (jpaxhia2)
- Siddarth Boinpally (sb72)
- Lukas Ping (lukasp2)

**PROBLEM:**

In offices, classrooms, and lecture halls worldwide, motion sensors are commonly used to automate lighting control. While convenient, these systems share a critical flaw: lights often switch off when people remain in the room but are relatively still—such as when typing, reading, or watching a presentation. This leads to frustration, disrupts productivity, and creates an inefficient work environment. The root of the issue lies in the reliance on Passive Infrared (PIR) sensors, which detect the infrared radiation emitted by warm bodies. Although effective for detecting large movements, PIR sensors struggle with micromotions, are prone to false triggers, and rely on fixed timeout settings. As a result, they fail to consistently recognize human presence.

**SOLUTION:**

Our approach introduces a multi-stage verification system to improve reliability while preserving the strengths of current technology. PIR sensors remain useful for their fast response to initial entry and larger movements, so we retain them for triggering lights when someone walks into a room. To overcome their limitations, we integrate a millimeter-wave (mmWave) radar sensor, which excels at detecting fine micromotions such as breathing or subtle hand movements.
This introduces the following subsystems:
- Control and Processor
- Sensing System
- Lighting Interface
- Power

**Subsystem #1: Control and Processor**

Primary Responsibilities:

- Take in the sensor data from the PIR and the mmWave sensors.
- Process this data and make a decision to stay on, gradually turn on, dim, and stay off.
- Send this decision out.

The control and processor subsystem will take in the PIR and mmWave sensor data, determine whether the lights should be off, on, gradually illuminate, or dim, and output this decision as a PWM (for the brightness of the lights) from the microprocessor for the lighting system to accurately drive the lights. By combining the two sensors, false positives and negatives will be reduced from the surrounding environment by combining the signals and using logic to combine the data sent in from both sensors. A STM32 microprocessor will be utilized, as it has the capability to process these signals and is best for filtering and dimming.

**Subsystem #2: Sensing System**

Primary responsibilities:

PIR: instant “walk-in” detection, coarse motion, low power standby.

mmWave: micromotion detection (breathing, typing), presence confirmation, and false-trigger suppression.

We will be using the PIR for fast wake and coarse motion and the mmWave for verification/hold and micromotion detection. Using both avoids having PIR false-offs while making sure that we have semi-instant illumination.

Basic state machine / functionality:
1. Idle / Vacant: PIR = low, mmWave = no-presence → lights off, system in low-power monitoring.
2. Wake / Entrance: PIR triggers → gradual illumination, start hold-timer and mmWave high-sensitivity window.
3. Occupied (confirmed): mmWave confirms presence (micro-motion or persistent reflection pattern) OR PIR continues to detect motion → remain ON; reset hold timers on detections.
4. Low-activity (PIR no longer seeing motion): PIR goes quiet → enter mmWave verification window: if mmWave detects micro-motion within verification window, remain in Occupied. If mmWave sees nothing for Nverify seconds → move to Vacant.
5. mmWave & PIR quiet → lights off, enter low-power scans at low duty.

**Subsystem #3: Lighting Interface**

Primary responsibilities:

- Gradually turn lights on and off
- Keeping lights on

Our gradual illumination will employ a 0-10V analog dimmer, which is essentially a subcircuit block. This is a very widely used and accepted lighting control interface that reads a DC voltage to control brightness on an LED. The driver itself still runs on AC Main power.

The subcircuit is comprised of these components:

- Microcontroller - To generate a high frequency PWM (Pulse Width Modulation) proportional to the desired brightness
- Filter - to transform the PWM to a DC voltage
- Op Amp Buffer / Amplifier- Since our STM32 microcontroller outputs up to 3.3 V and we need to generate up to 10 V DC
- Any protection needed - Resistors and diodes used as needed
- Output to LED

This 0-10V analog dimmer also can keep the lights on through the microcontroller generating a constant voltage above ~1.0 V. Once people leave the room and the controller doesn’t detect anyone, the inverse can be done to gradually turn the lights off (10 - 0 V).

Note: We will have to do some math to find a suitable slew rate for the brightening and dimming. We are thinking of having 3 different rates:

1. Gradual brightening - should take around 0.5-1 seconds for the lights to go from off to desired brightness
2. First dimming - takes around 10 seconds when the sensor first detects no people in the room.
3. Final shutoff - takes around ~2 seconds to fade to fully off. This is done after first dimming is completed and the sensor still detects no activity in the room.

**Subsystem #4: Power**

1. Take power from the fixture’s AC mains (120/230 VAC).
2. Use a dedicated isolated SMPS / LED-driver tap or internal LED driver rails to create regulated DC rails for electronics (3.3 V, 5 V, and optionally 1.8 V).
3. Keep the LED power path (the high-power LED driver) electrically separate from the low-voltage sensing electronics; provide good isolation and filtering between them.

The system is powered from AC mains, which feeds the LED driver to provide constant-current illumination and also supports mains sensing and surge protection components such as fuses and MOVs. All low-voltage electronics—including the MCU, mmWave radar module, PIR sensor, and any communications modules (Wi-Fi/BLE)—operate on DC, typically 3.3 V, with some modules optionally requiring 1.8 V or 5 V. The MCU manages these peripherals and interfaces with sensors using logic-level signals, ensuring safe and reliable operation of the sensing and control system.

**Criterion for Success**

The light should gradually turn on when somebody enters a room, and it should start the turn on process without much wait time. While it is on, and people are still present in the room, the light should not start to dim. When the room becomes empty, the light should start to dim (after a sufficient wait time) and turn off. In addition, this should be able to detect motion within 10-15 m. of the sensor.

Decentralized Systems for Ground & Arial Vehicles (DSGAV)

Mingda Ma, Alvin Sun, Jialiang Zhang

Featured Project

# Team Members

* Yixiao Sun (yixiaos3)

* Mingda Ma (mingdam2)

* Jialiang Zhang (jz23)

# Problem Statement

Autonomous delivery over drone networks has become one of the new trends which can save a tremendous amount of labor. However, it is very difficult to scale things up due to the inefficiency of multi-rotors collaboration especially when they are carrying payload. In order to actually have it deployed in big cities, we could take advantage of the large ground vehicle network which already exists with rideshare companies like Uber and Lyft. The roof of an automobile has plenty of spaces to hold regular size packages with magnets, and the drone network can then optimize for flight time and efficiency while factoring in ground vehicle plans. While dramatically increasing delivery coverage and efficiency, such strategy raises a challenging problem of drone docking onto moving ground vehicles.

# Solution

We aim at tackling a particular component of this project given the scope and time limitation. We will implement a decentralized multi-agent control system that involves synchronizing a ground vehicle and a drone when in close proximity. Assumptions such as knowledge of vehicle states will be made, as this project is aiming towards a proof of concepts of a core challenge to this project. However, as we progress, we aim at lifting as many of those assumptions as possible. The infrastructure of the lab, drone and ground vehicle will be provided by our kind sponsor Professor Naira Hovakimyan. When the drone approaches the target and starts to have visuals on the ground vehicle, it will automatically send a docking request through an RF module. The RF receiver on the vehicle will then automatically turn on its assistant devices such as specific LED light patterns which aids motion synchronization between ground and areo vehicles. The ground vehicle will also periodically send out locally planned paths to the drone for it to predict the ground vehicle’s trajectory a couple of seconds into the future. This prediction can help the drone to stay within close proximity to the ground vehicle by optimizing with a reference trajectory.

### The hardware components include:

Provided by Research Platforms

* A drone

* A ground vehicle

* A camera

Developed by our team

* An LED based docking indicator

* RF communication modules (xbee)

* Onboard compute and communication microprocessor (STM32F4)

* Standalone power source for RF module and processor

# Required Circuit Design

We will integrate the power source, RF communication module and the LED tracking assistant together with our microcontroller within our PCB. The circuit will also automatically trigger the tracking assistant to facilitate its further operations. This special circuit is designed particularly to demonstrate the ability for the drone to precisely track and dock onto the ground vehicle.

# Criterion for Success -- Stages

1. When the ground vehicle is moving slowly in a straight line, the drone can autonomously take off from an arbitrary location and end up following it within close proximity.

2. Drones remains in close proximity when the ground vehicle is slowly turning (or navigating arbitrarily in slow speed)

3. Drone can dock autonomously onto the ground vehicle that is moving slowly in straight line

4. Drone can dock autonomously onto the ground vehicle that is slowly turning

5. Increase the speed of the ground vehicle and successfully perform tracking and / or docking

6. Drone can pick up packages while flying synchronously to the ground vehicle

We consider project completion on stage 3. The stages after that are considered advanced features depending on actual progress.

Project Videos