Project

# Title Team Members TA Documents Sponsor
25 Building Interior Reconnaissance Drone (BIRD)
Jack Lavin
Jacob Witek
Mark Viz
Shiyuan Duan
# Building interior reconnaissance drone proposal

Team Members:
- Mark Viz (markjv2)
- Jack Lavin (jlavin4)
- Jacob Witek (witek5)

# Problem

There are many situations when law enforcement or emergency medical service professionals need quick, real-time, useful information about a non-visible location without sending in a human to gather this information due to present risks. One of the most important things to know in these situations is if there are people in a room or area, and if so, where they are located. While there are current promising solutions used by these professionals, they can rarely be operated by one person and take away time and manpower from situations which usually greatly require both. Our solution attempts to address these issues while providing an easy-to-use interface with critical information.

# Solution

Our solution to this issue is to use a reconnaissance drone equipped with a camera and other sensing components and simple autonomous behavior capabilities, and process the video feed on a separate laptop to determine an accurate location of all people in view of the drone relative to the location of a phone or viewing device nearby. This phone or viewing device would run an augmented-reality application using position information from the drone system to overlay the positions of people near the drone over first-person perspective video. The end result would allow someone to slide/toss the drone into a room, and after a second or two, be able to "see through the wall" where anyone in the room is.

# Solution Components

## Drone and Sensors

The drone itself will be a basic lightweight quadcopter design. The frame will be constructed using a 2D design cut from a sheet of carbon fiber and assembled with aluminum hardware and thread locks. The total volume including the rotor blades should not exceed 4" H by 8" W by 8" L at maximum (ideally much less). This simple frame will consist of a rectangular section to mount the PCB and a 2S (7.4 V) LiPo pack of about 2" x 2" or less, and four identical limbs mounted to the corners. On each of the four limbs will be brushless DC motors (EMAX XA2212 2-3S) driven by electronic speed controllers from the PCB (assuming they can't be pre-purchased). The PCB will have a two-pin DuPont/JST connectors for battery leads, a TP4056 LiPo discharging circuit, and buck converters for necessary voltage(s) all on the underside. On top, the PCB will house an ESP32-S3 microcontroller, an IMU with decent accuracy, a set of mmWave 24 GHz human presence sensor (like the LD2410) and ultrasonic transducers to form a phase array sensor with an accurate, narrow beam to scan for human presence with range. These components will allow the drone to be programmed with very simple and limited autonomous flight behaviors (fly up 5 feet, spin 360 degrees, land) and properly/safely control itself. The ultrasonic transducers and human sensing radars will be the primary method of determining human presence and mostly calculated on the ESP-32, however additional calculation will need to be made on the AR end with the received data. If time and budget allow, we may also include a small 2 MP or 5 MP camera for WiFi video stream or a composite video camera for an analog video stream as a backup/failsafe to the other sensors.

A working rough breakdown of the expected mass of each component will go as follows:

- 4 hobby motors: ~ 50 grams (based on consumer measurements)
- Carbon fiber frame: ~ 40 grams (estimate based on similar style and sized frames)
- 2S 500 mAh battery: ~30 grams (based on common commercial LiPo product info)
- PCB with MCU & peripherals: ~50 grams (based on measurements of similar boards)
- 10-20 ultrasonic transducers: ~50 grams (based on commercial component info)
- Metal hardware/fasteners & miscellaneous: ~25 grams (accounting for error as well)
- Total mass: ~255 grams
- Total thrust (at 7.6 V 7.3 A): ~2000 grams (from manufacturer ratings)
- Thrust/weight is well over 2.0 and should allow for quick movement and considerable stability along with the improved frame considerations, and also extra room for more weight if needed.

## AR Viewer or Headset

To create a useful augmented-reality display of the collected position data, the simplest way will be to write an app that uses the digital camera and gyroscope/IMU API's of a smart phone to overlay highlighted human position data on a live camera view. We would use the android studio platform to create this custom app which would interface with the data incoming from the drone. Building upon the android API's we would overlay the data to the phone camera. If we have more time to develop one, a headset or AR glasses could make the experience more useful (hands-free) and immersive. We may also use a laptop at this stage to run a server alongside the app for better processing.

# Working Supply List

*some can be found in student self-service, some need to be ordered
- Carbon fiber sheet (find appropriate size and 2-3 mm thick)
- Aluminum machine screws with lock-tite or bolt/nut with locking washer
- 4 EMAX brushless DC motors and mounting hardware
- 4 quadcopter rotor blades
- 2S (7.6 V) 500 mAh LiPo battery
- Custom PCB
- ESP32-S3 chip w/ PCB antenna
- 20 ultrasonic (40 kHz) transducer cans
- 4 mmWave 24 GHz human presence radar sensors
- TP 4056 LiPo Charging IC (find other necessary SMD components)
- DuPont two-pin connector for LiPo charging/discharging (choose whether removable battery design)
- Various SMD LEDs to indicate functionalities or states on PCB
- Voltage buck converter circuit components
- ESC circuit components
- Adafruit Accelerometer

# Criterion For Success

The best criteria for the success of this project is whether our handheld device or headset can effectively communicate human position data of a visually obstructed location to a nearby user within an accuracy of 1-2 meters while still allowing the user to carry out personal tasks. The video feed should be stable with minimal latency as to be effective and usable, and estimated human positions should be updated only when they are positively in view and information about the recency of data should be apparent (maybe a red highlight on new people, yellow on a stale location, and green for a newly updated position).

Healthy Chair

Ryan Chen, Alan Tokarsky, Tod Wang

Healthy Chair

Featured Project

Team Members:

- Wang Qiuyu (qiuyuw2)

- Ryan Chen (ryanc6)

- Alan Torkarsky(alanmt2)

## Problem

The majority of the population sits for most of the day, whether it’s students doing homework or

employees working at a desk. In particular, during the Covid era where many people are either

working at home or quarantining for long periods of time, they tend to work out less and sit

longer, making it more likely for people to result in obesity, hemorrhoids, and even heart

diseases. In addition, sitting too long is detrimental to one’s bottom and urinary tract, and can

result in urinary urgency, and poor sitting posture can lead to reduced blood circulation, joint

and muscle pain, and other health-related issues.

## Solution

Our team is proposing a project to develop a healthy chair that aims at addressing the problems

mentioned above by reminding people if they have been sitting for too long, using a fan to cool

off the chair, and making people aware of their unhealthy leaning posture.

1. It uses thin film pressure sensors under the chair’s seat to detect the presence of a user,

and pressure sensors on the chair’s back to detect the leaning posture of the user.

2. It uses a temperature sensor under the chair’s seat, and if the seat’s temperature goes

beyond a set temperature threshold, a fan below will be turned on by the microcontroller.

3. It utilizes an LCD display with programmable user interface. The user is able to input the

duration of time the chair will alert the user.

4. It uses a voice module to remind the user if he or she has been sitting for too long. The

sitting time is inputted by the user and tracked by the microcontroller.

5. Utilize only a voice chip instead of the existing speech module to construct our own

voice module.

6. The "smart" chair is able to analyze the situation that the chair surface temperature

exceeds a certain temperature within 24 hours and warns the user about it.

## Solution Components

## Signal Acquisition Subsystem

The signal acquisition subsystem is composed of multiple pressure sensors and a temperature

sensor. This subsystem provides all the input signals (pressure exerted on the bottom and the

back of the chair, as well as the chair’s temperature) that go into the microcontroller. We will be

using RP-C18.3-ST thin film pressure sensors and MLX90614-DCC non-contact IR temperature

sensor.

## Microcontroller Subsystem

In order to achieve seamless data transfer and have enough IO for all the sensors we will use

two ATMEGA88A-PU microcontrollers. One microcontroller is used to take the inputs and

serves as the master, and the second one controls the outputs and acts as the slave. We will

use I2C communication to let the two microcontrollers talk to each other. The microcontrollers

will also be programmed with the ch340g usb to ttl converter. They will be programmed outside

the board and placed into it to avoid over cluttering the PCB with extra circuits.

The microcontroller will be in charge of processing the data that it receives from all input

sensors: pressure and temperature. Once it determines that there is a person sitting on it we

can use the internal clock to begin tracking how long they have been sitting. The clock will also

be used to determine if the person has stood up for a break. The microcontroller will also use

the readings from the temperature sensor to determine if the chair has been overheating to turn

on the fans if necessary. A speaker will tell the user to get up and stretch for a while when they

have been sitting for too long. We will use the speech module to create speech through the

speaker to inform the user of their lengthy sitting duration.

The microcontroller will also be able to relay data about the posture to the led screen for the

user. When it’s detected that the user is leaning against the chair improperly for too long from

the thin film pressure sensors on the chair back, we will flash the corresponding LEDs to notify

the user of their unhealthy sitting posture.

## Implementation Subsystem

The implementation subsystem can be further broken down into three modules: the fan module,

the speech module, and the LCD module. This subsystem includes all the outputs controlled by

the microcontroller. We will be using a MF40100V2-1000U-A99 fan for the fan module,

ISD4002-240PY voice record chip for the speech module, and Adafruit 1.54" 240x240 Wide

Angle TFT LCD Display with MicroSD - ST7789 LCD display for the OLED.

## Power Subsystem

The power subsystem converts 120V AC voltage to a lower DC voltage. Since most of the input

and output sensors, as well as the ATMEGA88A-PU microcontroller operate under a DC voltage

of around or less than 5V, we will be implementing the power subsystem that can switch

between a battery and normal power from the wall.

## Criteria for Success

-The thin film pressure sensors on the bottom of the chair are able to detect the pressure of a

human sitting on the chair

-The temperature sensor is able to detect an increase in temperature and turns the fan as

temperature goes beyond our set threshold temperature. After the temperature decreases

below the threshold, the fan is able to be turned off by the microcontroller

-The thin film pressure sensors on the back of the chair are able to detect unhealthy sitting

posture

-The outputs of the implementation subsystem including the speech, fan, and LCD modules are

able to function as described above and inform the user correctly

## Envision of Final Demo

Our final demo of the healthy chair project is an office chair with grids. The office chair’s back

holds several other pressure sensors to detect the person’s leaning posture. The pressure and

temperature sensors are located under the office chair. After receiving input time from the user,

the healthy chair is able to warn the user if he has been sitting for too long by alerting him from

the speech module. The fan below the chair’s seat is able to turn on after the chair seat’s

temperature goes beyond a set threshold temperature. The LCD displays which sensors are

activated and it also receives the user’s time input.

Project Videos