Project

# Title Team Members TA Documents Sponsor
7 Non-Intrusive Smart Unlocking Mechanism for College Dormitory Rooms
Arnav Mehta
Raghav Pramod Murthy
Yuhao Cheng
John Li design_document1.pdf
final_paper3.pdf
grading_sheet1.pdf
other1.pdf
proposal1.pdf
video
# Non-Intrusive Smart Unlocking Mechanism for College Dormitory Rooms

Team Members:
Raghav Pramod Murthy (raghavp4)\
Arnav Mehta (arnavm7)\
Yuhao Cheng (yuhaoc7)

# Problem
Many college students living in dorms frequently face the problem of forgetting their keys. For many students, it’s their first time having to manage keys to get into their rooms, and with busy schedules, it’s very easy to forget or even misplace them. This can create a huge hassle. While some systems, like facial recognition systems, can bypass the standard key-lock system, they are not feasible to install on the college dorm doors; they need to be drilled into the interior of doors, which is costly. Other forms of authentication, such as voice recognition, are not easy to add either. This brings us to a more practical and non-intrusive solution: a lock/unlocking mechanism that does not modify the internal locking system of the door. Almost all door locks can be unlocked through the rotation of some exterior component of the door like the lock or the handle. This naturally leads us to explore a solution geared towards a flexible rotation system that can more easily integrate with existing door locks.

# Solution
We propose a portable system that turns the lock on the door (similar to how a person on the inside of the door would manually turn it to let someone in). This non-intrusive unlocking mechanism will be portable and transferable – it can be easily removed from one door and put onto another. The user attempting to access a room would scan their face on an app, and make a sound for 5 seconds (picked up by a microphone on the cellphone) to initiate voice authentication. The authentication would occur in the backend. If the face and the voice match a face and voice that has been previously registered on the app, the web app will send a signal to the microcontroller to initiate the unlocking process. The user will also be able to register other faces and voices (for example for their roommate) to allow multiple people to use this unlocking system. An important note is that this entire unlocking system will not interfere with manual unlocking with a key.



# Solution Components

## Subsystem 1: Turning Mechanism
This will be the component that physically turns the lock to unlock the door once it receives a signal.

ESP32-S3 microcontroller chip\
DRV8825 Stepper Motor Driver\
Stepper Motor: STEPPERONLINE Nema 17 Stepper Motor Bipolar 2A\
Custom PCB\
LM1117-2.5 Voltage Regulator\
12 V Battery\
Flexible Steel Cable to turn the handle

## Subsystem 2: Facial recognition + Voice Recognition app/User Interface for Authentication

Function: Authenticate the user by scanning their faces and analyzing their voice

Components:
Android app\
Flask backend hosted in GCP\
Google Cloud speech-to-text + recognition API\
DeepFace open source model to compare faces\
MongoDB instance to store face data / voice data


# Criterion For Success

Unit Test Goals:
1. Desired accuracy of the facial recognition model: 95% (on large online dataset and around 20 of our own pairs of cellphone images)
2. Desired accuracy of the speech-to-text + recognition API model: 90%
3. Processing times (from when user submits voice and face to when the signal is sent to the PCB) under 5 seconds

Functionality Goals:
Portability/Transferability of Unlocking System:
1. We will achieve this goal if we can mount our contraption onto a door in under ten minutes.

Facial Recognition + Voice Recognition:
1. We will achieve this goal if users who authenticate themselves (registering their face and voice), take a picture of themselves, and submit a voice sample can unlock the door without a key.
2. We will achieve this goal if an unauthorized user (a user who has not authenticated themselves with face and voice through the app) is unable to open the door.

WHEELED-LEGGED BALANCING ROBOT

Gabriel Gao, Jerry Wang, Zehao Yuan

WHEELED-LEGGED BALANCING ROBOT

Featured Project

# WHEELED-LEGGED BALANCING ROBOT

## Team Members:

- Gabriel Gao (ngao4)

- Zehao Yuan (zehaoy2)

- Jerry Wang (runxuan6)

# Problem

The motivation for this project arises from the limitations inherent in conventional wheeled delivery robots, which predominantly feature a four-wheel chassis. This design restricts their ability to navigate terrains with obstacles, bumps, and stairs—common features in urban environments. A wheel-legged balancing robot, on the other hand, can effortlessly overcome such challenges, making it a particularly promising solution for delivery services.

# Solution

The primary objective of this phase of the project is to demonstrate that a single leg of the robot can successfully bear weight and function as an electronic suspension system. Achieving this will lay the foundation for the subsequent development of the full robot.

# Solution Components

## Subsystem 1. Hybrid Mobility Module:

Actuated Legs: Four actuator motors (DM-J4310-2EC) power the legged system, enabling the robot to navigate uneven surfaces, obstacles, and stairs. The legs also functions as an advanced electromagnetic suspension system, quickly adjusting damping and stiffness to ensure a stable and level platform.

Wheeled Drive: Two direct drive BLDC (M3508) motors propel the wheels, enabling efficient travel on flat terrains.

**Note: 4xDM4310s and 2xM3508 motor can be borrow from RSO: Illini Robomaster** - [Image of Motors on campus](https://github.com/ngao4/Wheel_Legged_Robot/blob/main/image/motors.jpg)

The DM4310 has a built in ESC with CAN bus and double absolute encoder, able to provide 4 nm continuous torque. This torque allows the robot or the leg system to act as suspension system and carry enough weight for further application. M3508 also has ESC available in the lab, it is an FOC ESC with CAN bus communication. So in this project we are not focusing on motor driver parts. The motors would communicate with STM32 through CAN bus with about 1 kHz rate.

## Subsystem 2. Central Control Unit and PCB:

An STM32F103 microcontroller acts as the brain of the robot, processing input from the IMU through SPI signal, directing the motors through CAN bus. The pcb includes STM32F103 chip, BMI088 imu, power supply parts and also sbus remote control signal inverter.

Might further upgrade to STM32F407 if needed.

Attitude Sensing: A 6-axis IMU (BMI088) continuously monitors the robot's orientation and motion, facilitating real-time adjustments to ensure stability and correct navigation. The BMI088 would be part of the PCB component.

## Subsystem 3. Testing Platform

The leg will be connected to a harness as shown in this [sketch](https://github.com/ngao4/Wheel_Legged_Robot/blob/main/image/sketch.jpg). The harness simplifies the model by restricting the robot’s motion in the Y-axis, while retaining the freedom for the robot to move on the X-axis and jump in the Z-axis. The harness also guarantees safety as it prevents the robot from moving outside its limit.

## Subsystem 4. Payload Compartment (3D-printed):

A designated section to securely hold and transport items, ensuring that they are protected from disturbances during transit. We will add weights to test the maximum payload of the robot.

## Subsystem 5. Remote Controller:

A 2.4 GHz RC sbus remote controller will be used to control the robot. This hand-held device provides real-time control, making it simple for us to operate the robot at various distances. Safety is ensured as we can set a switch as a kill switch to shutdown the robot in emergency conditions.

**Note: Remote controller model: DJI DT7, can be borrow from RSO: Illini Robomaster**

The remote controller set comes with a receiver, the output is sbus signal which is commonly used in RC control. We would add an inverter circuit on pcb allowing the sbus signal to be read by STM32.

Note: When only demoing the leg function, the RC controller may not be used.

## Subsystem 6. Power System

We are considering a 6s (24V) Lithium Battery to power the robot. An alternative solution is to power the robot through a power supply using a pair of long wires.

# Criterion For Success

**Stable Balancing:** The robot (leg) should maintain its balance in a variety of situations, both static (when stationary) and dynamic (when moving).

**Cargo Carriage:** The robot(leg) can be able to carry a specified weight (like 1lb) without compromising its balance or ability to move.

_________________________________________________________________________

**If we are able to test the leg and function normally before midterm, we would try to build the whole wheel legged balancing robot out. It would be able to complete the following :**

**Directional Movement:** Via remote control, the robot should move precisely in the desired direction(up and down), showcasing smooth accelerations, decelerations, and turns.

**Platform Leveling:** Even when navigating slopes or uneven terrains, the robot should consistently ensure that its platform remains flat, preserving the integrity of the cargo it carries. Any tilt should be minimized, ideally maintaining a platform angle variation within a range of 10 degrees or less from the horizontal.

**Position Retention:** In the event of disruptions like pushes or kicks, the robot should make efforts to return to its original location or at least resist being moved too far off its original position.

**Safety:** During its operations, the robot should not pose a danger to its surroundings, ensuring controlled movements, especially when correcting its balance or position. The robot should be able to shut down (safety mode) by remote control.

Project Videos