Project

# Title Team Members TA Documents Sponsor
2 Autonomous Car for WiFi Mapping
Avi Winick
Ben Maydan
Josh Powers
Jason Jung design_document1.pdf
final_paper2.pdf
presentation1.pptx
proposal1.pdf
video
# Title
Autonomous Car for WiFi Mapping

# Team Members:
- Ben Maydan (bmaydan2)
- Josh Powers (jtp6)
- Avi Winick (awinick2)

# Problem
When moving into a new apartment, house, office, etc, people will often place their wifi modem or wifi extender in a convenient spot without much thought. Having gone through this just last week, it made us wonder if there was a better way to approach this to maximize the wifi strength across your house. The way most people go about testing their wifi is walking into a room, going to a speed tester website, running it, then doing that over and over. This takes a lot of time, isn't very accurate, and doesn't show the absolute most optimal location. We are solving the problem of automating the process of wifi testing of strength and speed in a given space. We would get wifi data from an autonomous vehicle driving around a room, create a heat map of the signal strength to display on a computer, analyze it, and then finally show the weak spots, deadzones, etc..
Some motivation for why this is a good project idea: this project allows you to find the best spot to place a wifi extender for optimal wifi so your zoom meetings never cut out and your instagram reels/youtube shorts/tiktok/angry birds keep playing with no issue (potentially at the same time).

# Solution

The basic idea is to do a scan of the room. Essentially, the car will have a lidar sensor and RSSI sensor on the top which will continuously scan the room for the strongest wifi signal. The lidar sensor ensures that the car will not run into anything and allows us to run the SLAM algorithm for obstacle avoidance. The idea is to do a two-scan approach to avoid the bluetooth connection from interfering with the wifi signal detector. The first scan of the room will be manually driving the car for a few minutes gathering LIDAR data and sending that data to an onboard raspberry pi which will map out the path for the car to take to explore the entire room. This is significantly less driving than the autonomous part (which splits the room into strips to drive back and forth). The onboard raspberry will then stream the path back to the car using codes sent to the motor controllers, where the second scan of the room will drive that path over the room and capture the wifi signal at every point in memory. Once the car is done scanning, all of the data will be sent via bluetooth back to the computer where we can nicely visualize the data and return the location of the optimal wifi signal back to the user.
This naturally introduces a few subsystems to achieve this goal.
Location tracking subsystem to map a location (x, y) with a wifi strength (this is done using SLAM) + Generating a path for the car to drive -> this is mapping the output of SLAM to a path the car can drive to scan the entire floor for the best wifi signal.
Lidar sensing + bluetooth streaming subsystem
Wifi signal catching subsystem which gets the signal strength of the wifi at the location the car is at
Building the car with omnidirectional wheels and motor controllers

# Solution Components

## SLAM subsystem for location tracking + generating a path for the car to drive
While we map out the Lidar data, we can pass it to the SLAM using an ESP32-S3 module. This module also has wifi and bluetooth capabilities to easily transfer the data to the computer to let the user view the heatmap. To run the SLAM algorithm, we will offload the computation to a raspberry pi 5 since it has a much more capable microprocessor to run 2d SLAM. The ESP32 in this case will be responsible for cleaning the LIDAR data and sending commands to the motor controller to move the car.
A separate algorithm will be written to run on the raspberry pi (since it is very compute heavy) which computes the optimal path for the car to drive while gathering wifi data.

## Lidar Sensor
The LiDar subsystem allows for the car to navigate through the room. It can scan the room in order to detect walls and furniture. The sensor helps avoid collisions by measuring the distance from obstacles. It can support the SLAM subsystem in mapping the environment in order for it to be then overlaid with the signal strength of the WiFi in all areas.
This will use the RPLIDAR A1M8 which can be connected to the ESP32 for the lidar point cloud data to be stored in flash for the SLAM algorithm to run.

## Wifi Sensor (RSSI)
Received Signal Strength Indicator is a sensor that is part of the ESP32 module that we can place on the top of our car that detects the wifi signals at a location, and can output a strength measurement in Dbm(decibel Milliwatt) . ESP32 can grab this data and transmit it through its bluetooth module to the computer which will display the data and run a very simple linear scan algorithm to find the (x, y) location with the strongest wifi signal. This is a very simple algorithm but we want it to be transmitted to the computer for the user to be able to visualize it. The ESP32 has an existing module that can measure the received signal strength. RSSI is a component of the ESP32 chip

## Car + mecanum wheels
We would 3D print a car and omnidirectional wheels, above the car would hold the lidar sensors as well as the RSSI and SLAM module. The ESP32 module would be placed on top, and have bluetooth send the data to the motor controllers for which direction to turn, move forward, or backwards. There would be different levels for each of the sensors, within the chassis holding the wheels, next level up holding the PCB board and motor controllers, and the top most level holding the ESP32 and the lidar sensor. The ESP32 has the RSSI and the lidar sensor will grab light data for SLAM.

# Criterion For Success

Before the car is able to drive, we need the car’s wheels to be able to be controlled by a program separately. Each mecanum wheel needs to be able to function properly to control the multidirectional aspect. This consists of connecting the ESP32 module to the PCB which is connected to the motor controllers which are connected to the motor of each wheel.

We would need our 3D printed car with omnidirectional wheels to be able to be controlled manually using a computer by sending the driving commands (using arrow keys) over bluetooth. We would need to make sure it is able to move in all directions around a room.

After testing the driving functionality, we would need it to be able to drive with a hard coded path for it to follow to map out the wifi of the room.

When the car can be driven autonomously, we need to achieve the ability to use the lidar sensor to map out a path for the car to follow through the room without needing a predetermined path. This is done using the Lidar sensor and offloading the difficult SLAM computation to a raspberry pi. This involves the following: gather lidar data -> run through slam on raspberry pi -> use generated location and grid based map returned by SLAM to run a path planning algorithm for the car -> start path algorithm. The part that is a criterion for success here is the algorithm we create to generate a path from the grid based map returned by SLAM.

Use an ESP 32 chip (with RSSI add on) to detect wifi strength and then send it over bluetooth to a computer where it would process the data and generate a heat map of the given space. Optionally give the user advice on where to move the modem to maximize either strength in a certain area or best general coverage.

Write an algorithm that will take all the wifi heatmap data and return an optimal spot to place the wifi extender. This is a very complicated algorithm since wifi data can be interpolated (for example, if the strength at x = 1.0 is strong and the strength at x = 0.0 is weak then you can say the strength at x = 0.5 is medium). This essentially means that given a 2D list of wifi signal strength data you have potentially infinite spots to place a wifi extender since you can interpolate anywhere.

Write an algorithm to find local deadspots using the heatmap. This can be done on the computer to visualize to the user and onboard the esp32 or the raspberry pi on the car to avoid mapping out sections of the room which are clearly deadspots. If it runs onboard the car it could save a lot of time by avoiding mapping out large sections of the room but it is again a very computationally intensive algorithm since it is essentially gradient ascent.

Extra credit video
https://youtu.be/XOs7rTOghVs

Remotely Controlled Self-balancing Mini Bike

Will Chen, Eric Tang, Jiaming Xu

Featured Project

# Remotely Controlled Self-balancing Mini Bike

Team Members:

- Will Chen hongyuc5

- Jiaming Xu jx30

- Eric Tang leweit2

# Problem

Bike Share and scooter share have become more popular all over the world these years. This mode of travel is gradually gaining recognition and support. Champaign also has a company that provides this service called Veo. Short-distance traveling with shared bikes between school buildings and bus stops is convenient. However, since they will be randomly parked around the entire city when we need to use them, we often need to look for where the bike is parked and walk to the bike's location. Some of the potential solutions are not ideal, for example: collecting and redistributing all of the bikes once in a while is going to be costly and inefficient; using enough bikes to saturate the region is also very cost inefficient.

# Solution

We think the best way to solve the above problem is to create a self-balancing and moving bike, which users can call bikes to self-drive to their location. To make this solution possible we first need to design a bike that can self-balance. After that, we will add a remote control feature to control the bike movement. Considering the possibilities for demonstration are complicated for a real bike, we will design a scaled-down mini bicycle to apply our self-balancing and remote control functions.

# Solution Components

## Subsystem 1: Self-balancing part

The self-balancing subsystem is the most important component of this project: it will use one reaction wheel with a Brushless DC motor to balance the bike based on reading from the accelerometer.

MPU-6050 Accelerometer gyroscope sensor: it will measure the velocity, acceleration, orientation, and displacement of the object it attaches to, and, with this information, we could implement the corresponding control algorithm on the reaction wheel to balance the bike.

Brushless DC motor: it will be used to rotate the reaction wheel. BLDC motors tend to have better efficiency and speed control than other motors.

Reaction wheel: we will design the reaction wheel by ourselves in Solidworks, and ask the ECE machine shop to help us machine the metal part.

Battery: it will be used to power the BLDC motor for the reaction wheel, the stepper motor for steering, and another BLDC motor for movement. We are considering using an 11.1 Volt LiPo battery.

Processor: we will use STM32F103C8T6 as the brain for this project to complete the application of control algorithms and the coordination between various subsystems.

## Subsystem 2: Bike movement, steering, and remote control

This subsystem will accomplish bike movement and steering with remote control.

Servo motor for movement: it will be used to rotate one of the wheels to achieve bike movement. Servo motors tend to have better efficiency and speed control than other motors.

Stepper motor for steering: in general, stepper motors have better precision and provide higher torque at low speeds than other motors, which makes them perfect for steering the handlebar.

ESP32 2.4GHz Dual-Core WiFi Bluetooth Processor: it has both WiFi and Bluetooth connectivity so it could be used for receiving messages from remote controllers such as Xbox controllers or mobile phones.

## Subsystem 3: Bike structure design

We plan to design the bike frame structure with Solidworks and have it printed out with a 3D printer. At least one of our team members has previous experience in Solidworks and 3D printing, and we have access to a 3D printer.

3D Printed parts: we plan to use PETG material to print all the bike structure parts. PETG is known to be stronger, more durable, and more heat resistant than PLA.

PCB: The PCB will contain several parts mentioned above such as ESP32, MPU6050, STM32, motor driver chips, and other electronic components

## Bonus Subsystem4: Collision check and obstacle avoidance

To detect the obstacles, we are considering using ultrasonic sensors HC-SR04

or cameras such as the OV7725 Camera function with stm32 with an obstacle detection algorithm. Based on the messages received from these sensors, the bicycle could turn left or right to avoid.

# Criterion For Success

The bike could be self-balanced.

The bike could recover from small external disturbances and maintain self-balancing.

The bike movement and steering could be remotely controlled by the user.

Project Videos