Project

# Title Team Members TA Documents Sponsor
5 Mesh Network Positioning System
Noah Breit
Peter Giannetos
Michael Gamota design_document1.pdf
final_paper1.pdf
grading_sheet1.pdf
presentation1.pdf
proposal1.pdf
### Team Members
- Peter Giannetos (PG19)
- Noah Breit (NHBREIT2)

# Abstract
Create a wireless positioning system of meshed stationary nodes that is able to track moving nodes over a predefined area at long ranges in excess of 1 km. An inspiration for this project comes from high altitude amateur rocketry where GNSS exclusive tracking systems are unable to maintain a lock at high velocities. However, this is not limited to rocketry and can be expanded to drone swarms or other general asset tracking.

_([Initial Idea Post](https://courses.grainger.illinois.edu/ece445/pace/view-topic.asp?id=76218))_

# Background
Our engineering team Spaceshot from the [Illinois Space Society](https://www.illinoisspacesociety.org/) is working towards being one of the first collegiate teams to build and launch a completely student designed vehicle 100km to the edge of space; Also known as the Kármán line.

A big challenge in achieving this goal is reliably validating altitude, because many commercial GNSS systems are not able to operate at those extreme conditions which is where the inspiration for this project is derived.

_(Spaceshot recently broke the University's 7-year standing altitude record in June of 2024, and is looking to do so again in the summer of 2025. [Kairos II Launch](https://www.youtube.com/watch?v=6WY3OQx-jNs))_

# Objective
The goal of this system is to lay the foundation for an alternative redundant positioning system that may one day be used to help verify vehicle altitude over long ranges. The scope of this project will not focus on achieving these long ranges, but the link budget appears feasible.

Instead, this project will focus on creating a proof of concept for lower altitude vehicles and as a general tracking system that can be used for many other applications beyond vehicle tracking.

## Other Potential Usages
- Drone tracking
- Warehouse asset & robotics tracking
- Car tracking

## Novelty Compared to GNSS
A large part of the novelty is that this system is not entirely reliant on GNSS satellites meaning it serves as a redundant backup solution for assets that require extra reliability. However, here are some other potential novelties compared to GNSS

- Indoors and/or outdoor usage
- Different frequency band than GPS (2.4GHz vs 1.57GHz)*
- Faster update rate than typical consumer GPS (+10Hz)
- Higher velocity tracking
- High velocity tracking**

*_This helps de-conflict with Iridium usage which has been shown to sometimes interfere with GPS signals._

**_The system is most likely able to track higher velocities than consumer grade GNSS, but for the purposes of a demo we won't be able to test that in class. However, we may be able to test fly this system with our RSO in ~April._


# System Overview

## Key Points
- Anchor nodes have a stationary predetermined known location via GNSS or other methods.
- Rover nodes have an unknown moving location and are the subjects of the tracking
- Anchor nodes with synchronized time use time division to sequentially ping rover nodes
- Distance from ToF data from each Anchor node is used to calculate position
- 2.4 GHz LoRa modulation is used as a carrier signal and the radio measures the"Time of Flight" between messages
- 915 MHz LoRa is used for command and control of the Anchor nodes and to relay information between their mesh network
- Each node also may have WiFi/Bluetooth connectivity for relaying data to the user
- Each node has a battery charger circuit for charging LiPo batteries
- Anchor nodes have a DC in that can be used for solar panels or other power sources in extended operation modes

### Diagrams
- [Rover Node](https://drive.google.com/file/d/1s-36r-JjqxyTw7y8X974gufuxaq0_UOH/view?usp=sharing)
- [Anchor Node](https://drive.google.com/file/d/1r33J0ESABEdzihPbGCJ6SNQ7NYmtnM9_/view?usp=sharing)

### Schematics
- [Rover Node](https://drive.google.com/file/d/18-mt-91eqGyq5F5amYsRW0zK7k6PrgOG/view?usp=sharing)
- [Anchor Node](https://drive.google.com/file/d/1wgfl9TMk5EhXdlnkxCsD5ZM0D6RoE0kX/view?usp=sharing)

### Layouts
- [Rover Node Front](https://drive.google.com/file/d/177LjG0lPpOCkkIHr40mCEJd_GgLVCnwK/view?usp=sharing)
- [Rover Node Back](https://drive.google.com/file/d/1zS2saVFtNqWhWLpZFaUtUVLBXwYDhPUr/view?usp=sharing)
- [Anchor Node Front](https://drive.google.com/file/d/1umoRkIO3XNXMvPkvjQ44fK_oXh8rU63C/view?usp=sharing)
- [Anchor Node Back](https://drive.google.com/file/d/1g0U6Ht-ASRbU-8QZHsjqA0HMhdr1DFMh/view?usp=sharing)

_(All files have been shared with @illinois.edu emails)_

# High Level Success Requirements

- Perform 3D trilateration of a rover node
- Read and stream barometer, GNSS, and other data to another node for data logging
- Publish live data to a local WiFi network

# Final Demo Idea
Playing catch or with a tennis ball with a 3D plot for position visualization, or walking around a field with the rover node that is then displays it's position on a phone/computer.

## Intermediary Objectives:
- PCB bring up (Validate all subsystem work separately)
- Perform 1D trilateration of a rover node
- Perform 2D trilateration of a rover node
- Calibrate ranging radios

## Side Objectives:
_(For fun & only if we have time)_
- Create an antenna tracker connected to the mesh network to track the moving object (Good proof of concept for using high gain antennas for reaching 100km or for using cameras to record an asset. ([Adafruit Pan/Tilt Kit](https://www.adafruit.com/product/1967))

RFA: Any-Screen to Touch-Screen Device

Ganesh Arunachalam, Sakhi Yunalfian

Featured Project

# Any-Screen to Touch-Screen Device

Team Members:

\- Sakhi Yunalfian (sfy2)

\- Muthu Arunachalam (muthuga2)

\- Zhengjie Fan (zfan11)

# Problem

While touchscreens are becoming increasingly popular, not all screens come equipped with touch capabilities. Upgrading or replacing non-touch displays with touch-enabled ones can be costly and impractical. Users need an affordable and portable solution that can turn any screen into a fully functional touchscreen.

# Solution

The any-screen-to-touch-screen device uses four ultra-wideband sensors attached to the four corners of a screen to detect the position of a specially designed pen or hand wearable. Ultrawideband (UWB) is a positioning technology that is lower-cost than Lidar/Camera yet more accurate than Bluetooth/Wifi/RFID. Since UWB is highly accurate we will use these sensors to track the location of a UWB antenna (placed in the pen). In addition to the UWB tag, the pen will also feature a touch-sensitive tip to detect contact with the screen (along with a redundant button to simulate screen contact if the user prefers to not constantly make contact with the screen). The pen will also have a gyroscope and low profile buttons to track tilt data and offer customizable hotkeys/shortcuts. The pen and sensors communicate wirelessly with the microcontroller which converts the pen’s input data along with its location on the screen into touchscreen-like interactions.

# Solution Components

## Location Sensing Subsystem (Hardware)

This subsystem will employ Spark Microsystems SR1010 digitally programmable ultra-wideband wireless transceiver. The transceiver will be housed in a enclosure that can be attached to the corners of a screen or monitor. Each sensor unit will also need a bluetooth module in order to communicate with the microcontroller.

## Signal Processing Subsystem (Hardware and Software)

A microcontroller, specifically the STM32F4 series microcontroller (STM32F407 or STM32F429). Real-time sensor data processing takes away a considerable amount of computing power. The STM32F4 series contain DSP instructions that allow a smoother way to perform raw data processing and noise reduction. This subsystem will allow us to perform triangulation to accurately estimate the location on the screen, smooth real-time data processing, latency minimization, sensitivity, and noise reduction.

A bluetooth module, in order for the sensor to send its raw data to the microcontroller. We are planning to make the communication between the sensors and the pen to the microcontroller to be wireless. One bluetooth module we are considering is the HC05 bluetooth module.

The microcontroller itself will be wired to the relevant computer system via USB 2.0 for data transfer of touchscreen interactions.

## Pen/Hand Wearable Subsystem (Hardware)

The pen subsystem will employ a simple spring switch as a pen tip to detect pen to screen contact. We will also use a Sparkfun DEV-08776 Lilypad button to simulate a press/pen-to-screen contact for redundancy and if the user wishes to control the pen without contact to the screen. The pen will also contain several low profile buttons and a STMicroelectronics LSM6DSO32TR gyroscope/accelerator sensor to provide further customizable pen functionality and potentially aid in motion tracking calculations. The pen will contain a Taoglas UWC.01 ultra-wideband tag to allow detection by the location sensing subsystem and a bluetooth module to allow communication with the microcontroller. The unit will need to be enclosed within a plastic or 3D printed housing.

## Touch Screen Emulation Subsystem (Software)

A microcontroller with embedded HID device functionalities in order to control mouse cursors of a specific device connected to it. We are planning to utilize the STM32F4 series microcontroller with built in USB HID libraries to help emulating the touch screen effects. We will also include a simple GUI to allow the user to customize the shortcuts mapped to the pen buttons and specify optional parameters like screen resolution, screen curve, etc.

## Power Subsystem (Hardware)

The power subsystem is not localized in one area since our solution consists of multiple wireless devices, however we specify all power requirements and solutions here for organization purposes.

For the wireless sensors in our location sensing subsystem, we plan on using battery power. Given the UWB transceiver has ultra-low power consumption and an internal DC-DC converter, it makes sense to power each sensor unit with a small 3.3V 650mAh rechargeable battery (potential option: [https://a.co/d/acFLsSu](https://a.co/d/acFLsSu)). We will include recharging capability and micro usb recharging port.

For our pen, we plan on using battery power too. The gyroscope module, UWB antenna, and bluetooth module all have low-power consumption so we plan on using the same rechargeable battery system as specified above.

The microcontroller will be wired via USB 2.0 directly to the computer subsystem in order to transmit mouse data/touchscreen interaction and will receive 5V 0.9A power supply through this connection.

# Criterion For Success

## Hardware

The UWB sensor system is able to track the pens location on the screen.

The pen is able to detect clicks, screen contact, and tilt.

The microcontroller is able to take input from the wireless pen and the wireless sensors.

Each battery-powered unit is successfully powered and able to be charged.

## Software

The pen’s input and sensor location data can be converted to mouse clicks and presses.

The pen’s buttons can be mapped to customizable shortcuts/hotkeys.

## Accuracy and Responsiveness

Touch detection and location accuracy is the most crucial criteria for our project’s success. We expect our device to have a 95% touch detection precision. In order to correctly control embedded HID protocols of a device, the data sent and processed by the microcontroller to the device has to have a low error threshold when comparing cursor movements with wearable location.

Touch recognition and responsiveness is the next most important thing. We want our system, by a certain distance threshold, able to detect the device with a relatively low margin of error of about 1% or less. More specifically, this criteria for success is the conclusion to see if our communication network protocol between the sensors, USB HID peripherals, and the microcontroller are able to efficiently transfer data in real-time for the device to interpret these data in a form of cursor location updates, scrolls, clicks, and more.

Latency and lags should have a time interval of less than 60 millisecond. This will be judged based on the DSP pipeline formed in the STM32F4 microcontroller.

## Reliability and Simplicity

We want our device to be easily usable for the users. It should be intuitive and straightforward to start the device and utilize its functionalities.

We want our device to also be durable in the sense of low chances of battery failures, mechanical failures, and systematic degradations.

## Integration and Compatibility

We want our device to be able to be integrated with any type of screens of different architectural measurements and operating systems.

Project Videos