Project

# Title Team Members TA Documents Sponsor
14 Audio Augmented Reality Glasses (AARG)
Evan Chong
Nikita Vasilyev
Sunny Chen
Aishee Mondal design_document1.pdf
final_paper1.pdf
grading_sheet1.pdf
proposal1.pdf
video
# Audio Augmented Reality Glasses (AARG)

Team Members:
- Sunny Chen (sunnyc3)
- Nikita Vasilyev (nvasi2)
- Evan Chong (eschong2)

# Problem
Have you ever seen a plant in nature or an animal in the wild that piqued your interest, but you didn’t have an efficient way of researching what it was? Repeatedly searching online to identify the subject can be a lengthy and tedious task, and this is the problem we seek to address. Our solution is meant to enlighten our user of unknown plants, animals, or objects in any setting they are observing.

# Solution
Our project idea stems from the surge of AR prototype glasses being introduced over the past year. We are planning to create our own glasses but in contrast to those on the market, ours will focus on the audio experience of the user. These glasses will have the explicit capability of capturing images of objects and relaying this information to an application that will process these images in the backend. The application will then send an explanation of the object back to an audio device on the glasses (either a speaker or bone-conducting device). The glasses will essentially work as a digital tour guide, with the explanation of the object being auditory rather than visual. The use case we have decided to tackle is a botanical tour guide, but the purpose is to create a platform that other applications can utilize for their objectives.

The subsystems we have broken down the device into are power, peripheral, communication, physical, and application. They are divided such that each subsystem has a designated purpose working towards the goal of full functionality.

# Solution Components

## Power System
The power system consists of the battery powering the device and the supporting charging circuit to replenish the battery once out of power. Some candidates for batteries are PCIFR18650-1500 from ZEUS Battery and ASR00011 from TinyCircuits.

## Peripheral System
The peripheral system focuses on the aspects of the glasses that interact with the outside world. This includes the camera, microphone, speaker, and interact button. These external components will interface with the microcontroller, provide crucial information to the application, and play audio to the user. For the moment we have the following components for each peripheral:
Camera: ESP32-CAM (Comes with development board and camera)
Microphone: CMA-4544PF-W
Speaker: ADS01008MR-LW100-R
Interact Button: B3U-1100P

## Communication System
The communication system consists of a microcontroller and Bluetooth Low Energy interface. This subsystem should create an interface that can be used by applications connected through Bluetooth. This interface allows for all the sensor data to be collected, processed, and sent to the application when requested. The component we plan to use for this system is the ESP32-WROOM-DA-N8 which contains an ESP32 microcontroller with a built-in PCB antenna for Bluetooth.

## Physical System
The physical system consists of the glass frame design and the mounting system for the PCB and hardware components. The frame design will be 3D printed. The goal would be to use premeasured plastic mounting points and screws to mount all components within the hollow frame.

## Application System
The application system consists of image processing, audio transfer, and user interface. The image will be processed, the plant will be identified, and then have audio transferred back to the speaker in the peripheral system. We will develop this application for iOS and interact with the glasses via Bluetooth.

# Criterion For Success

The following goals are fundamental to the success of our project:

- Successful User Flow - The user should be able to look at a plant, press the interact button, and then wait for the system to return the audio of the plant description.
- Accuracy - The final prototype should be able to correctly identify plants 75% of the time.
- Strong Bluetooth Connection - There should be an uninterrupted Bluetooth connection between the glasses and the mobile - device. Additionally, the glasses should be fully operational within a 15-foot range of the mobile device.

The goals below are considered reach goals, and if not accomplished would not hinder the success of our project:

- Bone Conduction Audio - An alternative way of relaying the audio to the user that involves transmitting sound vibrations through the bones.
- Adjustable Audio Volume Level - Within the application system the user will be able to adjust the volume.
- Voice Activation - In addition to the push button, users have the ability to speak to begin the system process.
- Heads-up Display - A display on the glass lenses to aid in relaying the information to the user.

RFA: Any-Screen to Touch-Screen Device

Ganesh Arunachalam, Sakhi Yunalfian

Featured Project

# Any-Screen to Touch-Screen Device

Team Members:

\- Sakhi Yunalfian (sfy2)

\- Muthu Arunachalam (muthuga2)

\- Zhengjie Fan (zfan11)

# Problem

While touchscreens are becoming increasingly popular, not all screens come equipped with touch capabilities. Upgrading or replacing non-touch displays with touch-enabled ones can be costly and impractical. Users need an affordable and portable solution that can turn any screen into a fully functional touchscreen.

# Solution

The any-screen-to-touch-screen device uses four ultra-wideband sensors attached to the four corners of a screen to detect the position of a specially designed pen or hand wearable. Ultrawideband (UWB) is a positioning technology that is lower-cost than Lidar/Camera yet more accurate than Bluetooth/Wifi/RFID. Since UWB is highly accurate we will use these sensors to track the location of a UWB antenna (placed in the pen). In addition to the UWB tag, the pen will also feature a touch-sensitive tip to detect contact with the screen (along with a redundant button to simulate screen contact if the user prefers to not constantly make contact with the screen). The pen will also have a gyroscope and low profile buttons to track tilt data and offer customizable hotkeys/shortcuts. The pen and sensors communicate wirelessly with the microcontroller which converts the pen’s input data along with its location on the screen into touchscreen-like interactions.

# Solution Components

## Location Sensing Subsystem (Hardware)

This subsystem will employ Spark Microsystems SR1010 digitally programmable ultra-wideband wireless transceiver. The transceiver will be housed in a enclosure that can be attached to the corners of a screen or monitor. Each sensor unit will also need a bluetooth module in order to communicate with the microcontroller.

## Signal Processing Subsystem (Hardware and Software)

A microcontroller, specifically the STM32F4 series microcontroller (STM32F407 or STM32F429). Real-time sensor data processing takes away a considerable amount of computing power. The STM32F4 series contain DSP instructions that allow a smoother way to perform raw data processing and noise reduction. This subsystem will allow us to perform triangulation to accurately estimate the location on the screen, smooth real-time data processing, latency minimization, sensitivity, and noise reduction.

A bluetooth module, in order for the sensor to send its raw data to the microcontroller. We are planning to make the communication between the sensors and the pen to the microcontroller to be wireless. One bluetooth module we are considering is the HC05 bluetooth module.

The microcontroller itself will be wired to the relevant computer system via USB 2.0 for data transfer of touchscreen interactions.

## Pen/Hand Wearable Subsystem (Hardware)

The pen subsystem will employ a simple spring switch as a pen tip to detect pen to screen contact. We will also use a Sparkfun DEV-08776 Lilypad button to simulate a press/pen-to-screen contact for redundancy and if the user wishes to control the pen without contact to the screen. The pen will also contain several low profile buttons and a STMicroelectronics LSM6DSO32TR gyroscope/accelerator sensor to provide further customizable pen functionality and potentially aid in motion tracking calculations. The pen will contain a Taoglas UWC.01 ultra-wideband tag to allow detection by the location sensing subsystem and a bluetooth module to allow communication with the microcontroller. The unit will need to be enclosed within a plastic or 3D printed housing.

## Touch Screen Emulation Subsystem (Software)

A microcontroller with embedded HID device functionalities in order to control mouse cursors of a specific device connected to it. We are planning to utilize the STM32F4 series microcontroller with built in USB HID libraries to help emulating the touch screen effects. We will also include a simple GUI to allow the user to customize the shortcuts mapped to the pen buttons and specify optional parameters like screen resolution, screen curve, etc.

## Power Subsystem (Hardware)

The power subsystem is not localized in one area since our solution consists of multiple wireless devices, however we specify all power requirements and solutions here for organization purposes.

For the wireless sensors in our location sensing subsystem, we plan on using battery power. Given the UWB transceiver has ultra-low power consumption and an internal DC-DC converter, it makes sense to power each sensor unit with a small 3.3V 650mAh rechargeable battery (potential option: [https://a.co/d/acFLsSu](https://a.co/d/acFLsSu)). We will include recharging capability and micro usb recharging port.

For our pen, we plan on using battery power too. The gyroscope module, UWB antenna, and bluetooth module all have low-power consumption so we plan on using the same rechargeable battery system as specified above.

The microcontroller will be wired via USB 2.0 directly to the computer subsystem in order to transmit mouse data/touchscreen interaction and will receive 5V 0.9A power supply through this connection.

# Criterion For Success

## Hardware

The UWB sensor system is able to track the pens location on the screen.

The pen is able to detect clicks, screen contact, and tilt.

The microcontroller is able to take input from the wireless pen and the wireless sensors.

Each battery-powered unit is successfully powered and able to be charged.

## Software

The pen’s input and sensor location data can be converted to mouse clicks and presses.

The pen’s buttons can be mapped to customizable shortcuts/hotkeys.

## Accuracy and Responsiveness

Touch detection and location accuracy is the most crucial criteria for our project’s success. We expect our device to have a 95% touch detection precision. In order to correctly control embedded HID protocols of a device, the data sent and processed by the microcontroller to the device has to have a low error threshold when comparing cursor movements with wearable location.

Touch recognition and responsiveness is the next most important thing. We want our system, by a certain distance threshold, able to detect the device with a relatively low margin of error of about 1% or less. More specifically, this criteria for success is the conclusion to see if our communication network protocol between the sensors, USB HID peripherals, and the microcontroller are able to efficiently transfer data in real-time for the device to interpret these data in a form of cursor location updates, scrolls, clicks, and more.

Latency and lags should have a time interval of less than 60 millisecond. This will be judged based on the DSP pipeline formed in the STM32F4 microcontroller.

## Reliability and Simplicity

We want our device to be easily usable for the users. It should be intuitive and straightforward to start the device and utilize its functionalities.

We want our device to also be durable in the sense of low chances of battery failures, mechanical failures, and systematic degradations.

## Integration and Compatibility

We want our device to be able to be integrated with any type of screens of different architectural measurements and operating systems.

Project Videos