Project

# Title Team Members TA Documents Sponsor
58 Virtual Reality Gloves
Aditya N
Ashton Billings
Hamza Lutfi
Jason Zhang final_paper1.pdf
grading_sheet1.pdf
other1.pdf
other2.pdf
other3.pdf
video
# Title

Team Members:
- Ashton Billings (Ashton6)
- Hamza Lutfi (hamzael2)
- Aditya N (avn5)

# Problem

Despite the recent breakthroughs in VR technologies and experiences, it's clear that the technology is still in its infancy. Our project will explore one area lacking in the industry, good and relatively cheap hand tracking. As of now, most flagship VR devices still rely on handheld controllers, with only the more expensive products such as the Valve Index having hand-tracking capabilities. Even still, these flagship devices are still lacking in certain areas.

# Solution

We will develop a hand-tracking system meant for VR development. To test our system, we will be using the Unity game engine combined with its OpenXR SDK which includes standardized hand-tracking software and the ability to deploy virtual scenes to an Oculus Quest 2 headset, which we will be using for this. We will use a combination of multiple sensors to achieve this result, with the main one for finger tracking being stress gauge sensors over each joint. These sensors change in resistance as they stretch, making them perfect for measuring joint bending. We also need to track where our hand is. We will be using IMUs combined with sensor fusion algorithms for this. Finally, With this data, our firmware will process it and arrange such data in compliance with Unity's XRHands package before sending it to Unity over BLE. We will also implement haptic feedback in our gloves, giving the illusion of holding objects in our hands. We will do this by using servo motors attached by string to each finger that will lock once our finger collides with an object. This will be done through Unity colliders, which provide simple true-false information about whether or not an object is colliding with another. We will send this to our hardware when true to lock the finger.

# Solution Components

## Subsystem 1 - Hardware

We will be using five main pieces of hardware for our design: stress gauge sensors for joint tracking, IMUs for hand tracking, servo motors for haptic feedback, rechargeable lithium-ion batteries for power, and of course our nRF52840 MCU. All three pieces will need their own set of complementary hardware/software to make them work as intended.

The stress gauge sensors will be embedded in the fabric, and then sown onto a glove, such that when you flex your hand in the glove, it will stretch the fabric, and therefore stretch the sensor. These sensors change in resistance as they flex, which will need to be converted to a voltage for measurement. This will be down with a Wheatstone bridge, then this voltage will be digitized with an ADC onboard the nRF52840 chip.

The IMU will be embedded on top of the hand. The accelerometers are prone to error due to the double integration of noise so sensor fusion algorithms will be used. Many open-source algorithms such as the Madgwick filter can be used for this, which combines data from the accelerometer, magnetometer, and gyroscope to reduce error in our position tracking.

The servo motors will be used for haptic feedback, stopping the fingers in place when they run into objects. As such we need a servo motor that can lock in place, so we've opted for the MG 996R for this. Since we will need 5 motors per hand, we will need a PWN module such as the PCA9685. This module allows for control of all 5 motors through one I2C communication bus and frees our MCU from needing to produce PWN modules.

To power our system we need batteries. We will be using lithium-ion batteries for our device which is typical of such devices. We will be using a battery management system device such as the bq2407x to control and charge these batteries.

Finally, we have our MCU, the nRF52840. This microprocessor is fast and built for wireless communication as well as being low power, perfect for our desire to use BLE. This MCU will be responsible for speaking with our three main components packing up this data and communicating it with Unity.


## Subsystem 2 - Firmware

Firmware for the nRF52840 will be written using the nRF5 SDK. The decision between the Connect SDK (Zephyr RTOS based) and “plain” SDK will be made once we are testing on a dev board. The firmware will implement code to read data from the sensors using the nRF’s ADCs, then send it to the computer running a game engine through BLE via Zephyr’s/SoftDevice. Our firmware will send this data via binary serialization for as low latency of a solution as possible.

## Subsystem 3 - Software

Our software will both receive data as well as send data for the proper function of our gloves. It will also provide us with the test scene to test our gloves functionality. All of this will be done in the Unity game engine, which uses C# for game development.

Our software will take in sensor data sent from the firmware and via a custom XRHands provider script convert it in such a way as to work with the standardized XRHands Unity package. This package has standardized functions for XRHand operations making it perfect for our project. An XRHands provider is simply a script used to take our raw sensor data and convert it to standardized XRHands data form.

Our software will also need to send interaction data to our glove for haptic feedback. This can be done using Unity colliders and collider callbacks. Functions like OnCollisionEnter will run once two colliders interact, meaning that in this function we can send a halt command to our firmware to tell the servos to stop, allowing for simple haptic feedback.

Finally, we have our test scene. Making a simple XR test scene in Unity is very simple with many free ones online available as well. These scenes contain interactable objects with Rigidbodies allowing for them to act like physical objects, colliders allowing for interaction, and meshes allowing for the computer to determine where these objects are for interaction. Our software will have a simple scene with objects to pick up and throw for testing.

# Criterion For Success

Our first criterion for success is that with these, we can pick up virtual objects with the gloves. These gloves will also lock properly when grabbing objects for this haptic feedback.

Our next criterion for success is that the gloves have a latency of under 1 second. We can test this by moving the glove in a predictable way, and timing how long it takes for Unity to react. We can do this by having a timer in our firmware and software and comparing the two.

Our final criterion for success is that the accuracy of the gloves is within a reasonable range. We can test this by predictably moving the glove into a certain position, say bending the index finger in relative to the palm in by 30 degrees, and seeing how much Unity moves the virtual hand in response. I'd say a reasonable range is that the fingers track within +/-15 degrees.

Oxygen Delivery Robot

Aidan Dunican, Nazar Kalyniouk, Rutvik Sayankar

Oxygen Delivery Robot

Featured Project

# Oxygen Delivery Robot

Team Members:

- Rutvik Sayankar (rutviks2)

- Aidan Dunican (dunican2)

- Nazar Kalyniouk (nazark2)

# Problem

Children's interstitial and diffuse lung disease (ChILD) is a collection of diseases or disorders. These diseases cause a thickening of the interstitium (the tissue that extends throughout the lungs) due to scarring, inflammation, or fluid buildup. This eventually affects a patient’s ability to breathe and distribute enough oxygen to the blood.

Numerous children experience the impact of this situation, requiring supplemental oxygen for their daily activities. It hampers the mobility and freedom of young infants, diminishing their growth and confidence. Moreover, parents face an increased burden, not only caring for their child but also having to be directly involved in managing the oxygen tank as their child moves around.

# Solution

Given the absence of relevant solutions in the current market, our project aims to ease the challenges faced by parents and provide the freedom for young children to explore their surroundings. As a proof of concept for an affordable solution, we propose a three-wheeled omnidirectional mobile robot capable of supporting filled oxygen tanks in the size range of M-2 to M-9, weighing 1 - 6kg (2.2 - 13.2 lbs) respectively (when full). Due to time constraints in the class and the objective to demonstrate the feasibility of a low-cost device, we plan to construct a robot at a ~50% scale of the proposed solution. Consequently, our robot will handle simulated weights/tanks with weights ranging from 0.5 - 3 kg (1.1 - 6.6 lbs).

The robot will have a three-wheeled omni-wheel drive train, incorporating two localization subsystems to ensure redundancy and enhance child safety. The first subsystem focuses on the drivetrain and chassis of the robot, while the second subsystem utilizes ultra-wideband (UWB) transceivers for triangulating the child's location relative to the robot in indoor environments. As for the final subsystem, we intend to use a camera connected to a Raspberry Pi and leverage OpenCV to improve directional accuracy in tracking the child.

As part of the design, we intend to create a PCB in the form of a Raspberry Pi hat, facilitating convenient access to information generated by our computer vision system. The PCB will incorporate essential components for motor control, with an STM microcontroller serving as the project's central processing unit. This microcontroller will manage the drivetrain, analyze UWB localization data, and execute corresponding actions based on the information obtained.

# Solution Components

## Subsystem 1: Drivetrain and Chassis

This subsystem encompasses the drive train for the 3 omni-wheel robot, featuring the use of 3 H-Bridges (L298N - each IC has two H-bridges therefore we plan to incorporate all the hardware such that we may switch to a 4 omni-wheel based drive train if need be) and 3 AndyMark 245 RPM 12V Gearmotors equipped with 2 Channel Encoders. The microcontroller will control the H-bridges. The 3 omni-wheel drive system facilitates zero-degree turning, simplifying the robot's design and reducing costs by minimizing the number of wheels. An omni-wheel is characterized by outer rollers that spin freely about axes in the plane of the wheel, enabling sideways sliding while the wheel propels forward or backward without slip. Alongside the drivetrain, the chassis will incorporate 3 HC-SR04 Ultrasonic sensors (or three bumper-style limit switches - like a Roomba), providing a redundant system to detect potential obstacles in the robot's path.

## Subsystem 2: UWB Localization

This subsystem suggests implementing a module based on the DW1000 Ultra-Wideband (UWB) transceiver IC, similar to the technology found in Apple AirTags. We opt for UWB over Bluetooth due to its significantly superior accuracy, attributed to UWB's precise distance-based approach using time-of-flight (ToF) rather than meer signal strength as in Bluetooth.

This project will require three transceiver ICs, with two acting as "anchors" fixed on the robot. The distance to the third transceiver (referred to as the "tag") will always be calculated relative to the anchors. With the transceivers we are currently considering, at full transmit power, they have to be at least 18" apart to report the range. At minimum power, they work when they are at least 10 inches. For the "tag," we plan to create a compact PCB containing the transceiver, a small coin battery, and other essential components to ensure proper transceiver operation. This device can be attached to a child's shirt using Velcro.

## Subsystem 3: Computer Vision

This subsystem involves using the OpenCV library on a Raspberry Pi equipped with a camera. By employing pre-trained models, we aim to enhance the reliability and directional accuracy of tracking a young child. The plan is to perform all camera-related processing on the Raspberry Pi and subsequently translate the information into a directional command for the robot if necessary. Given that most common STM chips feature I2C buses, we plan to communicate between the Raspberry Pi and our microcontroller through this bus.

## Division of Work:

Given that we already have a 3 omni wheel robot, it is a little bit smaller than our 50% scale but it allows us to immediately begin work on UWB localization and computer vision until a new iteration can be made. Simultaneously, we'll reconfigure the drive train to ensure compatibility with the additional systems we plan to implement, and the ability to move the desired weight. To streamline the process, we'll allocate specific tasks to individual group members – one focusing on UWB, another on Computer Vision, and the third on the drivetrain. This division of work will allow parallel progress on the different aspects of the project.

# Criterion For Success

Omni-wheel drivetrain that can drive in a specified direction.

Close-range object detection system working (can detect objects inside the path of travel).

UWB Localization down to an accuracy of < 1m.

## Current considerations

We are currently in discussion with Greg at the machine shop about switching to a four-wheeled omni-wheel drivetrain due to the increased weight capacity and integrity of the chassis. To address the safety concerns of this particular project, we are planning to implement the following safety measures:

- Limit robot max speed to <5 MPH

- Using Empty Tanks/ simulated weights. At NO point ever will we be working with compressed oxygen. Our goal is just to prove that we can build a robot that can follow a small human.

- We are planning to work extensively to design the base of the robot to be bottom-heavy & wide to prevent the tipping hazard.