Project
# | Title | Team Members | TA | Documents | Sponsor |
---|---|---|---|---|---|
58 | Virtual Reality Gloves |
Aditya Nebhrajani Ashton Billings Hamza Lutfi |
Jason Zhang | other1.pdf |
|
# Title Team Members: - Ashton Billings (Ashton6) - Hamza Lutfi (hamzael2) - Aditya Nebhrajani (avn5) # Problem Despite the recent breakthroughs in VR technologies and experiences, it's clear that the technology is still in its infancy. Our project will explore one area lacking in the industry, good and relatively cheap hand tracking. As of now, most flagship VR devices still rely on handheld controllers, with only the more expensive products such as the Valve Index having hand-tracking capabilities. Even still, these flagship devices are still lacking in certain areas. # Solution We will develop a hand-tracking system meant for VR development. To test our system, we will be using the Unity game engine combined with its OpenXR SDK which includes standardized hand-tracking software and the ability to deploy virtual scenes to an Oculus Quest 2 headset, which we will be using for this. We will use a combination of multiple sensors to achieve this result, with the main one for finger tracking being stress gauge sensors over each joint. These sensors change in resistance as they stretch, making them perfect for measuring joint bending. We also need to track where our hand is. We will be using IMUs combined with sensor fusion algorithms for this. Finally, With this data, our firmware will process it and arrange such data in compliance with Unity's XRHands package before sending it to Unity over BLE. We will also implement haptic feedback in our gloves, giving the illusion of holding objects in our hands. We will do this by using servo motors attached by string to each finger that will lock once our finger collides with an object. This will be done through Unity colliders, which provide simple true-false information about whether or not an object is colliding with another. We will send this to our hardware when true to lock the finger. # Solution Components ## Subsystem 1 - Hardware We will be using five main pieces of hardware for our design: stress gauge sensors for joint tracking, IMUs for hand tracking, servo motors for haptic feedback, rechargeable lithium-ion batteries for power, and of course our nRF52840 MCU. All three pieces will need their own set of complementary hardware/software to make them work as intended. The stress gauge sensors will be embedded in the fabric, and then sown onto a glove, such that when you flex your hand in the glove, it will stretch the fabric, and therefore stretch the sensor. These sensors change in resistance as they flex, which will need to be converted to a voltage for measurement. This will be down with a Wheatstone bridge, then this voltage will be digitized with an ADC onboard the nRF52840 chip. The IMU will be embedded on top of the hand. The accelerometers are prone to error due to the double integration of noise so sensor fusion algorithms will be used. Many open-source algorithms such as the Madgwick filter can be used for this, which combines data from the accelerometer, magnetometer, and gyroscope to reduce error in our position tracking. The servo motors will be used for haptic feedback, stopping the fingers in place when they run into objects. As such we need a servo motor that can lock in place, so we've opted for the MG 996R for this. Since we will need 5 motors per hand, we will need a PWN module such as the PCA9685. This module allows for control of all 5 motors through one I2C communication bus and frees our MCU from needing to produce PWN modules. To power our system we need batteries. We will be using lithium-ion batteries for our device which is typical of such devices. We will be using a battery management system device such as the bq2407x to control and charge these batteries. Finally, we have our MCU, the nRF52840. This microprocessor is fast and built for wireless communication as well as being low power, perfect for our desire to use BLE. This MCU will be responsible for speaking with our three main components packing up this data and communicating it with Unity. ## Subsystem 2 - Firmware Firmware for the nRF52840 will be written using the nRF5 SDK. The decision between the Connect SDK (Zephyr RTOS based) and “plain” SDK will be made once we are testing on a dev board. The firmware will implement code to read data from the sensors using the nRF’s ADCs, then send it to the computer running a game engine through BLE via Zephyr’s/SoftDevice. Our firmware will send this data via binary serialization for as low latency of a solution as possible. ## Subsystem 3 - Software Our software will both receive data as well as send data for the proper function of our gloves. It will also provide us with the test scene to test our gloves functionality. All of this will be done in the Unity game engine, which uses C# for game development. Our software will take in sensor data sent from the firmware and via a custom XRHands provider script convert it in such a way as to work with the standardized XRHands Unity package. This package has standardized functions for XRHand operations making it perfect for our project. An XRHands provider is simply a script used to take our raw sensor data and convert it to standardized XRHands data form. Our software will also need to send interaction data to our glove for haptic feedback. This can be done using Unity colliders and collider callbacks. Functions like OnCollisionEnter will run once two colliders interact, meaning that in this function we can send a halt command to our firmware to tell the servos to stop, allowing for simple haptic feedback. Finally, we have our test scene. Making a simple XR test scene in Unity is very simple with many free ones online available as well. These scenes contain interactable objects with Rigidbodies allowing for them to act like physical objects, colliders allowing for interaction, and meshes allowing for the computer to determine where these objects are for interaction. Our software will have a simple scene with objects to pick up and throw for testing. # Criterion For Success Our first criterion for success is that with these, we can pick up virtual objects with the gloves. These gloves will also lock properly when grabbing objects for this haptic feedback. Our next criterion for success is that the gloves have a latency of under 1 second. We can test this by moving the glove in a predictable way, and timing how long it takes for Unity to react. We can do this by having a timer in our firmware and software and comparing the two. Our final criterion for success is that the accuracy of the gloves is within a reasonable range. We can test this by predictably moving the glove into a certain position, say bending the index finger in relative to the palm in by 30 degrees, and seeing how much Unity moves the virtual hand in response. I'd say a reasonable range is that the fingers track within +/-15 degrees. |