Project

# Title Team Members TA Documents Sponsor
50 Weather-Resilient Camera System for Autonomous Vehicles
Adam Shore
Deyvik Bhan
Jacob Camras
John Li design_document1.pdf
final_paper2.pdf
grading_sheet1.pdf
presentation1.pdf
proposal1.pdf
video
# Weather-Resilient Camera System for Autonomous Vehicles


# Group members
- Adam Shore (ajshore2)
- Jacob Camras (camras3)
- Deyvik Bhan (deyvikb2)


# Problem:
Snow and freezing temperatures can severely impair the functionality of car cameras used for object detection, such as those in autonomous vehicles like Teslas. When snow or ice obstructs these cameras, the vehicle's object detection system may fail, leading to potential safety risks. Existing solutions are limited and often fail to address the real-time detection and prevention of this issue.


# Solution:
Our system keeps car cameras functional in adverse weather by integrating real-time detection and response mechanisms. Temperature and moisture sensors monitor conditions and detect when freezing or obstructions threaten visibility. If snow or ice accumulates, a targeted heating element activates to clear the lens, ensuring uninterrupted object detection.
To maintain visibility in the rain, an optical rain detection system identifies raindrops in real-time. A pretrained CNN, deployed with TinyML, processes camera images to detect raindrops. When rain appears, the system applies a hydrophobic nanocoating to repel water and prevent droplets from sticking to the lens. In heavier rain, the heating element warms the lens to evaporate moisture.
A microcontroller manages the entire system, processing sensor inputs and triggering the necessary responses. It runs the optimized TinyML-based CNN, which operates efficiently in low-power environments using an estimated 5-7MB of memory. A rechargeable Li-ion battery with voltage regulation ensures stable power distribution.
By combining these real-time detection and response mechanisms, our system keeps car cameras clear in all weather conditions, improving the reliability and safety of autonomous vehicle object detection systems.

# Solution Components

## OpenCV Module with Camera Subsystem:
A low-power camera will be used to capture images for processing. This system will utilize OpenCV in order to manage real-time image processing. This will allow for detecting raindrops and ice obstructions. We will use a pre trained CNN model in order to identify raindrops on the lens of the camera, similar to the following: https://github.com/tobybreckon/raindrop-detection-cnn
To summarize, the key components here include a Camera Module (OV7670 or OV5642) for real-time image capture and an OpenCV-based image processing for additional detection and filtering.



## Microcontroller Subsystem:
The primary control system will be powered by an STM32 microcontroller (STM32F746NGH6 Arm® Cortex®). This control unit will manage input from the sensors and trigger the heating and wiping mechanisms as needed. Specifically, this will act in the form as a rain sensor in order to properly trigger the outputs accordingly. We plan to use the DHT22 Temperature & Humidity Sensor. Additionally, AI-based algorithms will run to optimize real-time decision making. It will utilize TinyML to deploy the CNN to reduce the RAM needed. The system will process images from the camera and run the TinyML-based CNN model.



## Spray on-Hydrophobic Coating Subsystem:
There will be a mechanism in order to apply the hydrophobic coating when the rain is triggered. This spray solution is linked here: https://www.amazon.com/Nanoskin-NA-HQD16-Express-Hydrophobic-Polymer/dp/B00DOS0PMS?source=ps-sl-shoppingads-lpcontext&ref_=fplfs&psc=1&smid=ATVPDKIKX0DER&gPromoCode=sns_us_en_5_2023Q4&gQT=1
Before applying the hydrophobic coating we also will need to apply a spray on cleaning system. We propose a spray on product such as this one here:
Precision Optics Cleaning Solution – 2oz Spray Bottle (PLC2S
Once the product is sprayed on we will need to wipe it off the camera. We are proposing creating our own wiper using this product here
https://www.amazon.com/Micro-Helicopter-Airplane-Remote-Control/dp/B072V529YD
A cloth will be attached to this wiper to wipe off the cleaning fluid before the hydrophobic coating is applied.



## Heating Subsystem:
For our heating component, we will use a KIWIFOTOS USB Lens Dew Heater. This component is linked here: https://www.amazon.com/Temperature-Condensation-Prevention-Telescopes-80mm-110mm/dp/B08B4TJP6M?source=ps-sl-shoppingads-lpcontext&ref_=fplfs&smid=A2VY9ZK1UXR49Y&gQT=1&th=1
This component will be attached to our PCB so that our microcontroller can trigger it when needed, based on the sensor input.


## Power Subsystem
The battery subsystem will supply power to all components of the raindrop detection system, ensuring reliable operation in various environments. We plan to use a rechargeable Li-ion battery pack, likely 7.4V or 12V, depending on the power requirements of our microcontroller and sensors. A voltage regulation circuit will be implemented to step down or stabilize power for different components. A buck converter will provide a steady 3.3V or 5V for the microcontroller and sensors, while higher-power components, such as potential heating elements or additional processing units, will receive direct power as needed.





## PCB:
Used to connect the following components/systems: Camera, Microcontroller containing the CNN and sensors, Heating Component, Spray on-Hydrophobic coating, Power Subsystem





# Criterion For Success:

We will test our solution by simulating rainy and snowy conditions manually. We will pour water onto the lens and verify the hydrophobic nano coating and computer vision modules work by seeing whether or not the hydrophobic coating is sprayed when rain is detected by the CV module. if the temperature and moisture sensor work by putting ice on the camera and manually seeing if the ice has melted off, and we can see through the camera again.

The goals we aim to achieve for our project to be considered successful are as follows:

The system accurately detects raindrops and ice obstructions using the TinyML-based computer vision model and sensor inputs.
The hydrophobic nano-coating is successfully applied when raindrops are detected by the computer vision model.
The heating element activates when the temperature and moisture sensors detect freezing conditions, effectively clearing the obstruction.
The microcontroller efficiently processes the computer vision model while simultaneously handling sensor inputs and system activations.
The camera remains unobstructed and functional in simulated adverse weather conditions, allowing clear vision for object detection.
The PCB integrates all components seamlessly, ensuring stable power distribution and communication between sensors, microcontroller, and external systems.

Low Cost Myoelectric Prosthetic Hand

Michael Fatina, Jonathan Pan-Doh, Edward Wu

Low Cost Myoelectric Prosthetic Hand

Featured Project

According to the WHO, 80% of amputees are in developing nations, and less than 3% of that 80% have access to rehabilitative care. In a study by Heidi Witteveen, “the lack of sensory feedback was indicated as one of the major factors of prosthesis abandonment.” A low cost myoelectric prosthetic hand interfaced with a sensory substitution system returns functionality, increases the availability to amputees, and provides users with sensory feedback.

We will work with Aadeel Akhtar to develop a new iteration of his open source, low cost, myoelectric prosthetic hand. The current revision uses eight EMG channels, with sensors placed on the residual limb. A microcontroller communicates with an ADC, runs a classifier to determine the user’s type of grip, and controls motors in the hand achieving desired grips at predetermined velocities.

As requested by Aadeel, the socket and hand will operate independently using separate microcontrollers and interface with each other, providing modularity and customizability. The microcontroller in the socket will interface with the ADC and run the grip classifier, which will be expanded so finger velocities correspond to the amplitude of the user’s muscle activity. The hand microcontroller controls the motors and receives grip and velocity commands. Contact reflexes will be added via pressure sensors in fingertips, adjusting grip strength and velocity. The hand microcontroller will interface with existing sensory substitution systems using the pressure sensors. A PCB with a custom motor controller will fit inside the palm of the hand, and interface with the hand microcontroller.

Project Videos