Project

# Title Team Members TA Documents Sponsor
61 Stick On Car Proximity Sensor
Aryan Damani
Raunak Bathwal
Shrijan Sathish
Angquan Yu final_paper1.pdf
other1.pdf
photo1.jpg
presentation1.pptx
proposal1.pdf
Team Members:
Shrijan Sathish (shrijan2)
Aryan Damani (aryansd2)
Raunak Bathwal (raunakb2)

# Problem

Describe the problem you want to solve and motivate the need.

Many older cars lack proximity sensors that let the user know how close their car is to various obstacles, whether it be their garages, parking spot walls, or even curbs. Though this can be handled through various tricks of knowing where to look in the rearview or side mirrors to know where the front, sides, or back of the car is with respect to walls and other obstacles, it is always better to be sure. We aim to solve this inconvenience that comes with older model cars.

# Solution

Describe your design at a high-level, how it solves the problem, and introduce the subsystems of your project.

Our solution involves using 4 proximity sensors that can be placed on each corner of the car, with a receiver that can be placed inside the car. These will be linked through bluetooth and the receiver itself will also contain 4 lights on each of its corners. This will correspond with each sensor placed, and light up as well as produce an auditory cue (most likely small “beeps”) to alert the user how close they are to an obstacle and where it is. The closer you are to an obstacle, the faster the frequency of the beeps.



# Solution Components

## Subsystem 1: Proximity Sensor
The first, and main system, will be the sensors placed all around the car. Each module will be the same, regardless of where on the car it is placed. Each module will consist of 1-3 ultrasonic sensors(HC-SR04) based on their predicted placement on the vehicle, our custom PCB, a small watch battery, and a wireless RF transceiver (WRL-10534). The module will constantly transmit distance data to the receiver module located within the vehicle to make sure the driver is aware of how close they may be to any potential obstacles.

## Subsystem 2: Receiver

The receiver subsystem will be located within the vehicle, consisting of an RF receiver (WRL-10534) to communicate with the above proximity sensors, a power adapter to get power from the USB/car power, and a microcontroller(ATmega328P) to read input from proximity sensors, and output signals to control the lights and speakers over bluetooth using a bluetooth module (CC2541F256TRHATQ1) if necessary and if the vehicle is too close to an object.

## Subsystem 3: Lights + Speaker
The light and speaker system will consist of a small speaker that we have that will change frequency based on how close an object is, combined with a set of red LED diodes to represent which sensor is being triggered so the driver knows which direction to avoid.

# Criterion For Success

Our criterion for success will be testing with an actual car, where we reach a constant beep when we reach a distance of less than one foot to an obstacle, which will be our reassurance that the sensors work. Our second criterion for success is to get someone to use the system and determine if they are able to stop before/avoid obstacles with a relatively safe margin of error.




Cloud-controlled quadcopter

Anuraag Vankayala, Amrutha Vasili

Cloud-controlled quadcopter

Featured Project

Idea:

To build a GPS-assisted, cloud-controlled quadcopter, for consumer-friendly aerial photography.

Design/Build:

We will be building a quad from the frame up. The four motors will each have electronic speed controllers,to balance and handle control inputs received from an 8-bit microcontroller(AP),required for its flight. The firmware will be tweaked slightly to allow flight modes that our project specifically requires. A companion computer such as the Erle Brain will be connected to the AP and to the cloud(EC2). We will build a codebase for the flight controller to navigate the quad. This would involve sending messages as per the MAVLink spec for sUAS between the companion computer and the AP to poll sensor data , voltage information , etc. The companion computer will also talk to the cloud via a UDP port to receive requests and process them via our code. Users make requests for media capture via a phone app that talks to the cloud via an internet connection.

Why is it worth doing:

There is currently no consumer-friendly solution that provides or lets anyone capture aerial photographs of them/their family/a nearby event via a simple tap on a phone. In fact, present day off-the-shelf alternatives offer relatively expensive solutions that require owning and carrying bulky equipment such as the quads/remotes. Our idea allows for safe and responsible use of drones as our proposed solution is autonomous, has several safety features, is context aware(terrain information , no fly zones , NOTAMs , etc.) and integrates with the federal airspace seamlessly.

End Product:

Quads that are ready for the connected world and are capable to fly autonomously, from the user standpoint, and can perform maneuvers safely with a very simplistic UI for the common user. Specifically, quads which are deployed on user's demand, without the hassle of ownership.

Similar products and comparison:

Current solutions include RTF (ready to fly) quads such as the DJI Phantom and the Kickstarter project, Lily,that are heavily user-dependent or user-centric.The Phantom requires you to carry a bulky remote with multiple antennas. Moreover,the flight radius could be reduced by interference from nearby conditions.Lily requires the user to carry a tracking device on them. You can not have Lily shoot a subject that is not you. Lily can have a maximum altitude of 15 m above you and that is below the tree line,prone to crashes.

Our solution differs in several ways.Our solution intends to be location and/or event-centric. We propose that the users need not own quads and user can capture a moment with a phone.As long as any of the users are in the service area and the weather conditions are permissible, safety and knowledge of controlling the quad are all abstracted. The only question left to the user is what should be in the picture at a given time.

Project Videos