Project

# Title Team Members TA Documents Sponsor
55 HydroFlora ( A Context-Aware Watering Can )
Charis Wang
Delilah Dzulkafli
Idris Ispandi
Mingrui Liu design_document1.pdf
photo1.png
proposal1.pdf
# Team Members:

Idris Ispandi (mm120) Delilah Dzulkafli (delilah5)

# Problem:

Many people care for multiple houseplants with different watering needs, but watering is typically done by intuition and inconsistent habits. Because plant type, pot size, soil type, and moisture all affect how much water a plant actually needs, manual watering often results in overwatering or underwatering. Overwatering can lead to root rot, fungus gnats, and wasted water, while underwatering causes plant stress, slowed growth, and wilting. Existing reminders or generic schedules don’t adapt to real-time soil conditions, and fully automated irrigation systems can be too expensive, complex, or impractical for small indoor plant collections. There is a need for a simple, low-effort tool that helps users deliver the correct amount of water per plant based on measured soil dryness and plant/pot-specific requirements, without requiring a permanent installed system.

# Solution:

In order to maintain optimal conditions for plants, we propose a smart watering can. The watering can will have two working parts: the MCU connected to a water pump (on the watering can), and the modular sensing unit (on the plants pot). The idea is that when you get a new plant, you input to the MCU the type of the plant, and the recommended amount of water the plant will be stored. The sensor unit will constantly broadcast the readings so when you pick up the watering can it will tell you which plant is in need of water based on the previous watering logs. You select the plant and go to the respective pot and press dispense and the MCU will tell the pump to dispense the needed amount of water = Recommended moisture level - current moisture level. This way, we can ensure that each plant has the most optimal amount of water needed to grow.

# Solution Components:

- ## Subsystem 1 (Water Dispensing Unit):
Components: Peristaltic Liquid Pump with Silicone Tubing

Driven by the MCU, this unit is responsible for dispensing the required amount of water. This will be placed in the watering can.
[https://www.digikey.com/en/products/detail/adafruit-industries-llc/1150/5638299](url)


- ## Subsystem 2 (Sensor Node):
Components: Capacitive Soil Moisture Sensor SKU:SEN0193, ESP32-C3-WROOM-02, battery and regulator

This unit will have a sensor that will be attached to the plant to measure the soil moisture, and the readings will be transmitted to the main control unit periodically via WiFi/Bluetooth (tradeoffs are still being weighed).
[https://www.digikey.com/en/products/detail/dfrobot/SEN0193/6588605](url)


- ## Subsystem 3 (Main Control Unit):
Components: ESP32-C3-WROOM-02, LCD display, buttons

This acts as the device's main control unit. When the user chooses a plant by clicking the buttons (pre-defined for prototype), the LCD will display what plant the user has selected. It is then responsible for determining the amount of water to be pumped out based on the readings received from the plant’s moisture sensor.

- ## Subsystem 4 (Physical Build):

Components: A watering can

The MCU will be attached at the top of the watering can with a waterproof enclosure. This will be discussed with the machine shop for further opinions.

- ## Subsystem 5 (Power Management):
Components: Rechargeable Battery for MCU and LiPo battery for sensor unit

This subsystem provides rechargeable power and stable 3.3 V for our electronics. The pump, sensor node, and the control unit will have separate power systems.


# Criterion For Success:
This project will be considered successful if the system can reliably receive soil moisture data from multiple sensor nodes (sensor readings are stable under fixed conditions), accurately determine which plant needs watering, and dispense water within 10% of the target volume while maintaining a stable operation:


- Sensor nodes have a stable, repeatable moisture value where moisture reading increases after watering and decreases over time

- Sensor nodes can successfully broadcast soil moisture readings to the main control unit.

- Accurately determine which plant needs watering based on moisture level

- Pump dispenses water within 10% of target volume

- Different plants result in different dispense volume

- Sensor node operates continuously for >24 hours on battery without recharge

- Electronics remain functional after watering

Decentralized Systems for Ground & Arial Vehicles (DSGAV)

Mingda Ma, Alvin Sun, Jialiang Zhang

Featured Project

# Team Members

* Yixiao Sun (yixiaos3)

* Mingda Ma (mingdam2)

* Jialiang Zhang (jz23)

# Problem Statement

Autonomous delivery over drone networks has become one of the new trends which can save a tremendous amount of labor. However, it is very difficult to scale things up due to the inefficiency of multi-rotors collaboration especially when they are carrying payload. In order to actually have it deployed in big cities, we could take advantage of the large ground vehicle network which already exists with rideshare companies like Uber and Lyft. The roof of an automobile has plenty of spaces to hold regular size packages with magnets, and the drone network can then optimize for flight time and efficiency while factoring in ground vehicle plans. While dramatically increasing delivery coverage and efficiency, such strategy raises a challenging problem of drone docking onto moving ground vehicles.

# Solution

We aim at tackling a particular component of this project given the scope and time limitation. We will implement a decentralized multi-agent control system that involves synchronizing a ground vehicle and a drone when in close proximity. Assumptions such as knowledge of vehicle states will be made, as this project is aiming towards a proof of concepts of a core challenge to this project. However, as we progress, we aim at lifting as many of those assumptions as possible. The infrastructure of the lab, drone and ground vehicle will be provided by our kind sponsor Professor Naira Hovakimyan. When the drone approaches the target and starts to have visuals on the ground vehicle, it will automatically send a docking request through an RF module. The RF receiver on the vehicle will then automatically turn on its assistant devices such as specific LED light patterns which aids motion synchronization between ground and areo vehicles. The ground vehicle will also periodically send out locally planned paths to the drone for it to predict the ground vehicle’s trajectory a couple of seconds into the future. This prediction can help the drone to stay within close proximity to the ground vehicle by optimizing with a reference trajectory.

### The hardware components include:

Provided by Research Platforms

* A drone

* A ground vehicle

* A camera

Developed by our team

* An LED based docking indicator

* RF communication modules (xbee)

* Onboard compute and communication microprocessor (STM32F4)

* Standalone power source for RF module and processor

# Required Circuit Design

We will integrate the power source, RF communication module and the LED tracking assistant together with our microcontroller within our PCB. The circuit will also automatically trigger the tracking assistant to facilitate its further operations. This special circuit is designed particularly to demonstrate the ability for the drone to precisely track and dock onto the ground vehicle.

# Criterion for Success -- Stages

1. When the ground vehicle is moving slowly in a straight line, the drone can autonomously take off from an arbitrary location and end up following it within close proximity.

2. Drones remains in close proximity when the ground vehicle is slowly turning (or navigating arbitrarily in slow speed)

3. Drone can dock autonomously onto the ground vehicle that is moving slowly in straight line

4. Drone can dock autonomously onto the ground vehicle that is slowly turning

5. Increase the speed of the ground vehicle and successfully perform tracking and / or docking

6. Drone can pick up packages while flying synchronously to the ground vehicle

We consider project completion on stage 3. The stages after that are considered advanced features depending on actual progress.

Project Videos