Design Project: Gesture-Based Device Controller

C++, Assembly | Microprocessor System Design & Digital Logic

BC Analysis Dashboard

A Top-Down View of the Project with all I/O sensors and devices connected to the STM32F103RB microcontroller

For this individual design project, I embarked on an individual design challenge: to create a gesture‑based device controller that seamlessly integrates what I learned about microcontrollers, logic devices, and communication interfaces. The result is a hands‑free control system that uses the Grove PAJ7620U2 gesture sensor (I²C) and a SparkFun AT42QT101 capacitive touch sensor to activate and deactivate appliances—eliminating the need to press buttons. For robustness, manual override push‑buttons ensure reliable operation even if gesture detection falters, while an LED indicator clearly communicates system status. Powered by the STM32F103RB Nucleo‑64 board, this project pulled together analog signals, I²C, USART, SPI, GPIO control, and power‑management considerations into one cohesive, user‑friendly design—demonstrating both technical skill and real‑world applicability in home automation.

Tools Used

Behind the Build: A Visual Story

Key Questions I Tried to Answer

  1. Can the PAJ7620U2 distinguish gestures from ambient noise?
  2. Polling or interrupts: which best merges gesture and touch inputs?
  3. What’s the latency from gesture detection to relay activation?
  4. How to add manual override buttons for graceful failure?
  5. Is gesture‑only control user‑friendly, and does LED feedback suffice?

Steps Taken for Analysis

  1. Reviewed microcontroller and peripheral interface labs (I²C, GPIO, USART).
  2. Bench‑tested gesture and touch sensors under different conditions.
  3. Compared polling vs interrupt for sensor reads and measured latency.
  4. Implemented firmware on board, integrating relay and manual overrides.
  5. Evaluated response times, LED feedback clarity, and power consumption.

Live Demo

Thank you for taking the time to view my project!