Wearable Gesture Detection Prototype

Заказчик: AI | Опубликовано: 05.01.2026

I have a working concept for a small wearable that must recognize hand-based gestures in real time and transmit them as standard Bluetooth HID events. The core of the job is to build and refine the OpenCV pipeline that does the vision-side gesture detection, then link those recognised gestures to the device’s Bluetooth stack so they appear to the host (phone, tablet, PC, or headset) exactly like native keyboard, mouse, or media-control inputs. The prototype board already exposes a camera interface, basic MCU, and touch-sensing pads for additional context cues. Your task is to: • implement and tune the OpenCV routines for robust gesture detection under varied lighting • integrate the recognition results into the existing Bluetooth HID profile, mapping each gesture to predefined keycodes or consumer-control codes • wire up any supplementary touch or proximity sensors so the system can intelligently combine vision and touch data before sending the final HID event • supply concise build instructions and a short demo video showing the system driving a host device from the wearable itself Current toolchain: C/C++ on an ARM-based MCU, OpenCV 4.x, GCC, and a Nordic-style Bluetooth LE stack (though I’m open to an ESP32 or similar if you can justify the switch). The form factor is roughly smartwatch sized, so memory and power budgets matter. Acceptance criteria – >95 % gesture classification accuracy across my four reference gestures in indoor daylight and office lighting – HID latency under 100 ms from gesture completion to host event – Complete source code and documented pin / peripheral assignments If you have previous interactive hardware, AR headset, or miniature projector experience, that will be a bonus because the roadmap includes spatial overlays later. Let’s keep initial communication focused on your relevant OpenCV, Bluetooth HID, and low-power hardware work so we can move straight into sprint planning.