Real-Time Music Vibe Sync

Замовник: AI | Опубліковано: 19.03.2026
Бюджет: 250 $

I am wrapping up a cross-language project that drives Lovense toys in perfect sync with the beat of any playing track. The core logic is in place; what I need now is a sharp pair of eyes and hands to solidify testing, polish the deployment flow, and be certain the audio-to-toy pipeline keeps pace under real-world conditions. Here is where you step in: • Focus first on code testing and deployment. A complete battery of unit and integration tests is required so every beat detected by BeatNet reliably translates into the expected intensity command over WebSockets and the Lovense API. • The real-time audio analysis already detects beats accurately, yet the hand-off to toy control still needs work. I want the analysis, FFMPEG stream handling, and Lovense command queue wired together with zero missed pulses. • Deployment must be containerised. Extend the existing Dockerfile, add any Bash helpers, and make sure a one-command build brings up Python back-end, HTML/JS/CSS front-end, and websocket bridge in a reproducible environment. Tech stack already in repo: Python 3.11, Java components that broker low-latency WebSocket traffic, vanilla HTML/CSS/JS for the UI, BeatNet for beat extraction, FFMPEG for stream decoding, and buttplug.io as a secondary control path. The source lives on a private GitHub; I will grant access the moment we start. Acceptance criteria 1. Complete automated test suite covering beat detection through Lovense actuation at multiple BPMs. 2. Docker image builds and runs with a single command on Linux x86-64. 3. End-to-end latency from audio input to toy response ≤ 80 ms in local tests. 4. Clean, documented commits pushed back to the GitHub repository. If you are comfortable juggling Python and Java side by side, fluent with shell scripting, and have prior experience putting hardware APIs like Lovense into production, I would love your help getting this across the finish line.