Updates
3:39 AM Tuesday, 21st Apr, 2026
1
9:55 AM Sunday, 19th Apr, 2026
1
Details
Build Your Own Jarvis: Real-Time Hand Gestures & Voice Integration with Animations
We are hosting a building-first workshop to create a multi-modal AI interface. This isn't a sit-and-watch session; we are going to implement real-time hand tracking, voice command loops, and custom UI animations from scratch. The goal is to walk out with a successful, working Jarvis built entirely in Python.
The Workflow:
-
Hand Tracking: Using MediaPipe to map 21 coordinates and translate movements into system actions.
-
Voice Loop: Integrating speech recognition to process commands and trigger Python logic.
-
Animated HUD: Coding real-time visual overlays and HUD elements that track with your gestures.
-
Expert Sessions: We will be inviting speakers from the industry to discuss building tech products and navigating AI careers.
Prerequisites:
-
Python: A basic understanding (loops, functions, and logic) is a must.
-
Experience: Don’t be scared if you’ve never built a real project before. That’s exactly why we’re doing this—to bridge that gap.
-
Hardware: A laptop with a working webcam and microphone.
-
Mindset: The courage to build, whether you want to follow along with vibe coding or try to figure it out on your own.
📅 Date: Sunday, May 3rd, 2026
🕙 Time: 10:00 AM – 04:00 PM
📍 Venue: [To Be Announced Soon]



