← Return to Orbit

Virtual Mouse

Gesture-Controlled Computing via Computer Vision

AI Gesture Tracking Visualizer

01_The Vision

The keyboard and mouse have defined human-computer interaction for decades. Virtual Mouse seeks to break this boundary by bringing the interface into the physical world.

Using just a webcam and Python, this project transforms your hand gestures into precise cursor movements and click actions. It's a step towards a simplified, touch-free future inspired by sci-fi interfaces like Minority Report.

02_Core Tech

Real-time image processing pipeline optimized for low-latency feedback.

MediaPipe Hands

Google's high-fidelity hand tracking solution creates a 21-point skeletal mesh of the user's hand instantly. This allows us to track finger tips (index, thumb) with extreme precision.

OpenCV Processing

The industry standard for computer vision handles video stream capture, frame smoothing, and coordinate mapping. We map the camera's resolution to the screen's resolution to give full desktop control.

Gesture Logic

The system recognizes specific gestures:

  • Index Up: Moving Mode (Cursor follows finger)
  • Index + Middle Up: Selection / Dragging preparation
  • Pinch (Index + Thumb): Left Click

03_Use Cases

Touch-Free Kiosks

Ideal for public displays where hygiene is a concern. Navigate menus without touching a screen.

Accessibility

An alternative input method for users with limited mobility who may find traditional mice difficult to grasp.