Gesture-Based Touchless Control System Using Computer Vision

12 Feb

Authors: Shweta Barhate, Samiksha Patil, Vaibhavi Ghodke, Aniket Chavan, Harshwardhan Dahatonde

Abstract: Gesture-based interaction is quickly becoming a viable replacement for touchscreens, particularly where hygiene or accessibility is a concern. This study details a touchless control system that uses computer vision to read hand gestures in real-time. By merging image processing with a deep-learning tracking model, our approach can identify specific poses and motion-based gestures. The process is broken down into capture, preprocessing, feature extraction, classification, and execution. We designed the system to handle various lighting environments and backgrounds to ensure it works reliably in everyday scenarios. Performance tests show high accuracy and low latency, allowing for smooth interaction without the need for wearables or physical touch. This makes the system useful for applications ranging from healthcare and public kiosks to smart homes. The findings demonstrate that computer vision is an effective tool for building safer, user-friendly touchless interfaces.

DOI: http://doi.org/10.5281/zenodo.18619257