Tanuja Joshi|Sep 8, 2023
NMIMS Indore BTech student develops programme for reading hand gestures
Fascinated by the Apple Vision Pro, Sam Varghese developed a hand-gesture recognition machine learning (ML) model that works on Windows.
NEW DELHI: Nineteen year-old Sam Varghese has developed a hand-gesture recognition machine learning (ML) model “out of curiosity”. The third-year BTech student at SVKM’s Narsee Monjee Institute of Management Studies (NMIMS) Indore saw tech giant Apple’s promotional video for Apple Vision Pro and wanted to see what he could do with it.
Apple Vision Pro is a “spatial computer”. Using a virtual reality (VR) headset, the user can access and use the operating system which merges with the user’s own physical environment allowing for a 3D experience. In place of keys, the “spatial computer” can ‘read’ …. .that humans naturally use to communicate – hands, eyes, voice – gestures and this is what intrigued Varghese.
“After I saw the promotional video, I noticed the Artificial Intelligence (AI) and ML technology Apple has used. The video intrigued me so much that I took up the challenge of developing something similar for Windows 10 operating system.”
According to Varghese, Apple Vision Pro will cost 10 times the amount his Windows laptop costs. “I wanted to develop something for basic Windows users available at a cheaper cost,” he added. “I am not a machine learning expert or an Apple engineer but have always been curious about new AI/ML models,” stated Varghese.
AI ML models
Varghese’s model allows the user to control the computer using hand gestures. “For instance, to change the active window on your laptop – touching the index fingers and thumb together will do that. Furthermore, by touching the middle and thumb together, all open windows will get minimised,” explained Varghese. The model goes much beyond just changing tabs and opening new ones. “It also enables the user to lock the computer, shut it down, undo or redo tasks along with a wide range of tasks– all through intuitive hand movements,” he added.
Varghese’s model also does not require any headset; the laptop camera is used to read gestures. “Apple Vision Pro will lead the generation of laptops into a new dawn, users won’t need any keyboard or mouse in order to control their systems. “This is the future of augmented and virtual reality,” stated Varghese.
Varghese mainly used Python programming language along with some machine learning techniques to develop the model. “I trained a machine learning model with 30,000 data points through Python’s open vision library,” he explained.
Currently this programme is only available on laptops but soon a mobile version will also be released, stated Varghese. He estimates that once it's fully available for usage, the programme would cost around Rs 2,000- Rs 3,000, much cheaper than Apple’s Vision Pro.
Innovation at IIT, BITS
Varghese had support from his professors. “Our professor, Munendra Jain, a teacher of quantum physics at NMIMS Indore, who previously worked as a researcher at BITS Pilani, always emphasised on students’ innovations at BITS and IITs. This motivated me to develop and build innovative products,” stated Varghese.
He also added his Aaquil Bunglowala, associate dean, NMIMS Indore has consistently encouraged students to join all kinds of hackathons to build innovative solutions.
Varghese and his team were also invited to Indian Institute of Technology (IIT) Kanpur’s TechInnovation challenge and secured the sixth rank at the all-India level.
To get in touch, write to us at firstname.lastname@example.org.