It’s arguably one of the most striking scenes from Steven Spielberg’s 2002 sci-fi flick Minority Report: a futuristic Tom Cruise uses his light-tipped, gloved fingers to flick through media files on computer screens before him.
A decade on, that fictional glimpse into the year 2054 is looking more like reality inside a Ryerson University technology design lab, where a team of developers say they’re months away from completing the world’s first comprehensive database of human gestures.
“It’s like the next generation of multi-touch. It’s a complete game-changer,” said Adrian Bulzacki, a 29-year-old electrical and computer engineering PhD student behind the software — a universal user interface that allows people to control technology with body gestures.
With no touching, typing or sound needed, the software revolutionizes the way humans interact with technology, said Bulzacki, who founded Toronto-based tech company ARB Labs Inc. and also developed a semi-transparent, frameless touch screen.
“Gesture recognition is the next frontier, making computing more intuitive and immersive than ever before,” said Ryerson’s Digital Media Zone director Valerie Fox. “(It’s) the future of computing.”
Though Bulzacki dubs his database the first of its kind in the world, the concept of gesture recognition is not new.
In 2010, Minority Report science adviser and inventor John Underkoffler demoed a real-life version of the movie’s gesture recognition system at a TED Conference in California.
That same year, Microsoft launched Xbox Kinect, a novel gaming system that turns the player into the controller. Toronto’s Sunnybrook Health Sciences Centre has since introduced Kinect into operating rooms to allow surgeons to control medical images on a computer screen without having to touch the screen.
But Bulzacki, who began more than a year ago to develop a Kinect game called Charades (to be launched by Microsoft later this year), said he realized at the time that no one had compiled a comprehensive, technical database of human gestures.
To create the database, his team of ARB Labs developers, based at Ryerson’s Digital Media Zone, started to trim gesture data from the Charades game and transferred it to an Artificial Intelligence server — effectively training the server to recognize a wide variety of full-body human gestures in real-time.
“We came up with universal truths about the gestures to make the recognition of those occurrences really easy,” he said.
He gave the example of a punch gesture. The recognition system sees how the body moves when preparing and carrying through with a punch — from shoulder, arm and fist movements to the way a person’s ankle snaps in a certain way when readying for the follow-through of the punch.
Bulzacki said the gesture recognition database compiled by ARB Labs can now accurately identify human gestures faster than other existing gesture-based systems — and even faster than the human eye.
With a completed database on the horizon, Bulzacki said he has already licensed the system but plans to work on marketing the system to a variety of industries.
ARB Labs has already licensed the software in the oil industry, with companies using the technology to more easily interact with oil samples, and the real estate market — where developers plan to integrate gesture-controlled technology into next-generation living spaces.
He also said the company hopes to sell “aggressive gesture packs” to security companies at airports, casinos and banks — a software system that could rapidly identify aggressive gestures in public places and “protect people faster,” he said.
“The things we see in movies, futuristic movies, we’re very close to having that in the short term,” Bulzacki said. “In fact, I think we’re going to go past that very quickly.”
This article is originally by Niamh Scallan on The Toronto Star.