Articles tagged as: Andruid Kerne
Principal Investigator Andruid Kerne, Entrepreneurial Lead Jon Moeller, and Business Mentor Jamie Rhodes have been awarded an Innovation Corps grant by the National Science Foundation to sustain development of “ZeroTouch: High-Performance Sensing for Multi-Touch and Free-Air Interaction.” I-Corps awards take the most promising research projects in American university laboratories and turn them into start-ups.
ZeroTouch is a virtual in-the-air touchpad computer interface that allows users to literally draw pictures in midair. The technology also allows the user to convert any conventional display, such as big-screen TVs, into an affordable touchscreen panel, opening a world of possibilities for precision gaming, designing and other experiences that rely on touch technology.
New venture teams are the heart of I-Corps. Kerne is an associate professor in the Texas A&M Department of Computer Science and Engineering, and director of The Interface Ecology Lab. Moeller is a graduate student and with ZeroTouch, the star in a Best Buy future innovators commercial, which vaulted A&M into the national spotlight for leadership in STEM education. Rhodes is the director of new ventures for the A&M System’s Office of Technology Commercialization.
I-Corps awards can be won for basic research, initially developed through regular NSF grants, which shows immediate potential for broader applicability and impact in the commercial world. The ZeroTouch team is participating in an NSF course on forming a start-up venture. Through focused activities and long hours, team members are learning to practice entrepreneurship, developing a business model canvas for ZeroTouch, and bonding. PI Kerne said, “I am tired of reading about start-ups in Silicon Valley. We are ready to change the world with technology from Aggieland!”
It appears to be an empty frame, but it is filled with futuristic possibilities for the computer user with just the touch of a finger.
Created by researchers at the Interface Ecology Lab (IEL) in the Department of Computer Science and Engineering at Texas A&M University, ZeroTouch is a virtual in-the-air touchpad computer interface that allows users to literally draw pictures in midair. The technology also allows the user to convert any conventional display, such as big-screen TVs, into an affordable touchscreen panel, opening a world of possibilities for precision gaming, designing and other experiences that rely on touch technology.
“ZeroTouch is an input device, a multifinger sensor,” says Jon Moeller, a graduate student and ZeroTouch co-designer. “It is basically a thin plane of interaction built with a linear array of infrared-sensing component. Any object within that plane can be detected and recorded. It can transform any computer monitor or even free space into a multitouch interface.”
Moeller is a member of the IEL under the direction of Associate Professor Andruid Kerne. Under Kerne’s guidance, IEL researchers develop human-centered computing by taking an interface-ecosystems approach.
“The ecosystems approach encounters and develops the interface as a multidimensional border zone between people, technologies, languages, analog, digital, cyber, organic, culture and other representational systems as a basis for engaging in human-centered computing,” Kerne says. “We must incorporate the diverse methodologies that inhabit and characterize the interface border zone in order to discover new natural and meaningful forms of human–computer interaction.”
The development of integrated hardware–software systems for embodied interaction spans diverse fields, including algorithms, electronics, embedded systems, physics, art and cognition. In his research, Kerne draws from his experience developing embedded computer systems for NASA’s Mars Pathfinder, as well as his background in human–computer interaction, music composition, multimedia performance and culture.
Touch-sensitive frames have enabled interactive surfaces for years, but the size and responsiveness tend to be limited. Enter ZeroTouch, with precise sensing within a specific plane of interaction.
“We constructed a frame, and around the periphery we embedded low-cost infrared emitters and sensors to quickly determine intersections within the frame,” Moeller says. “ZeroTouch enables real-time sensing of fingers and hands, even in the presence of strong ambient light. Our technology allows for many interactions to be detected, many more than typical multitouch techniques. Our use of wide-angle optoelectronics allows for excellent touch resolution, even in the corners of the sensor.”
A 27-inch ZeroTouch frame has “smart” edges embedded with 256 infrared sensors and 32 LEDs that each blink about 2,400 times per second, detecting whatever moves around inside it. Fingertips, hands, arms and even inanimate objects pass through an invisible two-dimensional optical web that tracks them.
This ZeroTouch sensor has been integrated not only with 1080 pixel displays but also with higher-resolution 1440 pixel, nearing twice as many pixels. This configuration turns a traditional monitor into a low-cost multitouch surface, supporting direction interaction with multiple fingers and hands at one time — much larger and more responsive than an iPad.
ZeroTouch can also be integrated with high-resolution stylus-based tablet computing displays to enable pen-plus-hand interaction. The researchers’ initial applications address real-time strategy games and art-exhibit curation.
The frame is connected to a computer with USB, which provides power and collects the data. Capable of recognizing up to 20 independent touch points at a time, the sensor not only recognizes that an object has entered the plane but also registers its size.
Look, Ma, No Hands!
The ZeroTouch sensor can be suspended in free air, enabling precise gestural interaction similar to that seen in the movie Minority Report, in which characters viewed computer screens midair and scrolled through content on the monitor with the touch of a finger.
With the addition of the intangibleCanvas application, users can paint on a virtual canvas by gesturing in midair by simply moving a hand across the ZeroTouch plane. The colors are controlled using an iPhone, and the thickness of the brush is controlled by how much enters the frame. If it’s just a finger, the brush will be narrow. But the use of an entire arm will make a wider brushstroke.
IEL collaborator Ziegelbaum-Coelho recently developed a free-air interactive kiosk. The Cartier brand used this kiosk to create an innovative interface for browsing promotional videos about its luxuriant watches for an event organized by Fast Company, honoring the 100 most creative people in business.
One big advantage to ZeroTouch, the researchers say, is its affordability. The research prototype was made using commercially available sensors usually found in TV remote controls. Moeller says that the frame, which wasn’t designed for mass production, cost about $450 to construct.
ZeroTouch has many potential applications, such as a training guide for surgeons that can track their fine hand movements, as well as for interactive instructions on how to construct and repair complicated machinery.
Moeller says the technology creates more possibilities for interaction than those interfaces that rely on capacitive sensors, such as touchscreens on smartphones and laptops. The ZeroTouch technology simply requires the user to break the light beams; activating the sensor doesn’t require any force.
“You can use it with gloves on,” Moeller says. “So it can be used in hazardous environments where capacitive interfaces would be unsuitable.”
Next, the team plans to work on scaling the technology up to larger 2-D arrays, and to three dimensions, Kerne says. The researchers will experiment to explore the transformative potential for interactive experiences through stacking layers of the frames.
“One of the cool things about ZeroTouch is you can stack layers together to achieve depth sensing,” Kerne says.
Education Through Research
ZeroTouch creators demonstrated the technology during the 2011 Association for Computing Machinery Conference on Human Factors in Computing Systems (CHI) held in Canada and generated significant media attention, including the Today Show, Time, Discovery News, Popular Science, PC World and New Scientist.
Students in Kerne’s graduate seminar in human–computer interaction developed the CHI ZeroTouch exhibit. Kerne says he was excited and gratified that the lab received the accolades, and he was thrilled that students could contribute to the project and be recognized for their hard work.
“The recognition we garnered for ZeroTouch resulted from a kismet combining the inherent qualities of the high-resolution multifinger sensing and the applications we exhibited to seed people’s imaginations,” he says. “These resulted directly from the great work that students performed in class prior to the conference.
“I was extremely glad that the students got the gratification of being recognized by famous researchers at the conference and in the media for their ingenuity, dedication and perseverance. It’s a clear demonstration of how transformative synergy can result from innovative combinations of research and education.”
Editor’s note: This story originally appeared in Texas A&M Engineer magazine.
Media contact: Tony Okonski, Communications Coordinator, Department of Computer Science and Engineering at (979) 458-4412