In an open layout room filled with natural light, Carnegie Mellon University Human-Computer Interaction Institute PhD students Yang Zhang and GieradLaput fiddle with wires and switch on levers. The tables in front of them are covered in interfaces, sensors and tools, and several nearby glass walls display multi-colored formulas and troubleshooting lists. This, Laput says, is the Future Interfaces Group research lab.
Zhang and Laput demonstrate one of their newest devices, SkinTrack. The technology serves as a supplement to smartwatches, allowing users to use their forearm as a trackpad.
“Right now smart watches are really tiny screens, but your fingers are fat, so when you actually scroll on it you’re covering basically 30% of the screen,” says Laput, a third year PhD student. “We make the smartwatch bigger without actually making it bigger.”
The SkinTrack rig, composed of a ring that sends a high frequency Alternating Current (AC) signal undetectable to your body that communicates with a wristband fitted with electrodes, is one of the many prototypes on the long table by the back window of the makerspace.
Zhang demonstrates how the technology works, training it to appropriate the skin on the top of his hand into a trackpad. Not only can the technology tell when the user is actually swiping and touching, but it can also detect hovering. This kind of open-ended user recognition provides more robust options.
Most of the projects done here, Laput explains, are meant to build on existing interfaces like tablets and smartphones to further improve user experience.
EM-Sense is another smartwatch-based project developed at FIG. Laput attaches another wearable prototype to his wrist and began touching different objects. On a separate laptop, the electromagnetic frequencies change depending on where Laput places his hand.
A practical use of this technology would apply reminders to certain surfaces. For example, when your watch senses the unique electromagnetic frequency of your office door, it would send you a list of your upcoming appointments. Zhang and Laput agreed it’s all about making the technology work for the people.
But not all technology being developed by FIG relies on input-based interface interaction or smartwatch technology.
“It’s also about output or feedback to the user,” says Zhang of the opposite spectrum of their research. That’s what informs their project Teslatouch, which is all about making a touchscreen a more dynamic surface.
“We make a touchscreen feel like real-world materials with tactile feedback,” says Zhang.
They do this by utilizing electrostatic vibrations.
“It’s the same thing that makes balloons stick to our hair after we rub it, but here we change the force periodically so we can actually feel it,” explains Zhang.
This is scientific basis for a technology that can make a touch screen replicate the sensation of rubbing your finger along sandpaper.
Both researchers say they could see this technology adapted for online shopping sites and children’s learning apps. Just imagine shopping for curtains on Amazon and being able to feel the texture of the curtains just by touching the screen of your iPad.
FIG publishes its research periodically throughout the year so folks who develop and market technology to the masses are aware of possibilities for their devices.
Both Laput and Zhang are enthusiastic about the prospect of making technology that can improve the lives of the people who utilize it.
“I like the idea of optimizing for humans as opposed to a database,” says Laput.
More EP Archives can be heard here.