May 30, 2014

Touch-emitted heat enables interactions

Metaio is trying to make all physical objects touchable by recognizing (in future via HMDs) emitted heat of the human finger(s). So we can link every object with digital information or services. Unfortunately yet the idea needs a lot of work to be done.

Would be interesting to see if the concept operates as aspected with warm/hot objects when I, let’s say, want to play chess on a sun-exposed table in front of a coffee shop.

via TechCrunch, see also Metaio.com – Press Release

October 27, 2013

Feel what you see

Would be nice to test this technology which lets you feel surfaces of displayed images. I would like to know how realistic it can be.

via GIZMODO.de

 

February 17, 2013

You just think you feel what you’re seeing

It is possible to get tactile feedback via visual stimulus.

via Mohamed Ikbel Boulabiar at Google+

June 12, 2012

Double-sided touch

That looks interesting. It can optimize usability e.g. in case of reading and therefore scrolling a lot on smartphones.

May 29, 2012

Haptic experience in 3D

For an room size interaction we may need very big magnets. That seems to be counterproductive.

May 24, 2012

New touch input options

First some advancement in touch input devices:

July 31, 2011

Tactile walker navigation

Researchers of the Viterbi School of Engineering (University of Southern California) constructed and evaluated a tactile vest in combination with a stereo head-mounted camera. So with the camera they are able to measure the distance to objects in sight and the vest gives tactile signals to the one who is wearing it. Their main use case is to help blind people orientating. Therefore they won’t need the blindman’s stick anymore.

I’m sure there are a few other use cases such as helping people to navigate while they have to look at something different (maybe for construction workers).

But first they have to smarten this technology up ;) and also improve some things. Because there may be some problems for blind people navigation especially with little ledges / tripping hazards. How to ‘visualize’ that with a tactile vest? Furthermore it would be a big innovation if navigation hints are in 3D so the user can easily go ahead and / or sideways. And so on.

[viterbi.usc.edu: “Guide Vests”]

April 17, 2011

Tangible widgets for touch screens

A team of researchers from RWTH Aachen University and University of California, San Diego created some widgets to extend multi-touch tables with tactile feedback.

In addition to that one is able to create new forms of interaction by the combination of hardware and software. So it’s a nice new user interface technology. But with that you can or maybe sometimes need to have different devices / widgets at once. Where to place them when you need them and when you don’t need them? Probably these devices can get lost and become dirty quickly. But they are a good possibility to close the gap between software and hardware user interfaces respectively to fuse both ways so it would be perceived as one consistent interface.

[hci.rwth-aachen.de: SLAP]

April 8, 2011

Digital writing only with a glove

Jeff Rowberg developed a glove which tries to replace the standard PC-keyboard.

Here a video that explains the functionality: www.kickstarter.com/…/video.html And here’s more material:

Hm, seems cool to me. If you combine it with accelerometers you can use gestures for further interaction. Not only zooming gestures are possible. Also gaming like punching seems to be an option. And all that with one glove. What if you wear two of them maybe with some kind of different functionality?! And in combination (meaning registering the distance between parts of both gloves) there is much more interaction potential.

And the best: You can built it on your own because it’s made by open source hardware. So we can try to improve the glove or support Jeff.

[kickstarter.com: Keyglove]

[www.keyglove.net: The Keyglove Project]

March 4, 2011

Tactile feedback shows the way

Last week / in my last post we saw how the brain is able to control a means of transportation. But how to know where to drive? Beside the speech output option of GPS one has to verify the way by looking at the GPS. But with the tactile feedback called “Shear Feedback” by researches of University of Utah the driver can have his look on the road.

This option is really nice because it allows many use cases (bike riding directions, direct blind people, …) and is private (no one else realizes the informations (security reason) respectively get’s distracted by the cues (privacy reason)).

But I’m asking myself if this is really applicable for bike riding when there are vibrations through rough underground?! Under these circumstances you can’t hold your fingers at one point and the vibrations maybe interfere with tactile cueing. Interesting work has to be done…

[heml.eng.utah.edu: Shear Feedback]