May 30, 2014

Touch-emitted heat enables interactions

Metaio is trying to make all physical objects touchable by recognizing (in future via HMDs) emitted heat of the human finger(s). So we can link every object with digital information or services. Unfortunately yet the idea needs a lot of work to be done.

Would be interesting to see if the concept operates as aspected with warm/hot objects when I, let’s say, want to play chess on a sun-exposed table in front of a coffee shop.

via TechCrunch, see also Metaio.com – Press Release

June 23, 2013

Smart highways

Self lighting roads in the night, showing also the driving conditions, a lane to charge electric cars, and so on. Seems to be very interesting research which is starting now in a Dutch province.

via forschungs-blog.de

May 22, 2013

Mapping virtual to real movement

A person is walking through different virtual rooms while in reality doesn’t notice that he walks in a nine by nine meter space only. To the person, the virtual space seems to be larger.

via heise.de/iX, see also tuwien.ac.at

June 25, 2012

Augmented vision

Seems as TV is going to be extended to stimulate our peripheral vision.

via GIZMODO.de

December 7, 2011

Future mobile interaction

Samsung is showing his version of future mobile interaction with life (video) chat, augmented reality, holograms and life translations for better international communication.

So, how long does it take until it isn’t future any more? Can’t wait.

But in some point it seems not really useful, there are some improvements possible. Because the display is transparent other people can see your content. And do you really want other to see e.g. your chat partners? And where comes the sound from? How to charge it or won’t it need a extern power source ever? And where is all the intern (hardware) technology which enables this user experience? How do i get my photos and so on to my private computer at home?

In my opinion this is only a vision which shows the way from today with tablets and smartphones to a further technology step. I think some years ahead all of us will weare a little thing like a wrist watch. This will contain all the hardware to connect us with the real and virtual world. It mainly uses holograms for our visual sense, there can be a camera and beamer included to enable augmented reality experiences. This little thing can observe our vital functions permanently and maybe even get its power from our own body (temperature?). To ensure data privacy we have to build it near the eyes. Maybe that’s why we will look like a cyborg in future (but with all hands free).

July 31, 2011

Tactile walker navigation

Researchers of the Viterbi School of Engineering (University of Southern California) constructed and evaluated a tactile vest in combination with a stereo head-mounted camera. So with the camera they are able to measure the distance to objects in sight and the vest gives tactile signals to the one who is wearing it. Their main use case is to help blind people orientating. Therefore they won’t need the blindman’s stick anymore.

I’m sure there are a few other use cases such as helping people to navigate while they have to look at something different (maybe for construction workers).

But first they have to smarten this technology up ;) and also improve some things. Because there may be some problems for blind people navigation especially with little ledges / tripping hazards. How to ‘visualize’ that with a tactile vest? Furthermore it would be a big innovation if navigation hints are in 3D so the user can easily go ahead and / or sideways. And so on.

[viterbi.usc.edu: “Guide Vests”]

March 4, 2011

Tactile feedback shows the way

Last week / in my last post we saw how the brain is able to control a means of transportation. But how to know where to drive? Beside the speech output option of GPS one has to verify the way by looking at the GPS. But with the tactile feedback called “Shear Feedback” by researches of University of Utah the driver can have his look on the road.

This option is really nice because it allows many use cases (bike riding directions, direct blind people, …) and is private (no one else realizes the informations (security reason) respectively get’s distracted by the cues (privacy reason)).

But I’m asking myself if this is really applicable for bike riding when there are vibrations through rough underground?! Under these circumstances you can’t hold your fingers at one point and the vibrations maybe interfere with tactile cueing. Interesting work has to be done…

[heml.eng.utah.edu: Shear Feedback]

October 10, 2010

Navigation: head orientation & LED

The following video shows glasses extended by LEDs (connected to mobile navigation system) to navigate pedestrians only by LED lights.

That’s a good idea to prevent pedestrians from the constraint permanently looking to the navigation system (display). But I guess (primarily at night) these lights are annoying.

[www.uec.ac.jp: The University of Electro-Communications]

August 28, 2010

Mobile realtime face tracking

Here someone shows how a smartphone is able to track a users face real-time:

If this isn’t a fake that’s some kind of pleasing and frightening to me.

The comfort of user interfaces is increasing a lot – e.g. tracking head motions and do some automatic things with that like shut off the display.

But e.g. global corporations are able to log your gestures online during watching films and with that data concluding to your intimate preferences and many more…

June 19, 2010

Provide audio output by environmental interaction

This is an old concept but seems to have (funny) capabilities for future interaction: