The firm Elliptic Labs has developed a prototype for touchless user interfaces which is easier to implement and cheaper than today used camera-image-based technology and utilizes ultrasound. I like this “hand gesture – in / out (play / pause)”.
This can be a cool technology. Imagine to sit on the sofa watching a film and just have to use this gesture “out / pause” to hold the film. It’ll be something like “Minority Report” (again).
But what’s with animals (especially pets)? For example dogs can hear these (ultra)sounds and I think it’ll hurt them! Or will this attract bats to come around?
As I wrote some time ago, human-machine interaction is going towards the interesting trends we saw in futuristic films years ago. Here is another example of a human-machine interface which reminds us particularly of “Minority Report”. A man is gesticulating with both his hands free in space and on a screen you can see a 3D scene performing appropriate actions.
That’s impressive!
—
But because I’m seeing some disadvantages with this technology for me it isn’t such impressive anymore. The disadvantages or let’s say challenges are:
One has to hold his arms continuously in free space, for sure that’s exhausting.
What if there are more than two hands / one user? Are the algorithms able to match the hands to the right persons or will there be false recognized (inter)actions?
In my opinion the user gestures were adapted to the technology not to real user needs / perceptions.
The actions are not accurate enough yet.
In the case of the deletion of files some other way would be much better than to move the files to the edges I think. Because this action suggests me that there will be a backup of the files and I#m able to load them again.
And beside the other details a main goal for future investigations has to be to remove the need that a user has to recheck visually what he is doing with his hands (red cross and borders in the video and so on). The user wants to e.g. translate a picture, commands that with his hands and that’s it. The extra necessity to visualize that destroys the / my sensations.
But the technology (free hand gestures and 3d scenes) is a really great step into future!
What a cute demonstration of a nice technology – acting as you have a computer mouse without a mouse:
Probably this gives us the opportunity to interact almost everywhere (in a traditional style) with a laptop (in bed, at kitchen table, etc.) without carrying around the (hardware) mouse.
One has the ability to feel the ‘click’ and so you have tactile feedback of initiating a interaction function. But what if one acts with the hand out of the camera sight? Whill there be false inputs or no one? How to inform the user of the right hand position?
Beside the hardware and software products in the video have a look to the gyroscope functionality in smartphones today (especially the precision). It’s amazing! Real cool things can be done with this feature, I think.