January 28, 2011

Digitalized turntable

Greg Kaufman tried something new for DJs so that they don’t have to carry their mixer console and vinyls all the time to the clubs and install the console in the dark clubs. So as Greg said in his video, we are now in the digital age and that’s why he (was able to) digitalized the console. So if all clubs have such turntables, DJs only have to carry their USB device with their music on it.

That’s a great idea for using so called natural user interfaces, especially because the consoles are easily customizable. But that’s also the critical point of this device. Different DJs have different visions for the perfect mutlitouch turntable, I think. One wants to modify the display and the other to change controls. And these controls seem to be a little bit tricky to me. You have to use one finger for ABC and two finger for DEF and three finger for GH? and four for ?U?. And that’s the thing for both hands and you have to approximately hit the right locations on the table and that without haptic feedback. Looks great but needs closer exmination.

[kcai.edu: Kansas City Art Institute]

January 21, 2011

Human-car interface

Ford (in co-operation with Microsoft) provides an interesting human-machine interface for their cars called Ford SYNC. The driver mostly controls this interface with his voice (apparently except the volume level) and with that he is able to call people from (connected smart)phone book, get through radio news, be led to a unknown business location and so on.

Ford doesn’t have a lot of visual output of navigational data nor tactile interface. And I think that’s because they don’t want to distract the driver in such way. But so they have to do a lot of communication processes in an auditive way and the driver has to remember many things (menu options, etc.). Thinking about his options is mentally demanding and this is also distracting the driver from the traffic, from my point of view. That’s also the case if the driver lets read out the options by the ‘car’ because he has to focus to that.

I don’t like this interface really well. In my view the perfect interface is multimodal and therefore balanced with more visual and also tactile elements.

[ford.com: Ford SYNC]

PS: Why the most of auditive machine outputs are female?

November 15, 2010

Touchless UI with ultrasound

The firm Elliptic Labs has developed a prototype for touchless user interfaces which is easier to implement and cheaper than today used camera-image-based technology and utilizes ultrasound. I like this “hand gesture – in / out (play / pause)”.

This can be a cool technology. Imagine to sit on the sofa watching a film and just have to use this gesture “out / pause” to hold the film. It’ll be something like “Minority Report” (again).

But what’s with animals (especially pets)? For example dogs can hear these (ultra)sounds and I think it’ll hurt them! Or will this attract bats to come around?

[ellipticlabs.com: Elliptic Labs]

November 8, 2010

3D dynamic hologram

Nearly three years ago the University of Arizona (UA) developed | announced a system which was able to print a three dimensional hologram in a couple of minutes. This 3D picture can be seen by human without special glasses.

Today the UA College of Optical Sciences is one step further. 3D, no glasses, one color – it’s all the same but now they can produce one picture in about two seconds!

That’s really nice. Project conferences can be hold virtually in realtime in the future and so for example exhaust fumes can be reduced. OK, one has to check CO2 emissions and so on which is needed to produce 3D hologram systems. But in my opinion that’s the right way and many applications are waiting for this technology.

[optics.arizona.edu: UA College of Optical Sciences]

October 10, 2010

Navigation: head orientation & LED

The following video shows glasses extended by LEDs (connected to mobile navigation system) to navigate pedestrians only by LED lights.

That’s a good idea to prevent pedestrians from the constraint permanently looking to the navigation system (display). But I guess (primarily at night) these lights are annoying.

[www.uec.ac.jp: The University of Electro-Communications]

October 3, 2010

Transparent displays

There are coming more and larger transparent displays like these ones:

That’s an interesting technology. It enables us to transfer more (visual) informations and in different ways than today. For instance we can augment the reality and thereby enhance team communication (e.g. in offices: visualize important informations of your in front sitting neighbor). Or many people around a display can see (and in the future: can interact with!?) the same data. Maybe soon one interacts with the data in the back of the display and sees his fingers interacting without covering the visualizing with his hand. And maybe … or …

September 26, 2010

Accessibility of programming software for blind people

It’s great to see that disabled persons are integrated in normal life. The video shows how a (blind)human-machine interface can be built so that the people are able to program software. (The video also shows how they are teached to do so.)

[www.siue.edu – A.M. Stefik: Dissertation (2008, PDF)]

September 19, 2010

Automatically catch thoughts

This video shows the today’s circumstances to use a BCI (brain computer interface). You need a lot of tools and time to gather simple text messages.

But it’s impressive and I’m sure something like this is the human machine interface of the future! Maybe I saw to many films like “Matrix” and so on but seriously: The brain is the only body part we all have! So in fact e.g. deaf people will be able to speak. Maybe we can make an advance in communicating with sick people, think of paraplegic people or people in coma. Perhaps we will be able to communicate in some way with animals!

[engr.wisc.edu…: Department of Biomedical Engineering]

September 12, 2010

Protheses: realistic and better

The US Defense Advanced Research Projects Agency (DARPA) promotes research efforts to enable protheses near humane or in some way even better. So they try to connect them with human cerebral cortex to deliver information from brain to the protheses and also back again! So people can control protheses only by thoughts and will get sense impressions like the structure of a touched surface. The video (especially the first half) impressions were impressive to me:

Unfortunately there can be negative deployments but let’s think about all the good things which can be done with that technology! A lot of people can live a better life just because they will be able to do normal things.

[exhibitions.cooperhewitt.org: Modular Prosthetic-limb System]

September 5, 2010

Human body as interaction medium

Research efforts look at the human body as a medium for tactile input “technology” (and therefor they also need it as visual output surface).

That’s a really nice approach so that one don’t have to hold e.g. a mp3 player or a smartphone. But you have to have some free skin and that wouldn’t be accepted by users in winter.

In the video you can see Chris Harrison dialing with his hand – and than? Is he holding his hand at his ears and does he have a body built in microphone? ;D

[chrisharrison.net: Skinput]