New Computer Interfaces
Desktop computers are now WIMP interface. But mobile phones have shown that other concepts are possible. The new interfaces will change the way computers will be used. The revolution has already started.
Windows, MacOS, Gnome are WIMP interfaces : Windows + Icons + Menus + Pointers (mouse or arrows).
To this concept, others oppose:
- NUI: Natural User Interface.
The user interface is natural gestures, it was with hands that are adressed and moved objects.
- OCGM: Objects, Containers, Gestures, Manipulations.
In this new approach containers are no more windows, but objects that can contain other objects. The actions of the users are recognized by various computer peripherals. Manipulations are all effects from the user on objects.
This last concept should be the basis of Windows 8. It promises a radical change. But in the meantime, various techniques are emerging that are already usable.
TrueMotion technologically turns a PC into WII and even goes further ...
Withe the WII whose commercial success is immense, we made a first step in the field of virtual reality. To provide a technology similar to PCs, the company Sixense developed a paddle similar and even more elaborate.
The Nintendo's Wiimote knows in which direction you point the hand.
The TrueMotion controller through a dock producing an electromagnetic field is much more accurate because it knows what you point out but also with in which angle you look!
TrueMotion should work with current games on PC according Sixense, and even better on games designed to take advantage of all its features ...
A prototype was presented at CES (Consumer Electronics Show) and will lead to a commercial product whose price would be around 100 euros, with a game included.
Mgestyk is a futuristic alternative to the mouse that is already functional: command a computer with gestures!
The site does not fail to remember the movie Minority Report where we see Tom Cruise acting on the screen by moving the arms ...
To achieve this, you need a camera and the Mgestyk software that connects to it, and which can interpret your movements.
There are many applications as is demonstrated by the series of videos presented on the site: simulated driving or flying, selection of photos in a gallery, web browsing, Google Earth, and all sorts of games.
This system reminds us of the Nintendo Wiimote, or Logitech 3d mouse but with the advantage that it is empty-handed ... See the video:
More about Mgestyk
Blobo replaces a pad or a mouse with a new form of interface to video games consisting of a single ball and a Bluetooth connection. It works with a PC or a mobile.
This is the Finnish company Ball-it that has invented this new form of interaction similar to the Wiimote, but simpler.
With it you can play to ball games, running, aim with this ball in hand… Sensors capture the movements and transmit them to the characters in the virtual world of the game.
Games have a comic style such as with the Wii of Nintendo.
Ball-it sells a package that includes a Blobo and six games on a CD-ROM for PC. The whole is worth 55 euros (80$).
- Bloboshop. The Blobo website. The pack includes the ball and six games.
- Get the SDK. Develop for this interface.
Designed initially for the Xbox 360, Kinect, developed by Microsoft is closer to virtual reality than does the Nintendo Wii: this is the whole body that interacts with the console.
No more mouse or joystick, just to stand near the device to merge with the character on the screen, it reproduces all your movements ...
More in the article on Kinect.
Virtual multi-touch interface
In the wake of Kinect, with a projector it is possible to make any surface a touchpad.
This technology was developed by Microsoft with Carnegie Mellon University.
Which applications? We can bring up a keyboard on a desk or a wall, and icons on an object and thus provide it a virtual interface. There is no limit to what we can invent with this technology.
This can be combined with virtual reality and thus, in addition to giving a new look to objects in the environment through images, make them interactive by providing functions to the things represented virtually!
Holodesk and NUI
A Natural User Interface (NUI) allows to directly manipulate objects in a holographic 3D space.
For example, you can enter the hand in the space and its virtual office accessories, take a sheet of paper, move it to the desktop.
Microsoft has developed a system to implement to this principle, Holodesk.
Click on the image to see the video.
The name of text 2.0 was given to a interface technology for electronic book reader that follows the reader's eye and adjusts the picture to highlight what it looks and that only.
- Plugin for e-book. PEEP (Processing Easy Eye tracker plugin). For Windows and Java.
The idea of the MIT is to color the parts of the hand in a glove that allows a camera to identify more easily the movements. There are exactly reproduced on the screen by the gesture recognition software. The video below is inconclusive ...
It is a table or an interactive whiteboard, with a computer and cameras that interpret the gestures of the user and display contents based on that.
The version of Sony, attracTable goes even further. It can recognize the age, sex and the facial expressions of the user, like joy or disappointment, and react according to these parameters.
The table is equipped with two cameras connected to the computer controlling a cursor or an avatar. Objects may be put on it. If there is a mobile, it may serve as a giant screen.
- Video demonstration. It should be marketed in June 2010.
The interface developed by Toshiba is another UI through gesture recognition, consisting of a webcam and a software. It uses little processor, around 3%.
The principle is to superimpose a semi-transparent image of the user on the screen, which helps him to better interact with the content.
Mimesign, touchess gesture
Drive a device by gestures without touching the screen, a technology proposed by Elliptic Labs. The Norwegian company specializing in ultrasound uses a set of pre-programmed actions to control the device.
Microsoft Research Cambridge gives researchers the task of writing a report on what could be the human-computer interface of the future, given technological innovations that are already appearing.
The report first provides an evolution of computers themselves, becoming more intelligent, capable of learning, may in part what is expected of them and react without waiting to receive commands.
The interface will change the GUI (Graphical User Interface) such as Windows, KDE, Gnome to something that responds to speech, gestures, eye and is even considered an interaction with the thought, something that is technically possible today.
In addition it will introduce in the interaction to GPS, cameras, radio waves (RFID Low-cost Radio Frequency Identification) that will create an interconnection between man, computer, personal environment or work.
All this is planned for 2020.
- Manual Dexterity. By combining pen and gesture recognition, we get a unique interface even closer to the real world.