The Future is Here: The Thumbles Robot Touch Screen

thumblesSmartphones and tablets, with their high-resolution touchscreens and ever-increasing number of apps, are all very impressive and good. And though some apps are even able to jump from the screen in 3D, the vast majority are still limited to two-dimensions and are limited in terms of interaction. More and more, interface designers are attempting to break this fourth wall and make information something that you can really feel and move with your own two hands.

Take the Thumbles, an interactive screen created by James Patten from Patten Studio. Rather than your convention 2D touchscreen that responds to the heat in your fingers, this desktop interface combines touch screens with tiny robots that act as interactive controls. Whenever a new button would normally pop on the screen, a robot drives up instead, precisely parking for the user to grab it, turn it, or rearrange it. And the idea is surprisingly versatile.

thumbles1As the video below demonstrates, the robots serve all sorts of functions. In various applications, they appear as grabbable hooks at the ends of molecules, twistable knobs in a sound and video editor, trackable police cars on traffic maps, and swappable space ships in a video game. If you move or twist one robot, another robot can mirror the movement perfectly. And thanks to their omnidirectional wheels, the robots always move with singular intent, driving in any direction without turning first.

Naturally, there are concerns about the practicality of this technology where size is concerned. While it makes sense for instances where space isn’t a primary concern, it doesn’t exactly work for a smartphone or tablet touchscreen. In that case, the means simply don’t exist to create robots small enough to wander around the tiny screen space and act as interfaces. But in police stations, architecture firms, industrial design settings, or military command centers, the Thumbles and systems like it are sure to be all the rage.

thumbles2Consider another example shown in the video, where we see a dispatcher who is able to pick up and move a police car to a new location to dispatch it. Whereas a dispatcher is currently required to listen for news of a disturbance, check an available list of vehicles, see who is close to the scene, and then call that police officer to go to that scene, this tactile interface streamlines such tasks into quick movements and manipulations.

The same holds true for architects who want to move design features around on a CAD model; corporate officers who need to visualize their business model; landscapers who want to see what a stretch of Earth will look like once they’ve raised a section of land, changed the drainage, planted trees or bushes, etc.; and military planners can actively tell different units on a battlefield (or a natural disaster) what to do in real-time, responding to changing circumstances quicker and more effectively, and with far less confusion.

Be sure to check out the demo video below, showing the Thumbles in action. And be sure to check out Patten Studio on their website.


Sources: fastcodesign.com, pattenstudio.com

The Future is Here: The VR Cave!

Cave2It’s called the CAVE2, a next-generation virtual reality platform that is currently the most advanced visualization environment on Earth. Whereas other VR platforms are either in 2D or limited in terms of interactive capability, the CAVE2 is about the closest thing there is to a real-life holodeck. This is accomplished through a series of panoramic, floor-to-ceiling LCD displays and an optical tracking interface that is capable of rendering remarkably realistic 3D environments.

Developed by the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago, CAVE2 is a direct follow up to the VR platform the university created back in 1992. Like the original, the name stands for “Cave Automatic Virtual Environment”; but whereas its predecessor was set in a cube-shaped room, the new environment is set within a cylindrical, 320 degree immersive space. In addition, the screens, sounds, and resolution have all been vastly upgraded.

ModelFor example, the 7.5 by 2.5 meter space (24 feet x 8 feet) is covered floor-to-ceiling with 72 3D LCD screens, each of which outputs images at 37-megapixels (that’s 7,360 x 4,912 pixels, twice that of 2D). This allows for a pixel density that is on par with the human eye’s own angular resolution at 20/20 vision. Headgear is needed to get the full 3D effect, and the entire apparatus is controlled by a hand-held wand.

Yes, in addition to the holodeck, some other science fiction parallels are coming to mind right now. For example, there’s the gloved-controlled holographic interface from Minority Report, the high-tech nursery in Ray Bradbury’s short story The Veldt, and the parlor walls he envisioned in Fahrenheit 451. And apparently, this is no accident, since director Jason Leigh, the head of the project, is a major sci-fi geek!

mars_lifeBut of course, all this technology was designed with some real-life, practical applications in mind. These range from the exploration of outer space to the exploration of inner space, particularly the human body. As Ali Alaraj, a noted neuroscientist who used the CAVE2 put it:

“You can walk between the blood vessels. You can look at the arteries from below. You can look at the arteries from the side. …That was science fiction for me. It’s fantastic to come to work. Every day is like getting to live a science fiction dream. To do science in this kind of environment is absolutely amazing.”

All of this bodes well for NASA’s plans for space exploration that would involve space probes, holographics, and avatars. It would also be incredibly awesome as far as individual hospitals were concerned. Henceforth, they could perform diagnostic surgery using nanoprobes which could detail a patients body, inch for inch, from the inside out.

And of course, the EVL has provided a cool video of the CAVE2 platform in action. Check them out:

Source: IO9, evl.uic.edu