The Future is Here: Augmented Reality Storybooks

ar_storybookDisney has always been on the forefront of technological innovation whenever and wherever their animation is concerned. Augmented reality has been a part of their operations for quite some time, usually in the form of displays put on at Epcot Center or their Haunted Mansion. But now, they are bringing their efforts in AR to the kind of standard storybook that you would read to your children before bedtime.

Thanks to innovations provided by Nintendo DS, the PSP, tablets and smartphones, books have become alive and interactive in ways that were simply not possible ten or twenty years ago. However, one cannot deny that ebooks simply do not have the same kind of old world charm and magic that paperbacks do. Call it nostalgic appeal or tradition, but reading to a child from a bounded tome just seems somehow more meaningful to most people.

disneyhideout-640x353And that’s where Disney’s HideOut project comes into play, a mobile projector is used to create an augmented reality storybook. How it works is simple enough, and in a way, involves merging the best of electronic and paper media. Within the book, certain parts will be printed using special infrared-absorbing ink, so that sentences and images can be tracked.

The mobile projector, in turn, uses a built-in camera to sense the ink, then projects digital images onto the page’s surface that are animated to interact with the markers. In this way, it knows to show certain images when parts of the book call for them to be displayed, and can turn normal pictures into 3D animated segments.

disney_argameAnd storybooks aren’t the only application being investigated by Disney. In addition, they have been experimenting with game concepts, where a user would moves a mobile projector around a board, causing a character to avoid enemies. In another scenario, a characters projected onto a surface interacts with tangible objects placed around them. This would not be entertaining to a child, but could be educational as well.

The applications also extend to the world of work, as the demo below shows. in this case, HideOut projects a file system onto the top of a desk, allowing the user to choose folders by aiming the projector, not unlike how a person selects channels or options using a Wii remote by aiming it at a sensor bar. And the technology could even be used on smartphones and mobile devices, allowing people the ability to interact with their phone, Facetime, or Skype on larger surfaces.

disneyhideoutAnd of course, Disney is not the only company developing this kind of AR interactive technology, nor are they the first. Products like ColAR, an app that brings your coloring book images to life, and Eye of Judgment, an early PS3 game that accessed CCG cards and animated the characters on-screen, are already on the market. And while there does not appear to be a release date for Disney’s HideOut device just yet, its likely to be making the rounds within a few years tops.

For anyone familiar with the world of Augmented Reality and computing, this is likely to call to mind what Pranav Mistry demonstrated with his Sixth Sense technology, something which is being adopted by numerous developers for mobile computing. Since he first unveiled his concept back in 2009, the technology has been improving and the potential for commercial applications has been keeping pace.

In just a few years time, every storybook is likely to come equipped with its own projector. And I wouldn’t be surprised if it quickly becomes the norm to see people out on the streets interacting with images and worlds that only they can see. And those of us who are old enough will think back to a time when only crazy people did this!

In the meantime, check out this demo of the Disney’s HideOut device in action:


Source: extremetech.com

IBM Creates First Photonic Microchip

optical_computer1For many years, optical computing has been a subject of great interest for engineers and researchers. As opposed to the current crop of computers which rely on the movement of electrons in and out of transistors to do logic, an optical computer relies on the movement of photons. Such a computer would confer obvious advantages, mainly in the realm of computing speed since photons travel much faster than electrical current.

While the concept and technology is relatively straightforward, no one has been able to develop photonic components that were commercially viable. All that changed this past December as IBM became the first company to integrate electrical and optical components on the same chip. As expected, when tested, this new chip was able to transmit data significantly faster than current state-of-the-art copper and optical networks.

ibm-silicon-nanophotonic-chip-copper-and-waveguidesBut what was surprising was just how fast the difference really was. Whereas current interconnects are generally measured in gigabits per second, IBM’s new chip is already capable of shuttling data around at terabits per second. In other words, over a thousand times faster than what we’re currently used to. And since it will be no big task or expense to replace the current generation of electrical components with photonic ones, we could be seeing this chip taking the place of our standard CPUs really soon!

This comes after a decade of research and an announcement made back in 2010, specifically that IBM Research was tackling the concept of silicon nanophotonics. And since they’ve proven they can create the chips commercially, they could be on the market within just a couple of years. This is certainly big news for supercomputing and the cloud, where limited bandwidth between servers is a major bottleneck for those with a need for speed!

internetCool as this is, there are actually two key breakthroughs to boast about here. First, IBM has managed to build a monolithic silicon chip that integrates both electrical (transistors, capacitors, resistors) and optical (modulators, photodetectors, waveguides) components. Monolithic means that the entire chip is fabricated from a single crystal of silicon on a single production line, and the optical and electrical components are mixed up together to form an integrated circuit.

Second, and perhaps more importantly, IBM was able to manufacture these chips using the same process they use to produce the CPUs for the Xbox 360, PS3, and Wii. This was not easy, according to internal sources, but in so doing, they can produce this new chip using their standard manufacturing process, which will not only save them money in the long run, but make the conversion process that much cheaper and easier. From all outward indications, it seems that IBM spent most of the last two years trying to ensure that this aspect of the process would work.

Woman-Smashing-ComputerExcited yet? Or perhaps concerned that this boost in speed will mean even more competition and the need to constantly upgrade? Well, given the history of computing and technological progress, both of these sentiments would be right on the money. On the one hand, this development may herald all kinds of changes and possibilities for research and development, with breakthroughs coming within days and weeks instead of years.

At the same time, it could mean that rest of us will be even more hard pressed to keep our software and hardware current, which can be frustrating as hell. As it stands, Moore’s Law states that it takes between 18 months and two years for CPUs to double in speed. Now imagine that dwindling to just a few weeks, and you’ve got a whole new ballgame!

Source: Extremetech.com