The Future is Here: The Li-Fi Network

lifi_internet1Scientists have been looking at optics for some time as a means of enhancing the usual means of data processing. In terms of computing, it means that using optical components – which use photons rather than electrons to transmit information – could lead to computers that can run exponentially faster than those that use traditional electronics. But a group of German scientists have taken that a step farther, proposing an internet that runs on the same principles.

Using conventional LED bulbs in a laboratory setting, researchers at the Fraunhofer Henrich Hertz Institute (HHI) in Germany successfully transmitted data at 3Gbps using conventional. In a real-world setting, the same system was capable of transmitting data at rate of 500Mbps, roughly a dozen to hundreds of times what a conventional WiFi network is capable of transmitting.

optical_computer1The concept of visible light communications (VLC), or LiFi as it is sometimes known, has received a lot of attention in recent years, mostly due to the growing prevalence of LED technology. Much like other solid-state electronics, LEDs can be controlled as any other electronic component can. By extension, a VLC network can be created along the same lines as a WiFi one, using terahertz radiation (light) instead of microwaves and an LED bulb instead of an oscillating a WiFi transmitter, and photodetectors instead of antennas.

Compared to WiFi, the LiFi concept comes with a slew of advantages. First of all, it can turn any LED lamp into a network connection, and since it operates at such high frequencies, is well beyond the range of the current regulatory licensing framework. For the same reason, LiFi can be used in areas where extensive RF (radio-frequency) interference is common, such as on airplanes, in airports and hospitals. The Fraunhofer researchers even claim that VLC improves privacy, since the signal is directed from one box to another and not made up waves that can be easily picked up on by a third party.

Optical_ComputerBut of course, there is still much research and development that needs to be done. As it stands, the Fraunhoer research is limited in terms of how much information can be sent and how much distance it can travel. In order to compete with conventional WiFi, a system that uses optics to transmit information will have to be able to demonstrate the ability to pack a significant amount of bandwidth into a signal that can reach in excess of 100 m.

Nevertheless, there are numerous startups that are making headway, and many more researchers who are adapting optical components for computers as we speak. As a result, it shouldn’t be long before signs like this are appearing everywhere in your neighborhood…

lifi-internet

Source: Extremetech.com

IBM Creates First Photonic Microchip

optical_computer1For many years, optical computing has been a subject of great interest for engineers and researchers. As opposed to the current crop of computers which rely on the movement of electrons in and out of transistors to do logic, an optical computer relies on the movement of photons. Such a computer would confer obvious advantages, mainly in the realm of computing speed since photons travel much faster than electrical current.

While the concept and technology is relatively straightforward, no one has been able to develop photonic components that were commercially viable. All that changed this past December as IBM became the first company to integrate electrical and optical components on the same chip. As expected, when tested, this new chip was able to transmit data significantly faster than current state-of-the-art copper and optical networks.

ibm-silicon-nanophotonic-chip-copper-and-waveguidesBut what was surprising was just how fast the difference really was. Whereas current interconnects are generally measured in gigabits per second, IBM’s new chip is already capable of shuttling data around at terabits per second. In other words, over a thousand times faster than what we’re currently used to. And since it will be no big task or expense to replace the current generation of electrical components with photonic ones, we could be seeing this chip taking the place of our standard CPUs really soon!

This comes after a decade of research and an announcement made back in 2010, specifically that IBM Research was tackling the concept of silicon nanophotonics. And since they’ve proven they can create the chips commercially, they could be on the market within just a couple of years. This is certainly big news for supercomputing and the cloud, where limited bandwidth between servers is a major bottleneck for those with a need for speed!

internetCool as this is, there are actually two key breakthroughs to boast about here. First, IBM has managed to build a monolithic silicon chip that integrates both electrical (transistors, capacitors, resistors) and optical (modulators, photodetectors, waveguides) components. Monolithic means that the entire chip is fabricated from a single crystal of silicon on a single production line, and the optical and electrical components are mixed up together to form an integrated circuit.

Second, and perhaps more importantly, IBM was able to manufacture these chips using the same process they use to produce the CPUs for the Xbox 360, PS3, and Wii. This was not easy, according to internal sources, but in so doing, they can produce this new chip using their standard manufacturing process, which will not only save them money in the long run, but make the conversion process that much cheaper and easier. From all outward indications, it seems that IBM spent most of the last two years trying to ensure that this aspect of the process would work.

Woman-Smashing-ComputerExcited yet? Or perhaps concerned that this boost in speed will mean even more competition and the need to constantly upgrade? Well, given the history of computing and technological progress, both of these sentiments would be right on the money. On the one hand, this development may herald all kinds of changes and possibilities for research and development, with breakthroughs coming within days and weeks instead of years.

At the same time, it could mean that rest of us will be even more hard pressed to keep our software and hardware current, which can be frustrating as hell. As it stands, Moore’s Law states that it takes between 18 months and two years for CPUs to double in speed. Now imagine that dwindling to just a few weeks, and you’ve got a whole new ballgame!

Source: Extremetech.com