The Future of Physics: Entanglements and Wormholes

worm_holeQuantum entanglements are one of the most bizarre aspects of quantum physics, so much so that Albert Einstein himself referred to it as “spooky action at a distance.” Basically, the concept involves two particles with each occupying multiple states at once. Until such time as one is measured, neither has a definite state, causing the other particle to instantly assume a corresponding state, even if they reside on opposite ends of the universe.

But what enables particles to communicate instantaneously – and seemingly faster than the speed of light – over such vast distances? Earlier this year, physicists proposed an answer in the form of “wormholes,” or gravitational tunnels. The group showed that by creating two entangled black holes, then pulling them apart, they formed a wormhole connecting the distant black holes.

quantum-entanglement1Now an MIT physicist has found that, looked at through the lens of string theory, the creation of two entangled quarks — the very building blocks of matter — simultaneously gives rise to a wormhole connecting the pair. The theoretical results bolster the relatively new and exciting idea that the laws of gravity that hold the universe together may not be fundamental, but may arise from quantum entanglement themselves.

Julian Sonner, a senior postdoc at MIT’s Laboratory for Nuclear Science and Center for Theoretical Physics, published the results of his study in the journal Physical Review Letters, where it appears together with a related paper by Kristan Jensen of the University of Victoria and Andreas Karch of the University of Washington. Already, the theory is causing quite the buzz for scientists and fans of sci-fi who would like to believe FTL is still possible.

quantum_field_theoryThis is certainly good news for scientists looking to resolve the fundamental nature of the universe by seeing how its discernible laws fit together. Ever since quantum mechanics was first proposed more than a century ago, the main challenge for physicists has been to explain how it correlates to gravity. While quantum mechanics works extremely well at describing how things work on the microscopic level, it remains incompatible with general relativity.

For years, physicists have tried to come up with a theory that can marry the two fields. This has ranged from proposing the existence of a subatomic particle known as the “graviton” or “dilaton”, to various Grand Unifying Theories – aka. Theory of Everything (TOE) – such as Superstring Theory, Loop Quantum Gravity, and other theoretical models to explain the interaction. But so far, none have proven successful.

gravity_well_cartography_2_by_lordsong-d5lrxwsA theory of quantum gravity would suggest that classical gravity is not a fundamental concept, as Einstein first proposed, but rather emerges from a more basic, quantum-based phenomenon. In a macroscopic context, this would mean that the universe is shaped by something more fundamental than the forces of gravity. This is where quantum entanglement could play a role.

Naturally, there is a problem with this idea. Two entangled particles, “communicating” across vast distances, would have to do so at speeds faster than that of light — a violation of the laws of physics, according to Einstein. In July, physicists Juan Maldacena of the Institute for Advanced Study and Leonard Susskind of Stanford University proposed a theoretical solution in the form of two entangled black holes.

big bang_blackholeWhen the black holes were entangled, then pulled apart, the theorists found that what emerged was a wormhole – a tunnel through space-time that is thought to be held together by gravity. The idea seemed to suggest that, in the case of wormholes, gravity emerges from the more fundamental phenomenon of entangled black holes. Following up on work by Jensen and Karch, Sonner has sought to tackle this idea at the level of quarks.

To see what emerges from two entangled quarks, he first generated entangled quarks using the Schwinger effect — a concept in quantum theory that enables one to create particles out of nothing. Sonner then mapped the entangled quarks onto a four-dimensional space, considered a representation of space-time. In contrast, gravity is thought to exist in the fifth dimension. According to Einstein’s laws, it acts to “bend” and shape space-time.

black_holeTo see what geometry may emerge in the fifth dimension from entangled quarks in the fourth, Sonner employed holographic duality, a concept in string theory. While a hologram is a two-dimensional object, it contains all the information necessary to represent a three-dimensional view. Essentially, holographic duality is a way to derive a more complex dimension from the next lowest dimension.

Using holographic duality, Sonner derived the entangled quarks, and found that what emerged was a wormhole connecting the two, implying that the creation of quarks simultaneously creates a wormhole between them. More fundamentally, the results suggest that gravity itself may emerge from quantum entanglement. On top of all that, the geometry, or bending, of the universe as described by classical gravity, may also be a consequence of entanglement.

quantum-entanglement3As Sonner put it in his report, the results are a theoretical explanation for a problem that has dogged scientists who quite some time:

There are some hard questions of quantum gravity we still don’t understand, and we’ve been banging our heads against these problems for a long time. We need to find the right inroads to understanding these questions… It’s the most basic representation yet that we have where entanglement gives rise to some sort of geometry. What happens if some of this entanglement is lost, and what happens to the geometry? There are many roads that can be pursued, and in that sense, this work can turn out to be very helpful.

Granted, the idea of riding wormholes so that we, as humans, can travel from one location in space to another is still very much science fiction, knowing that there may very well be a sound, scientific basis for their existence is good news for anyone who believes we will be able to “jump” around the universe in the near to distant future. I used to be one of them, now… I think I might just be a believer again!

USS_Enterprise_caught_in_artificial_wormhole-640x272Sources: web.mit.edu, extremetech.com

The Future of Fusion: 1-MW Cold Fusion Plant Now Available!

fusion_energyIt’s actually here: the world’s first fusion power plant that is capable of generated a single megawatt of power and is available for pre-order. It’s known as the E-Cat 1MW Plant, which comes in a standard shipping container and uses low-energy nuclear reactions (LENR) – a process, often known as cold fusion, that fuses nickel and hydrogen into copper – to produce energy 100,000 times more efficiently than combustion.

E-Cat, or Energy Catalyzer, is a technology (and company of the same name) developed by Andrea Rossi – an Italian scientist who claims he’s finally harnessed cold fusion. For just $1.5 million, people can pre-order an E-Cat and expect delivery by early 2014. With this news, many are wondering if the age of cold fusion, where clean, abundant energy is readily available, is finally upon us.

E.Cat1Cold fusion, as the name implies, is like normal fusion, but instead of producing fast neutrons and ionizing radiation that decimates everything in its path, cold fusion’s Low-Energy Nuclear Reactions (LENR) produce very slow, safe neutrons. Where normal fusion requires massive, expensive containment systems, it sounds like E-Cat’s cold fusion can be safely contained inside a simple, pressurized vessel.

And while normal fusion power is generated by fusing hydrogen atoms, cold fusion fuses nickel and hydrogen into copper, by way of some kind of special catalyst. Despite the rudimentary setup, though, cold fusion still has the massive power and energy density intrinsic to atomic fusion. In short, it produces far more energy than conventional chemical reactions – such as burning fossil fuels. The only challenge is, the massive amounts of power that are usually required to initiate the reaction.

e.cat2According to E-Cat, each of its cold fusion reactors measures 20x20x1 centimeters (7.8×7.8×0.39 inches) and you stack these individual reactors together in parallel to create a thermal plant. The E-Cat 1MW Plant consists of 106 of these units rammed into a standard shipping container. Based on the specs provided by Rossi, the fuel costs works out to be $1 per megawatt-hour, which is utterly insane. Coal power is around $100 per megawatt-hour.

But before anyone gets too excited about the commercialization of cold fusion, it should be noted that Rossi is still being incredibly opaque about how his cold fusion tech actually works. The data sheet for the 1MW Plant shares one interesting tidbit: Despite producing 1MW of power, the plant requires a constant 200 kilowatts of input power — presumably to sustain the reaction.

E.Cat5_-1030x858The spec sheet also says that the fuel (specially treated nickel and hydrogen gas) needs to be recharged every two years. One of the science community’ biggest sticking points about Rossi’s cold fusion devices is that he hasn’t proven that his LENR is self-sustaining. Despite a huge amount of output energy, the device still needs to be connected to the mains.

What’s more, due to a lack of published papers, and thus peer review, and a dearth of protective patents, the scientific community in general remains very wary of Rossi’s claims. And of course, we should all remember that this is not the first time that researchers have proclaimed victory in the race to make cold fusion happen. Whenever the words “cold fusion” are raised in conjunction, the case of the Fleischmann–Pons experiment immediately springs to mind.

NASA_coldfusionFor those who remember, this case involved an experiment made in 1989 where two researchers claimed to have achieved cold fusion using palladium rods and heavy water. Initially, the scientific community treated the news with exciteent and interest, but after numerous labs were unable to reproduce their experiment, and a number of false positives were reported, their claims were officially debunked and they relocated their lab to avoid any further controversy.

At the same time, however, one must remember that some significant changes have happened in the past three decades. For one, NASA’s LENR facility has been working on producing cold fusion reactions for some time using an oscillating nickel lattice and hydrogen atoms. Then there was the recent milestone produced by the National Ignition Facility in California, which produced the first fusion reaction using lasers that produced more energy than it required.

Who’s to say if this is the real deal? All that is known is that between this most recent claim, and ongoing experiments conducted by NASA and other research organizations to make LENR cold fusion happen, a revolution in clean energy is set to happen, and will most likely happen within our lifetimes.

Addendum: Just been informed by WordPress that this is my 1400th post! Woot-woot!

Sources: extremetech.com, ecat.com

Evidence for the Big Bang

planck-attnotated-580x372The Big Bang Theory has been the dominant cosmological model for over half a century. According to the theory, the universe was created approximately 14 billion years ago from an extremely hot, dense state and then began expanding rapidly. After the initial expansion, the Universe cooled and began to form various subatomic particles and basic elements. Giant clouds of these primordial elements later coalesced through gravity to form stars, galaxies, and eventually planets.

And while it has its detractors, most of whom subscribe to the alternate Steady State Theory – which claims that new matter is continuously created as the universe expands – it has come to represent the scientific consensus as to how the universe came to be. And as usual, my ol’ pal and mentor in all things digital, Fraser Cain, recently released a video with the help of Universe Today discussing the particulars of it.

big_bangAddressing the particulars of the Big Bang Theory, Cain lists the many contributions made over the past century that has led this so-called theory to become the scientific consensus has come to exist. They are, in a nutshell:

  1. Cosmic Expanion: In 1912, astronomer Vesto Slipher calculated the speed and distance of “spiral nebulae” (galaxies) by measuring the light coming from them. He determined most were moving away. In 1924, Edwin Hubble determined that these galaxies were outside the Milky Way. He postulates that the motion of galaxies away from our own indicates a common point of origin.
  2. Abundance of Elements: Immediately after the big bang, only hydrogen existed and compressed into a tiny area of space under incredible heat and pressure. Like a star, this turned hydrogen into helium and other basic elements. Looking out into the universe (and hence back in time) scientists have found that great distances, the ratios of hydrogen to basic elements is consistent with what is found in star’s interiors.
  3. Cosmic Microwave Background (CMB) Radiation: In the 1960’s, using a radiotelescope, Arno Penzias and Robert Wilson discovered a background radio emission coming from every direction in the sky, day or night. This was consistent with the Big Bang Theory, which predicted that after the Big Bang, there would have been a release of radiation which then expanded billions of light years in all directions and cooled to the point that it shifted to invisible, microwave radiation.
  4. Large Scale Structure: The formation of galaxies and the large-scale structure of the cosmos are very similar. This is consistent with belief that after the initial Big Bang, the matter created would have cooled and began to coalesce into large collections, which is what galaxies, local galactic groups, and super-clusters are.

These are the four pillars of the Big Bang Theory, but they are no means the only points in its favor. In addition, there are numerous observational clues, such as how we have yet to observe a stars in the universe older than 13 billion years old, and fluctuations in the CMB that indicate a lack of uniformity. On top of that, there is the ongoing research into the existence of Dark Matter and Dark Energy, which are sure to bear fruit in the near future if all goes well.

big_bang1In short, scientists have a pretty good idea of how the universe came to be and the evidence all seems to confirm it. And some mysteries remain, we can be relatively confident that ongoing experimentation and research will come up with new and creative ways to shed light on the final unknowns. Little reason then why the Big Bang Theory enjoys such widespread support, much like Evolution, Gravity, and General Relativity.

Be sure to check out the full video, and subscribe to Universe Today for additional informative videos, podcasts, and articles. As someone who used to write for them, I can tell you that it’s a pretty good time, and very enlightening!

News from Space: The Search for Life on Europa

europa-landerJupiter’s moon of Europa is one of the best and most intriguing candidates for extra-terrestrial life in our Solar System. For many decades, scientists have known that beneath its icy outer-shell, a warm, liquid ocean resides. Due largely to interaction with Jupiter’s strong magnetic field – which causes heat-generating tidal forces in Europa’s interior – these warm waters may host life.

And now, new models suggest that its ice-covered waters are turbulent near the lower latitudes. This is what gives rise to its chaotic equatorial landscapes, but intriguingly, may also make it easier for life to make it to the surface. This contradicts previously held beliefs that Europa’s life was contained beneath it’s outer shell, and will mean that any missions mounted to Europa may have an easier time spotting it.

europa_chaosterrainThanks to ongoing observation of the planet’s surface – especially the Galileo and New Horizons space probes which provided comprehensive and detailed images – it has been known that Europa’s surface features are not consistent. The landscape is marked by features of disrupted ice known as chaos terrains, geological features that are characterized by huge chunks of ice that have broken away and then re-froze into chaotic patterns.

These models were produced by University of Texas geophysicist Krista Soderlund and her colleagues. Based on computer simulations, Soderlund and her colleagues have theorized that turbulent global ocean currents move Europa’s internal heat to the surface most efficiently in regions closest to the moon’s equator. This is likely causing the melting and upwelling at the surface, and why regions further north and south appear to be smoother.

europa_modelIn addition, the models indicate that given Europa’s spin, heat flow, and other factors, it likely percolates upward at about 1m per second or so — which is remarkably fast. This would explain why the equatorial regions appear to be so fragmented. But it also means that these areas are also likely yo be relatively fragile and soft, which means that upward currents could bring nutrients and even living organisms to the surface.

Hence why any potential search for signs of life on this moon would now appear to be considerably easier. If missions are indeed mounted to Europa in the not-too-distant future, either involving probes or manned missions (most likely in that order), their best bet for finding life would be to land at the equator. Then, with some drilling, they could obtain core samples that would determine whether or not life-sustaining nutrients and organic particles exist beneath the ice.

Hopefully, these missions won’t run afoul of any life that doesn’t take too well to their presence. We don’t want a re-enactment of Europa Report on our hands now do we?

Source: IO9.com

Climate Crisis: Illustrative Video of Impending Disaster

IPCC2012_vid3Recently, the United Nation’s Intergovernmental Panel on Climate Change released its 2012 report, which contained some rather stark observations and conclusions. In addition to reconfirming what the 2007 report said about the anthropogenic effects of CO2 emissions, the report also tackled speculation about the role of Solar Forcing and Cosmic Rays in Global Warming, as well as why warming has been proceeding slower than previously expected.

In the end, the report concluded that certain natural factors, such as the influence of the Sun and Cosmic Rays in “seeding clouds”, were diminishing, and thus have a negative effect on the overall warming situation. In spite of that, global temperatures continue to increase, due to the fact that humanity’s output of greenhouse gases (particularly CO2) has not slowed down one bit in recent years.

IPCC2012_vidThe report also goes on to explain detailed scenarios of what we can expect in the coming decades, in extreme and extensive detail. However, for those who have neither the time, patience, or technical knowledge that wade through the report, a helpful video has been provided. Courtesy of Globaia,this four minute video sums up the facts about Climate Change and how it is likely to impact Earth’s many inhabitants, human and otherwise.

Needless to say, the facts are grim. By 2050, if humans remain on their current path, global temperatures will rise more than two degrees Celsius above what it’s been for most of human history. By 2100, it might even climb four degrees. The IPCC report, and this video, confirm what we’ve been hearing everywhere. Arctic sea ice is disappearing, sea levels are rising, storms are getting more destructive, and the full extent of change is not even fully known.

IPCC2012_vid6As the organization that put together this data visualization along with other scientists, Globaia says that it created this video as a call to action for policymakers. Felix Pharand-Deschenes, who founded the Canadian nonprofit company and animated the video, claims that:

If we are convinced of the seriousness of the situation, then political actions and technological fixes will result,” says  “But we have to change our minds first. This is the reason why we try to translate our terrestrial presence and impacts into images–along with the physical limits of our collective actions.

But of course, there’s still hope. As Pharand-Deschenes went on to say, if we can summon up a “war effort,” and work together the way World War II-era citizens did, we could still manage to the social systems that are largely responsible for the problem. This includes everything from transportation and energy to how we grow our food, enough to stay below a two degree rise.

IPCC2012_vid5Of course, this is no small task. But as I love to remind all my readers, research and efforts are happening every day that is making this a reality. Not only is solar, wind and tidal power moving along by leaps and bounds, becoming profitable as well as affordable, we are making great strides in terms of Carbon Capture technology, alternative fuels, and eco-friendly living that are expected to play a huge role in the coming decades.

And though it is often not considered, the progress being made in space flight and exploration also play a role in saving the planet. By looking to make the process of sending ships and satellites into space cheaper, concepts like Space-Based Solar Power (SBSP) can become a reality, one which will meet humanity’s immense power demands in a way that is never marred by weather or locality.

IPCC2012_vid4Combined with sintering and 3-D printing, asteroid prospecting and mining could become a reality too in a few decades time. Currently, it is estimated that just a few of the larger rocks beyond the orbit of Mars would be enough to meet Earth’s mineral needs indefinitely. By shifting our manufacturing and mining efforts offworld with the help of automated robot spacecraft and factories, we would be generating far less in the way of a carbon footprint here on Earth.

But of course, the question of “will it be enough” is a burning one. Some scientists say that an increase of even two degrees Celsius is more than Earth’s creatures can actually handle. But most agree that we need to act immediately to prepare for the future, and that one of the things standing in the way of action is the fact that the problem seems so abstract. Luckily, informational videos like this one present the problem is clear and concise terms.

ipcc2012_vid1The IPCC reports that we only have 125 billion tons of CO2 left to burn before reaching the tipping point, and at current rates, that could happen in just over two decades. Will we have a fully renewable-powered, zero-carbon world by then? Who knows? The point is, if we can get such a task underway by then, things may get worse before they get better, but they will improve in the end. Compared to the prospect of extinction, that seems like a bargain!

In the meantime, check out the video – courtesy of Globaia and the International Geosphere-Biosphere Programme (IGBP) – and try to enjoy it despite its gloomy predictions. I assure you, it is well worth it!


Source:
fastcoexist.com

 

Alien Spotting by 2020?

alien-worldWith recent observations made possible by the Kepler space telescope, numerous planets have been discovered orbiting distant stars. Whereas previous observations and techniques could detect exoplanets, scientists are now able to observe and classify them, with the ultimate aim of determining how Earth-like they are and whether or not they can support life.

Combined with advanced astronomical techniques, the latest estimates claim that there may be are up to 50 sextillion potentially habitable planets in the universe. With their eyes on the next step, the scientific community is now preparing to launch a bevy of new space telescopes that can peer across the universe and tell us how many of those planets actually harbor life.

TESSOne such telescope is NASA’s Transiting Exoplanet Survey Satellite (TESS), which will launch in 2017. While Kepler was focused on a single patch of sky with around 145,000 stars, TESS will be equipped with four telescopes that keep track of around 500,000 stars, including the 1,000 nearest red dwarfs. TESS is expected to find thousands of orbiting, Earth-sized-or-larger planets around these stars.

But to find out whether or not any of those planets actually house life, another sophisticated telescope needs to be employed – the James Webb Space Telescope.Whereas TESS is Kepler’s successor, the James Webb Space Telescope – a joint NASA/ESA/CSA venture – is the planned successor for the Hubble Telescope and is due to launch in 2018.

TESS_Space_Telescope_Mirror37-640x425The JWST has a primary mirror that’s about five times larger than Hubble’s (pictured above), which means it can resolve much fainter signals, locating stars and other objects that have never been seen before. Because it primarily operates in the infrared band (whereas Hubble was tuned towards visible light), the JWST will also be able to see through dust clouds into hidden areas of space.

The JWST’s scientific payload includes a spectrometer that’s sensitive enough to analyze the atmosphere of distant planets. By measuring light from the parent stars, and how its reflected in the planets atmospheres, it will be able to determine if there are life-supporting elements and evidence of biological life – such as oxygen and methane.

TESS_comparisonBecause these planets are light years away, and because the reflected light is incredibly dim, the James Webb Space Telescope will only be able to do this for large planets that orbit red and white dwarfs. Still, that leaves thousands or even millions of candidates that it will be able to observe, and determine whether or not they are already inhabited by extra-terrestrial life.

And last, but not least, there’s the New Worlds Mission, which aims to put a Starshade – which is essentially a big flying space umbrella – into space. This disc would then fly between the James Webb Space Telescope and the star its observing, blocking out large amounts of light and the result “noise pollution” from nearby bright stars that the JWST isn’t observing.

Starshade_1280x720_H264With the Starshade in place, the JWST would be able to probe thousands of nearby planets for signs of life and return data to Earth that is of far greater accuracy. The New Worlds Mission is currently in the prototyping stage, but NASA hopes to procure the necessary funding by 2015 and and launch it within the JWST’s own lifetime.

Because of all this, it is now believed that by 2020 (give or take a few years) we will have the ability to directly image a distant planet and analyze its atmosphere. And if we find methane or another biological marker on just one planet, it will completely redefine our understanding of the universe and the lifeforms that inhabit it.

The answer to the question – “are we alone in the universe?” – may finally be answered, and within our own lifetime. And in the meantime, be sure to enjoy this video of the Starshade space umbrella, courtesy of New Scientist.


Sources: extremetech.com, wired.co.uk, newscientist.com

News From Space: MAVEN Launched

maven_launchYesterday, NASA’s Mars Atmosphere and Volatile Evolution (MAVEN) space probe was finally launched into space. The flawless launch took place from Cape Canaveral Air Force Station’s Space Launch Complex 41 at 1:28 p.m. EST atop a powerful Atlas V rocket. This historic event, which was the culmination of years worth of research, was made all the more significant due to the fact that it was nearly scrapped.

Back in late September, during the government shutdown, NASA saw its funding curtailed and put on hold. As a result, there were fears that MAVEN would miss its crucial launch window this November. Luckily, after two days of complete work stoppage, technicians working on the orbiter were granted an exemption and went back to prepping the probe for launch.

NASA_mavenThanks to their efforts, the launch went off without a hitch. 52 minutes later, the $671 Million MAVEN probe separated from the Atlas Centaur upper stage module, unfurled its wing-like solar panels, and began making its 10 month interplanetary voyage that will take it to Mars. Once it arrives, it will begin conducting atmospheric tests that will answer key questions about the evolution of Mars and its potential for supporting life.

Originally described as a “time-machine for Mars”, MAVEN was designed to orbit Mars and examine whether the atmosphere could also have provided life support, what the atmosphere was like, and what led to its destruction. This mission was largely inspired by recent discoveries made by the Opportunity and Curiosity rovers, whose surface studies revealed that Mars boasted an atmosphere some billions of years ago.

maven_atmo1During a post launch briefing for reporters, Bruce Jakosky – MAVEN’s Principal Investigator – described MAVEN’s mission as follows:

We want to determine what were the drivers of that change? What is the history of Martian habitability, climate change and the potential for life?

Once the probe arrives in orbit around Mars, scheduled for September 22nd, 2014, MAVEN will study Mars’ upper atmosphere to explore how the Red Planet may have lost its atmosphere over the course of billions of years. This will be done by measuring the current rates of atmospheric loss to determine how and when Mars lost its atmosphere and water.

maven_atmosphereFor the sake of this research, MAVEN was equipped with nine sensors the come in three instrument suites. The first is the Particles and Fields Package – which contains six instruments to characterize the solar wind and the ionosphere of Mars – that was provided by the University of California at Berkeley with support from CU/LASP and NASA’s Goddard Space Flight Center.

The second suite is the Remote Sensing Package, which ill determine global characteristics of the upper atmosphere and ionosphere and was built by CU/LASP. And last, but not least, is the Neutral Gas and Ion Mass Spectrometer, built by Goddard, which will measure the composition of Mars’ upper atmosphere.

As for the long term benefits of the mission and what it could mean for humanity, I’d say that Dr. Jim Green – NASA’s Director of Planetary Science at NASA HQ in Washington, DC – said it best:

We need to know everything we can before we can send people to Mars. MAVEN is a key step along the way. And the team did it under budget! It is so exciting!

Source: universetoday.com

Judgement Day Update: Bionic Computing!

big_blue1IBM has always been at the forefront of cutting-edge technology. Whether it was with the development computers that could guide ICBMs and rockets into space during the Cold War, or the creation of the Internet during the early 90’s, they have managed to stay on the vanguard by constantly looking ahead. So it comes as no surprise that they had plenty to say last month on the subject of the next of the next big leap.

During a media tour of their Zurich lab in late October, IBM presented some of the company’s latest concepts. According to the company, the key to creating supermachines that 10,000 faster and more efficient is to build bionic computers cooled and powered by electronic blood. The end result of this plan is what is known as “Big Blue”, a proposed biocomputer that they anticipate will take 10 years to make.

Human-Brain-project-Alp-ICTIntrinsic to the design is the merger of computing and biological forms, specifically the human brain. In terms of computing, IBM is relying the human brain as their template. Through this, they hope to be able to enable processing power that’s densely packed into 3D volumes rather than spread out across flat 2D circuit boards with slow communication links.

On the biological side of things, IBM is supplying computing equipment to the Human Brain Project (HBP) – a $1.3 billion European effort that uses computers to simulate the actual workings of an entire brain. Beginning with mice, but then working their way up to human beings, their simulations examine the inner workings of the mind all the way down to the biochemical level of the neuron.

brain_chip2It’s all part of what IBM calls “the cognitive systems era”, a future where computers aren’t just programmed, but also perceive what’s going on, make judgments, communicate with natural language, and learn from experience. As the description would suggest, it is closely related to artificial intelligence, and may very well prove to be the curtain raiser of the AI era.

One of the key challenge behind this work is matching the brain’s power consumption. The ability to process the subtleties of human language helped IBM’s Watson supercomputer win at “Jeopardy.” That was a high-profile step on the road to cognitive computing, but from a practical perspective, it also showed how much farther computing has to go. Whereas Watson uses 85 kilowatts of power, the human brain uses only 20 watts.

aquasar2Already, a shift has been occurring in computing, which is evident in the way engineers and technicians are now measuring computer progress. For the past few decades, the method of choice for gauging performance was operations per second, or the rate at which a machine could perform mathematical calculations.

But as a computers began to require prohibitive amounts of power to perform various functions and generated far too much waste heat, a new measurement was called for. The new measurement that emerged as a result was expressed in operations per joule of energy consumed. In short, progress has come to be measured in term’s of a computer’s energy efficiency.

IBM_Research_ZurichBut now, IBM is contemplating another method for measuring progress that is known as “operations per liter”. In accordance with this new paradigm, the success of a computer will be judged by how much data-processing can be squeezed into a given volume of space. This is where the brain really serves as a source of inspiration, being the most efficient computer in terms of performance per cubic centimeter.

As it stands, today’s computers consist of transistors and circuits laid out on flat boards that ensure plenty of contact with air that cools the chips. But as Bruno Michel – a biophysics professor and researcher in advanced thermal packaging for IBM Research – explains, this is a terribly inefficient use of space:

In a computer, processors occupy one-millionth of the volume. In a brain, it’s 40 percent. Our brain is a volumetric, dense, object.

IBM_stacked3dchipsIn short, communication links between processing elements can’t keep up with data-transfer demands, and they consume too much power as well. The proposed solution is to stack and link chips into dense 3D configurations, a process which is impossible today because stacking even two chips means crippling overheating problems. That’s where the “liquid blood” comes in, at least as far as cooling is concerned.

This process is demonstrated with the company’s prototype system called Aquasar. By branching chips into a network of liquid cooling channels that funnel fluid into ever-smaller tubes, the chips can be stacked together in large configurations without overheating. The liquid passes not next to the chip, but through it, drawing away heat in the thousandth of a second it takes to make the trip.

aquasarIn addition, IBM also is developing a system called a redox flow battery that uses liquid to distribute power instead of using wires. Two types of electrolyte fluid, each with oppositely charged electrical ions, circulate through the system to distribute power, much in the same way that the human body provides oxygen, nutrients and cooling to brain through the blood.

The electrolytes travel through ever-smaller tubes that are about 100 microns wide at their smallest – the width of a human hair – before handing off their power to conventional electrical wires. Flow batteries can produce between 0.5 and 3 volts, and that in turn means IBM can use the technology today to supply 1 watt of power for every square centimeter of a computer’s circuit board.

IBM_Blue_Gene_P_supercomputerAlready, the IBM Blue Gene supercomputer has been used for brain research by the Blue Brain Project at the Ecole Polytechnique Federale de Lausanne (EPFL) in Lausanne, Switzerland. Working with the HBP, their next step ill be to augment a Blue Gene/Q with additional flash memory at the Swiss National Supercomputing Center.

After that, they will begin simulating the inner workings of the mouse brain, which consists of 70 million neurons. By the time they will be conducting human brain simulations, they plan to be using an “exascale” machine – one that performs 1 exaflops, or quintillion floating-point operations per second. This will take place at the Juelich Supercomputing Center in northern Germany.

brain-activityThis is no easy challenge, mainly because the brain is so complex. In addition to 100 billion neurons and 100 trillionsynapses,  there are 55 different varieties of neuron, and 3,000 ways they can interconnect. That complexity is multiplied by differences that appear with 600 different diseases, genetic variation from one person to the next, and changes that go along with the age and sex of humans.

As Henry Markram, the co-director of EPFL who has worked on the Blue Brain project for years:

If you can’t experimentally map the brain, you have to predict it — the numbers of neurons, the types, where the proteins are located, how they’ll interact. We have to develop an entirely new science where we predict most of the stuff that cannot be measured.

child-ai-brainWith the Human Brain Project, researchers will use supercomputers to reproduce how brains form in an virtual vat. Then, they will see how they respond to input signals from simulated senses and nervous system. If it works, actual brain behavior should emerge from the fundamental framework inside the computer, and where it doesn’t work, scientists will know where their knowledge falls short.

The end result of all this will also be computers that are “neuromorphic” – capable of imitating human brains, thereby ushering in an age when machines will be able to truly think, reason, and make autonomous decisions. No more supercomputers that are tall on knowledge but short on understanding. The age of artificial intelligence will be upon us. And I think we all know what will follow, don’t we?

Evolution-of-the-Cylon_1024Yep, that’s what! And may God help us all!

Sources: news.cnet.com, extremetech.com

The Future is Creepy: Reading Consumer’s Brainwaves

brainscansProduct marketing has always been a high stakes game, where companies rely on psychology, competitive strategies, and well-honed ad campaigns to appeal to consumer’s instincts. This has never been an exact science, but it may soon be possible for advertisers to simply read your brainwaves to determine what you’re thinking and how much you’re willing to pay.

This past October, the German news site Spiegel Online profiled the provocative work of a Swiss neuroscientist and former sales consultant who is working on a method of measuring brain waves to determine how much a person would be willing to pay for a good or service. Known as “feel-good pricing” to marketing critics, the idea is already inspiring horror and intrigue.

brainwavesThe neuroscientist in question is Kai-Markus Müller, the head of Neuromarketing Labs who has over 10 years of experience in neuroscience research. According to his test, Starbucks is not actually charging enough for its expensive coffee. In fact, it’s probably leaving profits on the table because people would probably still buy it if they charged more.

To conduct this test, Müller targeting an area in the brain that lights up when things don’t really make sense. When test subjects were presented with the idea of paying 10 cents for coffee, their brain reacted unconsciously because the price seemed too cheap. A coffee for $8, on other hand, produced a similar reaction since the price seemed too high.

brain-activityOne would think that this method would help to determine optimum pricing. However, Müller then set up a coffee vending machine where people were allowed to set their own price. The two methods then matched up and revealed that people were willing to pay a higher price than what Starbucks actually charges. Somehow, paying less made people think they were selecting an inferior grade of product.

Naturally, there are those who would be horrified by this idea, feeling that it represents the worst combination of Big Brother surveillance and invasive marketing. This is to be expecting when any talk of “reading brainwaves” is concerned, dredging up images of a rampant-consumer society where absolutely no privacy exists, even within the space of your own head.

neuromarketOn the other hand, Müller himself takes issue with the notion of the “transparent consumer”, claiming that “Everyone wins with this method”. As proof, he cited the numerous flops in the consumer economy in the Spiegel Online article. Apparently, roughly 80 percent of all new products disappear from shelves after a short time, mainly because the producers have misjudged the markets desire for them or what they are willing to pay.

It’s all part of a nascent concept known as Neuromarketing, and it is set to take to the market in the coming years. One can expect that consumers will have things to say about it, and no doubt those feelings will come through whenever and wherever producers try to sell you something. Personally, I am reminded of what Orwell wrote in 1984:

“Always the eyes watching you and the voice enveloping you. Asleep or awake, working or eating, indoors or out of doors, in the bath or in bed — no escape. Nothing was your own except the few cubic centimetres inside your skull.”

futurama_lightspeedbriefsAnd perhaps more appropriately, I’m also reminded of what Fry said about advertising in the Season 1 episode of Futurama entitled “A Fistfull of Dollars”:

“Leela: Didn’t you have ads in the 21st century?

Fry: Well sure, but not in our dreams. Only on TV and radio, and in magazines, and movies, and at ball games… and on buses and milk cartons and t-shirts, and bananas and written on the sky. But not in dreams, no siree.”

Somehow, truth is always stranger than fiction!

Sources: fastcoexist.com, spiegel.de, neuromarketing-labs.com

TBBT’s “Friendship Algorithm”

TBBT_frienship_algorithmRecall that hilarious episode of The Big Bang Theory where Sheldon designed the friendship algorithm? Well, like much of what they do, the hilarity comes with its share of educational value. In fact, half of what makes the show so funny is the way they weave scientific fact into the story and nerd out on it! For those who actually get it, it’s doubly entertaining.

In this case, Sheldon’s characteristic appraisal of his situation reflected something very real and relatable about algorithms. Essentially, they are step-by-step procedures designed to solve problems. While they pertain to calculation, data processing, and automated reasoning, the concept is something we are already intimately familiar with.

Literally everyone uses algorithms in everyday decision making, thinking things out in advance and taking things into consideration to come up with alternate plans and reach the desired outcome. Treating it like a computer program, as Sheldon does below, is just an excessively nerdy way of going about it! Enjoy the video recap: