Is the Universe One Big Hologram?

universe_nightsky“You know how I can tell we’re not in the Matrix?  If we were, the food would be better.” Thus spoke Sheldon Cooper, the socially-challenged nerd from The Big Bang Theory. And yet, there is actually a scientific theory that posits that the universe itself could be a 2D hologram that is painted on some kind of cosmological horizon and only pops into 3D whenever we observe it (aka. always).

And in what may be the most mind-boggling experiment ever, the US Department of Energy’s Fermi National Accelerator Laboratory (Fermilab) seeks to test this theory for the first time. Their tool for this is the Holometer, a device which has been under construction for a couple of years. It is now operating at full power and will gather data for the next year or so, at which time it will seek to uncover if the universe is a hologram, and what it’s composed of.

big_bangThe current prevailing theories about how the universe came to be are the Big Bang, the Standard Model of particle physics, quantum mechanics, and classical physics. These hypotheses and models don’t fully answer every question about how the universe came to be or continues to persist – which is why scientists are always investigating other ideas, such as supersymmetry or string theory.

The holographic universe principle is part of string theory – or at least not inconsistent with it – and goes something like this: From our zoomed out vantage point, the universe seems to be a perfectly formed enclave of 4D spacetime. But what happens if you keep zooming in, past the atomic and subatomic, until you get down to the smallest possible unit that can exist in the universe?

fermi_holometer-3In explaining their theory, the scientists involved make much of the analogy of moving closer to an old-style TV until you can see the individual pixels. The holographic principle suggests that, if you zoom in far enough, we will eventually see the pixels of the universe. It’s theorized that these universal pixels are about 10 trillion trillion times smaller than an atom (where things are measured in Planck units).

The Holometer at Fermilab, which on the hunt for these pixels of the universe, is essentially an incredibly accurate clock. It consists of a twin-laser interferometer, which – as the name suggests – extracts information from the universe by measuring interference to the laser beams. Each interferometer directs a one-kilowatt laser beam at a beam splitter and then down two 40-m (130-ft) arms located at right-angles to one another.

holometer-interferometer-diagramThese beams are then reflected back towards the source, where they are combined and analyzed for any traces of interference. As Craig Hogan, the developer of the holographic noise theory and a director at Fermilab, explained:

We want to find out whether space-time is a quantum system just like matter is. If we see something, it will completely change ideas about space we’ve used for thousands of years.

After any outside influences are removed, any remaining fluctuations – measured by slightly different frequencies or arrival times – could be caused by the ever-so-slight quantum jitter of these universal pixels. If these universal pixels exist, then everything we see, feel, and experience in the universe is actually encoded in these 2D pixels. One major difficulty in such a test will be noise – aka. “Holographic noise” – which they expect to be present at all frequencies.

fermi_holometerTo mitigate this, the Holometer is testing at frequencies of many megahertz so that motions contained in normal matter are claimed not to be a problem. The dominant background noise of radio wave interference will be the most difficult to filter out, according to the team. As Holometer lead scientist Aaron Chou explained:

If we find a noise we can’t get rid of, we might be detecting something fundamental about nature – a noise that is intrinsic to space-time.

This would have some serious repercussions. For a start, it would mean that spacetime itself is a quantum system, just like matter. The theory that the universe consists of matter and energy would be annulled, replaced with the concept that the universe is made of information encoded into these universal pixels, which in turn create the classical concepts of matter and energy.

fermi_holometer-1And of course, if the universe is just a 3D projection from a 2D cosmological horizon, where exactly is that cosmological horizon? And does this mean that everything we know and love is just a collection of quantum information carrying 2D bits? And perhaps most importantly (from our point of view at least) what does that make us? Is all life just a collection of pixels designed to entertain some capricious audience?

All good and, if you think about it, incredibly time-honored questions. For has it not been suggested by many renowned philosophies that life is a deception, and death an escape? And do not the Hindu, Buddhist and Abrahamic religions tells us that our material existence is basically a facade that conceals our true reality? And were the ancient religions not all based on the idea that man was turned loose in a hostile world for the entertainment of the gods?

Well, could be that illusion is being broadcast in ultra-high definition! And getting back to The Big Bang Theory, here’s Leonard explaining the hologram principle to Penny, complete with holograms:


Sources:
extremetech.com, gizmag.com

The Future of Solar: The Space-Based Solar Farm

space-solar-headThe nation of Japan has long been regarded as being at the forefront of emerging technology. And when it comes to solar energy, they are nothing if not far-sighted and innovative. Whereas most nations are looking at building ground-based solar farms in the next few years, the Japanese are looking at the construction of vast Lunar and space-based solar projects that would take place over the course of the next few decades.

The latest proposal comes from the Japan Aerospace Exploration Agency (JAXA), which recently unveiled a series of pilot projects which, if successful, should culminate in a 1-gigawatt space-based solar power generator within just 25 years. Relying on two massive orbital mirrors that are articulated to dynamically bounce sunlight onto a solar panel-studded satellite, the energy harvested would then be beamed wirelessly to Earth using microwaves, collected Earth-side by rectifying antennas at sea, and then passed on to land.

lunaringJAXA has long been the world’s biggest booster of space-based solar power technology, making significant investments in research and rallying international support for early test projects. And in this respect, they are joined by private industries such as the Shimizu Corporation, a Japanese construction firm that recently proposed building a massive array of solar cells on the moon – aka. the “Lunar Ring” – that could beam up to 13,000 terawatts (roughly two-thirds of global power consumption) to Earth around the clock.

Considering that Japan has over 120 million residents packed onto an island that is roughly the size of Montana, this far-sighted tendency should not come as a surprise.  And even before the Fukushima disaster took place, Japan knew it needed to look to alternative sources of electricity if it was going to meet future demands. And considering the possibilities offered by space-based solar power, it should also come as no surprise that Japan – which has very few natural resources – would look skyward for the answer.

solar_array1Beyond Japan, solar power is considered the of front runner of alternative energy, at least until s fusion power comes of age. But Until such time as a fusion reaction can be triggered that produces substantially more energy than is required to initiate it, solar will remain the only green technology that could even theoretically provide for our global power demands. And in this respect, going into space is seen as the only way of circumventing the problems associated with it.

Despite solar power being in incredible abundance – the Earth’s deserts absorb more energy in a day than the human race uses in an entire year – the issue of harnessing that power and getting it to where it is needed remain as stumbling blocks. Setting up vast arrays in the Earth’s deserts would certainly deal with the former, but transmitting it to the urban centers of the world (which are far removed from it’s deserts) would be both expensive and impractical.

space-based-solarpowerLuckily, putting arrays into orbit solves both of these issues. Above the Earth’s atmosphere, they would avoid most forms of wear, the ground-based day/night cycle, and all occluding weather formations. And assuming the mirrors themselves are able to reorient to be perpetually aimed at the sun (or have mirrors to reflect the light onto them), the more optimistic estimates say that a well-designed space array could bring in more than 40 times the energy of a conventional one.

The only remaining issue lies in beaming all that energy back to Earth. Though space-based arrays can easily collect more power above the atmosphere than below it, that fact becomes meaningless if the gain is immediately lost to inefficiency during transmission. For some time, lasers were assumed to be the best solution, but more recent studies point to microwaves as the most viable solution. While lasers can be effectively aimed, they quickly lose focus when traveling through atmosphere.

spaceX_solararrayHowever, this and other plans involving space-based solar arrays (and a Space Elevator, for that matter) assume that certain advances over the next 20 years or so – ranging from light-weight materials to increased solar efficiency. By far the biggest challenge though, or the one that looks to be giving the least ground to researchers, is power transmission. With an estimated final mass of 10,000 tonnes, a gigawatt space solar array will require significant work from other scientists to improve things like the cost-per-kilogram of launch to orbit.

It currently costs around $20,000 to place a kilogram (2.2lbs) into geostationary orbit (GSO), and about half that for low-Earth orbit (LEO). Luckily, a number of recent developments have been encouraging, such as SpaceX’s most recent tests of their Falcon 9R reusable rocket system or NASA’s proposed Reusable Launch Vehicle (RLV). These and similar proposals are due to bring the costs of sending materials into orbit down significantly – Elon Musk hopes to bring it down to $1100 per kilogram.

So while much still needs to happen to make SBSP and other major undertakings a reality, the trends are encouraging, and few of their estimates for research timelines seem all that pie-eyed or optimistic anymore.

Sources: extremetech.com, (2)

The Future is Fusion: Surpassing the “Break-Even” Point

JET_fusionreactorFor decades, scientists have dreamed of the day when cold fusion – and the clean, infinite energy it promises – could be made possible. And in recent years, many positive strides have been taken in that direction, to the point where scientists are now able to “break-even”. What this means is, it has become the norm for research labs to be able to produce as much energy from a cold fusion reaction as it takes in triggering that reaction in the first place.

And now, the world’s best fusion reactor – located in Oxfordshire, Engand – will become the first fusion power experiment to attempt to surpass it. This experiment, known as the Joint European Torus (JET), has held the world record for fusion reactor efficiency since 1997. If JET can reach break-even point, there’s a very good chance that the massive International Thermonuclear Experimental Reactor (ITER) currently being built in France will be able to finally achieve the dream of self-sustaining fusion. 

NASA_fusionchamber

Originally built in 1983, the JET project was conceived by the European Community (precursor to the EU) as a means of making fusion power a reality. After being unveiled the following year at a former Royal Navy airfield near Culham in Oxfordshire, with Queen Elizabeth II herself in attendance, experiments began on triggering a cold fusion reaction. By 1997, 16 megawatts of fusion power were produced from an input power of 24 megawatts, for a fusion energy gain factor of around 0.7.

Since that time, no one else has come close. The National Ignition Facility – the only other “large gain” fusion experiment on the planet, located in California – recently claimed to have broken the break-even point with their  laser-powered process. However, these claims are apparently mitigated by the fact that their 500 terrawat process (that’s 500 trillion watts!) is highly inefficient when compared to what is being used in Europe.

NIF Livermore July 2008Currently, there are two competing approaches for the artificial creation of nuclear fusion. Whereas the NIF uses “inertial confinement” – which uses lasers to create enough heat and pressure to trigger nuclear fusion – the JET project uses a process known as “magnetic confinement”. This process, where deuterium and tritium fuel are fused within a doughnut-shaped device (a tokamak) and the resulting thermal and electrical energy that is released provides power.

Of the two, magnetic confinement is usually considered a better prospect for the limitless production of clean energy, and this is the process the 500-megawatt ITER fusion reactor once its up and running. And while JET itself is a fairly low-power experiment (38 megawatts), it’s still very exciting because it’s essentially a small-scale prototype of the larger ITER. For instance, JET has been upgraded in the past few years with features that are part of the ITER design.

fusion_energyThese include a wall of solid beryllium that can withstand being bombarded by ultra-high-energy neutrons and temperatures in excess of 200 million degrees. This is a key part of achieving a sustained fusion reaction, which requires that a wall is in place to bounce all the hot neutrons created by the fusion of deuterium and tritium back into the reaction, rather than letting them escape. With this new wall in place, the scientists at JET are preparing to pump up the reaction and pray that more energy is created.

Here’s hoping they are successful! As it stands, there are still many who feel that fusion is a pipe-dream, and not just because previous experiments that claimed success turned out to be hoaxes. With so much riding on humanity’s ability to find a clean, alternative energy source, the prospects of a breakthrough do seem like the stuff of dreams. I sincerely hope those dreams become a reality within my own lifetime…

Sources: extremetech.com, (2)

News from Space: Space Elevator by 2035!

space_elevator2Imagine if you will a long tether made of super-tensile materials, running 100,000 km from the Earth and reaching into geostationary orbit. Now imagine that this tether is a means of shipping people and supplies into orbit, forever removing the need for rockets and shuttles going into space. For decades, scientists and futurists have been dreaming about the day when a “Space Elevator” would be possible; and according to a recent study, it could become a reality by 2035.

The report was launched by the International Academy of Astronautics (IAA), a 350-page report that lays out a detailed case for a space elevator. At the center of it that will reach beyond geostationary orbit and held taught by an anchor weighing roughly two million kilograms (2204 tons). Sending payloads up this backbone could fundamentally change the human relationship with space, with the equivalent of a space launch happening almost daily.

space_elevatorThe central argument of the paper — that we should build a space elevator as soon as possible — is supported by a detailed accounting of the challenges associated with doing so. The possible pay-off is as simple: a space elevator could bring the cost-per-kilogram of launch to geostationary orbit from $20,000 to as little as $500. Not only would be it useful for deploying satellites, it would also be far enough up Earth’s gravity well to be able to use it for long-range missions.

This could include the long-awaited mission to Mars, where a shuttle would push off from the top and then making multiple loops around the Earth before setting off for the Red Planet. This would cut huge fractions off the fuel budget, and would also make setting up a base on the Moon (or Mars) a relatively trivial affair. Currently, governments and corporations spend billions putting satellites into space, but a space elevator could pay for itself and ensure cheaper access down the line.

terraforming-mars2The report lays out a number of technological impediments to a space elevator, but by far the most important is the tether itself. Current materials science has yet to provide a material with the strength, flexibility, and density needed for its construction. Tethers from the EU and Japan are beginning to push the 100-kilometer mark, are still a long way off orbital altitude, and the materials for existing tethers will not allow much additional length.

Projecting current research in carbon nanotubes and similar technologies, the IAA estimates that a pilot project could plausibly deliver packages to an altitude of 1000 kilometers (621 miles) as soon as 2025. With continued research and the help of a successful LEO (low Earth orbit, i.e. between 100 and 1200 miles) elevator, they predict a 100,000-kilometer (62,137-mile) successor will stretch well past geosynchronous orbit just a decade after that.

carbon-nanotubeThe proposed design is really quite simple, with a sea platform (or super-ship) anchoring the tether to the Earth while a counterweight sits at the other end, keeping the system taught through centripetal force. For that anchor, the report argues that a nascent space elevator should be stabilized first with a big ball of garbage – one composed of retired satellites, space debris, and the cast-off machinery used to build the elevator’s own earliest stages.

To keep weight down for the climbers (the elevator cars), this report imagines them as metal skeletons strung with meshes of carbon nanotubes. Each car would use a two-stage power structure to ascend, likely beginning with power from ground- or satellite-based lasers, and then the climber’s own solar array. The IAA hopes for a seven-day climb from the base to GEO — slow, but still superior and far cheaper than the rockets that are used today.

Space Elevator by gryphart-d42c7sp
Space Elevator by gryphart-d42c7sp

One thing that is an absolute must, according to the report, is international cooperation. This is crucial not only for the sake of financing the elevator’s construction, but maintaining its neutrality. In terms of placement, IAA staunchly maintains that a space elevator would be too precious a resource to be built within the territory of any particular nation-state. Though every government would certainly love a space elevator of their very own, cost considerations will likely make that impossible in the near-term.

By virtue of its physical size, a space elevator will stretch through multiple conflicting legal zones, from the high seas to the “territorial sky” to the “international sky” to outer space itself, presenting numerous legal and political challenges. Attacks by terrorists or enemies in war are also a major concern, requiring that it be defended and monitored at all levels. And despite being a stateless project, it would require a state’s assets to maintain, likely by the UN or some new autonomous body.

space_elevator1In 2003, Arthur C. Clarke famously said that we will build a space elevator 10 years after they stop laughing. Though his timeline may have been off, as if often the case – for example, we didn’t have deep space missions or AIs by 2001 – sentiments were bang on. The concept of a space elevator is taken seriously at NASA these days, as it eyes the concept as a potential solution for both shrinking budgets and growing public expectations.

Space is quickly becoming a bottleneck in the timeline of human technological advancement. From mega-telescopes and surveillance nets to space mining operations and global high-speed internet coverage, most of our biggest upcoming projects will require better access to space than our current methods can provide for. And in addition to providing for that support, this plans highlights exactly how much further progress in space depends on global cooperation.

Source: extremetech.com

The Future of Medicine: New Blood-Monitoring Devices

medtechNon-invasive medicine is currently one of the fastest growing industries in the world. Thanks to ongoing developments in the fields of nanofabrication, wireless communications, embedded electronics and microsensors, new means are being created all the time that can monitor our health that are both painless and hassle-free.

Consider diabetes, an epidemic that currently affects 8% of the population in the US and is growing worldwide. In October of 2013, some 347 million cases were identified by the World Health Organization, which also claims that diabetes will become the 7th leading cause of death by 2030. To make matters worse, the conditions requires constant blood-monitoring, which is difficult in developing nations and a pain where the means exist.

google_lensesHence why medical researchers and companies are looking to create simpler, non-invasive means. Google is one such company, which back in January announced that they are working on a “smart” contact lens that can measure the amount of glucose in tears. By merging a mini glucose sensor and a small wireless chip into a set of regular soft contact lenses, they are looking to take all the pin-pricks out of blood monitoring.

In a recent post on Google’s official blog, project collaborators Brian Otis and Babak Parviz described the technology:

We’re testing prototypes that can generate a reading once per second. We’re also investigating the potential for this to serve as an early warning for the wearer, so we’re exploring integrating tiny LED lights that could light up to indicate that glucose levels have crossed above or below certain thresholds.

And Google is hardly alone in this respect. Due to growing concern and the advancements being made, others are also looking at alternatives to the finger prick, including glucose measures from breath and saliva. A company called Freedom Meditech, for example, is working on a small device  that can measure glucose levels with an eye scan.

I_Sugar_X_prototype1Their invention is known as the I-SugarX, a handheld device that scans the aqueous humor of eye, yielded accurate results in clinical studies in less than four minutes. John F. Burd, Ph.D., Chief Science Officer of Freedom Meditech, described the process and its benefits in the following way:

The eye can be thought of as an optical window into to body for the painless measurement of glucose in the ocular fluid as opposed to the blood, and is well suited for our proprietary optical polarimetric based measurements. Based on the results of this, and other studies, we plan to begin human clinical studies as we continue our product development.

Between these and other developments, a major trend towards “smart monitoring” is developing and likely to make life easier and cut down on the associated costs of medicine. A smart contact lens or saliva monitor would make it significantly easier to watch out for uncontrolled blood sugar levels, which ultimately lead to serious health complications.

I_Sugar_X_prototype2But of course, new techniques for blood-monitoring goes far beyond addressing chronic conditions like diabetes. Diagnosing and controlling the spread of debilitating, potentially fatal diseases is another major area of focus. Much like diabetes, doing regular bloodwork can be a bit difficult, especially when working in developing areas of the world where proper facilities can be hard to find.

But thanks to researchers at Rice University in Houston, Texas, a new test that requires no blood draws is in the works. Relying on laser pulse technology to create a vapor nanobubble in a malaria-infected cell, this test is able to quickly and non-invasively diagnose the disease. While it does not bring medical science closer to curing this increasingly drug-resistant disease, it could dramatically improve early diagnosis and outcomes.

malaria-blood-free-detectorThe scanner was invented by Dmitro Lapotko, a physicist, astronomer, biochemist, and cellular biologist who studied laser weapons in Belarus before moving to Houston. Here, he and his colleagues began work on a device that used the same kind of laser and acoustic sensing technology employed on sub-hunting destroyers, only on a far smaller scale and for medical purposes.

Dubbed “vapor nanobubble technology,” the device combines a laser scanner and a fiber-optic probe that detect malaria by heating up hemozoin – the iron crystal byproduct of hemoglobin that is found in malaria cells, but not normal blood cells. Because the hemozoin crystals absorb the energy from the laser pulse, they heat up enough to create transient vapor nanobubbles that pop.

malariaThis, in turn, produces a ten-millionth-of-a-second acoustic signature that is then picked up by the device’s fiber-optic acoustic sensor and indicates the presence of the malaria parasite in the blood cells scanned. And because the vapor bubbles are only generated by hemozoin, which is only present in infected cells, the approach is virtually fool-proof.

In an recent issue of Proceedings of the National Academy of Sciences, Lapotko and his research team claimed that the device detected malaria in a preclinical trial on mice where only one red blood cell in a million was infected with zero false positives. In a related school news release, the study’s co-author David Sullivan – a malaria clinician a Johns Hopkins University – had this to say about the new method:

The vapor nanobubble technology for malaria detection is distinct from all previous diagnostic approaches. The vapor nanobubble transdermal detection method adds a new dimension to malaria diagnostics, and it has the potential to support rapid, high-throughput and highly sensitive diagnosis and screening by nonmedical personnel under field conditions.

At present, malaria is one of the world’s deadliest diseases, infecting hundreds of millions of people a year and claiming the lives of more than 600,000. To make matters worse, most the victims are children. All of this combines to make malaria one of the most devastating illness effecting the developing world, comparable only to HIV/AIDS.

malaria_worldwideBy ensuring that blood tests that could detect the virus, and require nothing more than a mobile device that could make the determination quickly, and need only a portable car battery to power it, medical services could penetrate the once-thought impenetrable barriers imposed by geography and development. And this in turn would be a major step towards bringing some of the world’s most infectious diseases to heel.

Ultimately, the aim of non-invasive technology is to remove the testing and diagnostic procedures from the laboratory and make them portable, cheaper, and more user-friendly. In so doing, they also ensure that early detection, which is often the difference between life and death, is far easier to achieve. It also helps to narrow the gap between access between rich people and poor, not to mention developing and developing nations.

Sources: fastcoexist.com, news.cnet.com, businesswire.com, googleblogspot.ca, who.int

Powered by the Sun: Efficiency Records and Future Trends

solar_panelThere have been many new developments in the field of solar technology lately, thanks to new waves of innovation and the ongoing drive to make the technology cheaper and more efficient. At the current rate of growth, solar power is predicted to become cheaper than natural gas by 2025. And with that, so many opportunities for clean energy and clean living will become available.

Though there are many contributing factors to this trend, much of the progress made of late is thanks to the discovery of graphene. This miracle material – which is ultra-thin, strong and light – has the ability to act as a super capacitor, battery, and an amazing superconductor. And its use in the manufacture of solar panels is leading to record breaking efficiency.

graphene-solarBack in 2012, researchers from the University of Florida reported a record efficiency of 8.6 percent for a prototype solar cell consisting of a wafer of silicon coated with a layer of graphene doped with trifluoromethanesulfonyl-amide (TFSA). And now, another team is claiming a new record efficiency of 15.6 percent for a graphene-based solar cell by ditching the silicon all together.

And while 15.6 efficiency might still lag behind certain designs of conventional solar cells (for instance, the Boeing Spectrolabs mass-production design of 2010 achieved upwards of 40 percent), this represents a exponential increase for graphene cells. The reason why it is favored in the production of cells is the fact that compared to silicon, it is far cheaper to produce.

solar_power2Despite the improvements made in manufacturing and installation, silicon is still expensive to process into cells. This new prototype, created by researchers from the Group of Photovoltaic and Optoelectronic Devices (DFO) – located at Spain’s Universitat Jaume I Castelló and the University of Oxford – uses a combination of titanium oxide and graphene as a charge collector and perovskite to absorb sunlight.

As well as the impressive solar efficiency, the team says the device is manufactured at low temperatures, with the several layers that go into making it being processed at under 150° C (302° F) using a solution-based deposition technique. This not only means lower potential production costs, but also makes it possible for the technology to be used on flexible plastics.

twin-creeks-hyperion-wafer-ii-flexibleWhat this means is a drop in costs all around, from production to installation, and the means to adapt the panel design to more surfaces. And considering the rate at which efficiency is being increased, it would not be rash to anticipate a range of graphene-based solar panels hitting the market in the near future – ones that can give conventional cells a run for their money!

However, another major stumbling block with solar power is weather, since it requires clear skies to be effective. For some time, the idea of getting the arrays into space has been proposed as a solution, which may finally be possible thanks to recent drops in the associated costs. In most cases, this consists or orbital arrays, but as noted late last year, there are more ambitious plans as well.

lunaring-3Take the Japanese company Shimizu and it’s proposed “Luna Ring” as an example. As noted earlier this month, Shimizu has proposed creating a solar array some 400 km (250 miles) wide and 11,000 km (6,800 miles) long that would beam solar energy directly to Earth. Being located on the Moon and wrapped around its entirety, this array would be able to take advantage of perennial exposure to sunlight.

Cables underneath the ring would gather power and transfer it to stations that facing Earth, which would then beam the energy our way using microwaves and lasers. Shimizu believes the scheme, which it showed off at a recent exhibition in Japan, would virtually solve our energy crisis, so we never have to think about fossil fuels again.

lunaring-2They predict that the entire array could be built and operational by 2035. Is that too soon to hope for planetary energy independence? And given the progress being made by companies like SpaceX and NASA in bringing the costs of getting into space down, and the way the Moon is factoring into multiple space agencies plans for the coming decades, I would anticipate that such a project is truly feasible, if still speculative.

Combined with increases being made in the fields of wind turbines, tidal harnesses, and other renewable energy sources – i.e. geothermal and piezoelectric – the future of clean energy, clear skies and clean living can’t get here soon enough! And be sure to check out this video of the Luna Ring, courtesy of the Shimizu corporation:


Sources:
gizmodo.com, fastcoexist.com

The Phosforce: A Real-Life Lightsaber?

phosforceSure, it may not be able to cut your hand off or deflect blaster shots, but this invention has geeks and engineers all experience a collective fangasm! In honor of Star Wars Day this year, the company known as Wicked Lasers debuted the Phosphorce, a laser-turned flashlight with the power to both illuminate and incinerate. Now tell me that doesn’t get your adrenaline pumping and make you wonder if it comes in designer shades and in both the single and double-bladed form!

As the video below shows, the Phosphorce is the company’s most powerful handheld laser married to a special lens. Attached, the lens turns the laser into a flashlight that produces some 500 lumens of power, making it the most powerful light on the market. Once removed, the device is back to being its usual, single-watt Spyder 3 Arctic laser, which is capable of projecting a beam up to a distance of 10km and incinerating at close range – just balloons, in case you were worried.

phosforce1And if case that’s not enough, the company also designs handles like the SABER, an attachment that turns their Arctic or Krypton lasers handhelds into the most stunning approximation of a lightsaber available. Already, BMW is in talks with Wicked Lasers to use the technology to fashion laser headlamps for their cars. The laser goes for a hefty $299.95, while the lens is available for a comparatively modest $79.85.

Yeah, not the cheapest lightsaber replica on the market, but at least it comes in Arctic blue. Tell me that doesn’t bring the Jedi’s weapon to mind! And be sure to check out the video, it is sure to pop your eyes!