News from Space: New Map of the Universe Confirms The Big Bang!

planckAfter 15 months of observing deep space, scientists with the European Space Agency Planck mission have generated a massive heat map of the entire universe.The “heat map”, as its called, looks at the oldest light in the universe and then uses the data to extrapolate the universe’s age, the amount of matter held within, and the rate of its expansion. And as usual, what they’ve found was simultaneously reassuring and startling.

When we look at the universe through a thermal imaging system, what we see is a mottled light show caused by cosmic background radiation. This radiation is essentially the afterglow of the Universe’s birth, and is generally seen to be smooth and uniform. This new map, however, provides a glimpse of the tiny temperature fluctuations that were imprinted on the sky when the Universe was just 370,000 years old.

big_bangSince it takes light so long to travel from one end of the universe to the other, scientists can tell – using red shift and other methods – how old the light is, and hence get a glimpse at what the universe looked like when the light was first emitted. For example, if a galaxy several billion light years away appears to be dwarfish and misshapen by our standards, it’s an indication that this is what galaxies looked like several billion years ago, when they were in the process of formation.

Hence, like archaeologists sifting through sand to find fossil records of what happened in the past, scientists believe this map reveals a sort of fossil imprint left by the state of the universe just 10 nano-nano-nano-nano seconds after the Big Bang. The splotches in the Planck map represent the seeds from which the stars and galaxies formed. As is heat-map tradition, the reds and oranges signify warmer temperatures of the universe, while light and dark blues signify cooler temperatures.universe

The cooler temperatures came about because those were spots where matter was once concentrated, but with the help of gravity, collapsed to form galaxies and stars. Using the map, astronomers discovered that there is more matter clogging up the universe than we previously thought, at around 31.7%, while there’s less dark energy floating around, at around 68.3%. This shift in matter to energy ratio also indicates that the universe is expanding slower than previously though, which requires an update on its estimated age.

All told, the universe is now believed to be a healthy 13.82 billion years old. That wrinkles my brain! And also of interest is the fact that this would appear to confirm the Big Bang Theory. Though widely considered to be scientific canon, there are those who dispute this creation model of the universe and argue more complex ideas, such as the “Steady State Theory” (otherwise known as the “Theory of Continuous Creation”).

24499main_MM_Image_Feature_49_rs4In this scenario, the majority of matter in the universe was not created in a single event, but gradually by several smaller ones. What’s more, the universe will not inevitable contract back in on itself, leading to a “Big Crunch”, but will instead continue to expand until all the stars have either died out or become black holes. As Krzysztof Gorski, a member of the Planck team with JPL, put it:

This is a treasury of scientific data. We are very excited with the results. We find an early universe that is considerably less rigged and more random than other, more complex models. We think they’ll be facing a dead-end.

Martin White, a Planck project scientist with the University of California, Berkeley and the Lawrence Berkeley National Laboratory, explained further. According to White, the map shows how matter scattered throughout the universe with its associated gravity subtly bends and absorbs light, “making it wiggle to and fro.” As he went on to say:

The Planck map shows the impact of all matter back to the edge of the Universe. It’s not just a pretty picture. Our theories on how matter forms and how the Universe formed match spectacularly to this new data.

planck_satThe Planck space probe, which launched in 2009 from the Guiana Space Center in French Guiana, is a European Space Agency mission with significant contribution from NASA. The two-ton spacecraft gathers the ancient glow of the Universe’s beginning from a vantage more than a million and a half kilometers from Earth. This is not the first map produced by Planck; in 2010, it created an all-sky radiation map which scientists, using supercomputers, removed all interfering background light from to get a clear view at the deep background of the stars.

However, this is the first time any satellite has been able to picture the background radiation of the universe with such high resolution. The variation in light captured by Planck’s instruments was less than 1/100 millionth of a degree, requiring the most sensitive equipment and the contrast. So whereas cosmic radiation has appeared uniform or with only slight variations in the past, scientists are now able to see even the slightest changes, which is intrinsic to their work.planck-attnotated-580x372

So in summary, we have learned that the universe is a little older than previously expected, and that it most certainly was created in a single, chaotic event known as the Big Bang. Far from dispelling the greater mysteries, confirming these theories is really just the tip of the iceberg. There’s still the grandiose mystery of how all the fundamental laws such as gravity, nuclear forces and electromagnetism work together.

Ah, and let’s not forget the question of what transpires beneath the veil of an even horizon (aka. a Black Hole), and whether or not there is such a thing as a gateway in space and time. Finally, there’s the age old question of whether or not intelligent life exists somewhere out there, or life of any kind. But given the infinite number of stars, planets and possibilities that the universe provides, it almost surely does!

And I suppose there’s also that persistent nagging question we all wonder when we look up at the stars. Will we ever be able to get out there and take a closer look? I for one like to think so, and that it’s just a matter of time!

To boldly go!
To boldly go!

Sources: universetoday.com, (2), extremetech.com, bbc.co.uk

The Future of Space Exploration

spacex-icarus-670Back in January, National Geographic Magazine celebrated its 125th anniversary. In honor of this occasion, they released a special issue which commemorated the past 125 years of human exploration and looked ahead at what the future might hold. As I sat in the doctor’s office, waiting on a prescription for antibiotics to combat my awful cold, I found myself terribly inspired by the article.

So naturally, once I got home, I looked up the article and its source material and got to work. The issue of exploration, especially the future thereof, is not something I can ever pass up! So for the next few minutes (or hours, depending on how much you like to nurse a read), I present you with some possible scenarios about the coming age of deep space exploration.

MarsOneSuffice it to say, National Geographic’s appraisal of the future of space travel was informative and hit on all the right subjects for me. When one considers the sheer distances involved, not to mention the amount of time, energy, and resources it would take to allow people to get there, the question of reaching into the next great frontier poses a great deal of questions and challenges.

Already, NASA, Earth’s various space agencies and even private companies have several ideas in the works or returning to the Moon, going to Mars, and to the Asteroid Belt. These include the SLS (Space Launch System), the re-purposed and upgraded version of the Saturn V rocket which took the Apollo astronauts to the Moon. Years from now, it may even be taking crews to Mars, which is slated for 2030.

ESA_moonbaseAnd when it comes to settling the Moon, Mars, and turning the Asteroid Belt into our primary source of mineral extraction and manufacturing, these same agencies, and a number of private corporations are all invested in getting it done. SpaceX is busy testing its reusable-launch rocket, known as the Grasshopper, in the hopes of making space flight more affordable. And NASA and the ESA are perfecting a process known as “sintering” to turn Moon regolith into bases and asteroids into manufactured goods.

Meanwhile, Virgin Galactic, Reaction Engines and Golden Spike are planning to make commercial trips into space and to the Moon possible within a few years time. And with companies like Deep Space Industries and Google-backed Planetary Resources prospeting asteroids and planning expeditions, it’s only a matter of time before everything from Earth to the Jovian is being explored and claimed for our human use.

Space Colony by Stephan Martiniere
Space Colony by Stephan Martiniere

But when it comes to deep-space exploration, the stuff that would take us to the outer reaches of the Solar System and beyond, that’s where things get tricky and pretty speculative. Ideas have been on the table for some time, since the last great Space Race forced scientists to consider the long-term and come up with proposed ways of closing the gap between Earth and the stars. But to this day, they remain a scholarly footnote, conceptual and not yet realizable.

But as we embark of a renewed era of space exploration, where the stuff of science fiction is quickly becoming the stuff of science fact, these old ideas are being dusted off, paired up with newer concepts, and seriously considered. While they might not be feasible at the moment, who know what tomorrow holds? From the issues of propulsion, to housing, to cost and time expenditures, the human race is once again taking a serious look at extra-Solar exploration.

And here are some of the top contenders for the “Final Frontier”:

Nuclear Propulsion:
Project-Orion-Spacecraft
The concept of using nuclear bombs (no joke) to propel a spacecraft was first proposed in 1946 by Stanislaw Ulam, a Polish-American mathematician who participated in the Manhattan Project. Preliminary calculations were then made by F. Reines and Ulam in 1947, and the actual project – known as Project Orion was initiated in 1958 and led by Ted Taylor at General Atomics and physicist Freeman Dyson from the Institute for Advanced Study in Princeton.

In short, the Orion design involves a large spacecraft with a high supply of thermonuclear warheads achieving propulsion by releasing a bomb behind it and then riding the detonation wave with the help of a rear-mounted pad called a “pusher”. After each blast, the explosive force is absorbed by this pusher pad, which then translates the thrust into forward momentum.

Though hardly elegant by modern standards, the proposed design offered a way of delivering the explosive (literally!) force necessary to propel a rocket over extreme distances, and solved the issue of how to utilize that force without containing it within the rocket itself. However, the drawbacks of this design are numerous and noticeable.

Orion SchematicF0r starters, the ship itself is rather staggering in size, weighing in anywhere from 2000 to 8,000,000 tonnes, and the propulsion design releases a dangerous amount of radiation, and not just for the crew! If we are to rely on ships that utilize nuclear bombs to achieve thrust, we better find a course that will take them away from any inhabited or habitable areas. What’s more, the cost of producing a behemoth of this size (even the modest 2000 tonne version) is also staggering.

Antimatter Engine:
NASA_antimatterMost science fiction authors who write about deep space exploration (at least those who want to be taken seriously) rely on anti-matter to power ships in their stories. This is no accident, since antimatter is the most potent fuel known to humanity right now. While tons of chemical fuel would be needed to propel a human mission to Mars, just tens of milligrams of antimatter, if properly harnessed, would be able to supply the requisite energy.

Fission and fusion reactions convert just a fraction of 1 percent of their mass into energy. But by combine matter with antimatter, its mirror twin, a reaction of 100 percent efficiency is achieved. For years, physicists at the CERN Laboratory in Geneva have been creating tiny quantities of antimatter by smashing subatomic particles together at near-light speeds. Given time and considerable investment, it is entirely possible this could be turned into a form of advanced propulsion.

In an antimatter rocket, a dose of antihydrogen would be mixed with an equal amount of hydrogen in a combustion chamber. The mutual annihilation of a half pound of each, for instance, would unleash more energy than a 10-megaton hydrogen bomb, along with a shower of subatomic particles called pions and muons. These particles, confined within a magnetic nozzle similar to the type necessary for a fission rocket, would fly out the back at one-third the speed of light.

antimatter_shipHowever, there are natural drawback to this design as well. While a top speed of 33% the speed of light per rocket is very impressive, there’s the question of how much fuel will be needed. For example, while it would be nice to be able to reach Alpha Centauri – a mere 4.5 light years away – in 13.5 years instead of the 130 it would take using a nuclear rocket, the amount of antimatter needed would be immense.

No means exist to produce antimatter in such quantities right now, and the cost of building the kind of rocket required would be equally immense. Considerable refinements would therefore be needed and a sharp drop in the cost associated with building such a vessel before any of its kind could be deployed.

Laser Sail:
solar_sail1Thinking beyond rockets and engines, there are some concepts which would allow a spaceship to go into deep space without the need for fuel at all. In 1948, Robert Forward put forward a twist on the ancient technique of sailing, capturing wind in a fabric sail, to propose a new form of space travel. Much like how our world is permeated by wind currents, space is filled with cosmic radiation – largely in the form of photons and energy associated with stars – that push a cosmic sail in the same way.

This was followed up again in the 1970’s, when Forward again proposed his beam-powered propulsion schemes using either lasers or masers (micro-wave lasers) to push giant sails to a significant fraction of the speed of light. When photons in the laser beam strike the sail, they would transfer their momentum and push the sail onward. The spaceship would then steadily builds up speed while the laser that propels it stays put in our solar system.

Much the same process would be used to slow the sail down as it neared its destination. This would be done by having the outer portion of the sail detach, which would then refocus and reflect the lasers back onto a smaller, inner sail. This would provide braking thrust to slow the ship down as it reached the target star system, eventually bringing it to a slow enough speed that it could achieve orbit around one of its planets.

solar_sailOnce more, there are challenges, foremost of which is cost. While the solar sail itself, which could be built around a central, crew-carrying vessel, would be fuel free, there’s the little matter of the lasers needed to propel it. Not only would these need to operate for years continuously at gigawatt strength, the cost of building such a monster would be astronomical, no pun intended!

A solution proposed by Forward was to use a series of enormous solar panel arrays on or near the planet Mercury. However, this just replaced one financial burden with another, as the mirror or fresnel lens would have to be planet-sized in scope in order for the Sun to keep the lasers focused on the sail. What’s more, this would require that a giant braking sail would have to be mounted on the ship as well, and it would have to very precisely focus the deceleration beam.

So while solar sails do present a highly feasible means of sending people to Mars or the Inner Solar System, it is not the best concept for interstellar space travel. While it accomplishes certain cost-saving measures with its ability to reach high speeds without fuel, these are more than recouped thanks to the power demands and apparatus needed to be it moving.

Generation/Cryo-Ship:
ringworld2Here we have a concept which has been explored extensively in fiction. Known as an Interstellar Ark, an O’Neill Cylinder, a Bernal Sphere, or a Stanford Torus, the basic philosophy is to create a ship that would be self-contained world, which would travel the cosmos at a slow pace and keep the crew housed, fed, or sustained until they finally reached their destination. And one of the main reasons that this concept appears so much in science fiction literature is that many of the writers who made use of it were themselves scientists.

The first known written examples include Robert H. Goddard “The Last Migration” in 1918, where he describes an “interstellar ark” containing cryogenic ally frozen people that set out for another star system after the sun died. Konstantin E. Tsiolkovsky later wrote of “Noah’s Ark” in his essay “The Future of Earth and Mankind” in 1928. Here, the crews were kept in wakeful conditions until they reached their destination thousands of years later.

enzmann_starshipBy the latter half of the 20th century, with authors like Robert A. Heinlein’s Orphans of the Sky, Arthur C. Clarke’s Rendezvous with Rama and Ursula K. Le Guin’s Paradises Lost, the concept began to be explored as a distant possibility for interstellar space travel. And in 1964, Dr. Robert Enzmann proposed a concept for an interstellar spacecraft known as the Enzmann Starship that included detailed notes on how it would be constructed.

Enzmann’s concept would be powered by deuterium engines similar to what was called for with the Orion Spacecraft, the ship would measure some 600 meters (2000 feet) long and would support an initial crew of 200 people with room for expansion. An entirely serious proposal, with a detailed assessment of how it would be constructed, the Enzmann concept began appearing in a number of science fiction and fact magazines by the 1970’s.

RAMA2Despite the fact that this sort of ship frees its makers from the burden of coming up with a sufficiently fast or fuel-efficient engine design, it comes with its own share of problems. First and foremost, there’s the cost of building such a behemoth. Slow-boat or no, the financial and resource burden of building a mobile space ship is beyond most countries annual GDP. Only through sheer desperation and global cooperation could anyone conceive of building such a thing.

Second, there’s the issue of the crew’s needs, which would require self-sustaining systems to ensure food, water, energy, and sanitation over a very long haul. This would almost certainly require that the crew remain aware of all its technical needs and continue to maintain it, generation after generation. And given that the people aboard the ship would be stuck in a comparatively confined space for so long, there’s the extreme likelihood of breakdown and degenerating conditions aboard.

Third, there’s the fact that the radiation environment of deep space is very different from that on the Earth’s surface or in low earth orbit. The presence of high-energy cosmic rays would pose all kinds of health risks to a crew traveling through deep space, so the effects and preventative measures would be difficult to anticipate. And last, there’s the possibility that while the slow boat is taking centuries to get through space, another, better means of space travel will be invented.

Faster-Than-Light (FTL) Travel:
???????????????????????Last, we have the most popular concept to come out of science fiction, but which has received very little support from scientific community. Whether it was the warp drive, the hyperdrive, the jump drive, or the subspace drive, science fiction has sought to exploit the holes in our knowledge of the universe and its physical laws in order to speculate that one day, it might be possible to bridge the vast distances between star systems.

However, there are numerous science based challenges to this notion that make an FTL enthusiast want to give up before they even get started. For one, there’s Einstein’s Theory of General Relativity, which establishes the speed of light (c) as the uppermost speed at which anything can travel. For subatomic particles like photons, which have no mass and do not experience time, the speed of light is a given. But for stable matter, which has mass and is effected by time, the speed of light is a physical impossibility.

Galactica_newFor one, the amount of energy needed to accelerate an object to such speeds is unfathomable, and the effects of time dilation – time slowing down as the speed of light approaches – would be unforeseeable. What’s more, achieving the speed of light would most likely result in our stable matter (i.e. our ships and bodies) to fly apart and become pure energy. In essence, we’d die!

Naturally, there have been those who have tried to use the basis of Special Relativity, which allows for the existence of wormholes, to postulate that it would be possible to instantaneously move from one point in the universe to another. These theories for “folding space”, or “jumping” through space time, suffer from the same problem. Not only are they purely speculative, but they raise all kinds of questions about temporal mechanics and causality. If these wormholes are portals, why just portals in space and not time?

The supermassive black hole at the center of the Milky Way galaxy.And then there’s the concept of a quantum singularity, which is often featured in talk of FTL. The belief here is that an artificial singularity could be generated, thus opening a corridor in space-time which could then be traversed. The main problem here is that such an idea is likely suicide. A quantum singularity, aka. a black hole, is a point in space where the laws of nature break down and become indistinguishable from each other – hence the term singularity.

Also, they are created by a gravitational force so strong that it tears a hole in space time, and that resulting hole absorbs all things, including light itself, into its maw. It is therefore impossible to know what resides on the other side of one, and astronomers routinely observe black holes (most notably Sagittarius A at the center of our galaxy) swallow entire planets and belch out X-rays, evidence of their destruction. How anyone could think these were a means of safe space travel is beyond me! But then again, they are a plot device, not a serious idea…

alcubierre-warp-drive-overviewBut before you go thinking that I’m dismissing FTL in it’s entirety, there is one possibility which has the scientific community buzzing and even looking into it. It’s known as the Alcubierre Drive, a concept which was proposed by physicist Miguel Alcubierre in his 1994 paper: “The Warp Drive: Hyper-Fast Travel Within General Relativity.”

The equations and theory behind his concept postulate that since space-time can be contracted and expanded, empty space behind a starship could be made to expand rapidly, pushing the craft in a forward direction. Passengers would perceive it as movement despite the complete lack of acceleration, and vast distances (i.e. light years) could be passed in a matter of days and weeks instead of decades. What’s more, this “warp drive” would allow for FTL while at the same time remaining consistent with Einstein’s theory of Relativity.

In October 2011, physicist Harold White attempted to rework the equations while in Florida where he was helping to kick off NASA and DARPA’s joint 100 Year Starship project. While putting together his presentation on warp, he began toying with Alcubierre’s field equations and came to the conclusion that something truly workable was there. In October of 2012, he announced that he and his NASA team would be working towards its realization.

But while White himself claims its feasible, and has the support of NASA behind him, the mechanics behind it all are still theoretical, and White himself admits that the energy required to pull off this kind of “warping” of space time is beyond our means at the current time. Clearly, more time and development are needed before anything of this nature can be realized. Fingers crossed, the field equations hold, because that will mean it is at least theoretically possible!

warp_drive

Summary:
In case it hasn’t been made manifestly obvious by now, there’s no simple solution. In fact, just about all possibilities currently under scrutiny suffer from the exact same problem: the means just don’t exist yet to make them happen. But even if we can’t reach for the stars, that shouldn’t deter us from reaching for objects that are significantly closer to our reach. In the many decades it will take us to reach the Moon, Mars, the Asteroid Belt, and Jupiter’s Moons, we are likely to revisit this problem many times over.

And I’m sure that in course of creating off-world colonies, reducing the burden on planet Earth, developing solar power and other alternative fuels, and basically working towards this thing known as the Technological Singularity, we’re likely to find that we are capable of far more than we ever thought before. After all, what is money, resources, or energy requirements when you can harness quantum energy, mine asteroids, and turn AIs and augmented minds onto the problems of solving field equations?

Yeah, take it from me, the odds are pretty much even that we will be making it to the stars in the not-too-distant future, one way or another. As far as probabilities go, there’s virtually no chance that we will be confined to this rock forever. Either we will branch out to colonize new planets and new star systems, or go extinct before we ever get the chance. I for one find that encouraging… and deeply disturbing!

Source: ngm.nationalgeographic.comnasa.gov, discoverymagazine.com, eng.wikipedia.org, 100yss.org

Candidates for De-Extinction

Woolly Mammoth Replica in Museum ExhibitIt’s no secret that humanity’s success on this planet we call Earth has come at a high cost. Since our ancestors began migrating out of Africa some 70,000 years ago, their passage and settlement have left marks on the natural environment and its species. In short, our ability to grow has always meant extinction for other species, be they other forms of high-order primates (such as Neanderthals) or animals hunted for their pelts and meat (such as wooly mammoths).

In fact, the Neolithic Revolution, which began some 15,000 years ago with the adoption of farming, was believed to have been motivated by the mass extinction of animals that were once hunted by our ancestors. And since that time, countless more species have been pushed to the brink or killed off entirely by our ever-expanding, consuming, and polluting ways. However, recent innovations in biology and bio-medicine might just be able to reverse this trend.

??????????????????????????????????????????????????????????????????????????????????Last Friday, at a at a National Geographic-sponsored TEDx conference, scientists met in Washington, D.C. to discuss which animals we should bring back from extinction, as well as the means and ethics involved in doing so. They called it “de-extinction”, and considered which species they would consider restoring to existence. The conference resulted in a list of 24 species that were selected for restoration, as well as some guidelines for the selection process.

Those chosen were based on the following criteria and future selections will be determined the same way:

  1. Are the species desirable — do they hold an important ecological function or are they beloved by humans?
  2. Are the species practical choices — do we have access to tissue that could give us good quality DNA samples or germ cells to reproduce the species?
  3. And are they able to be reintroduced to the wild — are the habitats in which they live available and do we know why they went extinct in the first place?

As you might imagine, dinosaurs didn’t make the cut. In addition to no longer serving and important ecological function, the habitats they once had access to are long gone (Earth’s climate and ecology have changed drastically since the Cretaceous Period), and most importantly, we no longer have access to their DNA.

TEDxDeExtinctionYes, despite what Michael Crichton told us, the DNA of dinosaur fossils is so far degraded that something like Jurassic Park would never be possible. And of course, despite being beloved by humans, they aren’t exactly safe customers to have around! But rest assured, the list of candidates is still very long.

Of the 24 species selected, the majority were families of birds which were pushed to extinction due to hunting, deforestation, urban sprawl, pollution, and loss of habitat. In addition, the famous Auroch, a species of cattle that is commemorated in myth but which actually existed until 1627. And then there’s the equally famous DoDo bird, the fearless bird which was rendered extinct by Portuguese settlers in its native Mauritius.

woolly-mammoth1And then there’s the venerable Wooly Mammoth, the great shaggy member of the Elephantidae family which went extinct some 4000 years ago. Not only is this animals demise directly associated with humanity’s ascendance to the top of the food chain, it is something which may now be entirely reversible. Thanks to frozen, preserved carcasses of Mammoths, which are still found in the north to this day, scientists have access to well-preserved strands of their DNA.

And as already noted, the issue of cost, ethics and desirability featured pretty prominently in the conference. For starters, those present had to consider whether or not it would be a good idea to bring animals back from the brink seeing as how it was human agency that led to their extinction in the first place. Is the world any better off than it was hundreds or even thousands of years ago? Would these animals find new purchase, or just end up dying off again?

sabre-tooth-tiger-_1117360cSecond, there was the question of housing them and reintroducing them into the wild. Not only is it a question of them being able to find habitats again, it’s a question of whether or not we can ensure the kind of transition that would be needed. Sure, we’d all love to see Sabre-Tooth tigers alive and well again, but its not like we can just clone them and send them back out into the wild. Who’s to say how their reintroduction will impact species that are currently roaming about in the wild?

And of course, there was the consideration of what all this tampering amounts to. Given that human agency is responsible for all this loss of life, would resurrecting them simply be more of the same? Would we be, in effect, playing God and tampering with forces best left to nature? All good questions, and they force us to consider an alternative proposition.

Perhaps what would be best for the natural world and its remaining species would be for us to stop behaving so irresponsibly. Perhaps we should focus on sustainable living, cleaning up pollution, ending climate change, and getting our own population under control before we start trying to repopulate other species. Still, it is an intriguing possibility, and provides some reassurance that no matter how much damage we end up doing, that we might be able to undo some after the fact. Perhaps we just need to wait…

Too bad about Jurassic Park though. In the course of everything else discussed at this TED conference, I’m sure that the announcement that dinosaurs were as good as gone shattered the dreams of many an eccentric billionaire!

t-rex

Sources: businessinsider.com, nationalgeographic.com

Manned Mars Mission Update!

Mars_landerMillionaire and space enthusiast Dennis Tito surprised the world with his announcement that he plans to fund a couple’s expedition to Mars. Apparently, the trip is planned to take place in 2018 during a conjunction of our planet with Mars, will take 501 days, and will involve sending a married couple in a capsule roughly the size of a Winnebago. But as time goes on, more news is trickling out of the “Inspiration Mars” program, and some of it is raising eyebrows.

For example, there’s the news that the Mars capsule will involve a rather interesting form of radiation shielding… made of feces. You read that right, the capsule will contain shielding composed of human feces (among other things) that will shield the couple inside from harmful cosmic radiation. But before people begin visualizing some ugly, creepy concoction, let me assure them that this concept is not as unusual as it sounds.

tito-mars-mission-conceptWhen it comes right down to it, this is the greatest health threat the people who go will face, followed shortly thereafter by muscle atrophy, boredom and cramped conditions. And rather than line the capsule with expensive and heavy metals, such as lead, the engineers designing the Inspiration Mars capsule thought they might kill two birds with one stone.

According to Taber MacCallum, co-founder and CEO of the Paragon Space Development Corporation and member of the Inspiration Mars team, explained that the idea had to do with waste recycling and storage. Since the couple will be eating, drinking and defecating within the capsule for a full 501 days, the waste has to go somewhere.

Mars_orbitThe proposed solution? Put it in the walls, along with food and liquid waste, and then desiccate it all to recycle the water. Or, as MacCallum put it:

It’s a little queasy sounding, but there’s no place for that material to go, and it makes great radiation shielding… Dehydrate them as much as possible, because we need to get the water back. Those solid waste products get put into a bag, put right back against the wall.

But to be fair, this proposal is not exactly new. In fact, the idea was mentioned back in 2011 by Michael Flynn, a life support engineer at NASA Ames Research Center, who proposed using urine and feces to shield space stations. Packing for Mars author Mary Roach The Geek’s Guide to the Galaxyalso mentioned it in a 2011 edition of The Geek’s Guide to the Galaxy. NASA’s Innovative Advanced Concepts program is also working out the nuts and bolts of this concept under the name of “Water Walls Architecture”.

Source_croppedWater, MacCallum explained, is the key ingredient here, since it serves as a better radiation shield than metal. It’s the nuclei of atoms that block the radiation you see, and water contains more atoms (and therefore more nuclei) per volume than metal does. Food and waste also provide good radiation shielding, and because the food blocks rather than absorbs the radiation, it will remain safe to eat.

Naturally, McCallum was sure to note that they are still working out some of the logistical problems. For one, they still need to figure out how best to keep the Mars-bound couple from experiencing too many nasty sights and smells on their journey.

Gotta admit, this isn’t something you think about when you hear the word “space travel” do you? But then again, you have to account for things like this. Until people can survive without consuming food and water, and expelling waste, long-term space missions will have to figure out what to do about all the dirty, ugly business people get into!

Sources: newscientist.com, IO9

Powered by the Sun: Microbead Solar Cells

solar3Despite how far solar cells have come in recent years, issues like production and installation costs have remained an ongoing obstacle to their full scale adoption. But as they say, obstacles are meant to be overcome, and can often produce very interesting solutions. For example, peel and stick solar panels that can be manufactured by a 3D printer are one option. Another is the recent creation of a solar cell as thin as a strand of hair. And as it happens, a third has just been unveiled.

This latest one comes to us from the University of Oslo, where researchers have come up with a way to produce silicon solar cells that are twenty times thinner than commercial solar cells. Typically, solar cells are fashioned out of 200-micrometer-thick (0.2mm) wafers of silicon, which given their average rate of power generation works out to about five grams of silicon per watt of solar power. Combined with all the silicon wasted in the production process, this makes for a very inefficient process.

Solar-Wafer-Solar-CellsOne way around this is to reduce the thickness of solar wafers, but this presents its own problems. As the wafer gets thinner, more light passes straight through the silicon, dramatically reducing the amount of electricity produced by the photovoltaic effect. Blue light, which has a short wavelength, can be absorbed by a very thin solar cell; but red light, which has longer wavelengths, can only be captured by thicker wafers.

Enter into this the breakthrough created by the Oslo researchers. Using a revolutionary technique involving microbeads – tiny plastic spheres that create an almost perfect periodic pattern on the silicon. Apparently, these beads force the sunlight to “move sideways,” ensuring a more uniform and powerful rate of absorption. Another trick is to dot the backs of each cell with asymmetric microindentations,which can trap even more solar energy.

solar_beadsUsing these techniques, silicon wafers can be created that measure a mere 10 micrometers in thickness but can do the job of a 200 micrometer cell. By using 95% less silicon, the cost of production drops considerably, which will reduce the cost of solar power installations and – more importantly – increase profits. With current production methods and costs, the profit margin associated with solar power is pretty negligible.

This latter aspect is especially important as far as commercial production comes into play. If we are to expect industries to adopt solar power for their energy needs, it has to be worth their while. At the moment, the Oslo researchers are in talks with industrial partners to investigate whether these methods can be scaled up to industrial production. But given the nature of their work, they seem quite confident that their technology could come to the market within five to seven years.

Stay tuned for more installments in the PBTS series!

Source: Extreme.tech

The Future is Here: Signal and Camera Bike Helmets

bike_helmet Today’s bikers have a wide assortment of gear to choose from. Everything from sport bikes with shocks in the front to streamlined, ergonomic body suits. And yet, bikers are still reliant on hand gestures to let traffic know what they are doing. Well, thanks to Hungarian designer Balázs Filczer new concept, hand signals will soon be a thing of the past.

Known as the Dora, this bike helmet incorporates turn signals and a brake light into a futuristic design. Activated via a Bluetooth, the helmet’s signals through a series of controls that are attached to the handlebars. The design concept was first pitched at the International Bicycle Design Competition in October 2012, where it took home the award for its category of clothing and accessories.

bike_helmet1Though still in the conceptual phase, this product shows a great deal of promise because of the way it would allow bikers to speak the language of car drivers. In many cases, bicyclists are injured because their signalling is not interpreted correctly by vehicles driving around them. But should this helmet fail, perhaps because the driver in question is a total jerk, there’s always this next concept to fall back on:

bike_helmet2It’s called the Helmet of Justice, a concept created by John Poindexter and Texas-based mobile studio Chaotic Moon. After being himself involved in a hit-and-run accident, he committed himself to creating a bike helmet that would allow bikers to even the odds against inconsiderate drivers who commit accidents and then flee the scene.

Admittedly, bike helmets are a little outside Chaotic Moon’s repertoire. Ordinarily, the company is known for creating mobile products for big name clients like Fox, Microsoft, and Disney. Still, the company was dedicated to a design that incorporated seven recording devices along with a software solution to easily let people upload data.

mini_cameraThe seven mini-cameras, secured in the helmet’s air vents with a layer of foam, record video at 30 frames per second with a resolution of 720×480. They provide a 360-degree view of an accident as it happens. All of the core data is saved in a detachable USB drive integrated into the helmet which the rider can then upload to their computer after coming from a near-brush with vehicular homicide.

Chaotic Moon is now in talks with major helmet manufacturers about licensing the product, and the price for an individual helmet is estimated to be around $300. Not cheap, but considering that it could lead to a successful lawsuit, you might have to offset the cost by the average out of court settlement 😉

Sources: fastcoexist.com, (2)

The Future is Here: Batteries for Stretchable Implants

Stretchable-battery1One of the newest and greatest developments in medical technology of late has been the creation of electronics that can stretch and flex. Increasingly, scientists are developing flexible electronics like video displays and solar panels that could make their way into clothing or even bodies. But of course, some challenges remain, specifically in how to power these devices.

Thus far, researchers have been able to develop batteries that are thin and bendable, flexibility has proven more of a challenge. In addition, no stretchable batteries have thus far offered rechargeability with high the kind of storage capacity that one might expect from the lithium-ion technology now powering many smartphones, tablets, laptops and other mobile devices.

flexbatteryHowever, that may be changing thanks to two research scientists – Yonggang Huang from Northwestern University and John A. Rogers University of Illinois. Together, they have unveiled a rechargeable lithium-ion battery that can be stretched, twisted and bended, and is still capable of powering electronics. What’s more, the power and voltage of this battery are similar to a conventional lithium-ion battery and can be used anywhere, including the inside of the human body.

Whereas previous batteries of its type had a hard time stretching up to 100 percent of their original size, this new design is capable of stretching up to 300 percent. Huang and Rogers have indicated that this will make it ideal for powering implantable electronics that are designed for monitoring brain waves or heart activity. What’s more, it can be recharged wirelessly and has been tested up to 20 cycles of recharging with little loss in capacity.

Stretchable-batteryFor their stretchable electronic circuits, the two developed an array of tiny circuit elements connected by metal wire “pop-up bridges.” Typically, this approach works for circuits but not for a stretchable battery, where components must be packed tightly to produce a powerful enough current. Huang’s design solution is to use metal wire interconnects that are long, wavy lines, filling the small space between battery components.

In a paper published on Feb. 26, 2013 in the online journal Nature Communications, Huang described the process of creating their new design:

“We start with a lot of battery components side by side in a very small space, and we connect them with tightly packed, long wavy lines. These wires provide the flexibility. When we stretch the battery, the wavy interconnecting lines unfurl, much like yarn unspooling. And we can stretch the device a great deal and still have a working battery.”

No telling when the first stretchable electronic implant will be available for commercial use, but now that we have the battery issue worked out, its only a matter of time before hospitals and patient care services are placing them in patients to monitor their health and vitals. Combined with the latest in personal computing and wireless technology, I also imagine everyone will be able to keep a database of their health which they will share with their doctor’s office.

And be sure to check out the video of the new battery in action:

Source: neurogadget.com

Reconstructing the Earliest Languages

prometheus_engineer1Remember that scene in Prometheus when David, the ship’s AI, was studying ancient languages in the hopes of being able to speak to the Engineers? The logic here was that since the Engineers were believed to have visited Earth many millennia ago to tamper with human evolution, that they were also responsible for our earliest known languages. In David’s case, this meant reconstructing the ancient tongue known as Proto-Indo-European.

Given the fact that my wife is linguistics major, and that I love all things ancient and historical, I found the concept pretty intriguing – even if it was a little Ancient Astronauts-y. To think that we could trace words and meaning back through endless iterations to determine what the earliest language recognized by linguists sounded like. Given how many tongues it has “parented”, it would be cool to meet the common ancestor.

prometheus-lingua2And now there is a piece of software that can do just that. Thanks to a group of linguists and computer scientists in the US and Canada, this program has shown the ability to analyze enormous groups of languages to reconstruct the earliest human languages, long before there was writing. By using this program and others like it, linguists may one day know how people sounded when they talked 20,000 years ago.

Alexandre Bouchard-Côté, a University of British Columbia statistician, began working on the program when he was a graduate student at UC Berkeley. By using algorithms to compare sounds and cognates across hundreds of different modern languages, he found he could predict which language groups were most related to each other. Basically, a sound that remained the same across distantly-related languages most likely existed early in our linguistic evolutionary tree.

Primary_Human_Language_Families_MapModern linguists speculate that the earliest languages that led to today’s tongues include Proto-Indo-European, Proto-Afroasiatic and Proto-Austronesian. These are the ancestral language families that gave rise to languages like Celtic, Germanic, Italic and Slavic; Arabic, Hebrew, Cushite and Somali; and Samoan, Tahitian, and Maori. Though by no means the only language family trees (they do not account of Sub-Saharan Africa or the pre-Columbian Americas, for example), they do encompass the majority of spoken languages today.

For their purposes, Bouchard-Côté and his colleagues focused on Proto-Austronesia, the family which led to today’s Polynesian languages as well as languages in Southeast Asia and parts of continental Asia. Using the software they developed, they were able to reconstruct over 600 ancient Proto-Austronesian languages and published their findings in the December issue of Proceedings of the National Academy of Sciences.

proto=austronesianIn their paper, Bouchard-Côté and his researchers said this of their new program:

“The analysis of the properties of hundreds of ancient languages performed by this system goes far beyond the capabilities of any previous automated system and would require significant amounts of manual effort by linguists.”

Ultimately, this program could allow linguists to hear languages that haven’t been spoken in millennia, reconstructing a lost world where those languages spread across the world, evolving as they went. In addition, it could be used for linguistic futurism, anticipating how languages may evolve over time and surmising what people will speak and sound like hundreds or even thousands of years from now.

Personally, I think the ability to look back and know what our ancestors sounded like is the real prize, but I’d be a poor sci-fi nerd if I didn’t at least fantasize about what our language patterns will sound like down the road. Lord knows its been speculated about plenty of times thus far, with thoughts ranging from Galego (a Slavic-English hybrid from Dune), the Chinese-English smattering used in Firefly, and City Speak from Blade Runner.

Hey, remember this little gem? Bonus points to anyone who can translate it for me (without consulting Google Translate!):

Monsieur, azonnal kövessen engem, bitte! Lófaszt! Nehogy már! Te vagy a Blade, Blade Runner! Captain Bryant toka. Meni-o mae-yo.

Sources: IO9, pnas.org

Reducing Energy Use Through AI

hal9000Interesting fact: household energy consumption accounts for about a third of an individuals carbon footprint. You know that energy that powers your water-heater, lighting, thermostat, stove, refrigerator, A/C, television, personal devices, computer… Yes, all that. As long as our current methods of generating energy cause carbon emissions, environmental problems with persist.

But of course, there are plenty of things we could be doing to curb our use of power at the same time. Turning off the lights, shutting down unused devices, turning down the heat; all good energy-saving habits. And if we forget, perhaps a kindly voice could remind us. Say… an artificial intelligence with an eerily polite voice that monitors our energy usage and tells us how to do better.

AI'sThat’s the idea being explored by Nigel Goddard, a professor at the University of Edinburgh’s School of Informatics who is trying to solve consumption problems by using cutting-edge AI techniques. In the multi-year IDEAL project that will be launching in 2013, Goddard and his colleagues will outfit hundreds of British homes with sensors that monitor temperature, humidity, and light levels, as well as gas and electricity use, and wirelessly report their readings.

The concept used here is known as “machine learning”, a branch of AI that involves the development of systems that can learn from data and anticipate behaviors. Once Goddard and his team have used this technique to process all the data returned by their sensors, they will rely on another cutting-edge technology – known as natural language synthesis – to generate automatic text messages that give people feedback about their energy use.

Green-TechnologyThe goal is not just to reduce people’s carbon footprint, but save them money as well. At least that’s the approach Goddard and his team are taking when it comes to their automated texts. Naturally, the amount of money saved will be based on household size and income, among other factors. But Goddard and his team anticipate that the inclusion of these sensors in people’s homes will save them 20 % off their utility costs across the board.

Taken in conjunction with numerous developments in the fields of clean energy, touchscreen displays and and solar power, a utility-monitoring computer program could be just what the doctor ordered for every futuristic home. Provided of course, you don’t mind taking instruction from a friendly AI…

Maybe now would be a good time to institute the Three Laws of Robotics!

Source: fastcoexist.com

Hacker Wars: The Invasion Continues!

cyber-war-1024x843State-sponsored hacking has been a major concern lately. From Russia’s “Red October” virus, which spied on embassies and diplomats in multiple countries, to China’s ongoing intrusion into government and corporate databases in the US, it seems as though private hackers are no longer the only ones we need to worry about.

The latest incident in this invasion of privacy and airing of personal information comes again from Russia, where a mysterious website has been posting personal information about some rather high-profile American figures. These include First Lady Michelle Obama, Vice-President Joe Biden, Jay-Z, Britney Spears, U.S. Attorney General Eric Holder, Sarah Palin, Arnold Schwarzenegger, and the head of the FBI.

michelle-obama_fullIn addition to taunting messages and unflattering pictures, the site includes Social Security numbers, credit reports, addresses and phone numbers. No reasons are listed on the site as to why these particular people were selected, but it seems clear at this point that they were chosen due to their high-profile nature and/or positions of importance within the US government. As of last Tuesday, both the FBI and Secret Service announced that they were investigating the website.

Though it is not definitively clear where the hackers are operating from, all indications point to Russia. The first clue came when it was revealed that site bore the internet suffix originally assigned to the Soviet Union (.su), a practice which is not uncommon with Russian hackers these days. In addition, it is also connected to a Twitter account, which carried an an anti-police message posted in Russian.

hackers_securityAt the moment, neither the White House or the Secret Service is offering assessments or comments on the matter. But some thoughts have been offered by Los Angeles Police Commander Andrew Smith, who spoke on behalf of Chief Charlie Beck, who’s information was also posted. According to Beck, this is not the first time that top police officials have had their private information posted online:

“People get mad at us, go on the Internet and try to find information about us, and post it all on one site. The best word I can use to describe it is creepy. It’s a creepy thing to do.”

Frank Preciado, assistant officer in charge of the LAPDs online division, added that the information on the police chief was likely taken from what is supposed to be a secure database of city employees. And it might just offer some insight into this latest, sweeping act of inforpiracy. When all is said and done, it appears that this may simply be a case of a small but qualified group of misfits engaging in public mischief.

internetHowever, of greater concern is the fact that with this latest act of high-profile hacking, a trend that citizens were forewarned might be coming true. In December of 2012, internet security company McAfee warned of an impending attack by Russian hackers against American banks. Dubbed “Project Blitzkrieg”, the threat of the attack surfaced on a Russian hacking forum in the previous September, and McAfee was quick to advised that it was a credible one.

As of December 2012, Russian hackers had effectively infected 500 databases in the US with the promise of more to come. The cybercriminal known as vorVzakone – whose name means ‘thief in law’ – was identified as the head of the operation, whose plans called for the release of a Trojan horse virus that would allow him and his accomplices to seize control of banks’ computers to steal information and money.

cold_war

Clearly, all of these incidents amount to a major public concern. But of greater concern to me is the fact the lines being drawn in this new era of cyber-warfare are eerily familiar. Not long ago, China and Russia were locked in an ongoing feud with the US and its allies, a war fueled by ideology but based on the cultivation of technology and espionage networks.

Granted, only China’s case of cyberwarfare against the US appears to be government-backed. But between the “Red October” virus,  “Project Blitzkrieg”, and the fact that Russian hackers are in the habit of using a Soviet-era suffix to designate their activities, it seems that Russia is fertile ground for a renewed standoff with the West as well. And given that the targets have been western governments and financial institutions, would it be so farfetched to assume the government might be marginally involved?

The means may have changed, but the overall purpose remains the same. Infiltrate, destabilize, and steal information from the enemy. Are we looking at a renewed Cold War, or just the last gasps of an ideological confrontation that was supposed to have died years ago? Only time will tell…

Sources: cbc.ca, dailymail.co.uk