The Future is Here… and Badass: The Electric Harley

harley_livewireThe Harley Davidson enjoys a rather unique place in American culture, one that is characterized by images of the open road, a sense of freedom, and the sound of a deep, earthy growl. Which is why, when the company released a teaser video earlier this month showcasing Project Livewire, many people were understandably nervous. After all, electronic vehicles seem just fine when it comes to the Toyota Prius, the Nissan Leaf, or anything in the Tesla catalog. But this is Harley Davidson, right?

In this case, the challenge arises from the fact that electric engines are usually silent. In the case of a Harley Hog or a Chopper – or any other classic brand names that scream Hell’s Angels, leather jackets and anarchy – the engine is an iconic part of the brand. They also didn’t want to fake the roar of the engine. Instead, the engineers carefully tweaked the arrangement of the motor and the gear box until it created a sound that’s a little like a jet flying by.

harley_livewire1Jeff Richlen, the chief engineer for the new prototype bike, explained:

When we went into this, we had to consider all of our products are grounded in three things–look, sound, and feel. The sound is the most important, and we didn’t want to lose that. We didn’t want a silent product… The first time we spun up the gears and ran the motorcycle we knew we had something special. It really was defining another sound of Harley Davidson. We’re certainly not forgetting our past and what is our product legacy, it’s just something brand new. And it kind of sounds like the future.

When addressing the reason for the project, Richlen admitted that the company’s main motivation wasn’t trying to improve the sustainability of their bikes, even though motorcycles produce more tailpipe emissions than cars. In the end, he claims that the company is looking to the possibilities of the future, and electronic engines are at the forefront of that. And while cars are well represented, the potential for motorbikes remains largely unexplored. Going green was merely a biproduct of that.

harleyIn the teaser video, things open up on historic route 66. A Harley drives by, only it doesn’t sound like a Harley. It’s quieter, more like the jet engine of a very small plane. Over the summer, Harley-Davidson will take the new LiveWire bike on a 30-city tour of the U.S. to get customer feedback. Richlen has extended an invite to anyone who doubts the power of the bike to come on out try the bike for themselves. The real test, he says, is in the twist of the throttle:

There are some limitations of the EV space right now, and we understand that, and that’s why we’re looking for feedback–what do customers expect out of the product, what would their tradeoff points be? There may be people who get on this thinking ‘golf cart’ and get off it thinking rocket ship.

So if you happen to live in a city where the Harley tour is stopping, and have a love of bikes that borders on the erotic, go check it out. And be sure to check out the teaser video below:


Source:
fastcoexist.com, cnet.com

News From Space: Rosetta Starts, Orion in the Wings

 Quick Note: This is my 1700th post!
Yaaaaaay, happy dance!

Rosetta_Artist_Impression_Far_625x469Space exploration is a booming industry these days. Between NASA, the ESA, Roscosmos, the CSA, and the federal space agencies of India and China, there’s just no shortage of exciting missions aimed at improving our understanding of our Solar System or the universe at large. In recent months, two such missions have been making the news; one of which (led by the ESA) is now underway, while the other (belonging to NASA) is fast-approaching.

In the first instance, we have the ESA’s Rosetta spacecraft, which is currently on its way to rendezvous with the comet 67P/Churyumov-Gerasimenko at the edge of our Solar System. After awaking from a 957 day hibernation back in January, it has just conducted its first instruments observations. Included in these instruments are three NASA science packages, all of which have started sending science data back to Earth.

Rosetta_and_Philae_at_cometSince leaving Earth in March 2004, the Rosetta spacecraft has traveled more than 6 billion km (3.7 billion miles) in an attempt to be the first spacecraft to successfully rendezvous with a comet. It is presently nearing the main asteroid belt between Jupiter and Mars – some 500,000 km (300,000 miles) from its destination. And until August, it will executing a series of 10 orbit correction maneuvers to line it self up to meet with 67P, which will take place on August 6th.

Rosetta will then continue to follow the comet around the Sun as it moves back out toward the orbit of Jupiter. By November of 2014, Rosetta’s mission will then to launch its Philae space probe to the comet, which will provide the first analysis of a comet’s composition by drilling directly into the surface. This will provide scientists with the first-ever interior view of a comet, and provide them with a window in what the early Solar System looked like.

rosetta-1The three NASA instruments include the MIRO, Alice, and IES. The MIRO (or Microwave Instrument for Rosetta Orbiter) comes in two parts – the microwave section and the spectrometer. The first is designed to measure the comet’s surface temperatures to provide information on the mechanisms that cause gas and dust to pull away from it and form the coma and tail. The other part, a spectrometer, will measure the gaseous coma for water, carbon monoxide, ammonia, and methanol.

Alice (not an acronym, just a nickname) is a UV spectrometer designed to determine the gases present in the comet and gauge its history. It will also be used to measure the rate at which the comet releases water, CO and CO2, which will provide details of the composition of the comet’s nucleus. IES (or Ion and Electron Sensor) is one of five plasma analyzing instruments that make up the Rosetta Plasma Consortium (RPC) suite. This instrument will measure the charged particles as the comet draws nearer to the sun and the solar wind increases.

oriontestflightNamed in honor of the Rosetta Stone – the a basalt slab that helped linguists crack ancient Egyptian – Rosetta is expected to provide the most detailed information about what comets look like up close (as well as inside and out). Similarly, the lander, Philae, is named after the island in the Nile where the stone was discovered. Together, they will help scientists shed light on the early history of our Solar System by examining one of its oldest inhabitants.

Next up, there’s the next-generation Orion spacecraft, which NASA plans to use to send astronauts to Mars (and beyond) in the not too distant future. And with its launch date (Dec. 4th, 2014) approaching fast, NASA scientists have set out what they hope to learn from its maiden launch. The test flight, dubbed EFT-1 is the first of three proving missions set to trial many of the in-flight systems essential to the success of any manned mission to Mars, or the outer Solar System.

orionheatshield-1EFT-1 will take the form of an unmanned test flight, with the Orion spacecraft being controlled entirely by a flight control team from NASA’s Kennedy Space Center located in Florida. One vital component to be tested is the Launch Abort System (LAS), which in essence is a fail-safe required to protect astronauts should anything go wrong during the initial launch phase. Designed to encapsulate the crew module in the event of a failure on the launch pad, the LAS thrusters will fire and carry the Orion away from danger.

Orion’s computer systems – which are 400 times faster than those used aboard the space shuttle and have the ability to process 480 million instructions per second- will also be tested throughout the test flight. However, they must also demonstrate the ability to survive the radiation and extreme cold of deep space followed by the fiery conditions of re-entry, specifically in the context of prolonged human exposure to this dangerous form of energy.

oriontestflight-1Whilst all systems aboard Orion will be put through extreme conditions during EFT-1, none are tested as stringently as those required for re-entry. The entire proving mission is designed around duplicating the kind of pressures that a potential manned mission to Mars will have to endure on its return to Earth, and so naturally the results of the performance of these systems will be the most eagerly anticipated by NASA scientists waiting impatiently in the Kennedy Space Center.

Hence the Orion’s heat shield, a new design comprised of a 41mm (1.6-inch) thick slab of Avcoat ablator, the same material that protected the crew of Apollo-era missions. As re-entry is expected to exceed speeds of 32,187 km/h (20,000 mph), this shield must protect the crew from temperatures of around 2,204 ºC (4,000 ºF). Upon contact with the atmosphere, the heat shield is designed to slowly degrade, drawing the intense heat of re-entry away from the crew module in the process.

orionheatshield-2The final aspect of EFT-1 will be the observation of the parachute deployment system. Assuming the LAS has successfully jettisoned from the crew module following launch, the majority of Orion’s stopping power will be provided by the deploying of two drogue parachutes, followed shortly thereafter by three enormous primary parachutes, with the combined effect of slowing the spacecraft to 1/1000th of its initial re-entry speed.

Previous testing of the parachute deployment system has proven that the Orion spacecraft could safely land under only one parachute. However, these tests could not simulate the extremes that the system will have to endure during EFT-1 prior to deployment. The Orion spacecraft, once recovered from the Pacific Ocean, is set to be used for further testing of the ascent abort system in 2018. Data collected from EFT-1 will be invaluable in informing future testing, moving towards a crewed Orion mission some time in 2021.

oriontestflight-2NASA staff on the ground will be nervously monitoring several key aspects of the proving mission, with the help of 1,200 additional sensors geared towards detecting vibration and temperature stress, while taking detailed measurements of event timing. Furthermore, cameras are set to be mounted aboard Orion to capture the action at key separation points, as well as views out of the windows of the capsule, and a live shot of the parachutes as they deploy (hopefully).

The launch promises to be a historic occasion, representing a significant milestone on mankind’s journey to Mars. Orion, the product of more than 50 years of experience, will be the first human-rated spacecraft to be constructed in over 30 years. The Orion will be launch is expected to last four hours and 25 minute, during which time a Delta-2 Heavy rocket will bring it to an altitude of 5,794 km (3,600 miles) with the objective of creating intense re-entry pressures caused by a return from a deep space mission.

And be sure to check out this animation of the Orion Exploration Flight Test-1:

Sources: gizmag.com, (2)

News from Mars: Opportunity Still at Work

opportunityAfter ten years in service (when it wasn’t supposed to last longer than nine months), one would think that left for the Opportunity rover to do. And yet, Opportunity is still hard at work, thanks in no small part to its solar panels being their cleanest in years. In its latest research stint, NASA’s decade-old Mars Exploration Rover Opportunity is inspecting a section of crater-rim ridgeline chosen as a priority target due to evidence of a water-related mineral.

Orbital observations of the site by another NASA spacecraft – the Mars Reconnaissance Orbiter (MRO) – found a spectrum with the signature of aluminum bound to oxygen and hydrogen. Researchers regard that signature as a marker for a mineral called montmorillonite, which is in a class of clay minerals (called smectites) that forms when basalt is altered under wet and slightly acidic conditions. The exposure of it extends about 240 meters (800 feet) north to south on the western rim of Endeavour Crater.

Mars_Reconnaissance_OrbiterThe detection was made possible using the MRO’s Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) combined with rover observations some 3 kms (2 miles) north on the crater’s western rim. Rocks exposed there contain evidence for an iron-bearing smectite – called nontronite – as well as for montmorillonite. That site yielded evidence for an ancient environment with water that would have been well-suited for use by microbes, evidence that could boost our understanding of what Mars looked like billions of years ago.

Opportunity reached the northern end of the montmorillonite-bearing exposure last month – a high point known as “Pillinger Point.” Opportunity’s international science team chose that informal name in honor of Colin Pillinger (1943-2014), the British principal investigator for the Beagle 2 project, which attempted to set a research lander on Mars a few weeks before Opportunity landed there in January of 2004.

Beagle 2Opportunity Principal Investigator Steve Squyres, of Cornell University, had this to say about Pillinger:

Colin and his team were trying to get to Mars at the same time that we were, and in some ways they faced even greater challenges than we did. Our team has always had enormous respect for the energy and enthusiasm with which Colin Pillinger undertook the Beagle 2 mission. He will be missed.

Though selected as a science destination, Pillinger Point also offers a scenic vista from atop the western rim of Endeavour Crater, which is about 22 kms (14 miles) in diameter. The picture below shows a section of a color shot taken by Opportunity’s panoramic camera (Pancam) upon arrival. A full-size view of this picture can be seen by going to NASA’s Jet Propulsion Laboratory Mars Exploration Rovers webpage.

Pillinger_pointInitial measurements at this site with the element-identifying alpha particle X-ray spectrometer at the end of Opportunity’s arm indicate that bright-toned veins in the rock contain calcium sulfate. Scientists deduce this mineral was deposited as water moved through fractures on Endeavour’s rim. The rover found similar veins of calcium sulfate farther north along the rim while investigating there earlier last month.

As Opportunity investigated this site and other sites farther south along the rim, the rover had more energy than usual. This was due to the solar cells being in rare form, says Opportunity Project Manager John Callas of NASA’s Jet Propulsion Laboratory:

The solar panels have not been this clean since the first year of the mission. It’s amazing, when you consider that accumulation of dust on the solar panels was originally expected to cause the end of the mission in less than a year. Now it’s as if we’d been a ship out at sea for 10 years and just picked up new provisions at a port of call, topping off our supplies.

Both Opportunity and its rover twin, Spirit, benefited from sporadic dust-cleaning events in past years. However, on the ridge that Opportunity has been navigating since late 2013, winds have removed dust more steadily, day by day, than either rover has experienced elsewhere. The rover’s signs of aging – including a stiff shoulder joint and occasional losses of data – have not grown more troublesome in the past year, and no new symptoms have appeared.

mountsharp_galecraterJPL’s Jennifer Herman, power-subsystem engineer added:

It’s easy to forget that Opportunity is in the middle of a Martian winter right now. Because of the clean solar arrays, clear skies and favorable tilt, there is more energy for operations now than there was any time during the previous three Martian summers. Opportunity is now able to pull scientific all-nighters for three nights in a row — something she hasn’t had the energy to do in years.

During Opportunity’s first decade on Mars and the 2004-2010 career of Spirit, NASA’s Mars Exploration Rover Project yielded a range of findings about wet environmental conditions on ancient Mars – some very acidic, others milder and more conducive to supporting life. These findings have since been supplemented and confirmed by findings by the Curiosity Rover, which hopes to find plenty of clues as to the nature of possible life on Mars when it reaches Mount Sharp later this summer.

Source: sciencedaily.com, marsrovers.jpl.nasa.gov

The Future is Here: First Android Newscasters in Japan

japan-android-robotsThis past week, Japanese scientists unveiled what they claim is the world’s first news-reading android. The adolescent-looking “Kodomoroid” – an amalgamation of the Japanese word “kodomo” (child) and “android”- and “Otonaroid” (“otona” meaning adult) introduced themselves at an exhibit entitled Android: What is a Human?, which is being presented at Tokyo’s National Museum of Emerging Science and Innovation (Miraikan).

The androids were flanked by robotics professor Hiroshi Ishiguro and Miraikan director Mamoru Mori. After Kodomoroid delivered news of an earthquake and an FBI raid to amazed reporters in Tokyo. She even poked fun at her creator, leading robotics professor Hiroshi Ishiguro, “You’re starting to look like a robot!” This was followed by Otonaroid fluffing her lines when asked to introduced herself, which was followed by her excusing herself by saying, “I’m a little bit nervous.”

geminoidBoth androids will be working at Miraikan and interacting with visitors, as part of Ishiguro’s studies into human reactions to the machines. Ishiguro is well-known for his work with “geminoid”, robots that bare a frightening resemblance to their creator. As part of his lecture process, Ishiguro takes his geminoid with him when he travels and even let’s it deliver his lectures for him. During an interview with AFP, he explained the reasoning behind this latest exhibit:

This will give us important feedback as we explore the question of what is human. We want robots to become increasingly clever. We will have more and more robots in our lives in the future… This will give us important feedback as we explore the question of what is human. We want robots to become increasingly clever.

Granted the unveiling did have its share of bugs. For her part, Otonaroid looked as if she could use some rewiring before beginning her new role as the museum’s science communicator, her lips out of sync and her neck movements symptomatic of a bad night’s sleep. But Ishiguro insisted both would prove invaluable to his continued research as museum visitors get to have conversations with the ‘droids and operate them as extensions of their own body.

pepperAnd this is just one of many forays into a world where the line between robots and humans are becoming blurred. After a successful debut earlier this month, a chatty humanoid called Pepper is set to go on sale as a household companion in Japan starting next year. Designed by SoftBank, using technology acquired from French robotics company Aldebaran, and marketed as a household companion, each robot will cost around $2,000, the same cost of a laptop.

Pepper can communicate through emotion, speech or body language and it’s equipped with both mics and proximity sensors. Inside, it will be possible to install apps and upgrade the unit’s functionality, the plan being to make Pepper far smarter than when you first bought it. It already understands 4,500 Japanese words, but perhaps more impressively, Pepper can apparently read into the tone used to understand its master’s disposition.

pepperAldebaran CEO Bruno Maisonnier claims that robots that can recognize human emotion will change the way we live and communicate. And this is certainly a big step towards getting robots into our daily lives, at least if you live in Japan (the only place Pepper will be available for the time being). He also believes this is the start of a “robotic revolution” where robotic household companions that can understand and interact with their human owners will become the norm.

Hmm, a world where robots are increasingly indistinguishable from humans, can do human jobs, and are capable of understanding and mimicking our emotions. Oh, and they live in our houses too? Yeah, I’m just going to ignore the warning bells going off in my head now! And in the meantime, be sure to check out these videos of Kodomoroid and Otonaroid and Pepper being unveiled for the first time:

World’s First Android Newscasters:


Aldebaran’s Pepper:


Sources:
cnet.com, gizmodo.com, engadget.com, nydailynews.com

Looking Forward: 10 Breakthroughs by 2025

BrightFutureWorld-changing scientific discoveries are emerging all the time; from drugs and vaccines that are making incurable diseases curable, to inventions that are making renewable energies cheaper and more efficient. But how will these develops truly shape the world of tomorrow? How will the combination of advancements being made in the fields of medical, digital and industrial technology come together to change things by 2025?

Well, according to the Thomson Reuters IP & Science unit – a leading intellectual property and collaboration platform – has made a list of the top 10 breakthroughs likely to change the world. To make these predictions, they  looked at two sorts of data – current scientific journal literature and patent applications. Counting citations and other measures of buzz, they identified 10 major fields of development, then made specific forecasts for each.

As Basil Moftah, president of the IP & Science business (which sells scientific database products) said:

A powerful outcome of studying scientific literature and patent data is that it gives you a window into the future–insight that isn’t always found in the public domain. We estimate that these will be in effect in another 11 years.

In short, they predict that people living in 2025 will have access to far more in the way of medical treatments and cures, food will be more plentiful (surprisingly enough), renewable energy sources and applications will be more available, the internet of things will become a reality, and quantum and medical science will be doing some very interesting thins.

1. Dementia Declines:
geneticsPrevailing opinion says dementia could be one of our most serious future health challenges, thanks in no small part to increased life expectancy. In fact, the World Health Organization expects the number of cases to triple by 2050. The Thomson Reuters report is far more optimistic though, claiming that a focus on the pathogenic chromosomes that cause neuro-degenerative disease will result in more timely diagnosis, and earlier, more effective treatment:

In 2025, the studies of genetic mutations causing dementia, coupled with improved detection and onset-prevention methods, will result in far fewer people suffering from this disease.

2. Solar Power Everywhere:
solarpowergeWith the conjunction of increased efficiencies, dropping prices and improved storage methods, solar power will be the world’s largest single source of energy by 2025. And while issues such as weather-dependence will not yet be fully resolved, the expansion in panel use and the incorporation of thin photovoltaic cells into just about every surface imaginable (from buildings to roadways to clothing) will means that solar will finally outstrip fossil fuels as coal as the predominant means of getting power.

As the authors of the report write:

Solar thermal and solar photovoltaic energy (from new dye-sensitized and thin-film materials) will heat buildings, water, and provide energy for devices in the home and office, as well as in retail buildings and manufacturing facilities.

3. Type 1 Diabetes Prevention:
diabetes_worldwideType 1 diabetes strikes at an early age and isn’t as prevalent as Type 2 diabetes, which comes on in middle age. But cases have been rising fast nonetheless, and explanations range from nutritional causes to contaminants and fungi. But the report gives hope that kids of the future won’t have to give themselves daily insulin shots, thanks to “genomic-editing-and-repairing” that it expects will fix the problem before it sets in. As it specifies:

The human genome engineering platform will pave the way for the modification of disease-causing genes in humans, leading to the prevention of type I diabetes, among other ailments.

4. No More Food Shortages:
GMO_seedsContrary to what many speculative reports and futurists anticipate, the report indicates that by the year 2025, there will be no more food shortages in the world. Thanks to a combination of lighting and genetically-modified crops, it will be possible to grow food quickly and easily in a plethora of different environments. As it says in the report:

In 2025, genetically modified crops will be grown rapidly and safely indoors, with round-the-clock light, using low energy LEDs that emit specific wavelengths to enhance growth by matching the crop to growth receptors added to the food’s DNA. Crops will also be bred to be disease resistant. And, they will be bred for high yield at specified wavelengths.

5. Simple Electric Flight:
Solar Impulse HB-SIA prototype airplane attends his first flight over PayerneThe explosion in the use of electric aircraft (be they solar-powered or hydrogen fueled) in the past few decades has led to predictions that by 2025, small electric aircraft will offset commercial flight using gas-powered, heavy jets. The report says advances in lithium-ion batteries and hydrogen storage will make electric transport a reality:

These aircraft will also utilize new materials that bring down the weight of the vehicle and have motors with superconducting technology. Micro-commercial aircraft will fly the skies for short-hop journeys.

6. The Internet of Things:
internet-of-things-2By 2025, the internet is likely to expand into every corner of life, with growing wifi networks connecting more people all across the world. At the same time, more and more in the way of devices and personal possessions are likely to become “smart” – meaning that they will can be accessed digitally and networked to other things. In short, the internet of things will become a reality. And the speed at which things move will vastly increase due to proposed solutions to the computing bottleneck.

Here’s how the report puts it:

Thanks to the prevalence of improved semiconductors, graphene-carbon nanotube capacitators, cell-free networks of service antenna, and 5G technology, wireless communications will dominate everything, everywhere.

7. No More Plastic Garbage:
110315-N-IC111-592Ever heard of the Great Pacific Garbage Patch (aka. the Pacific Trash Vortex), the mass of plastic debris in the Pacific Ocean that measures somewhere between 700,000 and 15,000,000 square kilometres (270,000 – 5,800,000 sq mi)? Well, according to the report, such things will become a thing of the past. By 2025, it claims, the “glucose economy” will lead to the predominance of packaging made from plant-derived cellulose (aka. bioplastics).

Because of this influx of biodegradable plastics, there will be no more permanent deposits of plastic garbage filling our oceans, landfills, and streets. As it says:

Toxic plastic-petroleum packaging that litters cities, fields, beaches, and oceans, and which isn’t biodegradable, will be nearing extinction in another decade. Thanks to advancements in the technology related to and use of these bio-nano materials, petroleum-based packaging products will be history.

8. More Precise Drugs:
drugsBy 2025, we’ll have sophisticated, personalized medicine, thanks to improved production methods, biomedical research, and the growth of up-to-the-minute health data being provided by wearable medical sensors and patches. The report also offers specific examples:

Drugs in development are becoming so targeted that they can bind to specific proteins and use antibodies to give precise mechanisms of action. Knowledge of specific gene mutations will be so much more advanced that scientists and physicians can treat those specific mutations. Examples of this include HER2 (breast cancer), BRAF V600 (melanoma), and ROS1 (lung cancer), among many others.

9. DNA Mapping Formalized:
DNA-1Recent explosions in genetic research – which include the Genome Project and ENCODE – are leading to a world where personal genetic information will become the norm. As a result, kids born in 2025 will be tested at the DNA level, and not just once or twice, but continually using nano-probes inserted in the body. The result will be a boon for anticipating genetic diseases, but could also raise various privacy-related issues. As it states:

In 2025, humans will have their DNA mapped at birth and checked annually to identify any changes that could point to the onset of autoimmune diseases.

10. Teleportation Tested:
quantum-entanglement1Last, but certainly not least, the report says research into teleportation will be underway. Between the confirmation of the Higgs Boson (and by extension, the Standard Model of particle physics), recent revelations about quantum entanglements and wormholes, and the discovery of the Amplituhedron, the field of teleportation is likely to produce some serious breakthroughs. No telling what these will be – be it the ability to teleport simple photons or something larger – but the fact that the research will be happening seems a foregone conclusion:

We are on the precipice of this field’s explosion; it is truly an emerging research front. Early indicators point to a rapid acceleration of research leading to the testing of quantum teleportation in 2025.

Summary:
Will all of these changes come to pass? Who knows? If history has taught us anything, it’s that predictions are often wrong and much in the way of exciting research doesn’t always make it to the market. And as always, various factors – such as politics, money, public resistance, private interests – have a way of complicating things. However, there is reason to believe that the aforementioned 10 things will become a viable reality. And Moftah believes we should be positive about the future:

[The predictions] are positive in nature because they are solutions researchers and scientists are working on to address challenges we face in the world today. There will always be obstacles and issues to overcome, but science and innovation give us hope for how we will address them.

I, for one, am happy and intrigued to see certain items making this list. The explosion in solar usage, bioplastics, and the elimination of food scarcity are all very encouraging. If there was one thing I was anticipating by 2025, it was increased drought and food shortages. But as the saying goes, “necessity is the mother of invention”. And as someone who has had two grandmothers who lived into their nineties and have both suffered from the scourges of dementia, it is good to know that this disease will be on the wane for future generations.

It is also encouraging to know that there will be better treatments for diseases like cancer, HIV, and diabetes. While the idea of a world in which all diseases are preventable and/or treatable worries some (on a count of how it might stoke overpopulation), no one who has ever lived with this disease, or known someone who has, would think twice if presented with a cure. And hardship, hunger, a lack of education, resources and health services are some of the main reasons for population explosions.

And, let’s face it, its good to live in an age where the future looks bright for a change. After a good century of total war, totalitarianism, atomic diplomacy, terrorism, and oh so much existential angst and dystopian fiction, it’s nice to think that the coming age will turn out alright after all.

Sources: fastcoexist.com, ip-science.thomsonreuters.com

Cyberwars: ACLU and NSA ex-Director to Debate Tomorrow!

keith-alexander-nsa-flickrIn what is sure to be a barn-burner of a debate, the former head of the National Security Agency – General Keith Alexander – will be participating tomorrow in a with ACLU Executive Director Anthony Romero. The televised, surveillance-themed debate, will take place tomorrow –  June 30th, 10:30am Eastern Time – on MSNBC. The subject: whether or not the NSA’s vast surveillance and data mining programs are making American’s safer.

While many would prefer that the current head of the NSA be involved in the debate, General Alexander is a far better spokesperson for the controversial programs that have been the subject of so much controversy. After all, “Emperor Alexander” – as his subordinates called him – is the man most directly responsible for the current disposition of the  NSA’s cyber surveillance and warfare program.Who better to debate their merit with the head of the ACLU – an organization dedicated to the preservation of personal freedom?

Edward-Snowden-660x367And according to classified documents leaked by Edward Snowden, General Alexander’s influence and power within the halls of government knew no bounds during his tenure. A four-star Army general with active units under his command, he was also the head of the National Security Agency, chief of the Central Security Service, and the commander of the US Cyber Command. It is this last position and the power it wields that has raised the greatest consternation amongst civil-libertarians and privacy advocates.

Keith Alexander is responsible for building this place up between 2005 and 2013, insisting that the US’s inherent vulnerability to digital attacks required that he and those like him assume more authority over the data zipping around the globe. According to Alexander, this threat is so paramount that it only makes sense that all power to control the flow of information should be concentrated in as few hands as possible, namely his.

NSA_fort_meadeIn a recent security conference held in Canada before the Canadian Security Intelligence Service (CSIS), Alexander expressed the threat in the following, cryptic way:

What we see is an increasing level of activity on the networks. I am concerned that this is going to break a threshold where the private sector can no longer handle it and the government is going to have to step in.

If this alone were not reason enough to put people on edge, there are also voices within the NSA who view Alexander as a quintessential larger-than-life personality. One former senior CIA official who agreed to speak on condition of anonymity, claimed:

We jokingly referred to him as Emperor Alexander—with good cause, because whatever Keith wants, Keith gets. We would sit back literally in awe of what he was able to get from Congress, from the White House, and at the expense of everybody else.

And it is because of such freedom to monitor people’s daily activities that movements like the February 11th “The Day We Fight Back” movement – an international cause that embraced 360 organizations in 70 countries that were dedicated to ending mass surveillance – have been mounted, demanding reform.

us_supremecourtIn addition, a series of recent ruling from the US Supreme Court have begun to put the kibosh on the surveillance programs that Alexander spent eight years building up. With everything from cell phone tracking to cell phone taps, a precedent is being set that is likely to outlaw all of the NSA domestic surveillance. But no matter what, the role of Snowden’s testimony in securing this landmark event cannot be underestimated.

In fact, in a recent interview, the ACLU’s Anthony Romero acknowledged a great debt to Snowden and claimed that the debate would not be happening without him. As he put it:

I think Edward Snowden has done this country a service… regardless of whether or not what he did was legal or illegal, whether or not we think the sedition laws or the espionage laws that are being used to possibly prosecute Snowden are too broad, the fact is that he has kick-started a debate that we did not have. This debate was anemic. Everyone was asleep at the switch.

One can only imagine what outcome this debate will have. But we can rest assured that some of the more predictable talking points will include the necessities emerging out of the War on Terror, the rise of the information revolution, and the dangers of Big Brother Government, as well as the NSA’s failure to prevent such attacks as the Boston Marathon Bombing, the Benghazi Embassy bombing, and a slew of other terrorist incidents that took place during Alexander’s tenure.

Do I sound biased? Well perhaps that’s because I am. Go ACLU, stick to Emperor Alexander!

Sources: engadget.com, democracynow.org

New from Space: Simulations and X-Rays Point to Dark Matter

center_universe2The cosmic hunt for dark matter has been turning up some interesting clues of late. And during the month of June, two key hints came along that might provide answers; specifically simulations that look at the “local Universe” from the Big Bang to the present day and recent studies involving galaxy clusters. In both cases, the observations made point towards the existence of Dark Matter – the mysterious substance believed to make up 85 per cent of the mass of the Universe.

In the former case, the clues are the result of new supercomputer simulations that show the evolution of our “local Universe” from the Big Bang to the present day. Physicists at Durham University, who are leading the research, say their simulations could improve understanding of dark matter due to the fact that they believe that clumps of the mysterious substance – or halos – emerged from the early Universe, trapping intergalactic gas and thereby becoming the birthplaces of galaxies.

universe_expansionCosmological theory predicts that our own cosmic neighborhood should be teeming with millions of small halos, but only a few dozen small galaxies have been observed around the Milky Way. Professor Carlos Frenk, Director of Durham University’s Institute for Computational Cosmology, said:

I’ve been losing sleep over this for the last 30 years… Dark matter is the key to everything we know about galaxies, but we still don’t know its exact nature. Understanding how galaxies formed holds the key to the dark matter mystery… We know there can’t be a galaxy in every halo. The question is: ‘Why not?’.

The Durham researchers believe their simulations answer this question, showing how and why millions of halos around our galaxy and neighboring Andromeda failed to produce galaxies. They say the gas that would have made the galaxy was sterilized by the heat from the first stars that formed in the Universe and was prevented from cooling and turning into stars. However, a few halos managed to bypass this cosmic furnace by growing early and fast enough to hold on to their gas and eventually form galaxies.

dark_matterThe findings were presented at the Royal Astronomical Society’s National Astronomy Meeting in Portsmouth on Thursday, June 26. The work was funded by the UK’s Science and Technology Facilities Council (STFC) and the European Research Council. Professor Frenk, who received the Royal Astronomical Society’s top award, the Gold Medal for Astronomy, added:

We have learned that most dark matter halos are quite different from the ‘chosen few’ that are lit up by starlight. Thanks to our simulations we know that if our theories of dark matter are correct then the Universe around us should be full of halos that failed to make a galaxy. Perhaps astronomers will one day figure out a way to find them.

Lead researcher Dr Till Sawala, in the Institute for Computational Cosmology, at Durham University, said the research was the first to simulate the evolution of our “Local Group” of galaxies, including the Milky Way, Andromeda, their satellites and several isolated small galaxies, in its entirety. Dr Sawala said:

What we’ve seen in our simulations is a cosmic own goal. We already knew that the first generation of stars emitted intense radiation, heating intergalactic gas to temperatures hotter than the surface of the sun. After that, the gas is so hot that further star formation gets a lot more difficult, leaving halos with little chance to form galaxies. We were able to show that the cosmic heating was not simply a lottery with a few lucky winners. Instead, it was a rigorous selection process and only halos that grew fast enough were fit for galaxy formation.

darkmatter1The close-up look at the Local Group is part of the larger EAGLE project currently being undertaken by cosmologists at Durham University and the University of Leiden in the Netherlands. EAGLE is one of the first attempts to simulate from the beginning the formation of galaxies in a representative volume of the Universe. By peering into the virtual Universe, the researchers find galaxies that look remarkably like our own, surrounded by countless dark matter halos, only a small fraction of which contain galaxies.

The research is part of a program being conducted by the Virgo Consortium for supercomputer simulations, an international collaboration led by Durham University with partners in the UK, Germany, Holland, China and Canada. The new results on the Local Group involve, in addition to Durham University researchers, collaborators in the Universities of Victoria (Canada), Leiden (Holland), Antwerp (Belgium) and the Max Planck Institute for Astrophysics (Germany).

ESO2In the latter case, astronomers using ESA and NASA high-energy observatories have discovered another possible hint by studying galaxy clusters, the largest cosmic assemblies of matter bound together by gravity. Galaxy clusters not only contain hundreds of galaxies, but also a huge amount of hot gas filling the space between them. The gas is mainly hydrogen and, at over 10 million degrees celsius, is hot enough to emit X-rays. Traces of other elements contribute additional X-ray ‘lines’ at specific wavelengths.

Examining observations by ESA’s XMM-Newton and NASA’s Chandra spaceborne telescopes of these characteristic lines in 73 galaxy clusters, astronomers stumbled on an intriguing faint line at a wavelength where none had been seen before. The astronomers suggest that the emission may be created by the decay of an exotic type of subatomic particle known as a ‘sterile neutrino’, which is predicted but not yet detected.

dark_matter_blackholeOrdinary neutrinos are very low-mass particles that interact only rarely with matter via the so-called weak nuclear force as well as via gravity. Sterile neutrinos are thought to interact with ordinary matter through gravity alone, making them a possible candidate as dark matter. As Dr Esra Bulbul – from the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts, USA, and lead author of the paper discussing the results – put it:

If this strange signal had been caused by a known element present in the gas, it should have left other signals in the X-ray light at other well-known wavelengths, but none of these were recorded. So we had to look for an explanation beyond the realm of known, ordinary matter… If the interpretation of our new observations is correct, at least part of the dark matter in galaxy clusters could consist of sterile neutrinos.

The surveyed galaxy clusters lie at a wide range of distances, from more than a hundred million light-years to a few billion light-years away. The mysterious, faint signal was found by combining multiple observations of the clusters, as well as in an individual image of the Perseus cluster, a massive structure in our cosmic neighborhood.

The supermassive black hole at the center of the Milky Way galaxy.The implications of this discovery may be far-reaching, but the researchers are being cautious. Further observations with XMM-Newton, Chandra and other high-energy telescopes of more clusters are needed before the connection to dark matter can be confirmed. Norbert Schartel, ESA’s XMM-Newton Project Scientist, commented:

The discovery of these curious X-rays was possible thanks to the large XMM-Newton archive, and to the observatory’s ability to collect lots of X-rays at different wavelengths, leading to this previously undiscovered line. It would be extremely exciting to confirm that XMM-Newton helped us find the first direct sign of dark matter. We aren’t quite there yet, but we’re certainly going to learn a lot about the content of our bizarre Universe while getting there.

Much like the Higgs Boson, the existence of Dark Matter was first theorized as a way of explaining how the universe appears to have mass that we cannot see. But by looking at indirect evidence, such as the gravitational influence it has on the movements and appearance of other objects in the Universe, scientists hope to one day confirm its existence. Beyond that, there is the mystery of “Dark Energy”, the hypothetical form of energy that permeates all of space and is believed to be behind accelerations in the expansion of the universe.

As with the discovery of the Higgs Boson and the Standard Model of particle physics, detecting these two invisible forces will at last confirm that the Big Bang and Cosmological theory are scientific fact – and not just working theories. When that happens, the dream of humanity finally being able to understand the universe (at both the atomic and macro level) may finally become a reality!

Source: sciencedaily.com, (2)

News from Space: “Earth-Sized Diamond” In Space

White-Dwarf-640x353As our knowledge of the universe beyond our Solar System expands, the true wonder and complexity of it is slowly revealed. At one time, scientists believed that other systems would be very much like our own, with planets taking on either a rocky or gaseous form, and stars conforming to basic classifications that determined their size, mass, and radiation output. However, several discoveries of late have confounded these assumptions, and led us to believe that just about anything could exist out there.

For example, a team of astronomers at the University of Wisconsin-Milwaukee recently identified the coldest, faintest white dwarf star ever detected, some 900 light years from Earth. Hovering near a much larger pulsar, this ancient stellar remnant has a temperature of less than 3,000 K, or about 2,700 degrees Celsius, which made it extremely difficult to detect. But what is especially impressive about this ancient stellar remnant is the fact that it is so cool that its carbon has crystallized.

radio-wave-dishesThis means, in effect, that this star has formed itself into an Earth-size diamond in space. The discovery was made by Prof. David Kaplan and his team from the UofW-M using the National Radio Astronomy Observatory’s (NRAO) Green Bank Telescope (GBT) and Very Long Baseline Array (VLBA), as well as other observatories. All of these instruments were needed to spot this star because its low energy output means that it is essentially “a diamond in the rough”, the rough being the endless vacuum of space, that is.

White dwarves like this one are what happens after a star about the size of our Sun spends all of its nuclear fuel and throws its outer layers off, leaving behind a tiny, super-dense core of elements (like carbon and oxygen). They burn at an excruciatingly slow pace, taking billions and billions of years to finally go out. Even newly transformed white dwarfs are incredibly hard to spot compared to active stars, and this one was only discovered because it happens to be nestled right up next to a pulsar.

White-Dwarf-diamondBy definition, a pulsar is what is left over when a neutron star when a slightly larger sun also runs its course. Those that spin are given the name of “pulsar” because their magnetic fields force radio waves out in tight beams that give the illusion of pulsations as they whir around, effectively strobing the universe like lighthouse. The pulsar that sits next to the diamond-encrusted white dwarf is known as PSR J2222-0137, and is 1.2 times the mass of our sun, but even smaller than the white dwarf.

Astronomers were tipped off to the presence of something near the pulsar by distortions in its radio waves, and an old-fashioned space hunt was then mounted for the culprit. The low mass made a white dwarf the most likely cause, but astronomers couldn’t see it because of its incredibly low luminosity. Because of this, the UofW-M team estimated the age of this object had to be upward of 11 billion years, the same age as the Milky Way Galaxy.

earth-size-diamond-in-space-detected-byastronomersThis meant that the object was already old when our galaxy was just beginning to coalesce. After all those eons to cool off, the star has likely collapsed into a crystallized chunk of carbon mixed with oxygen and some other elements. It could actually be possible, though extremely difficult, to land a spacecraft on an object like this. There may be many more stars in the sky with diamonds, perhaps some even older than this one.

Spotting this white dwarf was a bit of a fluke, though. Until more powerful instruments are devised that can see an incredibly dim, burnt out star, they’ll remain shrouded in the vast darkness of space. However, this is not the first time that an object composed of diamond was found in space by sheer stroke of luck. Remember the diamond planet, a body located some 40 light years from Earth that orbits the binary star 55 Cancri?

diamond_planetYep that one! Like I said, such discoveries are demonstrating that the universe is a much more interesting, awesome, and complex place than previously thought. Between diamond worlds, diamond planets, lakes of methane and atmospheres of plastic, it seems that just about anything is possible. Good to know, seeing as how so much of our plans for the future depend upon on getting out there!

Sources: cnet.com, extremetech.com

Powered by the Sun: Boosting Solar Efficiency

solar1Improving the efficiency of solar power – which is currently the most promising alternative energy source – is central to ensuring that it an becomes economically viable replacement to fossil fuels, coal, and other “dirty” sources. And while many solutions have emerged in recent years that have led to improvements in solar panel efficiency, many developments are also aimed at the other end of things – i.e. improving the storage capacity of solar batteries.

In the former case, a group of scientists working with the University of Utah believe they’ve discovered a method of substantially boosting solar cell efficiencies. By adding a polychromat layer that separates and sorts incoming light, redirecting it to strike particular layers in a multijunction cell, they hope to create a commercial cell that can absorb more wavelengths of light, and therefor generate more energy for volume than conventional cells.

EMSpectrumTraditionally, solar cell technology has struggled to overcome a significant efficiency problem. The type of substrate used dictates how much energy can be absorbed from sunlight — but each type of substrate (silicon, gallium arsenide, indium gallium arsenide, and many others) corresponds to capturing a particular wavelength of energy. Cheap solar cells built on inexpensive silicon have a maximum theoretical efficiency of 34% and a practical (real-world) efficiency of around 22%.

At the other end of things, there are multijunction cells. These use multiple layers of substrates to capture a larger section of the sun’s spectrum and can reach up to 87% efficiency in theory – but are currently limited to 43% in practice. What’s more, these types of multijunction cells are extremely expensive and have intricate wiring and precise structures, all of which leads to increased production and installation costs.

SolarCellResearchIn contrast, the cell created by the University of Utah used two layers — indium gallium phosphide (for visible light) and gallium arsenide for infrared light. According to the research team, when their polychromat was added, the power efficiency increased by 16 percent. The team also ran simulations of a polychromat layer with up to eight different absorbtion layers and claim that it could potentially yield an efficiency increase of up to 50%.

However, there were some footnotes to their report which temper the good news. For one, the potential gain has not been tested yet, so any major increases in solar efficiency remain theoretical at this time. Second, the report states that the reported gain was a percentage of a percentage, meaning that if the original cell efficiency was 30%, then a gain of 16% percent means that the new efficiency is 34.8%. That’s still a huge gain for a polychromat layer that is easily produced, but not as impressive as it originally sounded.

PolyChromat-640x353However, given that the biggest barrier to multi-junction solar cell technology is manufacturing complexity and associated cost, anything that boosts cell efficiency on the front end without requiring any major changes to the manufacturing process is going to help with the long-term commercialization of the technology. Advances like this could help make technologies cost effective for personal deployment and allow them to scale in a similar fashion to cheaper devices.

In the latter case, where energy storage is concerned, a California-based startup called Enervault recently unveiled battery technology that could increase the amount of renewable energy utilities can use. The technology is based on inexpensive materials that researchers had largely given up on because batteries made from them didn’t last long enough to be practical. But the company says it has figured out how to make the batteries last for decades.

SONY DSCThe technology is being demonstrated in a large battery at a facility in the California desert near Modeso, 0ne that stores one megawatt-hour of electricity, enough to run 10,000 100-watt light bulbs for an hour. The company has been testing a similar, though much smaller, version of the technology for about two years with good results. It has also raised $30 million in funding, including a $5 million grant from the U.S. Department of Energy.

The technology is a type of flow battery, so called because the energy storage materials are in liquid form. They are stored in big tanks until they’re needed and then pumped through a relatively small device (called a stack) where they interact to generate electricity. Building bigger tanks is relatively cheap, so the more energy storage is needed, the better the economics become. That means the batteries are best suited for storing hours’ or days’ worth of electricity, and not delivering quick bursts.

solarpanelsThis is especially good news for solar and wind companies, which have remained plagued by problems of energy storage despite improvements in both yield and efficiency. Enervault says that when the batteries are produced commercially at even larger sizes, they will cost just a fifth as much as vanadium redox flow batteries, which have been demonstrated at large scales and are probably the type of flow battery closest to market right now.

And the idea is not reserved to just startups. Researchers at Harvard recently made a flow battery that could prove cheaper than Enervault’s, but the prototype is small and could take many years to turn into a marketable version. An MIT spinoff, Sun Catalytix, is also developing an advanced flow battery, but its prototype is also small. And other types of inexpensive, long-duration batteries are being developed, using materials such as molten metals.

Sumitomo-redox-flow-battery-YokohamaOne significant drawback to the technology is that it’s less than 70 percent efficient, which falls short of the 90 percent efficiency of many batteries. The company says the economics still work out, but such a wasteful battery might not be ideal for large-scale renewable energy. More solar panels would have to be installed to make up for the waste. What’s more, the market for batteries designed to store hours of electricity is still uncertain.

A combination of advanced weather forecasts, responsive fossil-fuel power plants, better transmission networks, and smart controls for wind and solar power could delay the need for them. California is requiring its utilities to invest in energy storage but hasn’t specified what kind, and it’s not clear what types of batteries will prove most valuable in the near term, slow-charging ones like Enervault’s or those that deliver quicker bursts of power to make up for short-term variations in energy supply.

Tesla Motors, one company developing the latter type, hopes to make them affordable by producing them at a huge factory. And developments and new materials are being considered all time (i.e. graphene) that are improving both the efficiency and storage capacity of batteries. And with solar panels and wind becoming increasingly cost-effective, the likelihood of storage methods catching up is all but inevitable.

Sources: extremetech.com, technologyreview.com

 

The Large Hadron Collider: We’ve Definitely Found the Higgs Boson

higgs-boson1In July 2012, the CERN laboratory in Geneva, Switzerland made history when it discovered an elementary particle that behaved in a way that was consistent with the proposed Higgs boson – otherwise known as the “God Particle”. Now, some two years later, the people working the Large Hadron Collider have confirmed that what they observed was definitely the Higgs boson, the one predicted by the Standard Model of particle physics.

In the new study, published in Nature Physics, the CERN researchers indicated that the particle observed in 2012 researchers indeed decays into fermions – as predicted by the standard model of particle physics. It sits in the mass-energy region of 125 GeV, has no spin, and it can decay into a variety of lighter particles. This means that we can say with some certainty that the Higgs boson is the particle that gives other particles their mass – which is also predicted by the standard model.

CERN_higgsThis model, which is explained through quantum field theory  – itself an amalgam of quantum mechanics and Einstein’s special theory of relativity – claims that deep mathematical symmetries rule the interactions among all elementary particles. Until now, the decay modes discovered at CERN have been of a Higgs particle giving rise to two high-energy photons, or a Higgs going into two Z bosons or two W bosons.

But with the discovery of fermions, the researchers are now sure they have found the last holdout to the full and complete confirmation that the Standard Model is the correct one. As Marcus Klute of the CMS Collaboration said in a statement:

Our findings confirm the presence of the Standard Model Boson. Establishing a property of the Standard Model is big news itself.

CERN_LHCIt is certainly is big news for scientists, who can say with absolute certainty that our current conception for how particles interact and behave is not theoretical. But on the flip side, it also means we’re no closer to pushing beyond the Standard Model and into the realm of the unknown. One of the big shortfalls of the Standard Model is that it doesn’t account for gravity, dark energy and dark matter, and some other quirks that are essential to our understanding of the universe.

At present, one of the most popular theories for how these forces interact with the known aspects of our universe – i.e. electromagnetism, strong and nuclear forces – is supersymmetry.  This theory postulates that every Standard Model particle also has a superpartner that is incredibly heavy – thus accounting for the 23% of the universe that is apparently made up of dark matter. It is hoped that when the LHC turns back on in 2015 (pending upgrades) it will be able to discover these partners.

CERN_upgradeIf that doesn’t work, supersymmetry will probably have to wait for LHC’s planned successor. Known as the “Very Large Hadron Collider” (VHLC), this particle accelerator will measure some 96 km (60 mile) in length – four times as long as its predecessor. And with its proposed ability to smash protons together with a collision energy of 100 teraelectronvolts – 14 times the LHC’s current energy – it will hopefully have the power needed to answer the questions the discovery of the Higgs Boson has raised.

These will hopefully include whether or not supersymmetry holds up and how gravity interacts with the three other fundamental forces of the universe – a discovery which will finally resolve the seemingly irreconcilable theories of general relativity and quantum mechanics. At which point (and speaking entirely in metaphors) we will have gone from discovering the “God Particle” to potentially understanding the mind of God Himself.

I don’t think I’ve being melodramatic!

Source: extremetech.com, blogs.discovermagazine.com