The Future is Here: The Walking Bio-Robot

walking-bio-robot-spinal-muscleGiven that the field of robotics and electronics are making inroads into the field of biology – in the form of biorobotics and bionics – it was only a matter of time before applications began moving in the other direction. For example, muscles have been considered in recent years as a potential replacement for electric actuators, in part because they can run in a nutrient-rich fluid without the need for any other power source.

The latest example of this biological-technological crossover comes from Illinios, where bio-robotics experts have demonstrated a bio-bot built from 3-D printed hydrogel and spinal muscle tissue that can “walk” in response to an electrical signal. Less than a centimeter in length, the “bio-bot” responds to electrical impulses that cause the muscle to contract.

According to study leader, Professor Rashid Bashir, biological tissue has several advantages over other robotic actuators:

[Muscle] is biodegradable, it can run in fluid with just some nutrients and hence doesn’t need external batteries and power sources – and it could eventually be controlled by neurons in our future work.

walking-bio-robot-spinal-muscle-3Previous versions, using heart muscle tissue, were also able to “walk” but were not controllable, as heart tissue contracts constantly of its own accord. Spinal muscle, by contrast, responds to external electrical stimuli and provide a range of a range of potential uses. These include bio-robots being able to operate inside the body in medical applications, or being used outdoors in environmental services.

And though this design is very simple, it serves as a proof of concept that demonstrates that the technology works. Bashir and his team are now looking to start extending toward more complex machines – incorporating neurons that can get the bot walking in different directions when faced with different stimuli. Initially, they’ll look at designing a more complex hydrogel backbone that gives the robot the ability to move in more than one direction.

walking-bio-robot-spinal-muscle-6They’re also looking at integrating neurons to steer the tiny bots around, either using light or chemical gradients as a trigger. This would be a key step toward being able to design bots for a specific purpose. As Bashir said:

The idea of doing forward engineering with these cell-based structures is very exciting. Our goal is for these devices to be used as autonomous sensors. We want it to sense a specific chemical and move towards it, then release agents to neutralize the toxin, for example. Being in control of the actuation is a big step forward toward that goal.

This development is significant for a number of reasons. Not only is it a step on the road towards bionics and biorobotics, it also demonstrates that the merging of technology and biology works both ways. Not only are machines being designed to improve our biology, our biology is also inspiring machinery, and even being used for its unique and superior properties to make machines run better as well.

And be sure to watch this video of the muscle-powered bio-robot being explained:


Source:
gizmag.com
, news.illinois.edu

Frontiers of Neuroscience: Neurohacking and Neuromorphics

neural-network-consciousness-downloading-640x353It is one of the hallmarks of our rapidly accelerating times: looking at the state of technology, how it is increasingly being merged with our biology, and contemplating the ultimate leap of merging mind and machinery. The concept has been popular for many decades now, and with experimental procedures showing promise, neuroscience being used to inspire the next great leap in computing, and the advance of biomedicine and bionics, it seems like just a matter of time before people can “hack” their neurology too.

Take Kevin Tracey, a researcher working for the Feinstein Institute for Medical Research in Manhasset, N.Y., as an example. Back in 1998, he began conducting experiments to show that an interface existed between the immune and nervous system. Building on ten years worth of research, he was able to show how inflammation – which is associated with rheumatoid arthritis and Crohn’s disease – can be fought by administering electrical stimulu, in the right doses, to the vagus nerve cluster.

Brain-ScanIn so doing, he demonstrated that the nervous system was like a computer terminal through which you could deliver commands to stop a problem, like acute inflammation, before it starts, or repair a body after it gets sick.  His work also seemed to indicate that electricity delivered to the vagus nerve in just the right intensity and at precise intervals could reproduce a drug’s therapeutic reaction, but with greater effectiveness, minimal health risks, and at a fraction of the cost of “biologic” pharmaceuticals.

Paul Frenette, a stem-cell researcher at the Albert Einstein College of Medicine in the Bronx, is another example. After discovering the link between the nervous system and prostate tumors, he and his colleagues created SetPoint –  a startup dedicated to finding ways to manipulate neural input to delay the growth of tumors. These and other efforts are part of the growing field of bioelectronics, where researchers are creating implants that can communicate directly with the nervous system in order to try to fight everything from cancer to the common cold.

human-hippocampus-640x353Impressive as this may seem, bioelectronics are just part of the growing discussion about neurohacking. In addition to the leaps and bounds being made in the field of brain-to-computer interfacing (and brain-to-brain interfacing), that would allow people to control machinery and share thoughts across vast distances, there is also a field of neurosurgery that is seeking to use the miracle material of graphene to solve some of the most challenging issues in their field.

Given graphene’s rather amazing properties, this should not come as much of a surprise. In addition to being incredibly thin, lightweight, and light-sensitive (it’s able to absorb light in both the UV and IR range) graphene also a very high surface area (2630 square meters per gram) which leads to remarkable conductivity. It also has the ability to bind or bioconjugate with various modifier molecules, and hence transform its behavior. 

brainscan_MRIAlready, it is being considered as a possible alternative to copper wires to break the energy efficiency barrier in computing, and even useful in quantum computing. But in the field of neurosurgery, where researchers are looking to develop materials that can bridge and even stimulate nerves. And in a story featured in latest issue of Neurosurgery, the authors suggest thatgraphene may be ideal as an electroactive scaffold when configured as a three-dimensional porous structure.

That might be a preferable solution when compared with other currently vogue ideas like using liquid metal alloys as bridges. Thanks to Samsung’s recent research into using graphene in their portable devices, it has also been shown to make an ideal E-field stimulator. And recent experiments on mice in Korea showed that a flexible, transparent, graphene skin could be used as a electrical field stimulator to treat cerebral hypoperfusion by stimulating blood flow through the brain.

Neuromorphic-chip-640x353And what look at the frontiers of neuroscience would be complete without mentioning neuromorphic engineering? Whereas neurohacking and neurosurgery are looking for ways to merge technology with the human brain to combat disease and improve its health, NE is looking to the human brain to create computational technology with improved functionality. The result thus far has been a wide range of neuromorphic chips and components, such as memristors and neuristors.

However, as a whole, the field has yet to define for itself a clear path forward. That may be about to change thanks to Jennifer Hasler and a team of researchers at Georgia Tech, who recently published a roadmap to the future of neuromorphic engineering with the end goal of creating the human-brain equivalent of processing. This consisted of Hasler sorting through the many different approaches for the ultimate embodiment of neurons in silico and come up with the technology that she thinks is the way forward.

neuromorphic-chip-fpaaHer answer is not digital simulation, but rather the lesser known technology of FPAAs (Field-Programmable Analog Arrays). FPAAs are similar to digital FPGAs (Field-Programmable Gate Arrays), but also include reconfigurable analog elements. They have been around on the sidelines for a few years, but they have been used primarily as so-called “analog glue logic” in system integration. In short, they would handle a variety of analog functions that don’t fit on a traditional integrated circuit.

Hasler outlines an approach where desktop neuromorphic systems will use System on a Chip (SoC) approaches to emulate billions of low-power neuron-like elements that compute using learning synapses. Each synapse has an adjustable strength associated with it and is modeled using just a single transistor. Her own design for an FPAA board houses hundreds of thousands of programmable parameters which enable systems-level computing on a scale that dwarfs other FPAA designs.

neuromorphic_revolutionAt the moment, she predicts that human brain-equivalent systems will require a reduction in power usage to the point where they are consuming just one-eights of what digital supercomputers that are currently used to simulate neuromorphic systems require. Her own design can account for a four-fold reduction in power usage, but the rest is going to have to come from somewhere else – possibly through the use of better materials (i.e. graphene or one of its derivatives).

Hasler also forecasts that using soon to be available 10nm processes, a desktop system with human-like processing power that consumes just 50 watts of electricity may eventually be a reality. These will likely take the form of chips with millions of neuron-like skeletons connected by billion of synapses firing to push each other over the edge, and who’s to say what they will be capable of accomplishing or what other breakthroughs they will make possible?

posthuman-evolutionIn the end, neuromorphic chips and technology are merely one half of the equation. In the grand scheme of things, the aim of all of this research is not only produce technology that can ensure better biology, but technology inspired by biology to create better machinery. The end result of this, according to some, is a world in which biology and technology increasingly resemble each other, to the point that they is barely a distinction to be made and they can be merged.

Charles Darwin would roll over in his grave!

Sources: nytimes.com, extremetech.com, (2), journal.frontiersin.orgpubs.acs.org

The Future is Here: Bionic Eye Approved by FDA!

Argus-IIAfter more than 20 years in the making, the Argus II bionic eye was finally approved this past February by the Food and Drug Administration for commercial sale in the US. For people suffering from the rare genetic condition known as retinitis pigmentosa – an inherited, degenerative eye disease that causes severe vision impairment and often blindness – this is certainly good news indeed.

Developed by Second Sight, the Argus II is what is known as a “Retinal Prosthesis System” (RPS) that corrects the main effect of retinitis pigmentosa, which is the diminished ability to distinguish light from dark. While it doesn’t actually restore vision to people who suffer from this condition, it can improve their perceptions of light and dark, and thus identify the movement or location of objects.

argusII_1The Argus II works by using a series of electrodes implanted onto the retina that are wirelessly connected to a video camera mounted on the eyeglasses. The eye-electrodes use electrical impulses transmitted from the camera to stimulate the part of the retina that allows for image perception. By circumventing the parts of the eye effected by the disease, the bionic device is a prosthetic in every sense of the word.

According to Suber S. Huang, director of the University Hospital Eye Institute’s Center for Retina and Macular Disease, the breakthrough treatment is:

 [R]emarkable. The system offers a profound benefit for people who are blind from RP and who currently have no therapy available to them. Argus II allows patients to reclaim their independence and improve their lives.

ArgusIIArgus II boasts 20-plus years of research, three clinical trials, and more than $200 million in private and public investment behind it. Still, the system has been categorized by the FDA as a humanitarian use device, meaning there is a “reasonable assurance” that the device is safe and its “probable benefit outweighs the risk of illness or injury.”

Good news for people with vision impairment, and a big step in the direction of restoring sight. And of course, a possible step on the road to human enhancement and augmentation. As always, every development that is made in the direction of correcting human impairment offers the future possibility of augmenting otherwise unimpaired human beings.

infraredAs such, it might not be long before there are devices that can give the average human the ability to see in the invisible spectrum, such as IR and ultra-violet frequencies. Perhaps also something that can detect x-rays, gamma ray radiation, and other harmful particles. Given that the very definition of cyborg is “a being with both organic and cybernetic parts”, the integration of this device means the birth of the cybernetic age.

And be sure to check out this promotional video by Second Sight showing how the device works:

Source: news.cnet.com

The Singularity: The End of Sci-Fi?

singularity.specrepThe coming Singularity… the threshold where we will essentially surpass all our current restrictions and embark on an uncertain future. For many, its something to be feared, while for others, its something regularly fantasized about. On the one hand, it could mean a future where things like shortages, scarcity, disease, hunger and even death are obsolete. But on the other, it could also mean the end of humanity as we know it.

As a friend of mine recently said, in reference to some of the recent technological breakthroughs: “Cell phones, prosthetics, artificial tissue…you sci-fi writers are going to run out of things to write about soon.” I had to admit he had a point. If and when he reach an age where all scientific breakthroughs that were once the province of speculative writing exist, what will be left to speculate about?

Singularity4To break it down, simply because I love to do so whenever possible, the concept borrows from the field of quantum physics, where the edge of black hole is described as a “quantum singularity”. It is at this point that all known physical laws, including time and space themselves, coalesce and become a state of oneness, turning all matter and energy into some kind of quantum soup. Nothing beyond this veil (also known as an Event Horizon) can be seen, for no means exist to detect anything.

The same principle holds true in this case, at least that’s the theory. Originally coined by mathematician John von Neumann in the mid-1950’s, the term served as a description for a phenomenon of technological acceleration causing an eventual unpredictable outcome in society. In describing it, he spoke of the “ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.”

exponential_growth_largeThe term was then popularized by science fiction writer Vernor Vinge (A Fire Upon the Deep, A Deepness in the Sky, Rainbows End) who argued that artificial intelligence, human biological enhancement, or brain-computer interfaces could be possible causes of the singularity. In more recent times, the same theme has been picked up by futurist Ray Kurzweil, the man who points to the accelerating rate of change throughout history, with special emphasis on the latter half of the 20th century.

In what Kurzweil described as the “Law of Accelerating Returns”, every major technological breakthrough was preceded by a period of exponential growth. In his writings, he claimed that whenever technology approaches a barrier, new technologies come along to surmount it. He also predicted paradigm shifts will become increasingly common, leading to “technological change so rapid and profound it represents a rupture in the fabric of human history”.

kurzweil-loglog-bigLooking into the deep past, one can see indications of what Kurzweil and others mean. Beginning in the Paleolithic Era, some 70,000 years ago, humanity began to spread out a small pocket in Africa and adopt the conventions we now associate with modern Homo sapiens – including language, music, tools, myths and rituals.

By the time of the “Paleolithic Revolution” – circa 50,000 – 40,000 years ago – we had spread to all corners of the Old World world and left evidence of continuous habitation through tools, cave paintings and burials. In addition, all other existing forms of hominids – such as Homo neanderthalensis and Denisovans – became extinct around the same time, leading many anthropologists to wonder if the presence of homo sapiens wasn’t the deciding factor in their disappearance.

Map-of-human-migrationsAnd then came another revolution, this one known as the “Neolithic” which occurred roughly 12,000 years ago. By this time, humanity had hunted countless species to extinction, had spread to the New World, and began turning to agriculture to maintain their current population levels. Thanks to the cultivation of grains and the domestication of animals, civilization emerged in three parts of the world – the Fertile Crescent, China and the Andes – independently and simultaneously.

All of this gave rise to more habits we take for granted in our modern world, namely written language, metal working, philosophy, astronomy, fine art, architecture, science, mining, slavery, conquest and warfare. Empires that spanned entire continents rose, epics were written, inventions and ideas forged that have stood the test of time. Henceforth, humanity would continue to grow, albeit with some minor setbacks along the way.

The_Meeting_of_Cortés_and_MontezumaAnd then by the 1500s, something truly immense happened. The hemispheres collided as Europeans, first in small droves, but then en masse, began to cross the ocean and made it home to tell others what they found. What followed was an unprecedented period of expansion, conquest, genocide and slavery. But out of that, a global age was also born, with empires and trade networks spanning the entire planet.

Hold onto your hats, because this is where things really start to pick up. Thanks to the collision of hemispheres, all the corn, tomatoes, avocados, beans, potatoes, gold, silver, chocolate, and vanilla led to a period of unprecedented growth in Europe, leading to the Renaissance, Scientific Revolution, and the Enlightenment. And of course, these revolutions in thought and culture were followed by political revolutions shortly thereafter.

IndustrialRevolutionBy the 1700’s, another revolution began, this one involving industry and creation of a capitalist economy. Much like the two that preceded it, it was to have a profound and permanent effect on human history. Coal and steam technology gave rise to modern transportation, cities grew, international travel became as extensive as international trade, and every aspect of society became “rationalized”.

By the 20th century, the size and shape of the future really began to take shape, and many were scared. Humanity, that once tiny speck of organic matter in Africa, now covered the entire Earth and numbered over one and a half billion. And as the century rolled on, the unprecedented growth continued to accelerate. Within 100 years, humanity went from coal and diesel fuel to electrical power and nuclear reactors. We went from crossing the sea in steam ships to going to the moon in rockets.

massuseofinventionsAnd then, by the end of the 20th century, humanity once again experienced a revolution in the form of digital technology. By the time the “Information Revolution” had arrived, humanity had reached 6 billion people, was building hand held devices that were faster than computers that once occupied entire rooms, and exchanging more information in a single day than most peoples did in an entire century.

And now, we’ve reached an age where all the things we once fantasized about – colonizing the Solar System and beyond, telepathy, implants, nanomachines, quantum computing, cybernetics, artificial intelligence, and bionics – seem to be becoming more true every day. As such, futurists predictions, like how humans will one day merge their intelligence with machines or live forever in bionic bodies, don’t seem so farfetched. If anything, they seem kind of scary!

singularity-epocksThere’s no telling where it will go, and it seems like even the near future has become completely unpredictable. The Singularity looms! So really, if the future has become so opaque that accurate predictions are pretty much impossible to make, why bother? What’s more, will predictions become true as the writer is writing about them? Won’t that remove all incentive to write about it?

And really, if the future is to become so unbelievably weird and/or awesome that fact will take the place of fiction, will fantasy become effectively obsolete? Perhaps. So again, why bother? Well, I can think one reason. Because its fun! And because as long as I can, I will continue to! I can’t predict what course the future will take, but knowing that its uncertain and impending makes it extremely cool to think about. And since I’m never happy keeping my thoughts to myself, I shall try to write about it!

So here’s to the future! It’s always there, like the horizon. No one can tell what it will bring, but we do know that it will always be there. So let’s embrace it and enter into it together! We knew what we in for the moment we first woke up and embraced this thing known as humanity.

And for a lovely and detailed breakdown of the Singularity, as well as when and how it will come in the future, go to futuretimeline.net. And be prepared for a little light reading 😉

Biotech News: Artificial Ears and Bionic Eyes!

3d_earLast week was quite the exciting time for the field of biotechnology! Thanks to improvements in 3D printing and cybernetics – the one seeking to use living cells to print organic tissues and the other seeking to merge the synthetic with the organic – the line between artificial and real is becoming blurrier all the time. And as it turns out, two more major developments were announced just last week which have blurred it even further.

The first came from Cornell University, where a team of biotech researchers demonstrated that it was possible to print a replacement ear ear using a 3D printer and an injection of living cells. Using a process the team refers to as “high-fidelity tissue engineering”,  they used the cartilage from a cow for the ears interior and overlaid it with artificially generated skin cells to produce a fully-organic replacement.

3dstemcellsThis process builds on a number of breakthroughs in recent years involving 3D printers, stem cells, and the ability to create living tissue by arranging these cells in prearranged fashions. Naturally, the process is still in its infancy; but once refined, it will allow biomedical engineers to print customized ears for children born with malformed ones, or people who have lost theirs to accident or disease.

What’s more, the Cornell research team also envision a day in the near future when it’ll be possible to cultivate enough of a person’s own tissue so that the growth and implantation can happen all within the lab. And given recent the breakthrough at Wake Forest Institute of Regenerative Medicine- where researchers were able to create printed cartilage – it won’t be long before all the bio-materials can be created on-site as well.

Eye-cameraThe second breakthrough, which also occurred during this past week, took place in Germany, where researchers unveiled the world’s first high-resolution, user-configurable bionic eye. Known officially as the “Alpha IMS retinal prosthesis”, the device comes to us from the University of of Tübingen, where scientists have been working for some time to build and improve upon existing retinal prosthetics, such as Argus II – a retinal prosthesis developed by California-based company Second Sight.

Much like its predecessor, the Alpha IMS helps to restore vision by imitating the functions of a normal eye, where light is converted into electrical signals your retina and then transmitted to the brain via the optic nerve. In an eye that’s been afflicted by macular generation or diabetic retinophathy, these signals aren’t generated. Thus, the prosthetic works by essentially replacing the damaged piece of your retina with a computer chip that generates electrical signals that can be understood by your brain.

biotech_retinal-implantBut of course, the Alpha IMS improves upon previous prosthetics in a number of ways. First, it is connected to your brain via 1,500 electrodes (as opposed to the Argus II’s 60 electrodes) providing unparalleled visual acuity and resolution. Second, whereas the Argus II relies on an external camera to relay data to the implant embedded in your retina, the Alpha IMS is completely self-contained. This allows users to swivel the eye around as they would a normal eye, whereas the Argus II and others like it require the user to turn their head to change their angle of sight.

Here too the technology is still in its infancy and has a long way to go before it can outdo the real thing. For the most part, bionic eyes are still rely heavily on the user’s brain to make sense of the alien signals being pumped into it. However, thanks to the addition of configurable settings, patients have a degree of control over their perceived environment that most cannot begin to enjoy. So really, its not likely to be too long before these bionic implants improve upon the fleshy ones we come equipped with.

biotech_dnaWow, what a week! It seems that these days, one has barely has to wait at all to find that the next big thing is happening right under their very nose. I can foresee a future where people no longer fear getting into accidents, suffering burns, or losing their right eye (or left, I don’t discriminate). With the ability to regrow flesh and cartilage, and replace organic tissues with bionic ones, there may yet come a time when a human can have a close-shave with death and be entirely rebuilt.

I foresee death sports becoming a hell of a lot more popular in this future… Well, crap on me! And while we’re waiting for this future to occur, feel free to check out this animated video of the Alpha IMS being installed and how it works:


Sources:
IO9.com, Extremetech.com

DIY Prosthetics on Demand

DIY_prostheticThe field of prosthetics has seen some rather stark and amazing developments in recent years. And considering the rise in DIY cybernetics, biohacking and 3D printing, it was just a matter of time before a bunch of hobbyists found a way to create their own. And that’s precisely what Ivan Owen and Richard Van, a special effects artist and a woodworker, have managed to do.

Despite living hundreds of kilometers from each other, these two men managed to collaborate on the creation of an artificial limb. And in an especially heartwarming twist, they did it on demand for a South African boy named Liam who war born without fingers on his right hand. For some time, they had been working together to create prosthetics relying only on their general know-how and technology that is available to the general public, all the while keeping tabs on their progress and sharing it with the general public through their blog comingupshorthanded.com.

DIY_prosth_LiamAfter stumbling onto this website, Liam’s mother contacted Ivan and Richard and asked if they could create an artificial hand for her son. They obliged and, using a 3D printer, bits of cable, bungee cord returns and rubber thimbles, the two men collaborated over the internet to make it happen. And not only have they changed the life of young Liam, who is capable of doing things he never thought possible, they now hope to do the same for others looking for low-cost prosthetic alternatives.

For years, these two had been working on a “Robohand” together, in part due to the fact that Van As lost his right hand fingers in a woodworking accident. But until now, they had not considered the wider implications of their work. And after talking to Liam’s mom and seeing the difference it made in Liam’s life, they have set up a fundraising page are take requests for people looking for devices or who are interesting in offering help. Thanks to the open-source nature of the project, a number of improvements have already been made to their designs, with more sure to follow.

bionic_handsIn addition to showcasing the trend of DIY device-making and open-source development, this is also good news for anyone in the market for an artificial hand or limb and who does not have $10,000 kicking around. That’s the standard price for a prosthetic these days, which despite incredible leaps in terms of sophistication have not gotten any cheaper! But with the right know-how, and some technical assistance, a person can find their way to a cheap, printed alternative and see similar results.

Overal, prosthetics offer people the opportunity to restore mobility and retain their independence. And now, thanks to the internet and 3D printing capabilities, they can manufacture these devices independently. The power to restore your own mobility is in your own hands… Interesting, and one might even say cosmically convergent!

Rock on Liam! You’ve got a great mom and some talented friends. As for the rest of you, be sure to check out this video of the 5 year old boy in action with his new prosthetic hand.

Source: IO9.com, comingupshort.com, fundly.com

Criminalizing Transhuman Soldiers

biosoldiersIt seems to be the trend these days. You take a predictions that was once the domain of science fiction and treat it as impending science fact. Then you recommend that before it comes to pass, we pre-emptively create some kind of legal framework or organization to deal with it once it does. Thus far, technologies which are being realized have been addressed – such as autonomous drones – but more and more, concepts and technologies which could be real any day now are making the cut.

It all began last year when the organization known as Human Rights Watch and Harvard University teamed up to release a report calling for the ban of “killer robots”. It was soon followed when the University of Cambridge announced the creation of the Centre for the Study of Existential Risk (CSER) to investigate developments in AI, biotechnology, and nanotechnology and determine if they posed a risk.

X-47BAnd most recently, just as the new year began, a report funded by the Greenwall Foundation examined the legal and ethical implications of using biologically enhanced humans on the battlefield. This report was filed in part due to advances being made in biotechnology and cybernetics, but also because of the ongoing and acknowledged efforts by the Pentagon and DARPA to develop super-soldiers.

The report, entitled “Enhanced Warfighters: Risks, Ethics, and Policy”, was written by Keith Abney, Patrick Lin and Maxwell Mehlman of California Polytechnic State University.  The group, which investigates ethical and legal issues as they pertain to the military’s effort to enhance human warfighters, received funding from the Greenwall Foundation, an organization that specializes in biomedicine and bioethics.

In a recent interview, Abney expressed the purpose of the report, emphasizing how pre-emptive measures are necessary before a trend gets out of hand:

“Too often, our society falls prey to a ‘first generation’ problem — we wait until something terrible has happened, and then hastily draw up some ill-conceived plan to fix things after the fact, often with noxious unintended consequences. As an educator, my primary role here is not to agitate for any particular political solution, but to help people think through the difficult ethical and policy issues this emerging technology will bring, preferably before something horrible happens.”

US_Army_powered_armorWhat’s more, he illustrated how measures are necessary now since projects are well-underway to develop super soldiers. These include powered exoskeletons to increase human strength and endurance. These include devices like Lockheed Martin’s HULC, Raytheon’s XOS, UC Berkeley’s BLEEX, and other projects.

In addition, DARPA has numerous projects on the books designed to enhance a soldiers abilities with cybernetics and biotech. These include VR contact lenses, basic lenses that enhance normal vision by allowing a wearer to view virtual and augmented reality images without a headset of glasses. There’s also their Cognitive Technology Threat Warning System (CT2WS), which is a computer-assisted visual aid that instantly identifies threats by augmenting their visual faculties.

CREATOR: gd-jpeg v1.0 (using IJG JPEG v62), quality = 90And in the cognitive realm, there are such programs as Human Assisted Neural Devices (HAND) that seeks to strengthen and restore memories and the Peak Soldier Performance (PSP) program that will  boosthuman endurance, both physical and cognitive. But of course, since post-traumtic stress disorder is a major problem, DARPA is also busy at work creating drugs and treatments that can erase memories, something which they hope will give mentally-scarred soldiers a new lease on life (and military service!)

And of course, the US is hardly alone in this regard. Every industrialized nation in the world, from the EU to East Asia, is involved in some form of Future Soldier or enhanced soldier program. And with nations like China and Russia catching up in several key areas – i.e. stealth, unmanned aerial vehicles and aeronautics – the race is on to create a soldier program that will ensure one nation has the edge.

bionic_handsBut of course, as Abney himself points out, the issue of “enhancement” is a rather subjective term. For example, medical advancements are being made all the time that seek to address disabilities and disorders and also fall into the category of “enhancement”. Such ambiguities need to be ironed out before any legal framework can be devised, hence Abney and his associates came up with the following definition:

“In the end, we argued that the best definition of an enhancement is that it’s ‘a medical or biological intervention to the body designed to improve performance, appearance, or capability besides what is necessary to achieve, sustain or restore health.”

Working from this starting point, Abney and his colleagues made the case in their report that the risk such enhancements pose over and above what is required for normal health helps explain their need for special moral consideration.

These include, but are not limited to, the issue of consent, whether or not a soldier voluntary submits to enhancement. Second, there is the issue of long-term effects and whether or not a soldier is made aware of them. Third, there is the issue of what will happen with these people if and when they retire from the services and attempt to reintegrate into normal society.

It’s complicated, and if it’s something the powers that be are determined to do, then they need to be addressed before they become a going concern. Last thing we need is a whole bunch of enhanced soldiers wandering around the countryside unable to turn off their augmented killer instincts and super-human strength. Or, at the very least, it would be good to know we have some kind of procedure in place in case they do!

What do you think of when you hear the word "super soldier"? Yeah, me too!
What do you think of when you hear the word “super soldier”? Yeah, me too!

Source: IO9.com