Judgement Day Update: Banning Autonomous Killing Machines

drone-strikeDrone warfare is one of the most controversial issues facing the world today. In addition to ongoing concerns about lack of transparency and who’s making the life-and-death decisions, there has also been serious and ongoing concerns about the cost in civilian lives, and the efforts of both the Pentagon and the US government to keep this information from the public.

This past October, the testimonial of a Pakistani family to Congress helped to put a human face on the issue. Rafiq ur Rehman, a Pakistani primary school teacher, described how his mother, Momina Bibi, had been killed by a drone strike. His two children – Zubair and Nabila, aged 13 and 9 – were also injured in the attack that took place on October 24th of this year.

congress_dronetestimonyThis testimony occurred shortly after the publication of an Amnesty International report, which listed Bibi among 900 other civilians they say have been killed by drone strikes since 2001. Not only is this number far higher than previously reported, the report claims that the US may have committed war crimes and should stand trial for its actions.

Already, efforts have been mounted to put limitations on drone use and development within the US. Last year, Human Rights Watch and Harvard University released a joint report calling for the preemptive ban of “killer robots”. Shortly thereafter, Deputy Defense Secretary Ashton Carter signed a series of instructions to “minimize the probability and consequences of failures that could lead to unintended engagements.”

campaignkillerrobots_UNHowever, these efforts officially became international in scope when, on Monday October 21st, a growing number of humans rights activists, ethicists, and technologists converged on the United Nations Headquarters in New York City to call for an international agreement that would ban the development and use of fully autonomous weapons technology.

Known as the “Campaign To Stop Killer Robots,” an international coalition formed this past April, this group has demanded that autonomous killing machines should be treated like other tactics and tools of war that have been banned under the Geneva Convention – such as chemical weapons or anti-personnel landmines.

UAVsAs Jody Williams. a Nobel Peace Prize winner and, a founding member of the group said:

If these weapons move forward, it will transform the face of war forever. At some point in time, today’s drones may be like the ‘Model T’ of autonomous weaponry.

According to Noel Sharkey, an Irish computer scientist who is chair of the International Committee for Robot Arms Control, the list of challenges in developing autonomous robots is enormous. They range from the purely technological, such as the ability to properly identify a target using grainy computer vision, to ones that involve fundamental ethical, legal, and humanitarian questions.

As the current drone campaign has shown repeatedly, a teenage insurgent is often hard to distinguish from a child playing with a toy. What’s more, in all engagements in war, there is what is called the “proportionality test” – whether the civilian risks outweigh the military advantage of an attack. At present, no machine exists that would be capable of making these distinctions and judgement calls.

X-47B_over_coastlineDespite these challenges, militaries around the world – including China, Israel, Russia, and especially the U.S. – are enthusiastic about developing and adopting technologies that will take humans entirely out of the equation, often citing the potential to save soldiers’ lives as a justification. According to Williams, without preventative action, the writing is on the wall.

Consider the U.S. military’s X-47 aircraft, which can take off, land, and refuel on its own and has weapons bays, as evidence of the trend towards greater levels of autonomy in weapons systems. Similarly, the U.K. military is collaborating with B.A.E. Systems to develop a drone called the Taranis, or “God of Thunder,” which can fly faster than the speed of sound and select its own targets.

campaign_killerrobotsThe Campaign to Stop Killer Robots, a coalition of international and national NGOs, may have only launched recently, but individual groups have been to raise awareness for the last few years. Earlier this month, 272 engineers, computer scientists and roboticists signed onto the coalition’s letter calling for a ban. In addition, the U.N. is already expressed concern about the issue.

For example, the U.N. Special Rapporteur issued a report to the General Assembly back in April that recommended states establish national moratorium on the development of such weapons. The coalition is hoping to follow up on this by asking that other nations will join those already seeking to start early talks on the issue at the U.N. General Assembly First Committee on Disarmament and International Security meeting in New York later this month.

AI'sOn the plus side, there is a precedent for a “preventative ban”: blinding lasers were never used in war, because they were preemptively included in a treaty. On the downside, autonomous weapons technology is not an easily-defined system, which makes it more difficult to legislate. If a ban is to be applied, knowing where it begins and ends, and what loopholes exist, is something that will have to be ironed out in advance.

What’s more, there are alternatives to a ban, such as regulation and limitations. By allowing states to develop machinery that is capable of handling itself in non-combat situations, but which require a human operator to green light the use of weapons, is something the US military has already claimed it is committed to. As far as international law is concerned, this represents a viable alternative to putting a stop to all research.

Overall, it is estimated that we are at least a decade away from a truly autonomous machine of war, so there is time for the law to evolve and prepare a proper response. In the meantime, there is also plenty of time to address the current use of drones and all its consequences. I’m sure I speak for more than myself when I say that I hope its get better before it gets worse.

And in the meantime, be sure to enjoy this video produced by Human Rights Watch:

fastcoexist.com, thegaurdian.com, stopkillerrobots.org

Criminalizing Transhuman Soldiers

biosoldiersIt seems to be the trend these days. You take a predictions that was once the domain of science fiction and treat it as impending science fact. Then you recommend that before it comes to pass, we pre-emptively create some kind of legal framework or organization to deal with it once it does. Thus far, technologies which are being realized have been addressed – such as autonomous drones – but more and more, concepts and technologies which could be real any day now are making the cut.

It all began last year when the organization known as Human Rights Watch and Harvard University teamed up to release a report calling for the ban of “killer robots”. It was soon followed when the University of Cambridge announced the creation of the Centre for the Study of Existential Risk (CSER) to investigate developments in AI, biotechnology, and nanotechnology and determine if they posed a risk.

X-47BAnd most recently, just as the new year began, a report funded by the Greenwall Foundation examined the legal and ethical implications of using biologically enhanced humans on the battlefield. This report was filed in part due to advances being made in biotechnology and cybernetics, but also because of the ongoing and acknowledged efforts by the Pentagon and DARPA to develop super-soldiers.

The report, entitled “Enhanced Warfighters: Risks, Ethics, and Policy”, was written by Keith Abney, Patrick Lin and Maxwell Mehlman of California Polytechnic State University.  The group, which investigates ethical and legal issues as they pertain to the military’s effort to enhance human warfighters, received funding from the Greenwall Foundation, an organization that specializes in biomedicine and bioethics.

In a recent interview, Abney expressed the purpose of the report, emphasizing how pre-emptive measures are necessary before a trend gets out of hand:

“Too often, our society falls prey to a ‘first generation’ problem — we wait until something terrible has happened, and then hastily draw up some ill-conceived plan to fix things after the fact, often with noxious unintended consequences. As an educator, my primary role here is not to agitate for any particular political solution, but to help people think through the difficult ethical and policy issues this emerging technology will bring, preferably before something horrible happens.”

US_Army_powered_armorWhat’s more, he illustrated how measures are necessary now since projects are well-underway to develop super soldiers. These include powered exoskeletons to increase human strength and endurance. These include devices like Lockheed Martin’s HULC, Raytheon’s XOS, UC Berkeley’s BLEEX, and other projects.

In addition, DARPA has numerous projects on the books designed to enhance a soldiers abilities with cybernetics and biotech. These include VR contact lenses, basic lenses that enhance normal vision by allowing a wearer to view virtual and augmented reality images without a headset of glasses. There’s also their Cognitive Technology Threat Warning System (CT2WS), which is a computer-assisted visual aid that instantly identifies threats by augmenting their visual faculties.

CREATOR: gd-jpeg v1.0 (using IJG JPEG v62), quality = 90And in the cognitive realm, there are such programs as Human Assisted Neural Devices (HAND) that seeks to strengthen and restore memories and the Peak Soldier Performance (PSP) program that will  boosthuman endurance, both physical and cognitive. But of course, since post-traumtic stress disorder is a major problem, DARPA is also busy at work creating drugs and treatments that can erase memories, something which they hope will give mentally-scarred soldiers a new lease on life (and military service!)

And of course, the US is hardly alone in this regard. Every industrialized nation in the world, from the EU to East Asia, is involved in some form of Future Soldier or enhanced soldier program. And with nations like China and Russia catching up in several key areas – i.e. stealth, unmanned aerial vehicles and aeronautics – the race is on to create a soldier program that will ensure one nation has the edge.

bionic_handsBut of course, as Abney himself points out, the issue of “enhancement” is a rather subjective term. For example, medical advancements are being made all the time that seek to address disabilities and disorders and also fall into the category of “enhancement”. Such ambiguities need to be ironed out before any legal framework can be devised, hence Abney and his associates came up with the following definition:

“In the end, we argued that the best definition of an enhancement is that it’s ‘a medical or biological intervention to the body designed to improve performance, appearance, or capability besides what is necessary to achieve, sustain or restore health.”

Working from this starting point, Abney and his colleagues made the case in their report that the risk such enhancements pose over and above what is required for normal health helps explain their need for special moral consideration.

These include, but are not limited to, the issue of consent, whether or not a soldier voluntary submits to enhancement. Second, there is the issue of long-term effects and whether or not a soldier is made aware of them. Third, there is the issue of what will happen with these people if and when they retire from the services and attempt to reintegrate into normal society.

It’s complicated, and if it’s something the powers that be are determined to do, then they need to be addressed before they become a going concern. Last thing we need is a whole bunch of enhanced soldiers wandering around the countryside unable to turn off their augmented killer instincts and super-human strength. Or, at the very least, it would be good to know we have some kind of procedure in place in case they do!

What do you think of when you hear the word "super soldier"? Yeah, me too!
What do you think of when you hear the word “super soldier”? Yeah, me too!

Source: IO9.com

Scientists Raise the Alarm on Human Enhancements

enhancementThe concept of technological progress and its potential consequences has been the subject of quite a bit of attention lately. First, there was the announcement  from Harvard University and Human Rights Watch that a ban on killer robots was needed before the current pace of innovation led to the machines that could so without human oversight.

Then came the University of Cambridge’s announcement about the creation of the Center for the Study of Existential Risk (CSER) to evaluate new technologies. And last, there was the news the news that the DOD had signing a series of instructions to “minimize the probability and consequences of failures that could lead to unintended engagements,” starting at the design stage.

bionic_handConcordantly, back in early November, the Royal Society along with the Academy of Medical Sciences, British Academy, and Royal Academy of Engineering concluded a workshop called “Human Enhancement and the Future of Work” in which they considered the growing impact and potential risks of augmentation technologies. In their final report, they raised serious concerns about the burgeoning trend and how humanity is moving from a model of therapy to one in which human capacities are greatly improved. The implications, they concluded, should be part of a much wider public discussion.

Specifically, the report raised concerns on drugs and digital enhancements that will allow people to work longer, hard and faster. Such technologies could easily give rise to a culture of enhanced competitiveness, more than we currently know, where the latest in cybernetics, bionics and biomedical devices are used to gain and edge, not to remedy medical problems. Currently, things like bionic prosthesis are being created to aid amputees and injury victims; but as the technology improves and such devices become more effective than organic limbs, the purpose could change.

cyberpunk-eyeWhat’s more, there are the ethical implications of having such technology available to human beings. If people can upgrade their bodies to enhance their natural abilities, what will it means for those who get “left behind”? Will the already enormous gulf between the rich and poor expand even further and take on a new dimension? Will those who want to succeed in the business world be forced to scrounge so they can get the latest upgrades.

Or, as the panel’s final report put it:

“Work will evolve over the next decade, with enhancement technologies potentially making a significant contribution. Widespread use of enhancements might influence an individual’s ability to learn or perform tasks and perhaps even to enter a profession; influence motivation; enable people to work in more extreme conditions or into old age, reduce work-related illness; or facilitate earlier return to work after illness.”

At the same time however, they acknowledge the potential efficacy and demand for such technologies, prompting the call for open discourse. Again, from the report:

“Although enhancement technologies might bring opportunities, they also raise several health, safety, ethical, social and political challenges, which warrant proactive discussion. Very different regulatory regimes are currently applied: for example, digital services and devices (with significant cognitive enhancing effects) attract less, if any, regulatory oversight than pharmacological interventions. This raises significant questions, such as whether any form of self-regulation would be appropriate and whether there are circumstances where enhancements should be encouraged or even mandatory, particularly where work involves responsibility for the safety of others (e.g. bus drivers or airline pilots).”

In many ways, this report is overdue, as it is offering some rather obvious commentary on a subject which has been the subject of speculation and fiction for some time. For example, in the Sprawl Trilogy, William Gibson explored the idea of human enhancement and the disparity between rich and poor at length. In his world, the rich were ensured clinical immortality through AI and biotech while everyone else was forced to spend their savings just to afford the latest tech, merely so they could stay in the running.

However, just about all of the panel’s recommendations were most appropriate. They included further investigations into ensuring safety, affordability, and accessibility, not to mention that some of these enhancement technologies —  be they pharmaceutical, regenerative medicines, or cybernetics — should be regulated by the government. This last article is especially appropriate given the potential for personal misuse, not to mention the potential exploitation by employers.

With all the harm that could result from having technologies that could render human beings “postmortal” or “posthuman”, some degree of oversight is certainly necessary. But of course, the real key is a public educated and informed on the issue of cybernetics, bionics, and human enhancement, and what they could mean for us. As with so much else, the issue is one of choice, and awareness of what the consequences could be. Choose wisely, that’s the only guarantee! Hey, that rhymed… I smell a quote!

Source: IO9.com

Planning For Judgement Day…

TerminatorSome very interesting things have been taking place in the last month, all of concerning the possibility that humanity may someday face the extinction at the hands of killer AIs. The first took place on November 19th, when Human Rights Watch and Harvard University teamed up to release a report calling for the ban of “killer robots”, a preemptive move to ensure that we as a species never develop machines that could one day turn against us.

The second came roughly a week later when the Pentagon announced that measures were being taken to ensure that wherever robots do kill – as with drones, remote killer bots, and cruise missiles – the controller will always be a human being. Yes, while Americans were preparing for Thanksgiving, Deputy Defense Secretary Ashton Carter signed a series of instructions to “minimize the probability and consequences of failures that could lead to unintended engagements,” starting at the design stage.

X-47A Drone
X-47A Drone, the latest “hunter-killer”

And then most recently, and perhaps in response to Harvard’s and HRW’s declaration, the University of Cambridge announced the creation of the Centre for the Study of Existential Risk (CSER). This new body, which is headed up by such luminaries as Huw Price, Martin Rees, and Skype co-founder Jaan Tallinn, will investigate whether recent advances in AI, biotechnology, and nanotechnology might eventually trigger some kind of extinction-level event. The Centre will also look at anthropomorphic (human-caused) climate change, as it might not be robots that eventually kill us, but a swelteringly hot climate instead.

All of these developments stem from the same thing: ongoing developments in the field of computer science, remotes, and AIs. Thanks in part to the creation of the Google Neural Net, increasingly sophisticated killing machines, and predictions that it is only a matter of time before they are capable of making decisions on their own, there is some worry that machines programs to kill will be able to do so without human oversight. By creating bodies that can make recommendations on the application of technologies, it is hopes that ethical conundrums and threats can be nipped in the bud. And by legislating that human agency be the deciding factor, it is further hoped that such will never be the case.

The question is, is all this overkill, or is it make perfect sense given the direction military technology and the development of AI is taking? Or, as a third possibility, might it not go far enough? Given the possibility of a “Judgement Day”-type scenario, might it be best to ban all AI’s and autonomous robots altogether? Hard to say. All I know is, its exciting to live in a time when such things are being seriously contemplated, and are not merely restricted to the realm of science fiction.Blade_runner