The Future is Here: Overcoming Paralysis

neurobridge_ianIan Burkhart, a 23-year-old quadriplegic from Dublin, Ohio, was injured in 2010 in a diving accident, breaking his neck on a sandbar and paralyzing his body from the neck down. He was left with some use of his arms, but lost the use of his legs, hands, and fingers. Thanks to a new device known as the Neurobridge though – a device that allows the brains signals to bypass the severed spinal cord – Burkhart has now moved his right hand and fingers for the first time since the accident.

This device, which was developed in concert by the Ohio State University Wexner Medical Center and the non-profit company Battelle, consists of a pea-sized chip that contains an an array of 96 electrodes, allows researchers to look at detailed signals and neural activity emanating from the patient’s brain. This chip was implanted in Ian’s brain two months ago, when neurosurgeon Dr Ali Rezai of Ohio State University performed the surgery that would implant the sensor chip into the motor cortex of his brain.

neuromorphic_revolutionBattelle has been working on neurosensing technology for almost a decade. As Chad Bouton, the leader of the Neurobridge project at Battelle, explains:

We were having such success in decoding brain activity, we thought, ‘Let’s see if we could remap the signals, go around something like a spinal cord injury and then translate the signals into something that the muscles could understand and help someone paralyzed regain control of their limb’.

During the test, which occurred in June, the implanted chip read and interpreted the electrical activity in Burkhart’s brain and sent it to a computer. The computer then recoded the signal, and sent it to a high-definition electrode stimulation sleeve Burkhart wore on his right arm, a process that took less than a tenth of a second and allowed Burkhart to move his paralysed fingers. Basically, Burkhart is able to move his hand by simply thinking about moving his hand, and the machine does the rest.

neurobridge1A team led by Chad Bouton at Battelle spent nearly a decade developing the algorithms, software and sleeve. Then, just two years ago, Dr Ali Rezai and Dr Jerry Mysiw were brought on board to design the clinical trials. Burkhart became involved with the study after his doctor mentioned it to him and he learned he was an ideal candidate. He had the exact level of injury the researchers were looking for, is young and otherwise healthy, and lives close to the Ohio State University Wexner Medical Center, where the research is being conducted.

Even so, Burkhart had to think hard before agreeing to the surgery. He also knew that the surgery wouldn’t magically give him movement again. He would have to undergo rigorous training to regain even basic hand function. Mainly, his experience would help move along future technological advances. However, he was excited to be taking part in cutting-edge research which would ultimately help people like him who have suffered from spinal injuries and paralysis.

enhancementPost-surgery, Burkhart still had a lot of thinking to do, this time, in order to move his hand. As he explained:

It’s definitely great for me to be as young as I am when I was injured because the advancements in science and technology are growing rapidly and they’re only going to continue to increase… Mainly, it was just the fact that I would have to have brain surgery for something that wasn’t needed… Anyone able bodied doesn’t think about moving their hand, it just happens. I had to do lots of training and coaching.

The hand can make innumerable complex movements with the wrist, the fingers, and the fist. In order for Battelle’s software to read Ian’s mind, it has to look for subtle changes in the signals coming from Ian’s brain. As Bouton explains it, the process is like walking into a crowded room with hundreds of people trying to talk to each other, and you’re trying to isolate one particular conversation in a language that you don’t understand.

neurobridgeAt this point, Burkhart can perform a handful of movement patterns, including moving his hand up and down, opening and closing it, rotating it, and drumming on a table with his fingers. All of this can only be done while he’s in the hospital, hooked up to the researchers’ equipment. But the ultimate goal is to create a device and a software package that he can take with him, giving him the ability to bypass his injury and have full ambulatory ability during everyday activities.

This isn’t the only research looking into bringing movement back to the paralyzed. In the past, paralyzed patients have been given brain-computer interfaces, but they have only been able to control artificial limbs – i.e. Zak Water’s mind-controlled leg or the BrainGate’s device that allow stroke victims to eat and drink using a mind-controlled robotic arm. Participants in an epidural stimulator implant study have also been able to regain some movement in their limbs, but this technology works best on patients with incomplete spinal cord injuries.

braingate_drinkassistBurkhart is confident that he can regain even more movement back from his hand, and the researchers are approved to try the technology out on four more patients. Ultimately, the system will only be workable commercially with a wireless neural implant, or an EEG headset – like the Emotiv, Insight or Neurosky headsets. The technology is also being considered for stroke rehabilitation as well, another area where EEG and mind-control technology are being considered as a mean to recovery.

From restoring ambulatory ability through mind-controlled limbs and neurosensing devices to rehabilitating stroke victims with mind-reading software, the future is fast shaping up to be a place where no injuries are permanent and physical disabilities and neurological impairments are a thing of the past. I think I can safely speak for everyone when I say that watching these technologies emerge makes it an exciting time to be alive!

And be sure to check out this video from the OSUW Medical Center that shows Ian Burkhart and the Batelle team testing the Neurobridge:


Sources: cnet.com, fastcoexist.com

The Future is Here: Deka Mind-Controlled Arm Gets FDA Approval!

Deka_armFor years, biomedical researchers have been developing robotic prosthetics of greater and greater sophistication. From analog devices that can be quickly and cheaply manufactured by a 3-D printer, to mind-controlled prosthetics that move, to ones that both move and relay sensory information, the technology is growing by leaps and bounds. And just last week, the FDA officially announced it had approved the first prosthetic arm that’s capable of performing multiple simultaneous powered movements.

The new Deka arm – codenamed Luke, after Luke Skywalker’s artificial hand – was developed by Dean Kamen, inventor of the Segway. The project began in 2006 when DARPA funded multiple research initiatives in an attempt to create a better class of prosthetic device for veterans returning home from the Iraq War. Now, the FDA’s approval is a huge step for the Deka, as it means the devices are now clear for sale — provided the company can find a commercial partner willing to bring them to market.

Deka_arm1Compared to other prosthetics, the Deka Arm System is a battery-powered device that combines multiple approaches. Some of the Deka’s functions are controlled by myoelectricity, which means the device senses movement in various muscle groups via attached electrodes, then converts those muscle movements into motor control. This allows the user a more natural and intuitive method of controlling the arm rather than relying on a cross-body pulley system.

Deka_Arm2The more advanced myoelectric systems can even transmit sensation back to the user, using the same system of electrodes to simulate pressure sensation for the user. This type of control flexibility is essential to creating a device that can address the wide range of needs from various amputees, and the Deka’s degree of fine-grained control is remarkable. Not only are user’s able to perform a wide range of movements and articulations with the hand, they are able to sense what they are doing thanks to the small pads on the fingertips and palm.

Naturally, the issue of price remains, which is consequently the greatest challenge facing the wide-scale adoption of these types of devices. A simple prosthestic arm is likely to cost $3000, while a sophisticated prosthesis can run as much as $50,000. In many cases, limbs have a relatively short lifespan, with wear and tear requiring a replacement device 3 to 4 years. Hence why 3-D printed variations, which do not boast much sophistication, are considered a popular option.

bionic-handVisual presentation is also a major issue, as amputees often own multiple prostheses (including cosmetic ones) simply to avoid the embarrassment of wearing an obviously artificial limb. That’s one reason why the Deka Arm System’s design has evolved towards a much more normal-looking hand. Many amputees don’t want to wear a crude-looking mechanical device.

At present, the prosthetic market is still too broad, and the needs of amputees too specific to declare any single device as a one-size-fits-all success. But the Deka looks as though it could move the science of amputation forward and offer a significant number of veterans and amputees a device that more closely mimics natural human function than anything we’ve seen before. What’s more, combined with mind-controlled legs, bionic eyes and replacement organs, it is a major step forward in the ongoing goal of making disability a thing of the past.

And in the meantime, check out this DARPA video of the Deka Arm being tested:

 


Source: extremetech.com

News in Bionics: Restoring Sensation and Mobility!

TED_adrianne1It seems like I’ve writing endlessly about bionic prosthetics lately, thanks to the many breakthroughs that have been happening almost back to back. But I would be remiss if I didn’t share these latest two. In addition to showcasing some of the latest technological innovations, these stories are inspiring and show the immense potential bionic prosthetics have to change lives and help people recover from terrible tragedies.

For instance, on the TED stage this week in Vancouver, which included presentations from astronaut Chris Hadfield, NSA whistle blower Edward Snowden, and anti-corruption activist Charmiah Gooch, there was one presentation that really stole the stage. It Adrianne Haslet-Davis, a former dance instructor and a survivor of the Boston Marathon bombing, dancing again for the first time. And it was all thanks to a bionic limb developed by noted bionics researcher Hugh Herr. 

TED_hugh_herrAs the director of the Biomechatronics Group at the MIT Media Lab, Herr is known for his work on high-tech bionic limbs and for demonstrating new prosthetic technologies on himself. At 17, he lost both his legs in a climbing accident. After discussing the science of bionic limbs, Herr brought out Adrianne, who for the first time since her leg amputation, performed a short ballroom dancing routine.

This was made possible thanks to the help of a special kind of bionic limb that designed by Herr and his colleagues at MIT specifically for dancing. The design process took over 200 days, where the researchers studied dance, brought in dancers with biological limbs, studied how they moved, and examined the forces they applied on the dance floor. What resulted was a “dance limb” with 12 sensors, a synthetic motor system that can move the joint, and microprocessors that run the limb’s controllers.

TED_adrianne2The system is programmed so that the motor moves the limb in a way that’s appropriate for dance. As Herr explained in a briefing after his talk:

It was so new. We had never looked at something like dance. I understand her dream and emotionally related to her dream to return to dance. It’s similar to what I went through.” Herr says he’s now able to climb at a more advanced level than when he had biological legs.

Haslet-Davis’s new limb is only intended for dancing; she switches to a different bionic limb for regular walking. And while this might seem like a limitation, it in fact represents a major step in the direction of bionics that can emulate a much wider range of human motion. Eventually, Herr envisions a day when bionic limbs can switch modes for different activities, allowing a person to perform a range of different tasks – walking, running, dancing, athletic activity – without having to change prosthetics.

TED_adrianneIn the past, Herr’s work has been criticized by advocates who argue that bionic limbs are a waste of time when many people don’t even have access to basic wheelchairs. He argues, however, that bionic limbs–which can cost as much as a nice car–ultimately reduce health care costs. For starters, they allow people to return to their jobs quickly, Herr said, thus avoiding workers’ compensation costs.

They can also prevent injuries resulting from prosthetics that don’t emulate normal function as effectively as high-tech limbs. And given the fact that the technology is becoming more widespread and additive manufacturing is leading to lower production costs, there may yet come a day when a bionic prosthetic is not beyond the means of the average person. Needless to say, both Adrianne and the crowd were moved to tears by the moving and inspiring display!

bionic_hand_MIT1Next, there’s the inspiring story of Igor Spectic, a man who lost his right arm three years ago in a workplace accident. Like most people forced to live with the loss of a limb, he quickly came to understand the limitations of prosthetics. While they do restore some degree of ability, the fact that they cannot convey sensation means that the wearers are often unaware when they have dropped or crushed something.

Now, Spectic is one of several people taking part in early trials at Cleveland Veterans Affairs Medical Center, where researchers from Case Western Reserve University are working on prosthetics that offer sensation as well as ability. In a basement lab, the trials consist of connecting his limb to a prosthetic hand, one that is rigged with force sensors that are plugged into 20 wires protruding from his upper right arm.

bionic_hand_MITThese wires lead to three surgically implanted interfaces, seven millimeters long, with as many as eight electrodes apiece encased in a polymer, that surround three major nerves in Spetic’s forearm. Meanwhile, a nondescript white box of custom electronics does the job of translating information from the sensors on Spetic’s prosthesis into a series of electrical pulses that the interfaces can translate into sensations.

According to the trial’s leader, Dustin Tyler – a professor of biomedical engineering at Case Western Reserve University and an expert in neural interfaces – this technology is “20 years in the making”. As of this past February, the implants had been in place and performing well in tests for more than a year and a half. Tyler’s group, drawing on years of neuroscience research on the signaling mechanisms that underlie sensation, has developed a library of patterns of electrical pulses to send to the arm nerves, varied in strength and timing.

bionic_hand_MIT2Spetic says that these different stimulus patterns produce distinct and realistic feelings in 20 spots on his prosthetic hand and fingers. The sensations include pressing on a ball bearing, pressing on the tip of a pen, brushing against a cotton ball, and touching sandpaper. During the first day of tests, Spetic noticed a surprising side effect: his phantom fist felt open, and after several months the phantom pain was “95 percent gone”.

To test the hand’s ability to provide sensory feedback, and hence aid the user in performing complex tasks, Spetic and other trial candidates were tasked with picking up small blocks that were attached to a table with magnets, as well as handling and removing the stems from a bowl of cherries. With sensation restored, he was able to pick up cherries and remove stems 93 percent of the time without crushing them, even blindfolded.

bionic_hand_MIT_demoWhile impressive, Tyler estimates that completing the pilot study, refining stimulation methods, and launching full clinical trials is likely to take 10 years. He is also finishing development of an implantable electronic device to deliver stimuli so that the technology can make it beyond the lab and into a household setting. Last, he is working with manufacturers of prostheses to integrate force sensors and force processing technology directly into future versions of the devices.

As for Spetic, he has drawn quite a bit of inspiration from the trials and claims that they have left him thinking wistfully about what the future might bring. As he put it, he feels:

…blessed to know these people and be a part of this. It would be nice to know I can pick up an object without having to look at it, or I can hold my wife’s hand and walk down the street, knowing I have a hold of her. Maybe all of this will help the next person.

bionic-handThis represents merely one of several successful attempts to merge the technology of nerve stimulation in with nerve control, leading to bionic limbs that not only obey user’s commands, but provide sensory feedback at the same time. Given a few more decades of testing and development, we will most certainly be looking at an age where bionic limbs that are virtually indistiguishable from the real thing exist and are readily available.

And in the meantime, enjoy this news story of Adrianne Haslet-Davis performing her ballroom dance routine at TED. I’m sure you’ll find it inspiring!


Sources: fastcoexist.com, technologyreview.com, blog.ted.com

The Future is Here: VR Taste Buds and Google Nose

holodeck_telexOne of the most intriguing and fastest-growing aspects of digital media is the possibilities it offers for augmenting reality. Currently, that means overlaying images or text on top of the real world through the use of display glasses or projectors. But in time, the range of possibilities might expand far beyond the visual range, incorporating the senses of taste and smell.

That’s where devices like the Digital Taste Interface comes into play. Developed by Nimesha Ranasinghe, an electrical engineer and the lead researcher of the team at National University of Singapore, this new technology seeks to combine the worlds of virtual reality and gestation. As Ranasinghe explained it in a recent interview with fastcompany.com:

Gustation is one of the fundamental and essential senses, [yet] it is almost unheard of in Internet communication, mainly due to the absence of digital controllability over the sense of taste. To simulate the sensation of taste digitally, we explored a new methodology which delivers and controls primary taste sensations electronically on the human tongue.

digital_taste_interfaceThe method involves two main modules, the first being a control system which formulates different properties of stimuli – basically, levels of current, frequency, and temperature. These combine to provide thermal changes and electrical stimulation that simulate taste sensations, which are in turn delivered by the second module. This is the tongue interface, which consists of two thin, metal electrodes.

According to Ranasinghe, during the course of clinical trials, subjects reported a range of taste experiences. These ranged from sour, salty and bitter sensations to minty, spicy, and sweet. But to successfully communicate between the control systems and sensors, Ranasinghe and her team created a new language format. Known as the TasteXML (TXML), this software specifies the format of specific taste messages.

digital_taste_interface1While the team is currently in negotiations to make the technology commercially available, there are a few pressing updates in the works for the Digital Taste Interface. The first is a more appealing way to use the tongue sensors, which currently are attached while the mouth is open. To that end, they want an interface that can be held in the mouth, called the digital lollipop because it looks like the candy.

In addition to making the system look more aesthetically pleasing and appetizing, it will also allow for a deeper understanding of how electrical stimulation affects taste sensors on different parts of the tongue. In addition, they also want to incorporate smell and texture into the experience, to further extend the range of sensations and create a truly immersive virtual experience.

digital_taste_interface2Ultimately, the Digital Taste Interface has many potential benefits and applications, ranging from medical advances to diet regimens and video games. As Ranasinghe explains:

We are exploring different domains such as entertainment (taste changing drink-ware and accessories) and medical (for patients who lost the sense of taste or have a diminished sense of taste). However, our main focus is to introduce the sensation of taste as a digitally controllable media, especially to facilitate virtual and augmented reality domains.

So in the coming years, do not be surprised if virtual simulations come augmented with a full-range of sensory experiences. In addition to being able to interact with simulated environments (i.e. blowing shit up), you may also be able to smell the air, taste the food, and feel like you’re really and truly there. I imagine they won’t even call it virtual reality anymore. More like “alternate reality”!

And of course, there’s a video:


Sources:
fastcompany.com

The Worlds First Brain to Brain Interface!

Brain-ScanIt finally happened! It seems like only yesterday, I was talking about the limitations of Brain to Brain Interfacing (BBI), and how it was still limited to taking place between rats and between a human and a rat. Actually, it was two days ago, but the point remains. In spite of that, after only a few months of ongoing research, scientists have finally performed the first human-to-human interface.

Using a Skype connection, Rajesh Rao, who studies computational neuroscience at the University of Washington, successfully used his mind to control the hand of his colleague, Andrea Stucco. The experiment was conducted on Aug. 12th, less than month after researchers at Harvard used a non-invasive technique and a though to control the movement of a rat’s tail.

brain-to-brain-interfacingThis operation was quite simple: In his laboratory, Rao put on a skull cap containing electrodes which was connected to an electroencephalography (EEG) machine. These electrodes read his brainwaves and transmitted them across campus to Stocco who, seated in a separate lab, was equipped with a cap that was hooked up to a transcranial magnetic stimulation (TMS) machine.

This machine activating a magnetic stimulation coil that was integrated into the cap directly above Stocco’s left motor cortex, the part of the brain that controls movements of the hands. Back in Rao’s lab, he watched a screen displaying a video game, in which the player must tap the spacebar in order to shoot down a rocket; while  in Stocco’s lab. the computer was linked to that same game.

braininterfacing-0Instead of tapping the bar, however, Rao merely visualized himself doing so. The EEG detected the electrical impulse associated with that imagined movement, and proceeded to send a signal – via the Skype connection – to the TMS in Stocco’s lab. This caused the coil in Stocco’s cap to stimulate his left motor cortex, which in turn made his right hand move.

Given that his finger was already resting over the spacebar on his computer, this caused a cannon to fire in the game, successfully shooting down the rocket. He compared the feeling to that of a nervous tic. And to ensure that there was no chance of any outside influence, the Skype feeds were not visible to each other, and Stucco wore noise cancelling headphones and ear buds.

brain-activityIn the course of being interviewed, Rao was also quick to state that the technology couldn’t be used to read another person’s mind, or to make them do things without their willing participation. The researchers now hope to establish two-way communications between participants’ brains, as the video game experiment just utilized one-way communication.

Additionally, they would like to transmit more complex packets of information between brains, things beyond simple gestures. Ultimately, they hope that the technology could be used for things like allowing non-pilots to land planes in emergency situations, or letting disabled people transmit their needs to caregivers. And in time, the technology might even be upgraded to involve wireless implants.

brainpainting-brain-computer-interfaces-2One thing that should be emphasized here is the issue of consent. In this study, both men were willing participants, and it is certain that any future experimentation will involve people willingly accepting information back and forth. The same goes for commands, which theoretically could only occur between people willing to be linked to one another.

However, that doesn’t preclude that such links couldn’t one day be hacked, which would necessitate that anyone who chose to equip themselves with neural implants and uplinks also get their hands on protection and anti-hacking software. But that’s an issue for another age, and no doubt some future crime drama! Dick Wolf, you should be paying me for all the suggestions I’m giving you!

And of course, there’s a video of the experiment, courtesy of the University of Washington. Behold and be impressed, and maybe even a little afraid for the future:


Source:
gizmag.com

The Future is Here: Smart Skin!

neuronsWhen it comes to modern research and development, biomimetics appear to be the order of the day. By imitating the function of biological organisms, researchers seek to improve the function of machinery to the point that it can be integrated into human bodies. Already, researchers have unveiled devices that can do the job of organs, or bionic limbs that use the wearer’s nerve signals or thoughts to initiate motion.

But what of machinery that can actually send signals back to the user, registering pressure and stimulation? That’s what researchers from the University of Georgia have been working on of late, and it has inspired them to create a device that can do the job of the largest human organ of them all – our skin. Back in April, they announced that they had successfully created a brand of “smart skin” that is sensitive enough to rival the real thing.

smart-skin_610x407In essence, the skin is a transparent, flexible arrays that uses 8000 touch-sensitive transistors (aka. taxels) that emit electricity when agitated. Each of these comprises a bundle of some 1,500 zinc oxide nanowires, which connect to electrodes via a thin layer of gold, enabling the arrays to pick up on changes in pressure as low as 10 kilopascals, which is what human skin can detect.

Mimicking the sense of touch electronically has long been the dream researchers, and has been accomplished by measuring changes in resistance. But the team at Georgia Tech experimented with a different approach, measuring tiny polarization changes when piezoelectric materials such as zinc oxide are placed under mechanical stress. In these transistors, then, piezoelectric charges control the flow of current through the nanowires.

nanowiresIn a recent news release, lead author Zhong Lin Wang of Georgia Tech’s School of Materials Science and Engineering said:

Any mechanical motion, such as the movement of arms or the fingers of a robot, could be translated to control signals. This could make artificial skin smarter and more like the human skin. It would allow the skin to feel activity on the surface.

This, when integrated to prosthetics or even robots, will allow the user to experience the sensation of touch when using their bionic limbs. But the range of possibilities extends beyond that. As Wang explained:

This is a fundamentally new technology that allows us to control electronic devices directly using mechanical agitation. This could be used in a broad range of areas, including robotics, MEMS, human-computer interfaces, and other areas that involve mechanical deformation.

prostheticNot the first time that bionic limbs have come equipped with electrodes to enable sensation. In fact, the robotic hand designed by Silvestro Micera of the Ecole Polytechnique Federale de Lausanne in Switzerland seeks to do the same thing. Using electrodes that connect from the fingertips, palm and index finger to the wearer’s arm nerves, the device registers pressure and tension in order to help them better interact with their environment.

Building on these two efforts, it is easy to get a glimpse of what future prosthetic devices will look like. In all likelihood, they will be skin-colored and covered with a soft “dermal” layer that is studded with thousands of sensors. This way, the wearer will be able to register sensations – everything from pressure to changes in temperature and perhaps even injury – from every corner of their hand.

As usual, the technology may have military uses, since the Defense Advanced Research Projects Agency (DARPA) is involved. For that matter, so is the U.S. Air Force, the U.S. Department of Energy, the National Science Foundation, and the Knowledge Innovation Program of the Chinese Academy of Sciences are all funding it. So don’t be too surprised if bots wearing a convincing suit of artificial skin start popping up in your neighborhood!

terminator2Source: news.cnet.com

The Future is Here: Brain to Brain Interfaces!

?????????????????And I thought the month of February was an already exciting time for technological breakthroughs! But if a recent report from Nature.com is any indication, February will go down in history as the biggest month for breakthroughs ever! Why just last week, researchers in Natal, Brazil created the first ever electronic link between the brains of two living creatures.

The creatures in question were rats, and the link between their brains enabled one to help the other solve basic puzzles in real time — even though the animals were separated by thousands of kilometers of distance. The experiment was led by Miguel Nicolelis of Duke University, a pioneer in the field of brain-machine interfaces (BMIs), and a team of neurobiologists who’ve been working in the field for some time.

BMIHere’s how it works: An “encoder” rat in Natal, Brazil, trained in a specific behavioral task, presses a lever in its cage which it knows will result in a reward. A brain implant records activity from the rat’s motor cortex and converts it into an electrical signal that is delivered via neural link to the brain implant of a second “decoder” rat. The second rat’s motor cortex processes the signal from rat number one and – despite being thousands of km away and unfamiliar with what rat one is up to — uses that information to press the same lever.

MMIBack in 2011, Nicolelis and his colleagues unveiled the first such interface capable of a bi-directional link between a brain and a virtual body, allowing a monkey to not only mentally control a simulated arm, but receive and process sensory feedback about tactile properties like texture. And earlier this month, his team unveiled a BMI that enables rats to detect normally invisible infrared light via their sense of touch.

However, this latest experiment really takes the cake. Whereas brain-machine interfaces have long been the subject of research, generally for the sake of prostheses, a brain-to-brain interface between two living creatures in something entirely new, especially one that enables realtime sharing of sensorimotor information. And while it’s not telepathy, per se, it’s certainly something close, what Nicolelis calls “a new central nervous system made of two brains.”

Obviously, this kind of breakthrough is impressive in its own right, but according to Nicolelis, the most groundbreaking application of this brain-net (or n-mind) is yet to come:

These experiments demonstrated the ability to establish a sophisticated, direct communication linkage between rat brains, so basically, we are creating an organic computer that solves a puzzle. We cannot predict what kinds of emergent properties would appear when animals begin interacting as part of a brain-net. In theory, you could imagine that a combination of brains could provide solutions that individual brains cannot achieve by themselves.

neural-networksNaturally, there are some flaws in the process, which were made evident by the less-than-perfect results. For starters, the untrained decoder rats receiving input from a trained encoder only chose the correct lever around two-thirds of the time. Those results could not be the result of random odds, but they are also a far cry from the 95% accuracy where the signals were reversed, going from the untrained decoder to the trained encoder. As any student of science knows, one-way results are not the basis of a sound process.

And I imagine the people who are lobbying to make biosoldiers illegal and limit the use of autonomous drones will be on this like white on rice! Hence why we can probably look forward to many years of research and development before anything akin to human trials or commercial applications of this technology seem realizable.

And of course, there is a video demonstrated the mind link at work. a word of warning first though. If you’re an animal lover, like me, the video might be a little difficult to take. You be the judge:


Source:
IO9, nature.com