Brainwaves can now be used to control an impressive number of things these days: prosthetics, computers, quadroptors, and even cars. But recent research released by the Technische Universität München (TUM) in Germany indicates that they might also be used to flying an aircraft. Using a simple EEG cap that read their brainwaves, a team of researchers demonstrated that thoughts alone could navigate a plane.
Using seven people for the sake of their experiment, the research team hooked them all up to a cap containing dozens of electroencephalography (EEG) electrodes. They then sat them down in a flight simulator and told them to steer the plane using their thoughts alone. The cap read the electrical signals from their brains and an algorithm then translated those signals into computer commands.
According to the researchers, the accuracy with which the test subjects stayed on course was what was truly impressive. Not to mention the fact that the study participants weren’t all pilots and had varying levels of flight experience, with one having no experience at all. And yet, of the seven participants, all performed well enough to satisfy some of the criteria for getting a pilot’s license. Several of the subjects also managed to land their planes under poor visibility.
The research was part of an EU-funded program called ” Brainflight.” As Tim Fricke, an aerospace engineer who heads the project at TUM, explained:
A long-term vision of the project is to make flying accessible to more people. With brain control, flying, in itself, could become easier. This would reduce the work load of pilots and thereby increase safety. In addition, pilots would have more freedom of movement to manage other manual tasks in the cockpit.
With this successful test under their belts, the TU München scientists are focusing in particular on the question of how planes can provide feedback to their “mind pilots”. Ordinarily, pilots feel resistance in steering and must exert significant force when they are pushing their aircraft to its limits, and hence rely upon to gauge the state of their flight. This is missing with mind control, and must be addressed before any such system can be adapted to a real plane.
In many ways, I am reminded of the recent breakthroughs being made in mind-controlled prosthetics. After succeeding in creating prosthetic devices that could convert nerve impulses into controls, the next step became creating devices that could stimulate nerves in order to provide sensory feedback. Following this same developmental path, mind-controlled flight could become viable within a few years time.
Mind-controlled machinery, sensory feedback… what does this sound like to you?
The 2014 FIFA World Cup made history when it opened in Sao Paolo this week when a 29-year-old paraplegic man named Juliano Pinto kicked a soccer ball with the aid of a robotic exoskeleton. It was the first time a mind-controlled prosthetic was used in a sporting event, and represented the culmination of months worth of planning and years worth of technical development.
The exoskeleton was created with the help of over 150 researchers led by neuroscientist Dr. Miguel Nicolelis of Duke University, who’s collaborative effort was called the Walk Again Project. As Pinto successfully made the kick off with the exoskeleton, the Walk Again Project scientists stood by, watching and smiling proudly inside the Corinthians Arena. And the resulting buzz did not go unnoticed.
Immediately after the kick, Nicolelis tweeted about the groundbreaking event, saying simply: “We did it!” The moment was monumental considering that only a few of months ago, Nicolelis was excited just to have people talking about the idea of a mind-controlled exoskeleton being tested in such a grand fashion. As he said in an interview with Grandland after the event:
Despite all of the difficulties of the project, it has already succeeded. You go to Sao Paulo today, or you go to Rio, people are talking about this demo more than they are talking about football, which is unbelievably impossible in Brazil.
Dr. Gordon Cheng, a team member and the lead robotics engineer of the Technical University of Munich, explained how the exoskeleton works in an interview with BBC News:
The basic idea is that we are recording from the brain and then that signal is being translated into commands for the robot to start moving.
The result of many years of development, the mind-controlled exoskeleton represents a breakthrough in restoring ambulatory ability to those who have suffered a loss of motion due to injury. Using metal braces that were tested on monkeys, the exoskeleton relies on a series of wireless electrodes attached to the head that collect brainwaves, which then signal the suit to move. The braces are also stabilized by gyroscopes and powered by a battery carried by the kicker in a backpack.
Originally, a teenage paraplegic was expected to make the kick off. However, after a rigorous selection process that lasted many months, the 29 year-old Pinto was selected. And in performing the kickoff, he participated in an event designed to galvanize the imagination of millions of people around the world. It’s a new age of technology, friends, where disability is no longer a permanent thing,.
And in the meantime, enjoy this video of the event:
This summer, the World Cup 2014 will be taking place in Sao Paulo, Brazil; an event that is sure to be a media circus. And to kick off this circus (no pun!), FIFA has decided to do something rather special. This will consist of a paralyzed teenager making the ceremonial first kick, courtesy of an exoskeleton provided by The Walk Again Project. In addition to opening the games, this even will be the first time that a mind-controlled prosthetic will ever be used in a sporting event.
Though the teenager in question remains to be chosen, the event is scheduled and the exoskeleton tested and ready. Using metal braces that were tested on monkeys, the exoskeleton relies on a series of wireless electrodes attached to the head that collect brainwaves, which then signal the suit to move. The braces are also stabilized by gyroscopes and powered by a battery carried by the kicker in a backpack.
The Walk Again Project, a nonprofit collaboration dedicated to producing full-body mind-controlled prosthetics, represents a collaboration between such academic institutions as Duke University, the Technical University of Munich, the Swiss Federal Institute of Technology in Lausanne, the Edmond and Lily Safra International Institute of Neuroscience of Natal in Brazil, the University of California at Davis, the University of Kentucky, the Duke Immersive Virtual Environment facility.
Miguel Nicolelis, the Brazilian neuroscientist at Duke University who is leading the Walk Again Project’s efforts to create the robotic suit, had this to say about the planned event:
We want to galvanize people’s imaginations. With enough political will and investment, we could make wheelchairs obsolete.
Nicolelis is a pioneer in the field of mind-controlled prosthetics. In the 1990s, he helped build the first mind-controlled arm, which rats learned to manipulate so they could get a drink of water, simply by thinking about doing so. In that project, an electronic chip was embedded in the part of each rodent’s brain that controls voluntary muscle movements. Rows of wires that stuck out from the chip picked up electrical impulses generated by brain cells and relayed those signals to a computer.
Researchers studied the signals as the rats pushed a lever to guide the arm that gave them water, and they saw groups of neurons firing at different rates as the rats moved the lever in different directions. An algorithm was developed to decipher the patterns, discern the animal’s intention at any given moment and send commands from the brain directly to the arm instead of to the lever. Eventually, the rats could move the arm without pushing the lever at all.
Using similar brain-machine interfaces, Nicolelis and his colleagues learned to translate the neural signals in primate brains. In 2000, they reported that an owl monkey connected to the Internet had controlled an arm located 600 miles away. Eight years later, the team described a rhesus monkey that was able to dictate the pace of a robot jogging on a treadmill half a world away in Japan.
Small groups of neurons, it seems, are surprisingly capable of communicating with digital devices. Individual cells learn to communicate with computer algorithms more effectively over time by changing their firing patterns, as revealed in a study of a mouse’s brain published last year in Nature. This capacity for extensive plasticity and the ability to learn comes in quite handy when designing a prosthetic.
German-made sensors will relay a feeling of pressure when each foot touches the ground. And months of training on a virtual-reality simulator will have prepared the teenager — selected from a pool of 10 candidates — to do all this using a device that translates thoughts into actions. In an interview with New Scientist, the lead robotic engineer Gordon Cheng of the Technical University of Munich gave some indication of how the suit works
The vibrations can replicate the sensation of touching the ground, rolling off the toe and kicking off again. There’s so much detail in this, it’s phenomenal.
Capitalizing on that adaptability, several human quadriplegics have received implanted brain chips in FDA-approved clinical trials. One of the first was Matt Nagle, who lost the use of his extremities after being stabbed in the spine. With the aid of electrodes placed in his brain at Brown University in 2004, he learned to raise, lower and drop a piece of hard candy using a primitive jointed arm not connected to his body.
In a widely publicized demonstration of that system, now owned by a company called BrainGate, a 58-year-old woman paralyzed by a stroke sipped a cup of coffee last year using a five-fingered robotic arm not attached to her body. Despite the slickness of the presentation, however, the woman actually had little control over the arm. Despite it being aesthetically pleasing, the design was a little rudimentary.
However, things have come a long way since then thanks to ongoing research, development and testing. In Nicolelis’s lab, monkeys showed the ability to feel virtual objects displayed on a computer screen when areas of the brain associated with the sense of touch were stimulated. The blueprints for next summer’s soccer exoskeleton include similar sensors that will provide an artificial skin for its human wearer, thus ensuring that they can both move the device and receive sensory feedback.
With the world watching, Nicolelis hopes not only that his “bionic teenager” will be able to feel the ball but also that disabled people everywhere will feel a sense of hope. And why wouldn’t they? In this single, incredibly high-profile event, millions of people around the world who struggle with disabilities will witness something truly inspirational. A paralyzed teenager will rise from a wheelchair, kicks the World Cup ball, and bring countless millions to their feet.
And you’re waiting until June of 2014 to see this momentous event for yourselves, be sure to check out this promotional video from The Walk Again Project, featuring interviews with the people who made it happen and showcasing the exoskeleton itself:
3-D printing is leading to a revolution in manufacturing, and the list of applications grows with each passing day. But more important is the way it is coming together with other fields of research to make breakthroughs more affordable and accessible. Nowhere is this more true than in the fields of robotics and medicine, where printing techniques are producing a new generation of bionic and mind-controlled prosthetics.
For example, 3D Systems (a an additive manufacturing company) and EksoBionics (a company specializing in bionic prosthetic devices) recently partnered to produce the new “bespoke” exoskeleton that will restore ambulatory ability to paraplegics. The prototype was custom made for a woman named Amanda Boxtel, who was paralyzed in 1992 from a tragic skiing accident.
Designers from 3D Systems began by scanning her body, digitizing the contours of her spine, thighs, and shins; a process that helped them mold the robotic suit to her needs and specifications. They then combined the suit with a set of mechanical actuators and controls made by EksoBionics. The result, said 3D Systems, is the first-ever “bespoke” exoskeleton.
Intrinsic to the partnership between 3D Systems and EksoBionics was the common goal of finding a way to fit the exoskeleton comfortably to Boxtel’s body. One of the greatest challenges with exosuits and prosthetic devices is finding ways to avoid the hard parts bumping into “bony prominences,” such as the knobs on the wrists and ankles. These areas as not only sensitive, but prolonged exposure to hard surfaces can lead to a slew of health problems, given time.
As Scott Summit, the senior director for functional design at 3D Systems, explained it,:
[Such body parts] don’t want a hard surface touching them. We had to be very specific with the design so we never had 3D-printed parts bumping into bony prominences, which can lead to abrasions [and bruising].
One problem that the designers faced in this case was that a paralyzed person like Boxtel often can’t know that bruising is happening because they can’t feel it. This is dangerous because undetected bruises or abrasions can become infected.In addition, because 3D-printing allows the creation of very fine details, Boxtel’s suit was designed to allow her skin to breathe, meaning she can walk around without sweating too much.
The process of creating the 3D-printed robotic suit lasted about three months, starting when Summit and 3D Systems CEO Avi Reichenthal met Boxtel during a visit to EksoBionics. Boxtel is one of ten EksoBionics “test pilots”, and the exoskeleton was already designed to attach to the body very loosely with Velcro straps, with an adjustable fit. But it wasn’t yet tailored to fit her alone.
That’s where 3D Systems came into play, by using a special 3D scanning system to create the custom underlying geometry that would be used to make the parts that attach to the exoskeleton. As Boxtel put it:
When the robot becomes the enabling device to take every step for the rest of your life. the connection between the body and the robot is everything. So our goal is to enhance the quality of that connection so the robot becomes more symbiotic.
And human beings aren’t the only ones who are able to take advantage of this marriage between 3-D printing and biomedicine. Not surprisingly, animals are reaping the benefits of all the latest technological breakthroughs in these fields as well, as evidenced by the little duck named Dudley from the K911 animal rescue service in Sicamous, Canada.
Not too long ago, Dudley lost a leg when a chicken in the same pen mauled him. But thanks to a 3-D printed leg design, especially made for him, he can now walk again. It was created by Terence Loring of 3 Pillar Designs, a company that specializes in 3D-printing architectural prototypes. After hearing of Dudley’s plight through a friend, he decided to see what he could do to help.
Unlike a previous printed limb, the printed foot that was fashioned for Buttercup the Duck, Loring sought to create an entire limb that could move. The first limb he designed had a jointed construction, and was fully 3D-printed in plastic. Unfortunately, the leg broke the moment Dudley pit it on, forcing Loring to go back to the drawing board for a one-piece printed from softer plastic.
The subsequent leg he created had no joints and could bend on its own. And when Dudley put it on, he started walking straight away and without hesitation. Issues remain to be solved, like how to prevent friction sores – a problem that Mike Garey (who designed Buttercup’s new foot) solved with a silicone sock and prosthetic gel liner.
Nevertheless, Dudley is nothing if not as happy as a duck in a pond, and it seems very likely that any remaining issues will be ironed out in time. In fact, one can expect that veterinary medicine will fully benefit from the wide range of 3D printed prosthetic devices and even bionic limbs as advancement and research continues to produce new and exciting possibilities.
And in the meantime, enjoy the following videos which show both Amanda Boxtel and Dudley the duck enjoying their new devices and the ways in which they help bring mobility back to their worlds:
There’s just no shortage of breakthroughs in the field of biomedicine these days. Whether it’s 3D bioprinting, bionics, nanotechnology or mind-controlled prosthetics, every passing week seems to bring more in the way of amazing developments. And given the rate of progress, its likely going to be just a few years before mortality itself will be considered a treatable condition.
Consider the most recent breakthrough in 3D printing technology, which comes to us from the J.B Speed School of Engineering at the University of Louisville where researchers used a printed model of a child’s hear to help a team of doctors prepare for open heart surgery. Thanks to these printer-assisted measures, the doctors were able to save the life of a 14-year old child.
Philip Dydysnki, Chief of Radiology at Kosair Children’s Hospital, decided to approach the school when he and his medical team were looking at ways of treating Roland Lian Cung Bawi, a boy born with four heart defects. Using images taken from a CT scan, researchers from the school’s Rapid Prototyping Center were able to create and print a 3D model of Roland’s heart that was 1.5 times its actual size.
Built in three pieces using a flexible filament, the printing reportedly took around 20 hours and cost US$600. Cardiothoracic surgeon Erle Austin III then used the model to devise a surgical plan, ultimately resulting in the repairing of the heart’s defects in just one operation. As Austin said, “I found the model to be a game changer in planning to do surgery on a complex congenital heart defect.”
Roland has since been released from hospital and is said to be in good health. In the future, this type of rapid prototyping could become a mainstay for medical training and practice surgery, giving surgeons the options of testing out their strategies beforehand. And be sure to check out this video of the procedure from the University of Louisville:
And in another story, improvements made in the field of bionics are making a big difference for people suffering from diabetes. For people living with type 1 diabetes, the constant need to extract blood and monitor it can be quite the hassle. Hence why medical researchers are looking for new and non-invasive ways to monitor and adjust sugar levels.
Solutions range from laser blood-monitors to glucose-sensitive nanodust, but the field of bionics also offer solutions. Consider the bionic pancreas that was recently trialled among 30 adults, and has also been approved by the US Food and Drug Administration (FDA) for three transitional outpatient studies over the next 18 months.
The device comprises a sensor inserted under the skin that relays hormone level data to a monitoring device, which in turn sends the information wirelessly to an app on the user’s smartphone. Based on the data, which is provided every five minutes, the app calculates required dosages of insulin or glucagon and communicates the information to two hormone infusion pumps worn by the patient.
The bionic pancreas has been developed by associate professor of biomedical engineering at Boston University Dr. Edward Damiano, and assistant professor at Harvard Medical School Dr. Steven Russell. To date, it has been trialled with diabetic pigs and in three hospital-based feasibility studies amongst adults and adolescents over 24-48 hour periods.
The upcoming studies will allow the device to be tested by participants in real-world scenarios with decreasing amounts of supervision. The first will test the device’s performance for five continuous days involving twenty adults with type 1 diabetes. The results will then be compared to a corresponding five-day period during which time the participants will be at home under their own care and without the device.
A second study will be carried out using 16 boys and 16 girls with type 1 diabetes, testing the device’s performance for six days against a further six days of the participants’ usual care routine. The third and final study will be carried out amongst 50 to 60 further participants with type 1 diabetes who are also medical professionals.
Should the transitional trials be successful, a more developed version of the bionic pancreas, based on results and feedback from the previous trials, will be put through trials in 2015. If all goes well, Prof. Damiano hopes that the bionic pancreas will gain FDA approval and be rolled out by 2017, when his son, who has type 1 diabetes, is expected to start higher education.
With this latest development, we are seeing how smart technology and non-invasive methods are merging to assist people living with chronic health issues. In addition to “smart tattoos” and embedded monitors, it is leading to an age where our health is increasingly in our own hands, and preventative medicine takes precedence over corrective.
These days, advances in prosthetic devices, bionic limbs and exoskeletons continue to advance and amaze. Not only are doctors and medical researchers able to restore mobility and sensation to patients suffering from missing limbs, they are now crossing a threshold where they are able to restore these abilities and faculties to patients suffering from partial or total paralysis.
This should come as no surprise, seeing as how the latest biomedical advances – which involve controlling robotic limbs with brain-computer interfacing – offer a very obvious solution for paralyzed individuals. In their case, no robotic limbs or bionic attachments are necessary to restore ambulatory motion since these were not lost. Instead, what is needed is to restore motor control to compensate for the severed nerves.
Thanks to researchers working at Case Western University in Ohio, a way forward is being proposed. Here, a biomedical team is gearing up to combine the Braingate cortical chip, developed at Brown University, with their own Functional Electric Stimulation (FES) platform. Through this combination, they hope to remove robots from the equation entirely and go right to the source.
It has long been known that electrical stimulation can directly control muscles, but attempts to do this in the past artificially has often been inaccurate (and therefore painful and potentially damaging) to the patient. Stimulating the nerves directly using precisely positioned arrays is a much better approach, something that another team at Case Western recently demonstrated thought their “nerve cuff electrode”.
This electrode is a direct stimulation device that is small enough to be placed around small segments of nerve. The Western team used the cuff to provide an interface for sending data from sensors in the hand back to the brain using sensory nerves in the arm. With FES, the same kind of cuff electrode can also be used to stimulate nerves going the other direction, in other words, to the muscles.
The difficulty in such a scheme, is that even if the motor nerves can be physically separated from the sensory nerves and traced to specific muscles, the exact stimulation sequences needed to make a proper movement are hard to find. To achieve this, another group at Case Western has developed a detailed simulation of how different muscles work together to control the arm and hand.
Their model consists of 138 muscle elements distributed over 29 muscles, which act on 11 joints. The operational procedure is for the patient to watch the image of the virtual arm while they naturally generate neural commands that the BrainGate chip picks up to move the arm. In practice, this means trying to make the virtual arm touch a red spot to make it turn green.
Currently in clinical trials, the Braingate2 chip is being developed with the hope of not only stimulating muscles, but generating the same kinds of feedback and interaction that real muscle movement creates. The eventual plan is that the patient and the control algorithm will learn together in tandem so that a training screen will not be needed at all and a patient will be able to move on their own without calibrating the device.
But at the same time, biotech enhancements that are restoring sensation to amputee victims are also improving apace. Consider the bionic hand developed by Silvestro Micerna of the École Polytechnique Fédérale de Lausanne in Switzerland. Unlike previous bionic hands, which rely on electrodes to receive nerve signals to control the hand’s movement, his device sends electronic signals back to simulate the feeling of touch.
Back in February of 2013, Micerna and his research team began testing their bionic hand, and began clinical trials on a volunteer just last month. Their volunteer, a man named Dennis Aabo Sørensen from Denmark, lost his arm in a car accident nine years ago, and has since become the first amputee to experience artificially-induced sensation in real-time.
In a laboratory setting wearing a blindfold and earplugs, Sørensen was able to detect how strongly he was grasping, as well as the shape and consistency of different objects he picked up with his prosthetic. Afterwards, Sørensen described the experience to reporters, saying:
The sensory feedback was incredible. I could feel things that I hadn’t been able to feel in over nine years. When I held an object, I could feel if it was soft or hard, round or square.
The next step will involve miniaturizing the sensory feedback electronics for a portable prosthetic, as well as fine-tuning the sensory technology for better touch resolution and increased awareness about the movement of fingers. They will also need to assess how long the electrodes can remain implanted and functional in the patient’s nervous system, though Micerna’s team is confident that they would last for many years.
Micerna and his team were also quick to point out that Sørensen’s psychological strength was a major asset in the clinical trial. Not only has he been forced to adapt to the loss of his arm nine years ago, he was also extremely willing to face the challenge of having experienced touch again, but for only a short period of time. But as he himself put it:
I was more than happy to volunteer for the clinical trial, not only for myself, but to help other amputees as well… There are two ways you can view this. You can sit in the corner and feel sorry for yourself. Or, you can get up and feel grateful for what you have.
The study was published in the February 5, 2014 edition of Science Translational Medicine, and represents a collaboration called Lifehand 2 between several European universities and hospitals. And although a commercially-available sensory-enhanced prosthetic may still be years away, the study provides the first step towards a fully-realizable bionic hand.
Yes, between implantable electronics that can read out brainwaves and nerve impulses, computers programs that are capable of making sense of it all, and robotic limbs that are integrated to these machines and our bodies, the future is looking very interesting indeed. In addition to restoring ambulatory motion and sensation, we could be looking at an age where there is no such thing as “permanent injury”.
And in the meantime, be sure to check out this video of Sørensen’s clinical trial with the EPFL’s bionic hand:
Portable EEG devices have come a long way in recent years. From their humble beginnings as large, wire-studded contraptions that cost upwards of $10,000, they have now reached the point where they are small, portable, and affordable. What’s more, they are capable of not only reading brainwaves and interpreting brain activity, but turning that activity into real-time commands and controls.
Once such device is the Emotiv Insight, a neuroheadset that is being created with the help of a Kickstarter campaign and is now available for preorder. Designed by the same company that produced the EPOC, an earlier brain-computer interface (BCI) that was released in 2010, the Insight offers many improvements. Unlike its bulky predecessor, the new model is sleeker, lighter, uses five sensors instead of the EPOC’s fourteen and can be linked to your smartphone.
In addition, the Insight uses a new type of hydrophilic polymer sensor that absorbs moisture from the environment. Whereas the EPOC’s sensors required that the user first apply saline solution to their scalp, no extra applied moisture is necessary with this latest model. This is a boon for people who plan on using it repeatedly and don’t want to moisten their head with goo every time to do it.
The purpose behind the Insight and EPOC headsets is quite simple. According to Tan Le, the founder of Emotiv, the company’s long term aim is to take a clinical system (the EEG) from the lab into the real world and to democratize brain research. As already noted, older EEG machines were prohibitively expensive for smaller labs and amateur scientists and made it difficult to conduct brain research. Le and his colleagues hope to change that.
And it seems that they are destined to get their way. Coupled with similar devices from companies like Neurosky, the stage seems set for an age when brain monitoring and brain-computer interface research is something that is truly affordable – costing just a few hundred dollars instead of $10,000 – and allowing independent labs and skunkworks to contribute their own ideas and research to the fore.
As of September 16th, when the Kickstarter campaign officially closed, Emotiv surpassed its $1 million goal and raised a total of $1,643,117 for their device. Because of this, the company plans to upgrade the headset with a six-axis intertial sensor – to keep track of the user’s head movements, gait, tremor, gestures, etc. – a microSD card reader for added security, and a 3-axis magnetometer (i.e. a compass).
In some cases, these new brain-to computer interfaces are making it possible for people with disabilities or debilitating illnesses to control robots and prosthetics that assist them with their activities, rehab therapy, or restore mobility. On a larger front, they are also being adapted for commercial use – gaming and interfacing with personal computers and devices – as well as potential medical science applications such as neurotherapy, neuromonitoring, and neurofeedback.
Much like a fitness tracker, these devices could let us know how we are sleeping, monitor our emotional state over time, and make recommendations based on comparative analyses. So in addition to their being a viable growth market in aiding people with disabilities, there is also the very real possibility that neuroheadsets will give people a new and exciting way to interface with their machinery and keep “mental records”.
Passwords are likely to replace passthoughts, people will be able to identify themselves with brain-activity records, and remote control will take on a whole new meaning! In addition, mental records could become part of our regular medical records and could even be called upon to be used as evidence when trying to demonstrate mental fitness or insanity at trials. Dick Wolf, call me already! I’m practically giving these ideas away!
And be sure to enjoy this video from Emotiv’s Kickstarter site:
Silk implants are becoming the way of the future as far as brain implants are concerned, due to their paradoxically high resiliency and ability to dissolve. By combining them with nanoelectric circuits or drugs, scientists are exploring several possible applications, ranging from communications devices to control prosthetics and machines to medicinal devices that could treat disabilities and mental illnesses.
And according to a recent study released by the National Institutes of Health, treating epilepsy is just the latest application. According to the study, when administered to a series of epileptic rats, the treatment led to the rats experiencing far fewer seizures. What’s more, this new treatment represents something entirely new in terms of treatment of neurological disorder.
For starters, Rebecca L. Williams-Karneskyand and her colleagues used the silk implants for a timed-release therapy in rats experiencing epileptic seizures. Working on the theory that people with epilepsy suffer from a low level of adenosine – a chemical that the brain releases naturally to suppress seizures (and also perhaps movement during sleep) – they soaked the silk implants before implanting them.
Those rats who recieved the silk brain implants still had seizures, but their numbers were reduced fourfold. The implant released the chemical for ten days before they completely dissolved. And with time and testing, the treatment could very easily be made available for humans. According to the study’s co-author, Detlev Boison:
Clinical applications could be the prevention of epilepsy following head trauma or the prevention of seizures that often — in about 50 percent of patients — follow conventional epilepsy surgery. In this case, adenosine-releasing silk might be placed into the resection cavity in order to prevent future seizures.
Between the timed release of drugs and nanoelectric circuits that improve neuroelasticity, recall and relaxation, brain implants are coming a long way. At one time, they were the province of cyberpunk science fiction. But thanks to ongoing research and development, they are quickly jumping from the page and becoming a reality.
Though they currently remain confined to medical tests and laboratories, experts agree that it will be just a few years time before they are commercially available. By sometime in the coming decade, medimachines and neural implants will probably become a mainstay, and neurological disorders a fully treatable phenomena.
Much has been made of the advancements made in mind-controlled prosthetics lately. For many, the advancements made in this field have led to comparisons with the prosthetic hand that Luke Skywalker received at the end of the Empire Strikes Back. Remember that, how he got a robotic hand that not only looked real but also allowed him to feel pain? Well as it stands, we may be closer to that than previously thought.
Witness the new era of robo-prosthetic devices, ones that will not only restore motion to a amputees and people born without limbs, but also sensory perception! Developed by Silvestro Micera of the Ecole Polytechnique Federale de Lausanne in Switzerland, it’s the first prosthetic that will provide real-time sensory feedback to its owners. Later this year, a man by the name of Pierpaolo Petruzziello, who lost half his arm in a car accident, will receive the first of its kind, once all the tests are concluded.
Much like the mind-controlled prosthetics that have been making the rounds in recent years, this new device is wired directly to the user’s nervous system with electrodes, allowing them to control its movement. However, in this updated model, the process works both ways. Once the hand’s electrodes are clipped onto two of the arm’s main nerves, the median and the ulnar nerves, it will form a cybernetic connection allowing for the fast and bidirectional flow of information between the patient’s nervous system and the artificial hand.
In this respect, the arm works much as a real one does, using electrical stimuli to both send commands and receive sensory information. Announcing the development of the hand at the recently concluded AAAS conference in Boston, Micera was sure to highlight this aspect of the prosthetic, claiming that increased sensory feelings will improve acceptance of artificial limbs among patients.
Interestingly enough, this model is an updated version of one Micera and his team produced back in 2009, again for use by Petruzziello. He was able to move the bionic hand’s fingers, clench them into a fist and hold objects, and also reported feeling the sensation of needles pricked into the hand’s palm. However, this earlier version of the hand had only two sensory zones whereas the latest prototype will send sensory signals back from all the fingertips, as well as the palm and the wrists to give a near life-like feeling in the limb.
Once the hand and patient are united, he will wear it for a month in order to get a proper feel for the prosthetic and test out its many functions. Based on that test-drive, Micera hopes to develop a fully-functional and commercially viable model within the next two years.
Just think of it: prosthetics for amputees that will not only allow them to interact with their world again, but will provide them with the sensory information they need to actually feel like a part of it. One step closer to truly providing accident victims and people born without limbs a new and fully-functional lease on life. And perhaps to posthumanism as well!