Restoring Ability: Project NEUWalk

neuwalkIn the past few years, medical science has produced some pretty impressive breakthroughs for those suffering from partial paralysis, but comparatively little for those who are fully paralyzed. However, in recent years, nerve-stimulation that bypasses damaged or severed nerves has been proposed as a potential solution. This is the concept behind the NEUWalk, a project pioneered by the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland.

Here, researchers have figured out a way to reactivate the severed spinal cords of fully paralyzed rats, allowing them to walk again via remote control. And, the researchers say, their system is just about ready for human trials. The project operates on the notion that the human body requires electricity to function. The brain moves the body by sending electrical signals down the spinal cord and into the nervous system.

spinal-cord 2When the spinal cord is severed, the signals can no longer reach that part of the spine, paralysing that part of the body. The higher the cut, the greater the paralysis. But an electrical signal sent directly through the spinal cord below a cut via electrodes can take the place of the brain signal, as the team at EPFL, led by neuroscientist Grégoire Courtine, has discovered.

Previous studies have had some success in using epidural electrical stimulation (EES) to improve motor control where spinal cord injuries are concerned. However, electrically stimulating neurons to allow for natural walking is no easy task, and it requires extremely quick and precise stimulation. And until recently, the process of controlling the pulse width, amplitude and frequency in EES treatment was done manually.

brainwavesThis simply isn’t practical, and for two reasons: For starters, it is very difficult for a person to manually adjust the level of electrostimulation they require to move their legs as they are trying to walk. Second, the brain does not send electrical signals in an indiscriminate stream to the nerves. Rather, the frequency of the electrical stimulation varies based on the desired movement and neurological command.

To get around this, the team carefully studied all aspects of how electrical stimulation affects a rat’s leg movements – such as its gait – and was therefore able to figure out how to stimulate the rat’s spine for a smooth, even movement, and even take into account obstacles such as stairs. To do this, the researchers put paralyzed rats onto a treadmill and supported them with a robotic harness.

NEUWalk_ratsAfter several weeks of testing, the researchers had mapped out how to stimulate the rats’ nervous systems precisely enough to get them to put one paw in front of the other. They then developed a robust algorithm that could monitor a host of factors like muscle action and ground reaction force in real-time. By feeding this information into the algorithm, EES impulses could be precisely controlled, extremely quickly.

The next step involved severing the spinal cords of several rats in the middle-back, completely paralyzing the rats’ lower limbs, and implanted flexible electrodes into the spinal cord at the point where the spine was severed to allow them to send electrical signals down to the severed portion of the spine. Combined with the precise stimulation governed by their algorithm, the researcher team created a closed-loop system that can make paralyzed subjects mobile.

walkingrat.gifAs Grégoire Courtine said of the experiment:

We have complete control of the rat’s hind legs. The rat has no voluntary control of its limbs, but the severed spinal cord can be reactivated and stimulated to perform natural walking. We can control in real-time how the rat moves forward and how high it lifts its legs.

Clinical trials on humans may start as early as June 2015. The team plans to start testing on patients with incomplete spinal cord injuries using a research laboratory called the Gait Platform, housed in the EPFL. It consists of a custom treadmill and overground support system, as well as 14 infrared cameras that read reflective markers on the patient’s body and two video cameras for recording the patient’s movement.

WorldCup_610x343Silvestro Micera, a neuroengineer and co-author of the study, expressed hope that this study will help lead the way towards a day when paralysis is no longer permanent. As he put it:

Simple scientific discoveries about how the nervous system works can be exploited to develop more effective neuroprosthetic technologies. We believe that this technology could one day significantly improve the quality of life of people confronted with neurological disorders.

Without a doubt, restoring ambulatory ability to people who have lost limbs or suffered from spinal cord injuries is one of the many amazing possibilities being offered by cutting-edge medical research. Combined with bionic prosthetics, gene therapies, stem cell research and life-extension therapies, we could be looking at an age where no injury is permanent, and life expectancy is far greater.

And in the meantime, be sure to watch this video from the EPFL showing the NEUWalk technology in action:


Sources:
cnet.com, motherboard.com
, actu.epfl.ch

The Future is Here: Roombot Transforming Furniture

roombots_tableRobotic arms and other mechanisms have long been used to make or assemble furniture; but thus far, no one has ever created robots that are capable of becoming furniture. However, Swiss researchers are aiming to change that with Roombots, a brand of reconfigurable robotic modules that connect to each other to change shape and transform into different types of furniture, based on the needs and specifications of users.

Created by the Biorobotics Laboratory (BioRob) at École polytechnique fédérale de Lausanne (EPFL), the self-assembling Roombots attach to each other via connectors which enables them to take on the desired shape. The team’s main goal is to create self-assembling interactive furniture that can be used in a variety of ways. They were designed primarily for the sake of helping the disabled or elderly by morphing to suit their needs.

roombots_unpackLike LEGO bricks, Roombots can be stacked upon each other to create various structures and/or combined with furniture and other objects, changing not only their shape, but also and functionality. For instance, a person lying down on a Roombot bed could slowly be moved into a seated position, or a table could scoot over to a corner or tilt itself to help a book slide into a person’s hands. The team has solved a number of significant milestones, such as the having the Roombots move freely, to bring all this multi-functionality closer.

Each 22 cm-long module (which is made up of four half-spheres) has a wireless connection, a battery, and three motors that allow the module to pivot with three degrees of freedom. Each modules also has retractable “claws” that are used to attach to other pieces to form larger structures. With a series of rotations and connections, the modules can change shape and become any of a variety of objects. A special surface with holes adapted to the Roombots’ mechanical claws can also allow the modules to anchor to a wall or floor.

roombots_configThe Roombots can even climb up a wall or over a step, when the surface is outfitted with connector plates. They’re are also capable of picking up connector plates and arranging them to form, say, a table’s surface. Massimo Vespignani, a PhD student at BioRob, explained the purpose of this design and the advantages in a recent interview with Gizmag:

We start from a group of Roombot modules that might be stacked together for storage. The modules detach from this pile to form structures of two or more modules. At this point they can start moving around the room in what we call off-grid locomotion…

A single module can autonomously reach any position on a plane (this being on the floor, walls, or ceiling), and overcome a concave edge. In order to go over convex edges two modules need to collaborate…

The advantage would be that the modules can be tightly packed together for transportation and then can reconfigure into any type of structure (for example a robotic manipulator)…

We can ‘augment’ existing furniture by placing compatible connectors on it and attaching Roombots modules to allow it to move around the house.

roombots_boxThe range of applications for these kind of robotics is virtually infinite. For example, as seen in the video below, a series of Roombots as feet on a table that not only let it move around the room and come to the owner, but adjust its height as well. Auke Ijspeert, head of the Biorob, envisions that this type of customization could be used for physically challenged people who could greatly benefit from furniture that adapts to their needs and movements.

As he said in a recent statement:

It could be very useful for disabled individuals to be able to ask objects to come closer to them, or to move out of the way. [They could also be used as] ‘Lego blocks’ [for makers to] find their own function and applications.

Meanwhile, design students at ENSCI Les Ateliers in France have come up with several more ideas for uses of Roombots, such as flower pots that can move from window to window around a building and modular lighting components and sound systems. Similar to the MIT’s more complex self-assembling M-Blocks – which are programmable cube robots with no external moving parts – Roombots represent a step in the direction of self-assembling robots that are capable of taking on just about any task.

roombotsFor instance, imagine a series of small robotic modules that could be used for tasks like repairing bridges or buildings during emergencies. Simply release them from their container and feed them the instructions, and they assemble to prop up an earthquake-stricken structure or a fallen bridge. At the same time, it is a step in the direction of smart matter and nanotechnology, a futuristic vision that sees the very building blocks of everyday objects as programmable, reconfiguring materials that can shape or properties as needed.

To get a closer, more detailed idea of what the Roombot can do, check out the video below from EPFL News:


Source:
gizmag.com, cnet.com, kurzweilai.net

Biomedical Breakthroughs: Bionerves and Restored Sensation

restoring_mobilityThese days, advances in prosthetic devices, bionic limbs and exoskeletons continue to advance and amaze. Not only are doctors and medical researchers able to restore mobility and sensation to patients suffering from missing limbs, they are now crossing a threshold where they are able to restore these abilities and faculties to patients suffering from partial or total paralysis.

This should come as no surprise, seeing as how the latest biomedical advances – which involve controlling robotic limbs with brain-computer interfacing – offer a very obvious solution for paralyzed individuals. In their case, no robotic limbs or bionic attachments are necessary to restore ambulatory motion since these were not lost. Instead, what is needed is to restore motor control to compensate for the severed nerves.

braingate1Thanks to researchers working at Case Western University in Ohio, a way forward is being proposed. Here, a biomedical team is gearing up to combine the Braingate cortical chip, developed at Brown University, with their own Functional Electric Stimulation (FES) platform. Through this combination, they hope to remove robots from the equation entirely and go right to the source.

It has long been known that electrical stimulation can directly control muscles, but attempts to do this in the past artificially has often been inaccurate (and therefore painful and potentially damaging) to the patient. Stimulating the nerves directly using precisely positioned arrays is a much better approach, something that another team at Case Western recently demonstrated thought their “nerve cuff electrode”.

cuff-electrodeThis electrode is a direct stimulation device that is small enough to be placed around small segments of nerve. The Western team used the cuff to provide an interface for sending data from sensors in the hand back to the brain using sensory nerves in the arm. With FES, the same kind of cuff electrode can also be used to stimulate nerves going the other direction, in other words, to the muscles.

The difficulty in such a scheme, is that even if the motor nerves can be physically separated from the sensory nerves and traced to specific muscles, the exact stimulation sequences needed to make a proper movement are hard to find. To achieve this, another group at Case Western has developed a detailed simulation of how different muscles work together to control the arm and hand.

braingate2-img_assist_custom-500x288Their model consists of 138 muscle elements distributed over 29 muscles, which act on 11 joints. The operational procedure is for the patient to watch the image of the virtual arm while they naturally generate neural commands that the BrainGate chip picks up to move the arm. In practice, this means trying to make the virtual arm touch a red spot to make it turn green.

Currently in clinical trials, the Braingate2 chip is being developed with the hope of not only stimulating muscles, but generating the same kinds of feedback and interaction that real muscle movement creates. The eventual plan is that the patient and the control algorithm will learn together in tandem so that a training screen will not be needed at all and a patient will be able to move on their own without calibrating the device.

bionic-handBut at the same time, biotech enhancements that are restoring sensation to amputee victims are also improving apace. Consider the bionic hand developed by Silvestro Micerna of the École Polytechnique Fédérale de Lausanne in Switzerland. Unlike previous bionic hands, which rely on electrodes to receive nerve signals to control the hand’s movement, his device sends electronic signals back to simulate the feeling of touch.

Back in February of 2013, Micerna and his research team began testing their bionic hand, and began clinical trials on a volunteer just last month. Their volunteer, a man named Dennis Aabo Sørensen from Denmark, lost his arm in a car accident nine years ago, and has since become the first amputee to experience artificially-induced sensation in real-time.

prosthetic_originalIn a laboratory setting wearing a blindfold and earplugs, Sørensen was able to detect how strongly he was grasping, as well as the shape and consistency of different objects he picked up with his prosthetic. Afterwards, Sørensen described the experience to reporters, saying:

The sensory feedback was incredible. I could feel things that I hadn’t been able to feel in over nine years. When I held an object, I could feel if it was soft or hard, round or square.

The next step will involve miniaturizing the sensory feedback electronics for a portable prosthetic, as well as fine-tuning the sensory technology for better touch resolution and increased awareness about the movement of fingers. They will also need to assess how long the electrodes can remain implanted and functional in the patient’s nervous system, though Micerna’s team is confident that they would last for many years.

bionic-hand-trialMicerna and his team were also quick to point out that Sørensen’s psychological strength was a major asset in the clinical trial. Not only has he been forced to adapt to the loss of his arm nine years ago, he was also extremely willing to face the challenge of having experienced touch again, but for only a short period of time. But as he himself put it:

I was more than happy to volunteer for the clinical trial, not only for myself, but to help other amputees as well… There are two ways you can view this. You can sit in the corner and feel sorry for yourself. Or, you can get up and feel grateful for what you have.

The study was published in the February 5, 2014 edition of Science Translational Medicine, and represents a collaboration called Lifehand 2 between several European universities and hospitals. And although a commercially-available sensory-enhanced prosthetic may still be years away, the study provides the first step towards a fully-realizable bionic hand.

braingate_drinkassistYes, between implantable electronics that can read out brainwaves and nerve impulses, computers programs that are capable of making sense of it all, and robotic limbs that are integrated to these machines and our bodies, the future is looking very interesting indeed. In addition to restoring ambulatory motion and sensation, we could be looking at an age where there is no such thing as “permanent injury”.

And in the meantime, be sure to check out this video of Sørensen’s clinical trial with the EPFL’s bionic hand:


Sources:
extremetech.com, actu.epfl.ch, neurotechnology.com

Judgement Day Update: Bionic Computing!

big_blue1IBM has always been at the forefront of cutting-edge technology. Whether it was with the development computers that could guide ICBMs and rockets into space during the Cold War, or the creation of the Internet during the early 90’s, they have managed to stay on the vanguard by constantly looking ahead. So it comes as no surprise that they had plenty to say last month on the subject of the next of the next big leap.

During a media tour of their Zurich lab in late October, IBM presented some of the company’s latest concepts. According to the company, the key to creating supermachines that 10,000 faster and more efficient is to build bionic computers cooled and powered by electronic blood. The end result of this plan is what is known as “Big Blue”, a proposed biocomputer that they anticipate will take 10 years to make.

Human-Brain-project-Alp-ICTIntrinsic to the design is the merger of computing and biological forms, specifically the human brain. In terms of computing, IBM is relying the human brain as their template. Through this, they hope to be able to enable processing power that’s densely packed into 3D volumes rather than spread out across flat 2D circuit boards with slow communication links.

On the biological side of things, IBM is supplying computing equipment to the Human Brain Project (HBP) – a $1.3 billion European effort that uses computers to simulate the actual workings of an entire brain. Beginning with mice, but then working their way up to human beings, their simulations examine the inner workings of the mind all the way down to the biochemical level of the neuron.

brain_chip2It’s all part of what IBM calls “the cognitive systems era”, a future where computers aren’t just programmed, but also perceive what’s going on, make judgments, communicate with natural language, and learn from experience. As the description would suggest, it is closely related to artificial intelligence, and may very well prove to be the curtain raiser of the AI era.

One of the key challenge behind this work is matching the brain’s power consumption. The ability to process the subtleties of human language helped IBM’s Watson supercomputer win at “Jeopardy.” That was a high-profile step on the road to cognitive computing, but from a practical perspective, it also showed how much farther computing has to go. Whereas Watson uses 85 kilowatts of power, the human brain uses only 20 watts.

aquasar2Already, a shift has been occurring in computing, which is evident in the way engineers and technicians are now measuring computer progress. For the past few decades, the method of choice for gauging performance was operations per second, or the rate at which a machine could perform mathematical calculations.

But as a computers began to require prohibitive amounts of power to perform various functions and generated far too much waste heat, a new measurement was called for. The new measurement that emerged as a result was expressed in operations per joule of energy consumed. In short, progress has come to be measured in term’s of a computer’s energy efficiency.

IBM_Research_ZurichBut now, IBM is contemplating another method for measuring progress that is known as “operations per liter”. In accordance with this new paradigm, the success of a computer will be judged by how much data-processing can be squeezed into a given volume of space. This is where the brain really serves as a source of inspiration, being the most efficient computer in terms of performance per cubic centimeter.

As it stands, today’s computers consist of transistors and circuits laid out on flat boards that ensure plenty of contact with air that cools the chips. But as Bruno Michel – a biophysics professor and researcher in advanced thermal packaging for IBM Research – explains, this is a terribly inefficient use of space:

In a computer, processors occupy one-millionth of the volume. In a brain, it’s 40 percent. Our brain is a volumetric, dense, object.

IBM_stacked3dchipsIn short, communication links between processing elements can’t keep up with data-transfer demands, and they consume too much power as well. The proposed solution is to stack and link chips into dense 3D configurations, a process which is impossible today because stacking even two chips means crippling overheating problems. That’s where the “liquid blood” comes in, at least as far as cooling is concerned.

This process is demonstrated with the company’s prototype system called Aquasar. By branching chips into a network of liquid cooling channels that funnel fluid into ever-smaller tubes, the chips can be stacked together in large configurations without overheating. The liquid passes not next to the chip, but through it, drawing away heat in the thousandth of a second it takes to make the trip.

aquasarIn addition, IBM also is developing a system called a redox flow battery that uses liquid to distribute power instead of using wires. Two types of electrolyte fluid, each with oppositely charged electrical ions, circulate through the system to distribute power, much in the same way that the human body provides oxygen, nutrients and cooling to brain through the blood.

The electrolytes travel through ever-smaller tubes that are about 100 microns wide at their smallest – the width of a human hair – before handing off their power to conventional electrical wires. Flow batteries can produce between 0.5 and 3 volts, and that in turn means IBM can use the technology today to supply 1 watt of power for every square centimeter of a computer’s circuit board.

IBM_Blue_Gene_P_supercomputerAlready, the IBM Blue Gene supercomputer has been used for brain research by the Blue Brain Project at the Ecole Polytechnique Federale de Lausanne (EPFL) in Lausanne, Switzerland. Working with the HBP, their next step ill be to augment a Blue Gene/Q with additional flash memory at the Swiss National Supercomputing Center.

After that, they will begin simulating the inner workings of the mouse brain, which consists of 70 million neurons. By the time they will be conducting human brain simulations, they plan to be using an “exascale” machine – one that performs 1 exaflops, or quintillion floating-point operations per second. This will take place at the Juelich Supercomputing Center in northern Germany.

brain-activityThis is no easy challenge, mainly because the brain is so complex. In addition to 100 billion neurons and 100 trillionsynapses,  there are 55 different varieties of neuron, and 3,000 ways they can interconnect. That complexity is multiplied by differences that appear with 600 different diseases, genetic variation from one person to the next, and changes that go along with the age and sex of humans.

As Henry Markram, the co-director of EPFL who has worked on the Blue Brain project for years:

If you can’t experimentally map the brain, you have to predict it — the numbers of neurons, the types, where the proteins are located, how they’ll interact. We have to develop an entirely new science where we predict most of the stuff that cannot be measured.

child-ai-brainWith the Human Brain Project, researchers will use supercomputers to reproduce how brains form in an virtual vat. Then, they will see how they respond to input signals from simulated senses and nervous system. If it works, actual brain behavior should emerge from the fundamental framework inside the computer, and where it doesn’t work, scientists will know where their knowledge falls short.

The end result of all this will also be computers that are “neuromorphic” – capable of imitating human brains, thereby ushering in an age when machines will be able to truly think, reason, and make autonomous decisions. No more supercomputers that are tall on knowledge but short on understanding. The age of artificial intelligence will be upon us. And I think we all know what will follow, don’t we?

Evolution-of-the-Cylon_1024Yep, that’s what! And may God help us all!

Sources: news.cnet.com, extremetech.com

Synchronized VR Triggers Out of Body Experiences

Louish.Pixel
“Spirit Levitation: Out of Body Experience” by Loutish.Pixel

An out-of-body experience (OBE) is one of the most mysterious and inexplicable things a human being can endure. But thanks to new science, triggering one may be as easy as getting a person to watch a video of themselves with their heartbeat projected onto it. According to the study, it’s easy to trick the mind into thinking it belongs to an external body and manipulate a person’s self-consciousness by externalizing the body’s internal rhythms.

These findings were made by a team consisting of Dr Jane Aspell – Senior Lecturer in Psychology at Anglia Ruskin University in the UK – and Lukas Heydrich, a Phd Student at the Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland. Together, the two set out to see find how our internal organs contribute to bodily self-consciousness and whether they can be manipulated to induce an OBE.

electrocardiogramThe underlying goal, according to Ruskin, was finding out how our body merges information such as the visual, auditory, and olfactory with information coming from within. How this leads to the perception we call “reality”, and how it could be altered, is what is being studied here for the first time;

If you think about your body, you have several sources of information about it: you can see your hands and legs, you can feel the seat you’re sitting on via vision, you know you are standing upright thanks to your sense of balance etc. There is also a vast number of signals being sent to your brain from inside of your body every second that you are alive: about your heartbeat, your blood pressure, how full your stomach is, what electrolytes are in your blood, how fast you are breathing.

For their experiment, they attached 17 participants to electrocardiogram sensors and had them view videos of their own bodies through virtual reality goggles so that their body appeared to be two meters (6.5 ft) in front of them. Participants were then shown their own heartbeats in the form of a flashing outline around their “body doubles” that pulsed in sync with their own.

outofbody-1After a few minutes, many of the participants reported sensations of being in an entirely different part of the room rather than their physical body and feeling that their “selves” were closer to their virtual doubles. According to the team, this is the first study that clearly shows how visual signals containing information about the body’s internal organs (i.e. heartbeat) can change their perception of themselves. As Aspell put it:

It confirms that the brain is able to integrate visual information with cardiac information. It seems that the brain is very sensitive to patterns in the world which may relate to self – when the flashing was synchronous with the heartbeat this caused changes to subjects’ self-perception.

While it may sound like technologically-inspired mysticism, the research has several medical applications. One option is to help people with distorted views of themselves – i.e. anorexia, bulimia or other perceptual disorders – to connect with their actual physical appearance. Aspell is currently studying “yo-yo” dieters and says she plans to continue investigating how the internal body shapes who we are.

The Swiss National Science Foundation and the Fondation Bertarelli supported the study which is slated for publication in the APS journal Psychological Science.

Source: gizmag.com

The Future is Here: Blood Monitoring Implants!

nanorobot1

The realm of nanotechnology, which once seemed like the stuff of science fiction, is getting closer to realization with every passing year. And with all the innovations taking place in tiny-scale manufacturing, molecular research, and DNA structures, we could be looking at an age where tiny machines regulate our health, construct buildings, assemble atomic structures, and even contain enough hardware to run complex calculations.

One such innovation was announced back in March by the Ecole Polytechnique Fédérale de Lausanne (EPFL) in Switzerland, where researchers created the world’s smallest medical implant capable of monitoring critical chemicals in the blood. Measuring a mere 14mm in length, the device is capable of measuring up to five indicators, like proteins, glucose, lactate, ATP, and then transmit this information to a smartphone via Bluetooth.

implantable-sensor-640x353

In short, it is capable of providing valuable information that may help track and prevent heart attacks and monitor for indications of harmful conditions, like diabetes. Each sensor is coated with an enzyme that reacts with blood-borne chemicals to generate a detectable signal, and is paired with a wearable battery that provides the 100 milliwatts of power that the device requires by wireless inductive charging through the skin.

For patient monitoring, such a device has so many useful applications that it is likely to become indispensable, once introduced. In cancer treatment for example, numerous blood tests are often required to calibrate treatments according the to the patient’s particular ability to break down and excrete drugs. And since these parameters often change due the patient’s reaction to said treatments, anything that can provide up-to-the-minute monitoring will spare the patient countless invasive tests.

nanotech-2

In addition, in cases of heart attacks, the signs are visible in the hours before the event occurs. This occurs when fatigued or oxygen-starved muscle begins to break down, releasing fragments of the heart-specific smooth muscle protein known as troponin. If this protein can be detected before disruption of the heart rhythm begins, or the actual attack, lifesaving preemptive treatment can be initiated sooner.

At the moment, the sensors are limited by the number of sensors they hold. But there is no theoretical limit to how any sensors each implant can hold. In the future, such a device could be equipped with electronics that could monitor for strokes, blood clots, high cholesterol, cancer cells, HIV, parasites, viruses, and even the common cold (assuming such a thing continues to exist!) Just think about it.

You’re going about your daily activities when suddenly, you get a ringtone that alerts you that you’re about to experience a serious a health concern. Or maybe that the heavy lunch you just ate raised the level of LDL cholesterol in your bloodstream to an unwanted level. Tell me, on a scale of one to ten, how cool would that be?

Source: Extremetech.com