For years, biomedical researchers have been developing robotic prosthetics of greater and greater sophistication. From analog devices that can be quickly and cheaply manufactured by a 3-D printer, to mind-controlled prosthetics that move, to ones that both move and relay sensory information, the technology is growing by leaps and bounds. And just last week, the FDA officially announced it had approved the first prosthetic arm that’s capable of performing multiple simultaneous powered movements.
The new Deka arm – codenamed Luke, after Luke Skywalker’s artificial hand – was developed by Dean Kamen, inventor of the Segway. The project began in 2006 when DARPA funded multiple research initiatives in an attempt to create a better class of prosthetic device for veterans returning home from the Iraq War. Now, the FDA’s approval is a huge step for the Deka, as it means the devices are now clear for sale — provided the company can find a commercial partner willing to bring them to market.
Compared to other prosthetics, the Deka Arm System is a battery-powered device that combines multiple approaches. Some of the Deka’s functions are controlled by myoelectricity, which means the device senses movement in various muscle groups via attached electrodes, then converts those muscle movements into motor control. This allows the user a more natural and intuitive method of controlling the arm rather than relying on a cross-body pulley system.
The more advanced myoelectric systems can even transmit sensation back to the user, using the same system of electrodes to simulate pressure sensation for the user. This type of control flexibility is essential to creating a device that can address the wide range of needs from various amputees, and the Deka’s degree of fine-grained control is remarkable. Not only are user’s able to perform a wide range of movements and articulations with the hand, they are able to sense what they are doing thanks to the small pads on the fingertips and palm.
Naturally, the issue of price remains, which is consequently the greatest challenge facing the wide-scale adoption of these types of devices. A simple prosthestic arm is likely to cost $3000, while a sophisticated prosthesis can run as much as $50,000. In many cases, limbs have a relatively short lifespan, with wear and tear requiring a replacement device 3 to 4 years. Hence why 3-D printed variations, which do not boast much sophistication, are considered a popular option.
Visual presentation is also a major issue, as amputees often own multiple prostheses (including cosmetic ones) simply to avoid the embarrassment of wearing an obviously artificial limb. That’s one reason why the Deka Arm System’s design has evolved towards a much more normal-looking hand. Many amputees don’t want to wear a crude-looking mechanical device.
At present, the prosthetic market is still too broad, and the needs of amputees too specific to declare any single device as a one-size-fits-all success. But the Deka looks as though it could move the science of amputation forward and offer a significant number of veterans and amputees a device that more closely mimics natural human function than anything we’ve seen before. What’s more, combined with mind-controlled legs, bionic eyes and replacement organs, it is a major step forward in the ongoing goal of making disability a thing of the past.
And in the meantime, check out this DARPA video of the Deka Arm being tested:
It seems like I’ve writing endlessly about bionic prosthetics lately, thanks to the many breakthroughs that have been happening almost back to back. But I would be remiss if I didn’t share these latest two. In addition to showcasing some of the latest technological innovations, these stories are inspiring and show the immense potential bionic prosthetics have to change lives and help people recover from terrible tragedies.
For instance, on the TED stage this week in Vancouver, which included presentations from astronaut Chris Hadfield, NSA whistle blower Edward Snowden, and anti-corruption activist Charmiah Gooch, there was one presentation that really stole the stage. It Adrianne Haslet-Davis, a former dance instructor and a survivor of the Boston Marathon bombing, dancing again for the first time. And it was all thanks to a bionic limb developed by noted bionics researcher Hugh Herr.
As the director of the Biomechatronics Group at the MIT Media Lab, Herr is known for his work on high-tech bionic limbs and for demonstrating new prosthetic technologies on himself. At 17, he lost both his legs in a climbing accident. After discussing the science of bionic limbs, Herr brought out Adrianne, who for the first time since her leg amputation, performed a short ballroom dancing routine.
This was made possible thanks to the help of a special kind of bionic limb that designed by Herr and his colleagues at MIT specifically for dancing. The design process took over 200 days, where the researchers studied dance, brought in dancers with biological limbs, studied how they moved, and examined the forces they applied on the dance floor. What resulted was a “dance limb” with 12 sensors, a synthetic motor system that can move the joint, and microprocessors that run the limb’s controllers.
The system is programmed so that the motor moves the limb in a way that’s appropriate for dance. As Herr explained in a briefing after his talk:
It was so new. We had never looked at something like dance. I understand her dream and emotionally related to her dream to return to dance. It’s similar to what I went through.” Herr says he’s now able to climb at a more advanced level than when he had biological legs.
Haslet-Davis’s new limb is only intended for dancing; she switches to a different bionic limb for regular walking. And while this might seem like a limitation, it in fact represents a major step in the direction of bionics that can emulate a much wider range of human motion. Eventually, Herr envisions a day when bionic limbs can switch modes for different activities, allowing a person to perform a range of different tasks – walking, running, dancing, athletic activity – without having to change prosthetics.
In the past, Herr’s work has been criticized by advocates who argue that bionic limbs are a waste of time when many people don’t even have access to basic wheelchairs. He argues, however, that bionic limbs–which can cost as much as a nice car–ultimately reduce health care costs. For starters, they allow people to return to their jobs quickly, Herr said, thus avoiding workers’ compensation costs.
They can also prevent injuries resulting from prosthetics that don’t emulate normal function as effectively as high-tech limbs. And given the fact that the technology is becoming more widespread and additive manufacturing is leading to lower production costs, there may yet come a day when a bionic prosthetic is not beyond the means of the average person. Needless to say, both Adrianne and the crowd were moved to tears by the moving and inspiring display!
Next, there’s the inspiring story of Igor Spectic, a man who lost his right arm three years ago in a workplace accident. Like most people forced to live with the loss of a limb, he quickly came to understand the limitations of prosthetics. While they do restore some degree of ability, the fact that they cannot convey sensation means that the wearers are often unaware when they have dropped or crushed something.
Now, Spectic is one of several people taking part in early trials at Cleveland Veterans Affairs Medical Center, where researchers from Case Western Reserve University are working on prosthetics that offer sensation as well as ability. In a basement lab, the trials consist of connecting his limb to a prosthetic hand, one that is rigged with force sensors that are plugged into 20 wires protruding from his upper right arm.
These wires lead to three surgically implanted interfaces, seven millimeters long, with as many as eight electrodes apiece encased in a polymer, that surround three major nerves in Spetic’s forearm. Meanwhile, a nondescript white box of custom electronics does the job of translating information from the sensors on Spetic’s prosthesis into a series of electrical pulses that the interfaces can translate into sensations.
According to the trial’s leader, Dustin Tyler – a professor of biomedical engineering at Case Western Reserve University and an expert in neural interfaces – this technology is “20 years in the making”. As of this past February, the implants had been in place and performing well in tests for more than a year and a half. Tyler’s group, drawing on years of neuroscience research on the signaling mechanisms that underlie sensation, has developed a library of patterns of electrical pulses to send to the arm nerves, varied in strength and timing.
Spetic says that these different stimulus patterns produce distinct and realistic feelings in 20 spots on his prosthetic hand and fingers. The sensations include pressing on a ball bearing, pressing on the tip of a pen, brushing against a cotton ball, and touching sandpaper. During the first day of tests, Spetic noticed a surprising side effect: his phantom fist felt open, and after several months the phantom pain was “95 percent gone”.
To test the hand’s ability to provide sensory feedback, and hence aid the user in performing complex tasks, Spetic and other trial candidates were tasked with picking up small blocks that were attached to a table with magnets, as well as handling and removing the stems from a bowl of cherries. With sensation restored, he was able to pick up cherries and remove stems 93 percent of the time without crushing them, even blindfolded.
While impressive, Tyler estimates that completing the pilot study, refining stimulation methods, and launching full clinical trials is likely to take 10 years. He is also finishing development of an implantable electronic device to deliver stimuli so that the technology can make it beyond the lab and into a household setting. Last, he is working with manufacturers of prostheses to integrate force sensors and force processing technology directly into future versions of the devices.
As for Spetic, he has drawn quite a bit of inspiration from the trials and claims that they have left him thinking wistfully about what the future might bring. As he put it, he feels:
…blessed to know these people and be a part of this. It would be nice to know I can pick up an object without having to look at it, or I can hold my wife’s hand and walk down the street, knowing I have a hold of her. Maybe all of this will help the next person.
This represents merely one of several successful attempts to merge the technology of nerve stimulation in with nerve control, leading to bionic limbs that not only obey user’s commands, but provide sensory feedback at the same time. Given a few more decades of testing and development, we will most certainly be looking at an age where bionic limbs that are virtually indistiguishable from the real thing exist and are readily available.
And in the meantime, enjoy this news story of Adrianne Haslet-Davis performing her ballroom dance routine at TED. I’m sure you’ll find it inspiring!
There’s seems to be no shortage of medical breakthroughs these days! Whether it’s bionic limbs, 3-D printed prosthetic devices, bioprinting, new vaccines and medicines, nanoparticles, or embedded microsensors, researchers and medical scientists are bringing innovation and technological advancement together to create new possibilities. And in recent months, two breakthrough in particular have bbecome the focus of attention, offering the possibility of smarter surgery and health monitoring.
First up, there’s the tiny bladder sensor that is being developed by the Norwegian research group SINTEF. When it comes to patients suffering from paralysis, the fact that they cannot feel when their bladders are full, para and quadriplegics often suffer from pressure build-up that can cause damage to the bladder and kidneys. This sensor would offer a less invasive means of monitoring their condition, to see if surgery is required or if medication will suffice.
Presently, doctors insert a catheter into the patient’s urethra and fill their bladder with saline solution, a process which is not only uncomfortable but is claimed to provide an inaccurate picture of what’s going on. By contrast, this sensor can be injected directly into the patients directly through the skin, and could conceivably stay in place for months or even years, providing readings without any discomfort, and without requiring the bladder to be filled mechanically.
Patients would also able to move around normally, plus the risk of infection would reportedly be reduced. Currently readings are transmitted from the prototypes via a thin wire that extents from the senor out through the skin, although it is hoped that subsequent versions could transmit wirelessly – most likely to the patient’s smartphone. And given that SINTEF’s resume includes making sensors for the CERN particle collider, you can be confident these sensors will work!
Next month, a clinical trial involving three spinal injury patients is scheduled to begin at Norway’s Sunnaas Hospital. Down the road, the group plans to conduct trials involving 20 to 30 test subjects. Although they’re currently about to be tested in the bladder, the sensors could conceivably be used to measure pressure almost anywhere in the body. Conceivably, sensors that monitor blood pressure and warn of aneurisms or stroke could be developed.
Equally impressive is the tiny, doughnut-shaped sensor being developed by Prof. F. Levent Degertekin and his research group at the George W. Woodruff School of Mechanical Engineering at Georgia Tech. Designed to assist doctors as they perform surgery on the heart or blood vessels, this device could provide some much needed (ahem) illumination. Currently, doctors and scientists rely on images provided by cross-sectional ultrasounds, which are limited in terms of the information they provide.
As Degertekin explains:
If you’re a doctor, you want to see what is going on inside the arteries and inside the heart, but most of the devices being used for this today provide only cross-sectional images. If you have an artery that is totally blocked, for example, you need a system that tells you what’s in front of you. You need to see the front, back, and sidewalls altogether.
That’s where their new chip comes into play. Described as a “flashlight” for looking inside the human body, it’s basically a tiny doughnut-shaped sensor measuring 1.5 millimeters (less than a tenth of an inch) across, with the hole set up to take a wire that would guide it through cardiac catheterization procedures. In that tiny space, the researchers were able to cram 56 ultrasound transmitting elements and 48 receiving elements.
So that the mini monitor doesn’t boil patients’ blood by generating too much heat, it’s designed to shut its sensors down when they’re not in use. In a statement released from the university, Degertekin explained how the sensor will help doctors to better perform life-saving operations:
Our device will allow doctors to see the whole volume that is in front of them within a blood vessel. This will give cardiologists the equivalent of a flashlight so they can see blockages ahead of them in occluded arteries. It has the potential for reducing the amount of surgery that must be done to clear these vessels.
Next up are the usual animal studies and clinical trials, which Degertekin hopes will be conducted by licensing the technology to a medical diagnostic firm. The researchers are also going to see if they can make their device even smaller- small enough to fit on a 400-micron-diameter guide wire, which is roughly four times the diameter of a human hair. At that size, this sensor will be able to provide detailed, on-the-spot information about any part of the body, and go wherever doctors can guide it.
Such is the nature of the new age of medicine: smaller, smarter, and less invasive, providing better information to both save lives and improve quality of life. Now if we can just find a cure for the common cold, we’d be in business!
3-D printing is leading to a revolution in manufacturing, and the list of applications grows with each passing day. But more important is the way it is coming together with other fields of research to make breakthroughs more affordable and accessible. Nowhere is this more true than in the fields of robotics and medicine, where printing techniques are producing a new generation of bionic and mind-controlled prosthetics.
For example, 3D Systems (a an additive manufacturing company) and EksoBionics (a company specializing in bionic prosthetic devices) recently partnered to produce the new “bespoke” exoskeleton that will restore ambulatory ability to paraplegics. The prototype was custom made for a woman named Amanda Boxtel, who was paralyzed in 1992 from a tragic skiing accident.
Designers from 3D Systems began by scanning her body, digitizing the contours of her spine, thighs, and shins; a process that helped them mold the robotic suit to her needs and specifications. They then combined the suit with a set of mechanical actuators and controls made by EksoBionics. The result, said 3D Systems, is the first-ever “bespoke” exoskeleton.
Intrinsic to the partnership between 3D Systems and EksoBionics was the common goal of finding a way to fit the exoskeleton comfortably to Boxtel’s body. One of the greatest challenges with exosuits and prosthetic devices is finding ways to avoid the hard parts bumping into “bony prominences,” such as the knobs on the wrists and ankles. These areas as not only sensitive, but prolonged exposure to hard surfaces can lead to a slew of health problems, given time.
As Scott Summit, the senior director for functional design at 3D Systems, explained it,:
[Such body parts] don’t want a hard surface touching them. We had to be very specific with the design so we never had 3D-printed parts bumping into bony prominences, which can lead to abrasions [and bruising].
One problem that the designers faced in this case was that a paralyzed person like Boxtel often can’t know that bruising is happening because they can’t feel it. This is dangerous because undetected bruises or abrasions can become infected.In addition, because 3D-printing allows the creation of very fine details, Boxtel’s suit was designed to allow her skin to breathe, meaning she can walk around without sweating too much.
The process of creating the 3D-printed robotic suit lasted about three months, starting when Summit and 3D Systems CEO Avi Reichenthal met Boxtel during a visit to EksoBionics. Boxtel is one of ten EksoBionics “test pilots”, and the exoskeleton was already designed to attach to the body very loosely with Velcro straps, with an adjustable fit. But it wasn’t yet tailored to fit her alone.
That’s where 3D Systems came into play, by using a special 3D scanning system to create the custom underlying geometry that would be used to make the parts that attach to the exoskeleton. As Boxtel put it:
When the robot becomes the enabling device to take every step for the rest of your life. the connection between the body and the robot is everything. So our goal is to enhance the quality of that connection so the robot becomes more symbiotic.
And human beings aren’t the only ones who are able to take advantage of this marriage between 3-D printing and biomedicine. Not surprisingly, animals are reaping the benefits of all the latest technological breakthroughs in these fields as well, as evidenced by the little duck named Dudley from the K911 animal rescue service in Sicamous, Canada.
Not too long ago, Dudley lost a leg when a chicken in the same pen mauled him. But thanks to a 3-D printed leg design, especially made for him, he can now walk again. It was created by Terence Loring of 3 Pillar Designs, a company that specializes in 3D-printing architectural prototypes. After hearing of Dudley’s plight through a friend, he decided to see what he could do to help.
Unlike a previous printed limb, the printed foot that was fashioned for Buttercup the Duck, Loring sought to create an entire limb that could move. The first limb he designed had a jointed construction, and was fully 3D-printed in plastic. Unfortunately, the leg broke the moment Dudley pit it on, forcing Loring to go back to the drawing board for a one-piece printed from softer plastic.
The subsequent leg he created had no joints and could bend on its own. And when Dudley put it on, he started walking straight away and without hesitation. Issues remain to be solved, like how to prevent friction sores – a problem that Mike Garey (who designed Buttercup’s new foot) solved with a silicone sock and prosthetic gel liner.
Nevertheless, Dudley is nothing if not as happy as a duck in a pond, and it seems very likely that any remaining issues will be ironed out in time. In fact, one can expect that veterinary medicine will fully benefit from the wide range of 3D printed prosthetic devices and even bionic limbs as advancement and research continues to produce new and exciting possibilities.
And in the meantime, enjoy the following videos which show both Amanda Boxtel and Dudley the duck enjoying their new devices and the ways in which they help bring mobility back to their worlds:
Nanotechnology has long been the dream of researchers, scientists and futurists alike, and for obvious reasons. If machinery were small enough so as to be microscopic, or so small that it could only be measured on the atomic level, just about anything would be possible. These include constructing buildings and products from the atomic level up, with would revolutionize manufacturing as we know it.
In addition, microscopic computers, smart cells and materials, and electronics so infinitesimally small that they could be merged with living tissues would all be within our grasp. And it seems that at least once a month, universities, research labs, and even independent skunkworks are unveiling new and exciting steps that are bringing us ever closer to this goal.
Once such breakthrough comes from the University of North Carolina at Chapel Hill, where biomedical scientists and engineers have joined forces to create the “smart sponge”. A spherical object that is microscopic — just 250 micrometers across, and could be made as small as 0.1 micrometers – these new sponges are similar to nanoparticles, in that they are intended to be the next-generation of delivery vehicles for medication.
Each sponge is mainly composed of a polymer called chitosan, something which is not naturally occurring, but can be produced easily from the chitin in crustacean shells. The long polysaccharide chains of chitosan form a matrix in which tiny porous nanocapsules are embedded, and which can be designed to respond to the presence of some external compound – be it an enzyme, blood sugar, or a chemical trigger.
So far, the researchers tested the smart sponges with insulin, so the nanocapsules in this case contained glucose oxidase. As the level of glucose in a diabetic patient’s blood increases, it would trigger the nanocapsules in the smart sponge begin releasing hydrogen ions which impart a positive charge to the chitosan strands. This in turn causes them to spread apart and begin to slowly release insulin into the blood.
The process is also self-limiting: as glucose levels in the blood come down after the release of insulin, the nanocapsules deactivate and the positive charge dissipates. Without all those hydrogen ions in the way, the chitosan can come back together to keep the remaining insulin inside. The chitosan is eventually degraded and absorbed by the body, so there are no long-term health effects.
One the chief benefits of this kind of system, much like with nanoparticles, is that it delivers medication when its needed, to where its needed, and in amounts that are appropriate to the patient’s needs. So far, the team has had success treating diabetes in rats, but plans to expand their treatment to treating humans, and branching out to treat other types of disease.
Cancer is a prime candidate, and the University team believes it can be treated without an activation system of any kind. Tumors are naturally highly acidic environments, which means a lot of free hydrogen ions. And since that’s what the diabetic smart sponge produces as a trigger anyway, it can be filled with small amounts of chemotherapy drugs that would automatically be released in areas with cancer cells.
Another exciting breakthrough comes from University of California at Berkeley, where medical researchers are working towards tiny, implantable sensors . As all medical researchers know, the key to understanding and treating neurological problems is to gather real-time and in-depth information on the subject’s brain. Unfortunately, things like MRIs and positron emission tomography (PET) aren’t exactly portable and are expensive to run.
Implantable devices are fast becoming a solution to this problem, offering real-time data that comes directly from the source and can be accessed wirelessly at any time. So far, this has taken the form of temporary medical tattoos or tiny sensors which are intended to be implanted in the bloodstreams. However, what the researchers at UofC are proposing something much more radical.
In a recent research paper, they proposed a design for a new kind of implantable sensor – an intelligent dust that can infiltrate the brain, record data, and communicate with the outside world. The preliminary design was undertaken by Berkeley’s Dongjin Seo and colleagues, who described a network of tiny sensors – each package being no more than 100 micrometers – in diameter. Hence the term they used: “neural dust”.
The smart particles would all contain a very small CMOS sensor capable of measuring electrical activity in nearby neurons. The researchers also envision a system where each particle is powered by a piezoelectric material rather than tiny batteries. The particles would communicate data to an external device via ultrasound waves, and the entire package would also be coated in a polymer, thus making it bio-neutral.
But of course, the dust would need to be complimented by some other implantable devices. These would likely include a larger subdural transceiver that would send the ultrasound waves to the dust and pick up the return signal. The internal transceiver would also be wirelessly connected to an external device on the scalp that contains data processing hardware, a long range transmitter, storage, and a battery.
The benefits of this kind of system are again obvious. In addition to acting like an MRI running in your brain all the time, it would allow for real-time monitoring of neurological activity for the purposes of research and medical monitoring. The researchers also see this technology as a way to enable brain-machine interfaces, something which would go far beyond current methods. Who knows? It might even enable a form of machine-based telepathy in time.
Sounds like science fiction, and it still is. Many issues need to be worked out before something of this nature would be possible or commercially available. For one, more powerful antennae would need to be designed on the microscopic scale in order for the smart dust particles to be able to send and receive ultrasound waves.
Increasing the efficiency of transceivers and piezoelectric materials will also be a necessity to provide the dust with power, otherwise they could cause a build-up of excess heat in the user’s neurons, with dire effects! But most importantly of all, researchers need to find a safe and effective way to deliver the tiny sensors to the brain.
And last, but certainly not least, nanotechnology might be offering improvements in the field of prosthetics as well. In recent years, scientists have made enormous breakthroughs in the field of robotic and bionic limbs, restoring ambulatory mobility to accident victims, the disabled, and combat veterans. But even more impressive are the current efforts to restore sensation as well.
One method, which is being explored by the Technion-Israel Institute of Technology in Israel, involves incorporating gold nanoparticles and a substrate made of polyethylene terephthalate (PET) – the plastic used in bottles of soft drinks. Between these two materials, they were able to make an ultra-sensitive film that would be capable of transmitting electrical signals to the user, simulating the sensation of touch.
Basically, the gold-polyester nanomaterial experiences changes in conductivity as it is bent, providing an extremely sensitive measure of physical force. Tests conducted on the material showed that it was able to sense pressures ranging from tens of milligrams to tens of grams, which is ten times more sensitive than any sensors being build today.
Even better, the film maintained its sensory resolution after many “bending cycles”, meaning it showed consistent results and would give users a long term of use. Unlike many useful materials that can only really be used under laboratory conditions, this film can operate at very low voltages, meaning that it could be manufactured cheaply and actually be useful in real-world situations.
In their research paper, lead researcher Hossam Haick described the sensors as “flowers, where the center of the flower is the gold or metal nanoparticle and the petals are the monolayer of organic ligands that generally protect it.” The paper also states that in addition to providing pressure information (touch), the sensors in their prototype were also able to sense temperature and humidity.
But of course, a great deal of calibration of the technology is still needed, so that each user’s brain is able to interpret the electronic signals being received from the artificial skin correctly. But this is standard procedure with next-generation prosthetic devices, ones which rely on two-way electronic signals to provide control signals and feedback.
And these are just some examples of how nanotechnology is seeking to improve and enhance our world. When it comes to sensory and mobility, it offers solutions to not only remedy health problems or limitations, but also to enhance natural abilities. But the long-term possibilities go beyond this by many orders of magnitude.
As a cornerstone to the post-singularity world being envisioned by futurists, nanotech offers solutions to everything from health and manufacturing to space exploration and clinical immortality. And as part of an ongoing trend in miniaturization, it presents the possibility of building devices and products that are even tinier and more sophisticated than we can currently imagine.
It’s always interesting how science works by scale, isn’t it? In addition to dreaming large – looking to build structures that are bigger, taller, and more elaborate – we are also looking inward, hoping to grab matter at its most basic level. In this way, we will not only be able to plant our feet anywhere in the universe, but manipulate it on the tiniest of levels.
As always, the future is a paradox, filling people with both awe and fear at the same time.
When it comes to modern research and development, biomimetics appear to be the order of the day. By imitating the function of biological organisms, researchers seek to improve the function of machinery to the point that it can be integrated into human bodies. Already, researchers have unveiled devices that can do the job of organs, or bionic limbs that use the wearer’s nerve signals or thoughts to initiate motion.
But what of machinery that can actually send signals back to the user, registering pressure and stimulation? That’s what researchers from the University of Georgia have been working on of late, and it has inspired them to create a device that can do the job of the largest human organ of them all – our skin. Back in April, they announced that they had successfully created a brand of “smart skin” that is sensitive enough to rival the real thing.
In essence, the skin is a transparent, flexible arrays that uses 8000 touch-sensitive transistors (aka. taxels) that emit electricity when agitated. Each of these comprises a bundle of some 1,500 zinc oxide nanowires, which connect to electrodes via a thin layer of gold, enabling the arrays to pick up on changes in pressure as low as 10 kilopascals, which is what human skin can detect.
Mimicking the sense of touch electronically has long been the dream researchers, and has been accomplished by measuring changes in resistance. But the team at Georgia Tech experimented with a different approach, measuring tiny polarization changes when piezoelectric materials such as zinc oxide are placed under mechanical stress. In these transistors, then, piezoelectric charges control the flow of current through the nanowires.
In a recent news release, lead author Zhong Lin Wang of Georgia Tech’s School of Materials Science and Engineering said:
Any mechanical motion, such as the movement of arms or the fingers of a robot, could be translated to control signals. This could make artificial skin smarter and more like the human skin. It would allow the skin to feel activity on the surface.
This, when integrated to prosthetics or even robots, will allow the user to experience the sensation of touch when using their bionic limbs. But the range of possibilities extends beyond that. As Wang explained:
This is a fundamentally new technology that allows us to control electronic devices directly using mechanical agitation. This could be used in a broad range of areas, including robotics, MEMS, human-computer interfaces, and other areas that involve mechanical deformation.
Not the first time that bionic limbs have come equipped with electrodes to enable sensation. In fact, the robotic hand designed by Silvestro Micera of the Ecole Polytechnique Federale de Lausanne in Switzerland seeks to do the same thing. Using electrodes that connect from the fingertips, palm and index finger to the wearer’s arm nerves, the device registers pressure and tension in order to help them better interact with their environment.
Building on these two efforts, it is easy to get a glimpse of what future prosthetic devices will look like. In all likelihood, they will be skin-colored and covered with a soft “dermal” layer that is studded with thousands of sensors. This way, the wearer will be able to register sensations – everything from pressure to changes in temperature and perhaps even injury – from every corner of their hand.
As usual, the technology may have military uses, since the Defense Advanced Research Projects Agency (DARPA) is involved. For that matter, so is the U.S. Air Force, the U.S. Department of Energy, the National Science Foundation, and the Knowledge Innovation Program of the Chinese Academy of Sciences are all funding it. So don’t be too surprised if bots wearing a convincing suit of artificial skin start popping up in your neighborhood!
The future that is fast approaching us is one filled with possibilities, many of which were once thought to be the province of science fiction. Between tricorders and other new devices that can detect cancer sooner and at a fraction of the cost, HIV vaccines and cures, health monitoring tattoos and bionic limbs, we could be moving into an age where all known diseases are curable and physical handicaps will be non-existent.
And in the past few months, more stories have emerged with provide hope for millions of people living with diseases, injuries and disabilities. The first came just over three weeks ago from University of California, Berkley, where researchers have been working with an engineered virus which they claim could help cure blindness. As part of a gene therapy program, this treatment has been shown to effectively correct a rare form of inherited blindness.
For the past six years, medical science has been using adeno-associated viruses (AAV) as part of a gene therapy treatment to correct inherited retinal degenerative disease. However, the process has always been seen as invasive, since it involves injected the AAVs directly into a person’s retina with a needle. What’s more, the rpocess has shown itself to be limited, in that the injected virus does not reach all the retinal cells that need repair.
But as Professor David Schaffer, the lead researcher on the project, stated in an interview with Science Translational Medicine:
[D]octors have no choice because none of the gene delivery viruses can travel all the way through the back of the eye to reach the photoreceptors – the light sensitive cells that need the therapeutic gene.
Building on this and many more years of research, Prof David Schaffer and his colleagues developed a new process where they generated around 100 million variants of AAV and then selected five that were effective in penetrating the retina. They then used the best of these, a strain known as 7m8, to transport genes to cure two types of hereditary blindness on a group of mice.
In each case, the engineered virus delivered the corrective gene to all areas of the retina and restored retinal cells nearly to normal. But more importantly, the virus’ ability to penetrate the retina on its own makes the process far less invasive, and will likely be far more cost-effective when adapted to humans. And the process is apparently very convenient:
[W]e have now created a virus that you just inject into the liquid vitreous humor inside the eye and it delivers genes to a very difficult-to-reach population of delicate cells in a way that is surgically non-invasive and safe. It’s a 15-minute procedure, and you can likely go home that day.
Naturally, clinical trials are still needed, but the results are encouraging and Professor Schaffer indicated that his team are busy at work, now collaborating with physicians to identify the patients most likely to benefit from this gene-delivery technique.
Next up, there was the announcement back at the end of May that researchers from North Carolina State and University of North Carolina Chapel Hill had found yet another medical use for nanoparticles. In there case, this consisted of combating a major health concern, especially amongst young people today: diabetes.
In a study that was published in the Journal of Agricultural and Food Chemistry, the collaborating teams indicated that their solution of nanoparticles was able to monitor blood sugar levels in a group of mice and released insulin when their sugar levels got too high. Based on the results, the researchers claim that their method will also work for human beings with type 1 diabetes.
Each of the nanoparticles have a core of insulin that is contained with a degradable shell. When glucose levels in the blood reach high concentrations spike, the shell dissolves, releasing insulin and lowering the subject’s blood sugar. The degradable nano-network was shown to work in mice where a single injection kept blood glucose levels normal for a minimum of 10 days.
While the exact cause of this kind of diabetes is unknown, the effects certainly are. Patients living with this genetically-acquired form of the disease require several shots of insulin a day to keep their blood sugar levels under control. And even then, blindness, depression and even death can still result. What’s more, if the insulin shots are specifically calculated for the individual in question, side-effects can occur.
Hence the genius behind this new method. Not only would it relieve people who have type 1 diabetes from constantly injecting themselves, it would also remove the need to monitor their own blood sugar levels since the nanoparticles would be controlling them automatically.
In a study published recently in the Journal of Agricultural and Food Chemistry, Zhen Gu, lead author of the study claimed that the technology functions essentially the same as a pancreas. Hence another benefit of the new method, in that it could make pancreatic transplants – which are often necessary for patients with diabetes – unnecessary.
And last, but certainly not least, comes from the University of Illinois where John Rogers are developing a series of bio-absorbable electronic circuits that could help us win the war on drug-resistant bacteria. As part of a growing trend of biodegradable, flexible electronic circuits that operate wirelessly, fighting “superbugs” is just one application for this technology, but a very valuable one.
For some time now, bacteria that is resistant to antibiotics has been spreading, threatening to put the clock back 100 years to the time when routine, minor surgery was life-threatening. Some medical experts are warning that otherwise straightforward operations could soon become deadly unless new ways to fend off these infections are found. And though bacteria can evolve ways of evading chemical assaults, they are still vulnerable to direct assault.
This is how the new bio-absorbable circuits work: by heating up the virus. Each circuit is essentially a miniature electric heater that can be implanted into wounds and powered wirelessly to fry bacteria during healing before dissolving harmlessly into body fluids once their job is done. While this might sound dangerous, keep in mind that it takes only a relatively mild warming to kill bugs without causing discomfort or harm to surrounding tissues.
To fashion the circuits, Rogers and his colleagues used layers of utra-thin wafers and silk, material so thin that they disintegrate in water or body fluids or (in the case of silk) are known to dissolve anyway. For the metal parts, they used extra-thin films of magnesium, which is not only harmless but in fact an essential nutrient. For semiconductors, they used silicon membranes 300 nanometres thick, which also dissolve in water.
In addition to deterring bacteria, Rogers says that implantable, bio-absorbable RF electronics could be used to stimulate nerves for pain relief, and to stimulate bone re-growth, a process long proven to work when electrodes are placed on the skin or directly on the bone. Conceivably they could also be used to precisely control drug release from implanted reservoirs.
In other words, this is just the beginning. When it comes to the future of medicine, just about any barrier that was once considered impassable are suddenly looking quite porous…