Judgement Day Update: Cheetah Robot Unleashed!

MIT-Cheetah-05-640x366There have been lots of high-speed bio-inspired robots in recent years, as exemplified by Boston Dynamics WildCat. But MIT’s Cheetah robot, which made its big debut earlier this month, is in a class by itself. In addition to being able to run at impressive speeds, bound, and jump over obstacles, this particular biomimetic robot is also being battery-and-motor driven rather than by a gasoline engine and hydraulics, and can function untethered (i.e. not connected to a power source).

While gasoline-powered robots are still very much bio-inspired, they are dependent on sheer power to try and match the force and speed of their flesh-and-blood counterparts. They’re also pretty noisy, as the demonstration of the WildCat certainly showed (video below). MIT’s Cheetah takes the alternate route of applying less power but doing so more efficiently, more closely mimicking the musculoskeletal system of a living creature.

mit-cheetahThis is not only a reversal on contemporary robotics, but a break from history. Historically, to make a robot run faster, engineers made the legs move faster. The alternative is to keep the same kind of frequency, but to push down harder at the ground with each step. As MIT’s Sangbae Kim explained:

Our robot can be silent and as efficient as animals. The only things you hear are the feet hitting the ground… Many sprinters, like Usain Bolt, don’t cycle their legs really fast. They actually increase their stride length by pushing downward harder and increasing their ground force, so they can fly more while keeping the same frequency.

MIT’s Cheetah uses much the same approach as a sprinter, combining custom-designed high-torque-density electric motors made at MIT with amplifiers that control the motors (also a custom MIT job). These two technologies, combined with a bio-inspired leg, allow the Cheetah to apply exactly the right amount of force to successfully bound across the ground and navigate obstacles without falling over.

MIT-cheetah_jumpWhen it wants to jump over an obstacle, it simply pushes down harder; and as you can see from the video below, the results speak for themselves. For now, the Cheetah can run untethered at around 16 km/h (10 mph) across grass, and hurdle over obstacles up to 33 centimeters high. The Cheetah currently bounds – a fairly simple gait where the front and rear legs move almost in unison – but galloping, where all four legs move asymmetrically, is the ultimate goal.

With a new gait, and a little byte surgery to the control algorithms, MIT hopes that the current Cheetah can hit speeds of up to 48 km/h (30 mph), which would make it the fastest untethered quadruped robot in the world. While this is still a good deal slower than the real thing  – real cheetah’s can run up to 60 km/h (37 mph) – it will certainly constitute another big step for biomimetics and robotics.

Be sure to check out the video of the Cheetah’s test, and see how it differs from the Boston Dynamics/DARPA’s WildCat’s tests from October of last year:



Source:
extremetech.com

The Future is Here: First Android Newscasters in Japan

japan-android-robotsThis past week, Japanese scientists unveiled what they claim is the world’s first news-reading android. The adolescent-looking “Kodomoroid” – an amalgamation of the Japanese word “kodomo” (child) and “android”- and “Otonaroid” (“otona” meaning adult) introduced themselves at an exhibit entitled Android: What is a Human?, which is being presented at Tokyo’s National Museum of Emerging Science and Innovation (Miraikan).

The androids were flanked by robotics professor Hiroshi Ishiguro and Miraikan director Mamoru Mori. After Kodomoroid delivered news of an earthquake and an FBI raid to amazed reporters in Tokyo. She even poked fun at her creator, leading robotics professor Hiroshi Ishiguro, “You’re starting to look like a robot!” This was followed by Otonaroid fluffing her lines when asked to introduced herself, which was followed by her excusing herself by saying, “I’m a little bit nervous.”

geminoidBoth androids will be working at Miraikan and interacting with visitors, as part of Ishiguro’s studies into human reactions to the machines. Ishiguro is well-known for his work with “geminoid”, robots that bare a frightening resemblance to their creator. As part of his lecture process, Ishiguro takes his geminoid with him when he travels and even let’s it deliver his lectures for him. During an interview with AFP, he explained the reasoning behind this latest exhibit:

This will give us important feedback as we explore the question of what is human. We want robots to become increasingly clever. We will have more and more robots in our lives in the future… This will give us important feedback as we explore the question of what is human. We want robots to become increasingly clever.

Granted the unveiling did have its share of bugs. For her part, Otonaroid looked as if she could use some rewiring before beginning her new role as the museum’s science communicator, her lips out of sync and her neck movements symptomatic of a bad night’s sleep. But Ishiguro insisted both would prove invaluable to his continued research as museum visitors get to have conversations with the ‘droids and operate them as extensions of their own body.

pepperAnd this is just one of many forays into a world where the line between robots and humans are becoming blurred. After a successful debut earlier this month, a chatty humanoid called Pepper is set to go on sale as a household companion in Japan starting next year. Designed by SoftBank, using technology acquired from French robotics company Aldebaran, and marketed as a household companion, each robot will cost around $2,000, the same cost of a laptop.

Pepper can communicate through emotion, speech or body language and it’s equipped with both mics and proximity sensors. Inside, it will be possible to install apps and upgrade the unit’s functionality, the plan being to make Pepper far smarter than when you first bought it. It already understands 4,500 Japanese words, but perhaps more impressively, Pepper can apparently read into the tone used to understand its master’s disposition.

pepperAldebaran CEO Bruno Maisonnier claims that robots that can recognize human emotion will change the way we live and communicate. And this is certainly a big step towards getting robots into our daily lives, at least if you live in Japan (the only place Pepper will be available for the time being). He also believes this is the start of a “robotic revolution” where robotic household companions that can understand and interact with their human owners will become the norm.

Hmm, a world where robots are increasingly indistinguishable from humans, can do human jobs, and are capable of understanding and mimicking our emotions. Oh, and they live in our houses too? Yeah, I’m just going to ignore the warning bells going off in my head now! And in the meantime, be sure to check out these videos of Kodomoroid and Otonaroid and Pepper being unveiled for the first time:

World’s First Android Newscasters:


Aldebaran’s Pepper:


Sources:
cnet.com, gizmodo.com, engadget.com, nydailynews.com

The Future is Here: Roombot Transforming Furniture

roombots_tableRobotic arms and other mechanisms have long been used to make or assemble furniture; but thus far, no one has ever created robots that are capable of becoming furniture. However, Swiss researchers are aiming to change that with Roombots, a brand of reconfigurable robotic modules that connect to each other to change shape and transform into different types of furniture, based on the needs and specifications of users.

Created by the Biorobotics Laboratory (BioRob) at École polytechnique fédérale de Lausanne (EPFL), the self-assembling Roombots attach to each other via connectors which enables them to take on the desired shape. The team’s main goal is to create self-assembling interactive furniture that can be used in a variety of ways. They were designed primarily for the sake of helping the disabled or elderly by morphing to suit their needs.

roombots_unpackLike LEGO bricks, Roombots can be stacked upon each other to create various structures and/or combined with furniture and other objects, changing not only their shape, but also and functionality. For instance, a person lying down on a Roombot bed could slowly be moved into a seated position, or a table could scoot over to a corner or tilt itself to help a book slide into a person’s hands. The team has solved a number of significant milestones, such as the having the Roombots move freely, to bring all this multi-functionality closer.

Each 22 cm-long module (which is made up of four half-spheres) has a wireless connection, a battery, and three motors that allow the module to pivot with three degrees of freedom. Each modules also has retractable “claws” that are used to attach to other pieces to form larger structures. With a series of rotations and connections, the modules can change shape and become any of a variety of objects. A special surface with holes adapted to the Roombots’ mechanical claws can also allow the modules to anchor to a wall or floor.

roombots_configThe Roombots can even climb up a wall or over a step, when the surface is outfitted with connector plates. They’re are also capable of picking up connector plates and arranging them to form, say, a table’s surface. Massimo Vespignani, a PhD student at BioRob, explained the purpose of this design and the advantages in a recent interview with Gizmag:

We start from a group of Roombot modules that might be stacked together for storage. The modules detach from this pile to form structures of two or more modules. At this point they can start moving around the room in what we call off-grid locomotion…

A single module can autonomously reach any position on a plane (this being on the floor, walls, or ceiling), and overcome a concave edge. In order to go over convex edges two modules need to collaborate…

The advantage would be that the modules can be tightly packed together for transportation and then can reconfigure into any type of structure (for example a robotic manipulator)…

We can ‘augment’ existing furniture by placing compatible connectors on it and attaching Roombots modules to allow it to move around the house.

roombots_boxThe range of applications for these kind of robotics is virtually infinite. For example, as seen in the video below, a series of Roombots as feet on a table that not only let it move around the room and come to the owner, but adjust its height as well. Auke Ijspeert, head of the Biorob, envisions that this type of customization could be used for physically challenged people who could greatly benefit from furniture that adapts to their needs and movements.

As he said in a recent statement:

It could be very useful for disabled individuals to be able to ask objects to come closer to them, or to move out of the way. [They could also be used as] ‘Lego blocks’ [for makers to] find their own function and applications.

Meanwhile, design students at ENSCI Les Ateliers in France have come up with several more ideas for uses of Roombots, such as flower pots that can move from window to window around a building and modular lighting components and sound systems. Similar to the MIT’s more complex self-assembling M-Blocks – which are programmable cube robots with no external moving parts – Roombots represent a step in the direction of self-assembling robots that are capable of taking on just about any task.

roombotsFor instance, imagine a series of small robotic modules that could be used for tasks like repairing bridges or buildings during emergencies. Simply release them from their container and feed them the instructions, and they assemble to prop up an earthquake-stricken structure or a fallen bridge. At the same time, it is a step in the direction of smart matter and nanotechnology, a futuristic vision that sees the very building blocks of everyday objects as programmable, reconfiguring materials that can shape or properties as needed.

To get a closer, more detailed idea of what the Roombot can do, check out the video below from EPFL News:


Source:
gizmag.com, cnet.com, kurzweilai.net

News From Space: Robotnaut Gets a Pair of Legs!

robotnaut_movementSpaceX’s latest delivery to the International Space Station – which was itself pretty newsworthy – contained some rather interesting cargo: the legs for NASA’s robot space station helper. Robotics enthusiasts know this being as Robonaut 2 (R2), a humanoid robot NASA placed on the space station to automate tasks such as cleaning and routine maintenance. Since its arrival at the station in February 2011, R2 has performed a series of tasks to demonstrate its functionality in microgravity.

Until now, Robonaut navigated around the ISS on wheels. But thanks to a brand-new pair of springy, bendy legs, the space station’s helper robot will now be able to walk, climb, and perform a variety of new chores. These new legs, funded by NASA’s Human Exploration and Operations and Space Technology mission directorates, will provide R2 the mobility it needs to help with regular and repetitive tasks inside and outside the space station. The goal is to free up the crew for more critical work, including scientific research.

robonaut1NASA says that the new seven-jointed legs are designed for climbing in zero gravity and offer a considerable nine-foot leg span. Michael Gazarik, NASA’s associate administrator for space technology in Washington, explained:

NASA has explored with robots for more than a decade, from the stalwart rovers on Mars to R2 on the station. Our investment in robotic technology development is helping us to bolster productivity by applying robotics technology and devices to fortify and enhance individual human capabilities, performance and safety in space.

Taking their design inspiration from the tethers astronauts use while spacewalking, the legs feature a series of “end effectors” – each f which has a built-in vision system designed to eventually automate each limb’s approaching and grasping – rather than feet. These allow the legs to grapple onto handrails and sockets located both inside the space station and, eventually, on the ISS’s exterior. Naturally, these legs don’t come cheap -costing $6 million to develop and an additional $8 million to construct and test for spaceflight.

robonaut_legsRobonaut was developed by NASA’s Johnson Space Center in collaboration with General Motors and off-shore oil field robotics firm Oceaneering. All that corporate involvement isn’t accidental; Robonaut isn’t designed to simply do chores around the space station. NASA is also using R2 to showcase a range of patented technologies that private companies can license from Johnson Space Center.

The humanoid, task-performing robot is also a NASA technology showcase. In a webcast, the space agency advertised its potential uses in logistics warehouses, medical and industrial robotics, and in toxic or hazardous environments. As NASA dryly puts it:

R2 shares senses similar to humans: the ability to touch and see. These senses allow it to perform in ways that are not typical for robots today.

robonaut_legs2In addition to these legs, this latest supply drop – performed by a SpaceX Dragon capsule – included a laser communication system for astronauts and an outer space farming system designed to grow lettuce and other salad crops in orbit. We can expect that the Robotnaut 2 will be assisting in their use and upkeep in the coming months and years. So expect to hear more about this automated astronaut in the near future!

And in the meantime, be sure to check out this cool video of the R2 robotic legs in action:


Sources:
fastcompany.com, nasa.gov

The Future of 3D Printing: Exoskeletons and Limbs

???????????????????????3-D printing is leading to a revolution in manufacturing, and the list of applications grows with each passing day. But more important is the way it is coming together with other fields of research to make breakthroughs  more affordable and accessible. Nowhere is this more true than in the fields of robotics and medicine, where printing techniques are producing a new generation of bionic and mind-controlled prosthetics.

For example, 3D Systems (a an additive manufacturing company) and EksoBionics (a company specializing in bionic prosthetic devices) recently partnered to produce the new “bespoke” exoskeleton that will restore ambulatory ability to paraplegics. The prototype was custom made for a woman named Amanda Boxtel, who was paralyzed in 1992 from a tragic skiing accident.

3d_amanda2Designers from 3D Systems began by scanning her body, digitizing the contours of her spine, thighs, and shins; a process that helped them mold the robotic suit to her needs and specifications. They then combined the suit with a set of mechanical actuators and controls made by EksoBionics. The result, said 3D Systems, is the first-ever “bespoke” exoskeleton.

Intrinsic to the partnership between 3D Systems and EksoBionics was the common goal of finding a way to fit the exoskeleton comfortably to Boxtel’s body. One of the greatest challenges with exosuits and prosthetic devices is finding ways to avoid the hard parts bumping into “bony prominences,” such as the knobs on the wrists and ankles. These areas as not only sensitive, but prolonged exposure to hard surfaces can lead to a slew of health problems, given time.

3d-printed-ekso-suit-frontAs Scott Summit, the senior director for functional design at 3D Systems, explained it,:

[Such body parts] don’t want a hard surface touching them. We had to be very specific with the design so we never had 3D-printed parts bumping into bony prominences, which can lead to abrasions [and bruising].

One problem that the designers faced in this case was that a paralyzed person like Boxtel often can’t know that bruising is happening because they can’t feel it. This is dangerous because undetected bruises or abrasions can become infected. In addition, because 3D-printing allows the creation of very fine details, Boxtel’s suit was designed to allow her skin to breathe, meaning she can walk around without sweating too much.

3d_amandaThe process of creating the 3D-printed robotic suit lasted about three months, starting when Summit and 3D Systems CEO Avi Reichenthal met Boxtel during a visit to EksoBionics. Boxtel is one of ten EksoBionics “test pilots”, and the exoskeleton was already designed to attach to the body very loosely with Velcro straps, with an adjustable fit. But it wasn’t yet tailored to fit her alone.

That’s where 3D Systems came into play, by using a special 3D scanning system to create the custom underlying geometry that would be used to make the parts that attach to the exoskeleton. As Boxtel put it:

When the robot becomes the enabling device to take every step for the rest of your life. the connection between the body and the robot is everything. So our goal is to enhance the quality of that connection so the robot becomes more symbiotic.

3D_DudleyAnd human beings aren’t the only ones who are able to take advantage of this marriage between 3-D printing and biomedicine. Not surprisingly, animals are reaping the benefits of all the latest technological breakthroughs in these fields as well, as evidenced by the little duck named Dudley from the K911 animal rescue service in Sicamous, Canada.

Not too long ago, Dudley lost a leg when a chicken in the same pen mauled him. But thanks to a 3-D printed leg design, especially made for him, he can now walk again. It was created by Terence Loring of 3 Pillar Designs, a company that specializes in 3D-printing architectural prototypes. After hearing of Dudley’s plight through a friend, he decided to see what he could do to help.

3D_buttercupfootUnlike a previous printed limb, the printed foot that was fashioned for Buttercup the Duck, Loring sought to create an entire limb that could move. The first limb he designed had a jointed construction, and was fully 3D-printed in plastic. Unfortunately, the leg broke the moment Dudley pit it on, forcing Loring to go back to the drawing board for a one-piece printed from softer plastic.

The subsequent leg he created had no joints and could bend on its own. And when Dudley put it on, he started walking straight away and without hesitation. Issues remain to be solved, like how to prevent friction sores – a problem that Mike Garey (who designed Buttercup’s new foot) solved with a silicone sock and prosthetic gel liner.

3D_Dudley2Nevertheless, Dudley is nothing if not as happy as a duck in a pond, and it seems very likely that any remaining issues will be ironed out in time. In fact, one can expect that veterinary medicine will fully benefit from the wide range of 3D printed prosthetic devices and even bionic limbs as advancement and research continues to produce new and exciting possibilities.

And in the meantime, enjoy the following videos which show both Amanda Boxtel and Dudley the duck enjoying their new devices and the ways in which they help bring mobility back to their worlds:

 

Amanda Boxtel taking her first steps in 22 years:

 


Dudley the duck walking again:


Sources: news.cnet.com, (2), (3), 3dsystems.com, 3pillardesigns.com

Biomedical Breakthroughs: Bionerves and Restored Sensation

restoring_mobilityThese days, advances in prosthetic devices, bionic limbs and exoskeletons continue to advance and amaze. Not only are doctors and medical researchers able to restore mobility and sensation to patients suffering from missing limbs, they are now crossing a threshold where they are able to restore these abilities and faculties to patients suffering from partial or total paralysis.

This should come as no surprise, seeing as how the latest biomedical advances – which involve controlling robotic limbs with brain-computer interfacing – offer a very obvious solution for paralyzed individuals. In their case, no robotic limbs or bionic attachments are necessary to restore ambulatory motion since these were not lost. Instead, what is needed is to restore motor control to compensate for the severed nerves.

braingate1Thanks to researchers working at Case Western University in Ohio, a way forward is being proposed. Here, a biomedical team is gearing up to combine the Braingate cortical chip, developed at Brown University, with their own Functional Electric Stimulation (FES) platform. Through this combination, they hope to remove robots from the equation entirely and go right to the source.

It has long been known that electrical stimulation can directly control muscles, but attempts to do this in the past artificially has often been inaccurate (and therefore painful and potentially damaging) to the patient. Stimulating the nerves directly using precisely positioned arrays is a much better approach, something that another team at Case Western recently demonstrated thought their “nerve cuff electrode”.

cuff-electrodeThis electrode is a direct stimulation device that is small enough to be placed around small segments of nerve. The Western team used the cuff to provide an interface for sending data from sensors in the hand back to the brain using sensory nerves in the arm. With FES, the same kind of cuff electrode can also be used to stimulate nerves going the other direction, in other words, to the muscles.

The difficulty in such a scheme, is that even if the motor nerves can be physically separated from the sensory nerves and traced to specific muscles, the exact stimulation sequences needed to make a proper movement are hard to find. To achieve this, another group at Case Western has developed a detailed simulation of how different muscles work together to control the arm and hand.

braingate2-img_assist_custom-500x288Their model consists of 138 muscle elements distributed over 29 muscles, which act on 11 joints. The operational procedure is for the patient to watch the image of the virtual arm while they naturally generate neural commands that the BrainGate chip picks up to move the arm. In practice, this means trying to make the virtual arm touch a red spot to make it turn green.

Currently in clinical trials, the Braingate2 chip is being developed with the hope of not only stimulating muscles, but generating the same kinds of feedback and interaction that real muscle movement creates. The eventual plan is that the patient and the control algorithm will learn together in tandem so that a training screen will not be needed at all and a patient will be able to move on their own without calibrating the device.

bionic-handBut at the same time, biotech enhancements that are restoring sensation to amputee victims are also improving apace. Consider the bionic hand developed by Silvestro Micerna of the École Polytechnique Fédérale de Lausanne in Switzerland. Unlike previous bionic hands, which rely on electrodes to receive nerve signals to control the hand’s movement, his device sends electronic signals back to simulate the feeling of touch.

Back in February of 2013, Micerna and his research team began testing their bionic hand, and began clinical trials on a volunteer just last month. Their volunteer, a man named Dennis Aabo Sørensen from Denmark, lost his arm in a car accident nine years ago, and has since become the first amputee to experience artificially-induced sensation in real-time.

prosthetic_originalIn a laboratory setting wearing a blindfold and earplugs, Sørensen was able to detect how strongly he was grasping, as well as the shape and consistency of different objects he picked up with his prosthetic. Afterwards, Sørensen described the experience to reporters, saying:

The sensory feedback was incredible. I could feel things that I hadn’t been able to feel in over nine years. When I held an object, I could feel if it was soft or hard, round or square.

The next step will involve miniaturizing the sensory feedback electronics for a portable prosthetic, as well as fine-tuning the sensory technology for better touch resolution and increased awareness about the movement of fingers. They will also need to assess how long the electrodes can remain implanted and functional in the patient’s nervous system, though Micerna’s team is confident that they would last for many years.

bionic-hand-trialMicerna and his team were also quick to point out that Sørensen’s psychological strength was a major asset in the clinical trial. Not only has he been forced to adapt to the loss of his arm nine years ago, he was also extremely willing to face the challenge of having experienced touch again, but for only a short period of time. But as he himself put it:

I was more than happy to volunteer for the clinical trial, not only for myself, but to help other amputees as well… There are two ways you can view this. You can sit in the corner and feel sorry for yourself. Or, you can get up and feel grateful for what you have.

The study was published in the February 5, 2014 edition of Science Translational Medicine, and represents a collaboration called Lifehand 2 between several European universities and hospitals. And although a commercially-available sensory-enhanced prosthetic may still be years away, the study provides the first step towards a fully-realizable bionic hand.

braingate_drinkassistYes, between implantable electronics that can read out brainwaves and nerve impulses, computers programs that are capable of making sense of it all, and robotic limbs that are integrated to these machines and our bodies, the future is looking very interesting indeed. In addition to restoring ambulatory motion and sensation, we could be looking at an age where there is no such thing as “permanent injury”.

And in the meantime, be sure to check out this video of Sørensen’s clinical trial with the EPFL’s bionic hand:


Sources:
extremetech.com, actu.epfl.ch, neurotechnology.com

The Future is Here: AirMule’s Autonomous Demo Flight

airmule1Vertical Take-Off and Landing craft have been the subject of military developers for some time. In addition to being able to deploy from landing strips that are damaged or small for conventional aircraft, they are also able to navigate terrain and land where other craft cannot. Add to that the ability to hover and fly close to the ground, and you have a craft that can also provide support while avoiding IEDs and landmines.

One concept that incorporates all of these features is the AirMule, a compact, unmanned, single-engine vehicle that is being developed by Tactical Robotics in Israel. In January of 2013, the company unveiled the prototype which they claimed was created for the sake of supporting military personnel,  evacuating the wounded, and conducting remote reconnaissance missions.

airmule-1Now, less than a year later, the company conducted a demonstration with their prototype aircraft recently demonstrated its ability to fly autonomously, bringing it one step closer to carrying out a full mission demo. During the test, which took place in December, the craft autonomously performed a vertical take-off, flew to the end of a runway, then turned around on the spot and flew back to its starting point.

All the while, it maintained altitude using two laser altimeters, while maintaining positioning via a combination of GPS, an inertial navigation system, and optical reference to markers on the ground. These autonomous systems, which allow it to fly on its own, can also be countermanded in favor of remote control, in case a mission seems particularly harry and requires a human controller.

airmule-0In its current form, the AirMule possesses many advantages over other VTOL craft, such as helicopters. For starters, it weighs only 770 kg (1,700 lb) – as opposed to a Bell UH-1 empty weights of 2,365 kg (5,215 lbs) – can carry a payload of up to 640 kg (1,400 lb), has a top speed of 180 km/h (112 mph), and can reach a maximum altitude of 12,000 ft (3,658 m).

In short, it has a better mass to carrying capacity ratio than a helicopter, comparable performance, and can land and take-off within an area of 40 square meters (430.5 sq ft), which is significantly smaller than what a manned helicopter requires for a safe landing. The internal rotor blades are reportedly also much quieter than those of a helicopter, giving the matte-black AirMule some added stealth.

BD_atlasrobotPlans now call for “full mission demonstrations” next year, utilizing a second prototype that is currently under construction. And when complete, this vehicle and those like it can expected to be deployed to many areas of the world, assisting Coalition and other forces in dirty, dangerous environments where landmines, IEDs and other man-made and natural hazards are common.

Alongside machines like the Alpha Dog, LS3 or Wildcat, machines that were built by Boston Dynamics (recently acquired by Google) to offer transport and support to infantry in difficult terrain, efforts to “unman the front lines” through the use of autonomous drones or remote-controlled robots continue. Clearly, the future battlefield is a place where robots where will be offering a rather big hand!

 

And be sure to check this video of the AirMule demonstration, showing the vehicle take-off, hover, fly around, and then come in for a landing:


Sources: gizmag.com, tactical-robotics.com

Tech News: Google Seeking “Conscious Homes”

nest_therm1In Google’s drive for world supremacy, a good number of start-ups and developers have been bought up. Between their acquisition of eight robotics companies in the space of sixth months back in 2013 to their ongoing  buyout of anyone in the business of aerospace, voice and facial recognition, and artificial intelligence, Google seems determined to have a controlling interest in all fields of innovation.

And in what is their second-largest acquisition to date, Google announced earlier this month that they intend get in on the business of smart homes. The company in question is known as Nest Labs, a home automation company that was founded by former Apple engineers Tony Fadell and Matt Rogers in 2010 and is behind the creation of The Learning Thermostat and the Protect smoke and carbon monoxide detector.

nest-thermostatThe Learning Thermostat, the company’s flagship product, works by learning a home’s heating and cooling preferences over time, removing the need for manual adjustments or programming. Wi-Fi networking and a series of apps also let users control and monitor the unit Nest from afar, consistent with one of the biggest tenets of smart home technology, which is connectivity.

Similarly, the Nest Protect, a combination smoke and carbon monoxide detector, works by differentiating between burnt toast and real fires. Whenever it detects smoke, one alarm goes off, which can be quieted by simply waving your hand in front of it. But in a real fire, or where deadly carbon monoxide is detected, a much louder alarm sounds to alert its owners.

nest_smoke_detector_(1_of_9)_1_610x407In addition, the device sends a daily battery status report to the Nest mobile app, which is the same one that controls the thermostats, and is capable of connecting with other units in the home. And, since Nest is building a platform for all its devices, if a Nest thermostat is installed in the same home, the Protect and automatically shut it down in the event that carbon monoxide is detected.

According to a statement released by co-f0under Tony Fadell, Nest will continue to be run in-house, but will be partnered with Google in their drive to create a conscious home. On his blog, Fadell explained his company’s decision to join forces with the tech giant:

Google will help us fully realize our vision of the conscious home and allow us to change the world faster than we ever could if we continued to go it alone. We’ve had great momentum, but this is a rocket ship. Google has the business resources, global scale, and platform reach to accelerate Nest growth across hardware, software, and services for the home globally.

smarthomeYes, and I’m guessing that the $3.2 billion price tag added a little push as well! Needless to say, some wondered why Apple didn’t try to snatch up this burgeoning company, seeing as how its being run by two of its former employees. But according to Fadell, Google founder Sergey Brin “instantly got what we were doing and so did the rest of the Google team” when they got a Nest demo at the 2011 TED conference.

In a press release, Google CEO Larry Page had this to say about bringing Nest into their fold:

They’re already delivering amazing products you can buy right now – thermostats that save energy and smoke/[carbon monoxide] alarms that can help keep your family safe. We are excited to bring great experiences to more homes in more countries and fulfill their dreams!

machine_learningBut according to some, this latest act by Google goes way beyond wanting to develop devices. Sara Watson at Harvard University’s Berkman Center for Internet and Society is one such person, who believes Google is now a company obsessed with viewing everyday activities as “information problems” to be solved by machine learning and algorithms.

Consider Google’s fleet of self-driving vehicles as an example, not to mention their many forays into smartphone and deep learning technology. The home is no different, and a Google-enabled smart home of the future, using a platform such as the Google Now app – which already gathers data on users’ travel habits – could adapt energy usage to your life in even more sophisticated ways.

Larry_PageSeen in these terms, Google’s long terms plans of being at the forefront of the new technological paradigm  – where smart technology knows and anticipates and everything is at our fingertips – certainly becomes more clear. I imagine that their next goal will be to facilitate the creation of household AIs, machine minds that monitor everything within our household, provide maintenance, and ensure energy efficiency.

However, another theory has it that this is in keeping with Google’s push into robotics, led by the former head of Android, Andy Rubin. According to Alexis C. Madrigal of the Atlantic, Nest always thought of itself as a robotics company, as evidence by the fact that their VP of technology is none other than Yoky Matsuoka – a roboticist and artificial intelligence expert from the University of Washington.

yokymatsuoka1During an interview with Madrigal back in 2012, she explained why this was. Apparently, Matsuoka saw Nest as being positioned right in a place where it could help machine and human intelligence work together:

The intersection of neuroscience and robotics is about how the human brain learns to do things and how machine learning comes in to augment that.

In short, Nest is a cryptorobotics company that deals in sensing, automation, and control. It may not make a personable, humanoid robot, but it is producing machine intelligences that can do things in the physical world. Seen in this respect, the acquisition was not so much part of Google’s drive to possess all our personal information, but a mere step along the way towards the creation of a working artificial intelligence.

It’s a Brave New World, and it seems that people like Musk, Page, and a slew of futurists that are determined to make it happen, are at the center of it.

Sources: cnet.news.com, (2), newscientist.com, nest.com, theatlantic.com

Judgement Day Update: DARPA Robotics Challenge!

darpa-robotics-challenge-conceptFor the past two years, the Defense Advanced Research Projects Agency has been holding a series of trials where robots are tasked with navigating disaster areas and performing tasks with tools and materials provided. This is known as the Robotics Challenge, which took place from Dec.20th to 21st and was streamed live from Florida’s Homestead Miami Speedway.

And this year, Google’s Schaft humanoid robot took home the top prize after scoring 27 points out of a total of 32 points. IHMC Robotics, based in Florida, grabbed second place, while Carnegie Mellon University’s Team Tartan Rescue placed third. Eight of the top teams that participated in the challenge may receive as much as $1 million in funding from DARPA, ahead of further trials next year with a $2 million prize.

schaft_robotBuilt by a Japanese start-up – one of Google’s many recent acquisitions – the Schaft is an updated version of the Humanoid Robot Project robot (HRP-2), with hardware and software modifications that include more powerful actuators, a walking/stabilization system, and a capacitor instead of a battery. The robot stands 1.48 m (4.8 ft) tall, weighs in at 95 kg (209 lb), and is generally unappealing to the eye.

However, what it lacks in photogenic quality, it makes up for in performance. Over the course of the trials, the bipedal robot was able to bring stable walking and significant torque power to fore as it opened doors, wielded hoses, and cut away part of a wall. However, team Schaft lost points when a gust of wind blew a door out of the robot’s hand and the robot was unable to exit a vehicle after navigated a driving course successfully.

Check out the video of the Schaft in action:


Initially, over 100 teams applied to compete when the challenged was announced in April of last year. After a series of reviews and virtual challenges, the field was narrowed down to 16 competing in four “tracks. On Track A, Schaft was joined by the RoboSimian, the robot recently built by NASA’s Jet Propulsion Laboratory (JPL). Another primate-like robot was the Tartan Rescue CHIMP, a red headless robot with rollers on its feet.

At the other end of the spectrum was the Johnson Space Center’s Valkyrie, a biped, anthropomorphic robot that honestly looks like something out of anime or Tony Stark’s closet. This latter aspect is due largely to the fact that it has a glowing chest light, though the builders claim that it’s just a bulge to make room in the torso for linear actuators to move the waist.

Valkyrie_robotOfficially designated “R5” by NASA, Val was designed to be a high-powered rescue robot, capable of traversing uneven terrain, climbing ladders, using tools, and even driving. According to the designers, the Valkyrie was designed to be human in form because:

a human form makes sense because we’re humans, and these robots will be doing the jobs that we don’t want to be doing because they’re too dangerous. To that end, Valkyrie has seven degree of freedom arms with actuated wrists and hands, each with three fingers and a thumb. It has a head that can tilt and swivel, a waist that can rotate, and six degree of freedom legs complete with feet equipped with six-axis force-torque sensors.

Unfortunately, the robot failed in its tasks this year, scoring 0 points and placing amongst the last three competitors. I guess NASA has some bugs to work out before this patently badass design can go toe-to-toe with other disaster robots. Or perhaps the anthropomorphic concept is just not up to the task. Only time and further trials will tell. And of course, there’s a video of Val in action too:


The B and C track teams are often difficult to tell apart because they all used Atlas robots. Meanwhile, the D track teams brought several of their own robots to the fore. These included Chiron, a robot that resembles a a metallic sea louse; Mojovation, a distinctly minimalist robot; South Korea’s Kaist, and China’s Intelligent Pioneer.

DARPA says that the point of the competition is to provide a baseline from which to develop robotics for disaster response. Events such as the 2011 Fukushima nuclear disaster, which not only damaged the reactors but made it impossible for crews to respond in time, demonstrate that robots have a potential role. DARPA believes that robots that could navigate the ruins and work in radioactive environments would have been of great help.

DARPA Robotics Challenge The problem is that current robots simply aren’t up to task. Specialized robots can’t be built to deal with the unpredictable, full telepresence control is neither practical nor desirable, and most robots tend to be a bit on the delicate side. What’s needed is a robot that can work on its own, use tools and vehicles at hand, deal with the unpredictable, and is durable and agile enough to operate in the ruins of a building.

That’s where DARPA Robotics Challenge comes in. Over the next few years, DARPA will use the results of the competition to draw a baseline that will benefit engineers working on the next generation of robots. For now, the top eight of the teams go on with DARPA funding to compete in the Robotics Finals event late next year, for a US $2 million prize.

DARPACourseIf there’s one thing the current challenge demonstrated, its that anthropomorphic designs are not well-suited to the tasks they were given. And ironic outcome, considering that one of the aims of the challenge is to develop robots capable of performing human tasks, but under conditions considered unsafe for humans. As always, the top prize goes to those who can think outside the box!

And in the meantime, enjoy this video of the Robot Challenge, taken on the second day of the trials.


Sources: gizmag.com, news.cnet.com, wired.com, IO9.com, theroboticchallenge.org

The Future is Here: Smarty Rings

smarty-ringsOkay, its not exactly here yet, but the implications of this idea could be a game changer. It’s known as the Smarty Ring, a crowdfunded idea being advertised on Indiegogo by a group of inventors in Chennai, India. And at its core is a waterproof, stainless steel band that will feature an LED screen and connect to your phone via Bluetooth 4.0 wireless technology.

For some time now, the Chennai-based group has been the source of some controversy, due mainly to the fact that they have no working prototypes of the ring, but also because they have not identified themselves beyond giving their location. They also freely admit that the photos of the Smarty Ring on Indiegogo and on their website are photoshopped.

smarty-rings1Surprisingly, this has not prevented them from being able to mount their campaign to raise money for its development. While the crowdfunding site Kickstarter has rules requiring creators to be clear about the state of a project’s development and show a prototype “demonstrating the product’s current functionality,” Indiegogo has no such rules.

However, this has not stopped their campaign – which officially closed at 11:00 am ET on Dec.11th, 2013 – from raising a total of $299,349 from their original goal of $40,000. Numerous blueprints of what the watch would look like, including detailed images of its electronics, are also available on their campaign page. What’s more, the group is still taking advanced orders and offering discount pricing to anyone who orders one before Dec.30th.

smarty-rings3Also, the group has become much less clandestine since the campaign closed. In response to questions, group spokesperson Karthik said the project was founded by Chennai-based mechatronics engineer Ashok Kumar, and that their team of inventors includes electronic and computer engineers with experience in robotics and nanotechnology.

Ultimately, the goal of the project was to create a high-tech gadget that would also double as “high-end fashion jewelry,” according to an email to CBC News from the team’s marketing director, Karthik, who did not give his last name. The group also claims on their website that the average smartphone user checks their phone every six minutes, and promises to make that unnecessary, saving time and the battery life of the smartphone.

smarty-rings4According to the The Smarty Ring’s site, the features are to include:

  • A clock with stop watch, timer and alarm
  • Notifications of calls, text and email messages, and social networking updates from services such as Facebook, Twitter, and Skype
  • Phone controls that let users accept or reject incoming calls, make outgoing calls to preset numbers, and control music or the phone’s camera
  • A phone tracking feature that beeps when your phone gets more than nine meters away from you
  • The ring charges wirelessly and its creators guarantee 24 hours of battery life

The Smarty Ring team says the retail price for the device will be $275, but backers and people who preorder before Dec.30th will be able to get one at the reduced price of $175. They estimate that delivery will begin sometime in April of 2014. They are also offering cheaper versions that include only the tracking feature or the clock and tracking features.

smarty-rings5Needless to say, if this is a scam, it is clearly a well-thought out and elaborate one. Not only is the idea of a smart ring that can connect wirelessly to other devices and do the job of a smartphone entirely within the bounds of current and developing technology, its a very cool idea. But if it is in fact real, its realization could mean a new wave of innovation and design for the smart devices market.

Currently, designers and developers are working towards the creation of smartwatches, smartphones, tablets and phablets that are not only smaller and much thinner, but also flexible and transparent. An even smaller device, such as a ring or bracelet, that can do the same job but be far more ergonomic, may be just what the market ordered!

And in the meantime, be sure to enjoy this promotional video from the Smarty Ring website. And be sure to check out their website and determine for yourself if they are liars, inventors, or just plain dreamers:


Sources:
cbc.ca, indiegogo.com