Judgement Day Update: Cheetah Robot Unleashed!

MIT-Cheetah-05-640x366There have been lots of high-speed bio-inspired robots in recent years, as exemplified by Boston Dynamics WildCat. But MIT’s Cheetah robot, which made its big debut earlier this month, is in a class by itself. In addition to being able to run at impressive speeds, bound, and jump over obstacles, this particular biomimetic robot is also being battery-and-motor driven rather than by a gasoline engine and hydraulics, and can function untethered (i.e. not connected to a power source).

While gasoline-powered robots are still very much bio-inspired, they are dependent on sheer power to try and match the force and speed of their flesh-and-blood counterparts. They’re also pretty noisy, as the demonstration of the WildCat certainly showed (video below). MIT’s Cheetah takes the alternate route of applying less power but doing so more efficiently, more closely mimicking the musculoskeletal system of a living creature.

mit-cheetahThis is not only a reversal on contemporary robotics, but a break from history. Historically, to make a robot run faster, engineers made the legs move faster. The alternative is to keep the same kind of frequency, but to push down harder at the ground with each step. As MIT’s Sangbae Kim explained:

Our robot can be silent and as efficient as animals. The only things you hear are the feet hitting the ground… Many sprinters, like Usain Bolt, don’t cycle their legs really fast. They actually increase their stride length by pushing downward harder and increasing their ground force, so they can fly more while keeping the same frequency.

MIT’s Cheetah uses much the same approach as a sprinter, combining custom-designed high-torque-density electric motors made at MIT with amplifiers that control the motors (also a custom MIT job). These two technologies, combined with a bio-inspired leg, allow the Cheetah to apply exactly the right amount of force to successfully bound across the ground and navigate obstacles without falling over.

MIT-cheetah_jumpWhen it wants to jump over an obstacle, it simply pushes down harder; and as you can see from the video below, the results speak for themselves. For now, the Cheetah can run untethered at around 16 km/h (10 mph) across grass, and hurdle over obstacles up to 33 centimeters high. The Cheetah currently bounds – a fairly simple gait where the front and rear legs move almost in unison – but galloping, where all four legs move asymmetrically, is the ultimate goal.

With a new gait, and a little byte surgery to the control algorithms, MIT hopes that the current Cheetah can hit speeds of up to 48 km/h (30 mph), which would make it the fastest untethered quadruped robot in the world. While this is still a good deal slower than the real thing  – real cheetah’s can run up to 60 km/h (37 mph) – it will certainly constitute another big step for biomimetics and robotics.

Be sure to check out the video of the Cheetah’s test, and see how it differs from the Boston Dynamics/DARPA’s WildCat’s tests from October of last year:



Source:
extremetech.com

The Future is Here: First Android Newscasters in Japan

japan-android-robotsThis past week, Japanese scientists unveiled what they claim is the world’s first news-reading android. The adolescent-looking “Kodomoroid” – an amalgamation of the Japanese word “kodomo” (child) and “android”- and “Otonaroid” (“otona” meaning adult) introduced themselves at an exhibit entitled Android: What is a Human?, which is being presented at Tokyo’s National Museum of Emerging Science and Innovation (Miraikan).

The androids were flanked by robotics professor Hiroshi Ishiguro and Miraikan director Mamoru Mori. After Kodomoroid delivered news of an earthquake and an FBI raid to amazed reporters in Tokyo. She even poked fun at her creator, leading robotics professor Hiroshi Ishiguro, “You’re starting to look like a robot!” This was followed by Otonaroid fluffing her lines when asked to introduced herself, which was followed by her excusing herself by saying, “I’m a little bit nervous.”

geminoidBoth androids will be working at Miraikan and interacting with visitors, as part of Ishiguro’s studies into human reactions to the machines. Ishiguro is well-known for his work with “geminoid”, robots that bare a frightening resemblance to their creator. As part of his lecture process, Ishiguro takes his geminoid with him when he travels and even let’s it deliver his lectures for him. During an interview with AFP, he explained the reasoning behind this latest exhibit:

This will give us important feedback as we explore the question of what is human. We want robots to become increasingly clever. We will have more and more robots in our lives in the future… This will give us important feedback as we explore the question of what is human. We want robots to become increasingly clever.

Granted the unveiling did have its share of bugs. For her part, Otonaroid looked as if she could use some rewiring before beginning her new role as the museum’s science communicator, her lips out of sync and her neck movements symptomatic of a bad night’s sleep. But Ishiguro insisted both would prove invaluable to his continued research as museum visitors get to have conversations with the ‘droids and operate them as extensions of their own body.

pepperAnd this is just one of many forays into a world where the line between robots and humans are becoming blurred. After a successful debut earlier this month, a chatty humanoid called Pepper is set to go on sale as a household companion in Japan starting next year. Designed by SoftBank, using technology acquired from French robotics company Aldebaran, and marketed as a household companion, each robot will cost around $2,000, the same cost of a laptop.

Pepper can communicate through emotion, speech or body language and it’s equipped with both mics and proximity sensors. Inside, it will be possible to install apps and upgrade the unit’s functionality, the plan being to make Pepper far smarter than when you first bought it. It already understands 4,500 Japanese words, but perhaps more impressively, Pepper can apparently read into the tone used to understand its master’s disposition.

pepperAldebaran CEO Bruno Maisonnier claims that robots that can recognize human emotion will change the way we live and communicate. And this is certainly a big step towards getting robots into our daily lives, at least if you live in Japan (the only place Pepper will be available for the time being). He also believes this is the start of a “robotic revolution” where robotic household companions that can understand and interact with their human owners will become the norm.

Hmm, a world where robots are increasingly indistinguishable from humans, can do human jobs, and are capable of understanding and mimicking our emotions. Oh, and they live in our houses too? Yeah, I’m just going to ignore the warning bells going off in my head now! And in the meantime, be sure to check out these videos of Kodomoroid and Otonaroid and Pepper being unveiled for the first time:

World’s First Android Newscasters:


Aldebaran’s Pepper:


Sources:
cnet.com, gizmodo.com, engadget.com, nydailynews.com

The Future is Here: Roombot Transforming Furniture

roombots_tableRobotic arms and other mechanisms have long been used to make or assemble furniture; but thus far, no one has ever created robots that are capable of becoming furniture. However, Swiss researchers are aiming to change that with Roombots, a brand of reconfigurable robotic modules that connect to each other to change shape and transform into different types of furniture, based on the needs and specifications of users.

Created by the Biorobotics Laboratory (BioRob) at École polytechnique fédérale de Lausanne (EPFL), the self-assembling Roombots attach to each other via connectors which enables them to take on the desired shape. The team’s main goal is to create self-assembling interactive furniture that can be used in a variety of ways. They were designed primarily for the sake of helping the disabled or elderly by morphing to suit their needs.

roombots_unpackLike LEGO bricks, Roombots can be stacked upon each other to create various structures and/or combined with furniture and other objects, changing not only their shape, but also and functionality. For instance, a person lying down on a Roombot bed could slowly be moved into a seated position, or a table could scoot over to a corner or tilt itself to help a book slide into a person’s hands. The team has solved a number of significant milestones, such as the having the Roombots move freely, to bring all this multi-functionality closer.

Each 22 cm-long module (which is made up of four half-spheres) has a wireless connection, a battery, and three motors that allow the module to pivot with three degrees of freedom. Each modules also has retractable “claws” that are used to attach to other pieces to form larger structures. With a series of rotations and connections, the modules can change shape and become any of a variety of objects. A special surface with holes adapted to the Roombots’ mechanical claws can also allow the modules to anchor to a wall or floor.

roombots_configThe Roombots can even climb up a wall or over a step, when the surface is outfitted with connector plates. They’re are also capable of picking up connector plates and arranging them to form, say, a table’s surface. Massimo Vespignani, a PhD student at BioRob, explained the purpose of this design and the advantages in a recent interview with Gizmag:

We start from a group of Roombot modules that might be stacked together for storage. The modules detach from this pile to form structures of two or more modules. At this point they can start moving around the room in what we call off-grid locomotion…

A single module can autonomously reach any position on a plane (this being on the floor, walls, or ceiling), and overcome a concave edge. In order to go over convex edges two modules need to collaborate…

The advantage would be that the modules can be tightly packed together for transportation and then can reconfigure into any type of structure (for example a robotic manipulator)…

We can ‘augment’ existing furniture by placing compatible connectors on it and attaching Roombots modules to allow it to move around the house.

roombots_boxThe range of applications for these kind of robotics is virtually infinite. For example, as seen in the video below, a series of Roombots as feet on a table that not only let it move around the room and come to the owner, but adjust its height as well. Auke Ijspeert, head of the Biorob, envisions that this type of customization could be used for physically challenged people who could greatly benefit from furniture that adapts to their needs and movements.

As he said in a recent statement:

It could be very useful for disabled individuals to be able to ask objects to come closer to them, or to move out of the way. [They could also be used as] ‘Lego blocks’ [for makers to] find their own function and applications.

Meanwhile, design students at ENSCI Les Ateliers in France have come up with several more ideas for uses of Roombots, such as flower pots that can move from window to window around a building and modular lighting components and sound systems. Similar to the MIT’s more complex self-assembling M-Blocks – which are programmable cube robots with no external moving parts – Roombots represent a step in the direction of self-assembling robots that are capable of taking on just about any task.

roombotsFor instance, imagine a series of small robotic modules that could be used for tasks like repairing bridges or buildings during emergencies. Simply release them from their container and feed them the instructions, and they assemble to prop up an earthquake-stricken structure or a fallen bridge. At the same time, it is a step in the direction of smart matter and nanotechnology, a futuristic vision that sees the very building blocks of everyday objects as programmable, reconfiguring materials that can shape or properties as needed.

To get a closer, more detailed idea of what the Roombot can do, check out the video below from EPFL News:


Source:
gizmag.com, cnet.com, kurzweilai.net

News From Space: Robotnaut Gets a Pair of Legs!

robotnaut_movementSpaceX’s latest delivery to the International Space Station – which was itself pretty newsworthy – contained some rather interesting cargo: the legs for NASA’s robot space station helper. Robotics enthusiasts know this being as Robonaut 2 (R2), a humanoid robot NASA placed on the space station to automate tasks such as cleaning and routine maintenance. Since its arrival at the station in February 2011, R2 has performed a series of tasks to demonstrate its functionality in microgravity.

Until now, Robonaut navigated around the ISS on wheels. But thanks to a brand-new pair of springy, bendy legs, the space station’s helper robot will now be able to walk, climb, and perform a variety of new chores. These new legs, funded by NASA’s Human Exploration and Operations and Space Technology mission directorates, will provide R2 the mobility it needs to help with regular and repetitive tasks inside and outside the space station. The goal is to free up the crew for more critical work, including scientific research.

robonaut1NASA says that the new seven-jointed legs are designed for climbing in zero gravity and offer a considerable nine-foot leg span. Michael Gazarik, NASA’s associate administrator for space technology in Washington, explained:

NASA has explored with robots for more than a decade, from the stalwart rovers on Mars to R2 on the station. Our investment in robotic technology development is helping us to bolster productivity by applying robotics technology and devices to fortify and enhance individual human capabilities, performance and safety in space.

Taking their design inspiration from the tethers astronauts use while spacewalking, the legs feature a series of “end effectors” – each f which has a built-in vision system designed to eventually automate each limb’s approaching and grasping – rather than feet. These allow the legs to grapple onto handrails and sockets located both inside the space station and, eventually, on the ISS’s exterior. Naturally, these legs don’t come cheap -costing $6 million to develop and an additional $8 million to construct and test for spaceflight.

robonaut_legsRobonaut was developed by NASA’s Johnson Space Center in collaboration with General Motors and off-shore oil field robotics firm Oceaneering. All that corporate involvement isn’t accidental; Robonaut isn’t designed to simply do chores around the space station. NASA is also using R2 to showcase a range of patented technologies that private companies can license from Johnson Space Center.

The humanoid, task-performing robot is also a NASA technology showcase. In a webcast, the space agency advertised its potential uses in logistics warehouses, medical and industrial robotics, and in toxic or hazardous environments. As NASA dryly puts it:

R2 shares senses similar to humans: the ability to touch and see. These senses allow it to perform in ways that are not typical for robots today.

robonaut_legs2In addition to these legs, this latest supply drop – performed by a SpaceX Dragon capsule – included a laser communication system for astronauts and an outer space farming system designed to grow lettuce and other salad crops in orbit. We can expect that the Robotnaut 2 will be assisting in their use and upkeep in the coming months and years. So expect to hear more about this automated astronaut in the near future!

And in the meantime, be sure to check out this cool video of the R2 robotic legs in action:


Sources:
fastcompany.com, nasa.gov

The Future of 3D Printing: Exoskeletons and Limbs

???????????????????????3-D printing is leading to a revolution in manufacturing, and the list of applications grows with each passing day. But more important is the way it is coming together with other fields of research to make breakthroughs  more affordable and accessible. Nowhere is this more true than in the fields of robotics and medicine, where printing techniques are producing a new generation of bionic and mind-controlled prosthetics.

For example, 3D Systems (a an additive manufacturing company) and EksoBionics (a company specializing in bionic prosthetic devices) recently partnered to produce the new “bespoke” exoskeleton that will restore ambulatory ability to paraplegics. The prototype was custom made for a woman named Amanda Boxtel, who was paralyzed in 1992 from a tragic skiing accident.

3d_amanda2Designers from 3D Systems began by scanning her body, digitizing the contours of her spine, thighs, and shins; a process that helped them mold the robotic suit to her needs and specifications. They then combined the suit with a set of mechanical actuators and controls made by EksoBionics. The result, said 3D Systems, is the first-ever “bespoke” exoskeleton.

Intrinsic to the partnership between 3D Systems and EksoBionics was the common goal of finding a way to fit the exoskeleton comfortably to Boxtel’s body. One of the greatest challenges with exosuits and prosthetic devices is finding ways to avoid the hard parts bumping into “bony prominences,” such as the knobs on the wrists and ankles. These areas as not only sensitive, but prolonged exposure to hard surfaces can lead to a slew of health problems, given time.

3d-printed-ekso-suit-frontAs Scott Summit, the senior director for functional design at 3D Systems, explained it,:

[Such body parts] don’t want a hard surface touching them. We had to be very specific with the design so we never had 3D-printed parts bumping into bony prominences, which can lead to abrasions [and bruising].

One problem that the designers faced in this case was that a paralyzed person like Boxtel often can’t know that bruising is happening because they can’t feel it. This is dangerous because undetected bruises or abrasions can become infected. In addition, because 3D-printing allows the creation of very fine details, Boxtel’s suit was designed to allow her skin to breathe, meaning she can walk around without sweating too much.

3d_amandaThe process of creating the 3D-printed robotic suit lasted about three months, starting when Summit and 3D Systems CEO Avi Reichenthal met Boxtel during a visit to EksoBionics. Boxtel is one of ten EksoBionics “test pilots”, and the exoskeleton was already designed to attach to the body very loosely with Velcro straps, with an adjustable fit. But it wasn’t yet tailored to fit her alone.

That’s where 3D Systems came into play, by using a special 3D scanning system to create the custom underlying geometry that would be used to make the parts that attach to the exoskeleton. As Boxtel put it:

When the robot becomes the enabling device to take every step for the rest of your life. the connection between the body and the robot is everything. So our goal is to enhance the quality of that connection so the robot becomes more symbiotic.

3D_DudleyAnd human beings aren’t the only ones who are able to take advantage of this marriage between 3-D printing and biomedicine. Not surprisingly, animals are reaping the benefits of all the latest technological breakthroughs in these fields as well, as evidenced by the little duck named Dudley from the K911 animal rescue service in Sicamous, Canada.

Not too long ago, Dudley lost a leg when a chicken in the same pen mauled him. But thanks to a 3-D printed leg design, especially made for him, he can now walk again. It was created by Terence Loring of 3 Pillar Designs, a company that specializes in 3D-printing architectural prototypes. After hearing of Dudley’s plight through a friend, he decided to see what he could do to help.

3D_buttercupfootUnlike a previous printed limb, the printed foot that was fashioned for Buttercup the Duck, Loring sought to create an entire limb that could move. The first limb he designed had a jointed construction, and was fully 3D-printed in plastic. Unfortunately, the leg broke the moment Dudley pit it on, forcing Loring to go back to the drawing board for a one-piece printed from softer plastic.

The subsequent leg he created had no joints and could bend on its own. And when Dudley put it on, he started walking straight away and without hesitation. Issues remain to be solved, like how to prevent friction sores – a problem that Mike Garey (who designed Buttercup’s new foot) solved with a silicone sock and prosthetic gel liner.

3D_Dudley2Nevertheless, Dudley is nothing if not as happy as a duck in a pond, and it seems very likely that any remaining issues will be ironed out in time. In fact, one can expect that veterinary medicine will fully benefit from the wide range of 3D printed prosthetic devices and even bionic limbs as advancement and research continues to produce new and exciting possibilities.

And in the meantime, enjoy the following videos which show both Amanda Boxtel and Dudley the duck enjoying their new devices and the ways in which they help bring mobility back to their worlds:

 

Amanda Boxtel taking her first steps in 22 years:

 


Dudley the duck walking again:


Sources: news.cnet.com, (2), (3), 3dsystems.com, 3pillardesigns.com

Biomedical Breakthroughs: Bionerves and Restored Sensation

restoring_mobilityThese days, advances in prosthetic devices, bionic limbs and exoskeletons continue to advance and amaze. Not only are doctors and medical researchers able to restore mobility and sensation to patients suffering from missing limbs, they are now crossing a threshold where they are able to restore these abilities and faculties to patients suffering from partial or total paralysis.

This should come as no surprise, seeing as how the latest biomedical advances – which involve controlling robotic limbs with brain-computer interfacing – offer a very obvious solution for paralyzed individuals. In their case, no robotic limbs or bionic attachments are necessary to restore ambulatory motion since these were not lost. Instead, what is needed is to restore motor control to compensate for the severed nerves.

braingate1Thanks to researchers working at Case Western University in Ohio, a way forward is being proposed. Here, a biomedical team is gearing up to combine the Braingate cortical chip, developed at Brown University, with their own Functional Electric Stimulation (FES) platform. Through this combination, they hope to remove robots from the equation entirely and go right to the source.

It has long been known that electrical stimulation can directly control muscles, but attempts to do this in the past artificially has often been inaccurate (and therefore painful and potentially damaging) to the patient. Stimulating the nerves directly using precisely positioned arrays is a much better approach, something that another team at Case Western recently demonstrated thought their “nerve cuff electrode”.

cuff-electrodeThis electrode is a direct stimulation device that is small enough to be placed around small segments of nerve. The Western team used the cuff to provide an interface for sending data from sensors in the hand back to the brain using sensory nerves in the arm. With FES, the same kind of cuff electrode can also be used to stimulate nerves going the other direction, in other words, to the muscles.

The difficulty in such a scheme, is that even if the motor nerves can be physically separated from the sensory nerves and traced to specific muscles, the exact stimulation sequences needed to make a proper movement are hard to find. To achieve this, another group at Case Western has developed a detailed simulation of how different muscles work together to control the arm and hand.

braingate2-img_assist_custom-500x288Their model consists of 138 muscle elements distributed over 29 muscles, which act on 11 joints. The operational procedure is for the patient to watch the image of the virtual arm while they naturally generate neural commands that the BrainGate chip picks up to move the arm. In practice, this means trying to make the virtual arm touch a red spot to make it turn green.

Currently in clinical trials, the Braingate2 chip is being developed with the hope of not only stimulating muscles, but generating the same kinds of feedback and interaction that real muscle movement creates. The eventual plan is that the patient and the control algorithm will learn together in tandem so that a training screen will not be needed at all and a patient will be able to move on their own without calibrating the device.

bionic-handBut at the same time, biotech enhancements that are restoring sensation to amputee victims are also improving apace. Consider the bionic hand developed by Silvestro Micerna of the École Polytechnique Fédérale de Lausanne in Switzerland. Unlike previous bionic hands, which rely on electrodes to receive nerve signals to control the hand’s movement, his device sends electronic signals back to simulate the feeling of touch.

Back in February of 2013, Micerna and his research team began testing their bionic hand, and began clinical trials on a volunteer just last month. Their volunteer, a man named Dennis Aabo Sørensen from Denmark, lost his arm in a car accident nine years ago, and has since become the first amputee to experience artificially-induced sensation in real-time.

prosthetic_originalIn a laboratory setting wearing a blindfold and earplugs, Sørensen was able to detect how strongly he was grasping, as well as the shape and consistency of different objects he picked up with his prosthetic. Afterwards, Sørensen described the experience to reporters, saying:

The sensory feedback was incredible. I could feel things that I hadn’t been able to feel in over nine years. When I held an object, I could feel if it was soft or hard, round or square.

The next step will involve miniaturizing the sensory feedback electronics for a portable prosthetic, as well as fine-tuning the sensory technology for better touch resolution and increased awareness about the movement of fingers. They will also need to assess how long the electrodes can remain implanted and functional in the patient’s nervous system, though Micerna’s team is confident that they would last for many years.

bionic-hand-trialMicerna and his team were also quick to point out that Sørensen’s psychological strength was a major asset in the clinical trial. Not only has he been forced to adapt to the loss of his arm nine years ago, he was also extremely willing to face the challenge of having experienced touch again, but for only a short period of time. But as he himself put it:

I was more than happy to volunteer for the clinical trial, not only for myself, but to help other amputees as well… There are two ways you can view this. You can sit in the corner and feel sorry for yourself. Or, you can get up and feel grateful for what you have.

The study was published in the February 5, 2014 edition of Science Translational Medicine, and represents a collaboration called Lifehand 2 between several European universities and hospitals. And although a commercially-available sensory-enhanced prosthetic may still be years away, the study provides the first step towards a fully-realizable bionic hand.

braingate_drinkassistYes, between implantable electronics that can read out brainwaves and nerve impulses, computers programs that are capable of making sense of it all, and robotic limbs that are integrated to these machines and our bodies, the future is looking very interesting indeed. In addition to restoring ambulatory motion and sensation, we could be looking at an age where there is no such thing as “permanent injury”.

And in the meantime, be sure to check out this video of Sørensen’s clinical trial with the EPFL’s bionic hand:


Sources:
extremetech.com, actu.epfl.ch, neurotechnology.com

The Future is Here: AirMule’s Autonomous Demo Flight

airmule1Vertical Take-Off and Landing craft have been the subject of military developers for some time. In addition to being able to deploy from landing strips that are damaged or small for conventional aircraft, they are also able to navigate terrain and land where other craft cannot. Add to that the ability to hover and fly close to the ground, and you have a craft that can also provide support while avoiding IEDs and landmines.

One concept that incorporates all of these features is the AirMule, a compact, unmanned, single-engine vehicle that is being developed by Tactical Robotics in Israel. In January of 2013, the company unveiled the prototype which they claimed was created for the sake of supporting military personnel,  evacuating the wounded, and conducting remote reconnaissance missions.

airmule-1Now, less than a year later, the company conducted a demonstration with their prototype aircraft recently demonstrated its ability to fly autonomously, bringing it one step closer to carrying out a full mission demo. During the test, which took place in December, the craft autonomously performed a vertical take-off, flew to the end of a runway, then turned around on the spot and flew back to its starting point.

All the while, it maintained altitude using two laser altimeters, while maintaining positioning via a combination of GPS, an inertial navigation system, and optical reference to markers on the ground. These autonomous systems, which allow it to fly on its own, can also be countermanded in favor of remote control, in case a mission seems particularly harry and requires a human controller.

airmule-0In its current form, the AirMule possesses many advantages over other VTOL craft, such as helicopters. For starters, it weighs only 770 kg (1,700 lb) – as opposed to a Bell UH-1 empty weights of 2,365 kg (5,215 lbs) – can carry a payload of up to 640 kg (1,400 lb), has a top speed of 180 km/h (112 mph), and can reach a maximum altitude of 12,000 ft (3,658 m).

In short, it has a better mass to carrying capacity ratio than a helicopter, comparable performance, and can land and take-off within an area of 40 square meters (430.5 sq ft), which is significantly smaller than what a manned helicopter requires for a safe landing. The internal rotor blades are reportedly also much quieter than those of a helicopter, giving the matte-black AirMule some added stealth.

BD_atlasrobotPlans now call for “full mission demonstrations” next year, utilizing a second prototype that is currently under construction. And when complete, this vehicle and those like it can expected to be deployed to many areas of the world, assisting Coalition and other forces in dirty, dangerous environments where landmines, IEDs and other man-made and natural hazards are common.

Alongside machines like the Alpha Dog, LS3 or Wildcat, machines that were built by Boston Dynamics (recently acquired by Google) to offer transport and support to infantry in difficult terrain, efforts to “unman the front lines” through the use of autonomous drones or remote-controlled robots continue. Clearly, the future battlefield is a place where robots where will be offering a rather big hand!

 

And be sure to check this video of the AirMule demonstration, showing the vehicle take-off, hover, fly around, and then come in for a landing:


Sources: gizmag.com, tactical-robotics.com