Update: 3D-Printed Gun Faces Crackdown

defense-distributed-liberator,Z-M-383602-13Just a few days ago, Defense Distributed announced the creation of the world’s first gun that is made entirely out of 3D-printed parts. And as anticipated, it didn’t take long for a crackdown to ensue. The group’s leader Cody Wilson, after conducting the first successful firing test of “The Liberator”, claimed that the blueprints would be uploaded to the open-source website Defcad so they would be available to anyone.

Yesterday, less than a week after the announcement was made, Mr. Wilson claimed that Defcad is “going dark” at the request of the U.S. Department of Defense Trade Controls. Defense Distributed runs the website, which has been a provider of weapons-related 3D printer blueprints since the group was founded.

Defense Distributed new magazines

As of yesterday, the site contained only a brief message explaining why it the Liberator blueprints were no longer available:

Defcad files are being removed from public access at the request of the U.S. Department of Defence Trade Controls. Until further notice, the United States government claims control of the information.

The group’s twitter feed also contained the following message:

#DEFCAD has gone dark at the request of the Department of Defense Trade Controls. Take it up with the Secretary of State.

The weapon itself was the result of eight months of research and testing on behalf of Wilson and his group. In that time, the group has become a source of controversy due to their dedication to making blueprints for printable gun parts available online. These include components for AR-15 assault weapon and extended magazines for an AK-47 assault rifle.

defense_distmagHowever, the Liberator, named in honor of the single-shot pistols that were dropped on France during the Second World War, was the first set of blueprints that was made entirely out of ABS plastic, making it the first open-source “Wiki-weapon” that would be available to anyone with the means to print it.

As a result of their commitment to open-source weaponry, Defense Distributed has become the subject of penalties and restrictions. In fact, Defcad was created after Makerbot Industries chose to purge all of the group’s gun blueprints from the website. Shortly after they test-fired an AR-15 that included printed parts, Wilson and his associates also had their 3D printer, which they had been leasing, seized.

defense_dist1This latest decision targets their activities at their source. However, the decision to take the plans off of Defcad did not present an estimated 10,000 downloads. However, it is not clear if those who obtained the plans will be able to print them off at their local printing shop. Only those who already possess a 3D printing unit, which is likely to run them between $1000 and $3000 dollars will be able to produce their own version of the Liberator.

In short, this issue is not yet resolved. Knowing Wilson and his admirers, open-source, printable weapons are likely to remain a contentious issue for some time to come…

Source: cbc.ca

The Future is Here: The 3D-printed Robotic Hand

robotic_handThe field of robotic has been advancing by leaps and bounds in recent years, especially where robotic limbs and prosthetics are concerned. But until recently, cost has remained an issue. With top of the line bionic limbs – like the BeBionic which costs up to $35,000 = most amputees simply can’t afford them. Little surprise then why there are many efforts to create robotic limbs that are both cheaper and more accessible.

Last month, DARPA announced the creation of a robotic hand that could perform complex tasks, and which was made using cheap electronic components. And then there’s Robohand, the online group that creates 3D-printed robotic hands for children with a free, open-source 3D-printing pattern available on Thingiverse for people who wish to make their own.

robotic_hand2

And now, Christopher Chappell of the U.K. wants to do take things a step further with his “Anthromod”. Using Kickstarter, a crowdfunding website, he has started a campaign for a 3D-printed robotic hand that is a little bit more sophisticated than the Robohand, but would cost around $450. In short, the proposed design offers the ambulatory ability of a bionic limb, but at a cost that is far more affordable.

To break it down, the arm uses a tendon system of elastic bands with the movement being provided by five Hobby Servos, which are in turn built out of off-the-shelf electronics. Wearers will be able to move all four of the units fingers, thumb and wrist, once the sensors have been calibrated, and the software to control the hand and EEG sensors is available online for free. This all adds up to a unit that is not only more affordable, but easy to assemble, repair and maintain.

robotic_hand3On their Kickstarter page, Chappell describes his campaign and their long-term goals:

Our Kickstarter campaign is to develop a humanoid robotic hand and arm that is of far lower cost than any other available. We believe that this will open up robotics to a far wider market of makers and researchers than has ever been possible. This should then trigger an explosion of creativity in the areas of robotics, telepresence and ultimately prosthetics.

Much like the InMoov, a 3D printed android with limited function, the Anthromod represents an age of robotics that are accessible to the public. And with time, its not hard to imagine an entire line of enhancements and robotics, such as household servants and cybernetic components, that could be manufactured in-house, provided you’re willing to shell out the money for a industrial-sized 3D printer!

To check out the Anthromod website, click here. And be sure to check out the video below of their hand in action.

Note: As of this article’s writing, Chappell and his colleagues passed their goal of £10,000 and reached a whopping total of £12,086 (18,808 dollars US). Congratulations folks!


Sources:
news.cnet.com, kickstarter.com

 

The World’s First Completely 3D-Printed Gun

liberatorSince it’s inception, 3D printing has offered people a wide range of manufacturing possibilities, ranging from the creation of intricate prototypes to drugs and even human tissue. However, one of the most controversial manufactured items to come from the technology has been what the Texas-based organization known as Defense Distributed refers to as “Wiki-weapons”, guns that can be made by anyone using downloaded blueprints and a public printer.

DD_gunsNot long ago, the group announced that they had successfully created a working AR-15 assault weapon using some printed parts. This drew sharp criticism from advocates of gun control, in part because the same weapon was used in the Newton, Connecticut school shooting. However, Cody Wilson, founder of DD, announced that they would continue to pursue their goal of making printed guns, stating that their commitment to the 2nd Amendment took precedence over a single tragedy.

And now, it appear that they have gone a step further, unveiling the world’s first fully 3D-printed weapon. Save for a nail which is used as the firing pin, the gun is made up entirely of printed parts, can fire normal ammunition and is capable of making it past a metal detector. It’s called the Liberator, the product of eight months of labor by Cody and his group, and named in honor of the one-shot pistols that were airdropped by the Allies on France during the Second World War.

DD_liberatorIn an interview with Forbes, Cody and his group demonstrated their first test firing, which was a success. He also claimed that the Liberator will be capable of connecting to different barrels, allowing it to fire various calibers of ammunition. He also plans to publish the files necessary to print it at home as well as details on its operation so that anyone can produce their own.

This is all in keeping with Cody’s vision – being a hardcore libertarian and anarchist – to create a class of weapon that anyone can produce, circumventing the law and the regulatory process. At the same time though, Distributed Defense did decide to include a small chunk of metal in the final design to ensure that the gun couldn’t pass through a metal detector undetected. This is in compliance with the Undetectable Firearms Act, and may have been motivated by the group’s sagging public image.

Defense_DistributedHowever, this has not stopped the group from obtaining a federal firearms license this past March, making it a legal gun manufacturer. And once the file is online, anybody will be able to download it. What’s more, all attempts to limit DD’s activities, which include printing firms purging gun parts from their databases, has made Cody even more eager to pursue his aims. In a statement made to Forbes magazine, he said:

You can print a lethal device. It’s kind of scary, but that’s what we’re aiming to show… Everyone talks about the 3D printing revolution. Well, what did you think would happen when everyone has the means of production? I’m interested to see what the potential for this tool really is. Can it print a gun?

Well, Mr. Wilson, we’re about to find out! And if I were a betting man, I would say it the “potential” will include more unregistered firearms, a terrorist act or shooting that will involve a partially printed weapon, and Wilson’s continued intransigence to reform his ways, citing the 2nd Amendment as always. Libertarians are nothing if not predictable!

Sources: tech.fortune.cnn.com, forbes.com

 

The Future is Here: Using 3D Printing and DNA to Recreate Faces

strangervisions-1In what is either one of the most novel or frightening stories involving 3D printing and genetic research, it seems that an artist named Heather Dewey-Hagborg has been using the technology to recreate the faces of litterbugs. This may sound like something out of a dystopian novel – using a high-tech scenario to identify perpetrators of tiny crimes – but in fact, it is the basis of her latest art project.

It’s known as Stranger Visions, a series of 3D printed portraits based on DNA samples taken from objects found on the streets of Brooklyn. Using samples of discarded gum and litter collected from the streets, a her work with a DIY biology lab in Brooklyn called Genspace – where she met a number of biologists who taught her everything she now knows about molecular biology and DNA – she was able to reconstruct what the strangers looked like and then printed the phenotypes out as a series of 3D portraits.

According to Dewey-Hagborg, the inspiration for this project came to her while waiting for a therapy session, when she noticed a framed print on the wall that contained a small hair inside the cracked glass. After wondering who the hair belonged to, and what the person looked like, she became keenly aware of the genetic trail left by every person in their daily life, and began to question what physical characteristics could be identified through the DNA left behind on a piece of gum or cigarette butt.

strangervisions-3In a recent interview, Dewey-Hagborg explained the rather interesting and technical process behind her art:

So I extract the DNA in the lab and then I amplify certain regions of it using a technique called PCR – Polymerase Chain Reaction. This allows me to study certain regions of the genome that tend to vary person to person, what are called SNPs or Single Nucleotide Polymorphisms.

I send the results of my PCR reactions off to a lab for sequencing and what I get back are basically text files filled with sequences of As, Ts, Cs, and Gs, the nucleotides that compose DNA. I align these using a bioinformatics program and determine what allele is present for a particular SNP on each sample.

strangervisions-5

Then I feed this information into a custom computer program I wrote which takes all these values which code for physical genetic traits and parameterizes a 3d model of a face to represent them. For example gender, ancestry, eye color, hair color, freckles, lighter or darker skin, and certain facial features like nose width and distance between eyes are some of the features I am in the process of studying.

I add some finishing touches to the model in 3d software and then export it for printing on a 3d printer. I use a Zcorp printer which prints in full color using a powder type material, kind of like sand and glue.

The resulting portraits are bizarre approximations of anonymous people who unknowingly left their genetic material on a random city street. Naturally, there are plenty of people who wonder how accurate her approximations are. Well, according to Dewey-Hagborg, the portraits bear a “family resemblance” to the subject, and at this time, no person has never recognized themselves in any of her exhibitions. Yet…

strangervisions-4And of course, there are limitations with this sort of phenotype-DNA identification. For starters, it is virtually impossible to determine the age of a person from their DNA alone. In addition, facial features like scars and hair growth cannot be gauged, so Dewey-Hagborg casts each portrait as if the person were around 25 years of age.

And yet, I cannot help but feel that there is some awesome and terrible potential in what Dewey-Hagborg has created here. While her artistic vision had to do with the subject of identity and anonymity in our society, there is potential here for something truly advanced and invasive. Already it has been considered that DNA identification could be the way of the future, where everyone’s identity is kept in a massive database that can either be used to track them or eliminate as suspects in criminal cases.

But in cases where the person’s DNA is not yet on file, police would no longer need to rely on sketch artists to identify potential perps. Instead, they could just reconstruct their appearances based on a single strand of DNA, and use existing software to correct for age, hair color, facial hair, scars, etc, and then share the resulting images with the public via a public database or press releases.

strangervisions-2And as Dewey-Hagborg’s own project shows, the potential for public exposure and identification is huge. With a sophisticated enough process and a quick turnover rate, cities could identity entire armies of litterbugs, polluters, petty criminals and even more dangerous offenders, like pedophiles and stalkers, and publicly shame them by posting their faces for all to see.

But of course, I am forced to acknowledge that Dewey-Hagborg conducted this entire project using a DIY genetics lab and through her own ardent collection process. Whereas some would see here an opportunity for Big Brother to mess with our lives, others would see further potential for a democratic, open process where local communities are able to take genetics and identification into their own hands.

Like I said, the implications and potential being shown here are both awesome and scary!

Source: thisiscolossal.com

NASA’s 3D Printed Moon Base

ESA_moonbaseSounds like the title of a funky children’s story, doesn’t it? But in fact, it’s actually part of NASA’s plan for building a Lunar base that could one day support inhabitants and make humanity a truly interplanetary species. My thanks to Raven Lunatick for once again beating me to the punch! While I don’t consider myself the jealous type, knowing that my friends and colleagues are in the know before I am on stuff like this always gets me!

In any case, people may recall that back in January of 2013, the European Space Agency announced that it could be possible to build a Lunar Base using 3D printing technology and moon dust. Teaming up with the architecture firm Foster + Partners, they were able to demonstrate that one could fashion entire structures cheaply and quite easily using only regolith, inflatable frames, and 3D printing technology.

sinterhab2_1And now, it seems that NASA is on board with the idea and is coming up with its own plans for a Lunar base. Much like the ESA’s planned habitat, NASA’s would be located in the Shackleton Crater near the Moon’s south pole, where sunlight (and thus solar energy) is nearly constant due to the Moon’s inclination on the crater’s rim. What’s more, NASA”s plan would also rely on the combination of lunar dust and 3D printing for the sake of construction.

However, the two plans differ in some key respects. For one, NASA’s plan – which goes by the name of SisterHab – is far more ambitious. As a joint research project between space architects Tomas Rousek, Katarina Eriksson and Ondrej Doule and scientists from Nasa’s Jet Propulsion Laboratory (JPL), SinterHab is so-named because it involves sintering lunar dust: heating it up with microwaves to the point where the dust fuses to become a solid, ceramic-like block.

This would mean that bonding agents would not have to be flown to the Moon, which is called for in the ESA’s plan. What’s more, the NASA base would be constructed by a series of giant spider robots designed by JPL Robotics. The prototype version of this mechanical spider is known as the Athlete rover, which despite being a half-size variant of the real thing has already been successfully tested on Earth.

athlete_robotEach one of these robots is human-controlled, has six 8.2m legs with wheels at the end, and comes with a detachable habitable capsule mounted at the top. Each limb has a different function, depending on what the controller is looking to do. For example, it has tools for digging and scooping up soil samples, manipulators for poking around in the soil, and will have a microwave 3D printer mounted on one of the legs for the sake of building the base. It also has 48 3D cameras that stream video to its operator or a remote controlling station.

The immediate advantages to NASA’s plan are pretty clear. Sintering is quite cheap, in terms of power as well as materials, and current estimates claim that an Athlete rover should be able to construct a habitation “bubble” in only two weeks. Another benefit of the process is that astronauts could use it on the surface of the Moon surrounding their base, binding dust and stopping it from clogging their equipment. Moon dust is extremely abrasive, made up of tiny, jagged morcels rather than finely eroded spheres.

sinterhab3Since it was first proposed in 2010 at the International Aeronautical Congress, the concept of SinterHab has been continually refined and updated. In the end, a base built on its specifications will look like a rocky mass of bubbles connected together, with cladding added later. The equilibrium and symmetry afforded in this design not only ensures that grouping will be easy, but will also guarantee the structural integrity and longevity of the structures.

As engineers have known for quite some time, there’s just something about domes and bubble-like structures that were made to last. Ever been to St. Peter’s Basilica in Rome, or the Blue Mosque in Istanbul? Ever looked at a centuries old building with Onion Dome and felt awed by their natural beauty? Well, there’s a  reason they’re still standing! Knowing that we can expect similar beauty and engineering brilliance down the road gives me comfort.

In the meantime, have a gander at the gallery for the proposed SinterHab base, and be sure to check out this video of the Athlete rover in action:

Source: Wired.co.uk, robotics.jpl.nasa.gov

The Future is Here: The 3D Scanner!

makerbot3dOkay, that title might be a bit of a mislead, but after years of developing the technology, it seems that we might have something which is essentially the next best thing. Until recently, 3D printers were designed for use exclusively by trained technicians. And despite the ease with which modern 3D printers can be used, it is still difficult to design and prep the requisite models, which still requires expertise in modelling software.

But that too could be changing, thanks to the new MakerBot’s Digitizer Desktop 3D Scanner. Designed to supplement their printer (the Replicator 2) this device is capable of scanning any object, creating a three-dimensional model, and then uploading it to your printer where it will then be molded into solid form, bit by bit. In effect, people can now create objects as easily as they could print off an inkjet document.

makerbot-replicator2The Digitizer was revealed for the first time at the South by South West (SXSW) Conference in Austin, Texas earlier today, where emerging technologies are being showcased alongside the latest in entertainment and music. And while the device was merely a prototype, one which is still undergoing testing and refining, MakerBot announced that they plan to begin commercial production very soon.

Which makes it official. Human beings now have access to Replicator technology and is one step closer to living in a Star Trek universe! Granted, were not quite to the point where we can generate anything, including food and precious metals, but this latest development has revealed to us a future where DIY can encompass just about anything. If a an appliance breaks in your home, just scan the faulty component and download it into your printer. No need to contact the manufacturer and activate that troublesome warranty!

Foods comes out here
Foods comes out here, waste goes in the recycling unit!

Naturally, there are concerns about the controversy this will create as well. While the Digitizer Desktop 3D Scanner will certainly be another big step towards making 3D printing more accessible, it’s also sure to add fuel the debate over the legality and copyright issues of duplicating real world objects. What’s more, the cost of each unit (most likely a few thousand dollars each) is pretty prohibitive for most households, raising the question of real access.

Nevertheless, this is still pretty exciting news. Since the beginning of recorded history, our collective economic models have been based on the idea of resource scarcity. But with further refinements and the ability to generate objects out of more materials (including the organic), all our economic models are likely to change and we could very well be embarking on a future where scarcity has effectively become obsolete.

Yes, with a little more time, research and cool gadgets, we could be witnessing the collapse of financial history. Where all units of value will be made useless and as a species, we’ll be one step closer to economic equilibrium! And I have to admit, this is one area of change that I find exciting, as opposed to scary!

Source: Gizmodo.com

The Future is Here: The 3D Doodler

3d_doodler_home3D printing technology has been making some serious waves in the scientific, engineering and biomedical community. It seems that every day, more and more possibilities and applications are being discovered for this revolutionary new process. However, cost remains an obvious issue. With an individual printers being large, expensive and trying to maintain, most people can’t exactly afford to put one in their home and go town with it.

Enter the 3D Doodler, the world’s first 3-dimensional printing pen that allows you to draw in the air. Much like a standard 3D printer, it employs heated ABS plastic which then cools the moment it is excreted from the devices, instantly cooling and taking shape. And much like a pen or pencil, it is compact, hand-held, and allows people to literally draw designs into being! It also requires no software or computers, making it inexpensive and easy to use.

3D_doodlerAnd the range of what one can create is pretty much limitless. Using stencils, one can create 3D models for architecture, design specs, and proposed prototypes. Or, if you should so choose, just put the pen to any surface and begin composing shapes, designs and words out of thin air. Art is also an obvious application, since it give the user the ability to create an endless array of abstract or realistic designs. And of course, modeling, as shown in the video, could become a very popular (and competitive) outlet for its use.

3D_doodler1

And in another ultra-modern twist, the designers of the 3D Doodler are using their website to elicit funds to help them crowd-fund their idea and make it commercially viable. No trouble there! Of the $30,000 needed to get the prototype off the ground, they have managed to illicit a total of $2,106,977 as of this article’s publication. Guess people really do want to see these things getting onto the shelf. Look for it at your local hardware or art supply store!

And while you’re waiting, check out the video below to see the pen in action and what things it can make.

Robots Meet the Fashion Industry

robot_fashionRobotics has come a long way in recent years. Why, just take a look at NASA’s X1 Robotic exoskeleton, the Robonaut, robotaxis and podcars, the mind-controlled EMT robot suit, Stompy the giant robot, Kenshiro and Roboy, and the 3D printed android. I suppose it was only a matter of time before the world of fashion looked at this burgeoning marketplace and said “me too!”

And here are just some of the first attempts to merge the two worlds: First up there’s the robot mannequin, a means of making window shopping more fun for consumers. Known as the MarionetteBot, this automaton has already made several appearances in shops in Japan and can expected to be making debut appearances across Asia, in North America and the EU soon enough!

Check out the video below to see the robot in action. Designed by the Japanese robotics company United Arrows, the mannequin uses a Kinect to capture and help analyze the movements of a person while a motor moves a total of 16 wires to match the person’s pose. Though it is not yet fast or limber enough to perfectly mimic the moves of a person, the technology shows promise, and has provided many a window-shopper with plenty of entertainment!


And next up, there’s the equally impressive FitBot, a shape-shifting mannequin that is capable of emulating thousands of body types. Designed by the British virtual shopping company Fits.Me, the FitBot is designed to help take some of the guesswork out of online shopping, where a good 25% of purchases are regularly returned because they were apparently the wrong size.

But with the FitBots, along with a virtual fitting room, customers will be able to see right away what the clothes will look like on them. The only downside is you will have to know your exact measurements, because that’s what the software will use to adjust the bot’s body. Click here to visit the company’s website and see how the virtual fitting room works, and be sure to check out there video below:


What does the future hold for the fashion industry and high-tech? Well, already customers are able to see what they look like using Augmented Reality technology displays, and can get pictures thanks to tablet and mobile phone apps that can present them with the image before making a purchase. Not only does it take a lot of the legwork out of the process, its much more sanitary as far as trying on clothes is concerned. And in a world where clothing can be printed on site, it would be downright necessary.

The "magic mirror"
The “magic mirror”

But in the case of online shopping, its likely to take the form of a Kinect device in your computer, which scans your body and lets you know what size to get. How cool/lazy would that be? Oh, and as for those AR displays that put you in the clothes you want? They should come with a disclaimer: Objects in mirror are less attractive than they appear!

Source: en.akihabaranews.com, technabob.com

Biotech News: Artificial Ears and Bionic Eyes!

3d_earLast week was quite the exciting time for the field of biotechnology! Thanks to improvements in 3D printing and cybernetics – the one seeking to use living cells to print organic tissues and the other seeking to merge the synthetic with the organic – the line between artificial and real is becoming blurrier all the time. And as it turns out, two more major developments were announced just last week which have blurred it even further.

The first came from Cornell University, where a team of biotech researchers demonstrated that it was possible to print a replacement ear ear using a 3D printer and an injection of living cells. Using a process the team refers to as “high-fidelity tissue engineering”,  they used the cartilage from a cow for the ears interior and overlaid it with artificially generated skin cells to produce a fully-organic replacement.

3dstemcellsThis process builds on a number of breakthroughs in recent years involving 3D printers, stem cells, and the ability to create living tissue by arranging these cells in prearranged fashions. Naturally, the process is still in its infancy; but once refined, it will allow biomedical engineers to print customized ears for children born with malformed ones, or people who have lost theirs to accident or disease.

What’s more, the Cornell research team also envision a day in the near future when it’ll be possible to cultivate enough of a person’s own tissue so that the growth and implantation can happen all within the lab. And given recent the breakthrough at Wake Forest Institute of Regenerative Medicine- where researchers were able to create printed cartilage – it won’t be long before all the bio-materials can be created on-site as well.

Eye-cameraThe second breakthrough, which also occurred during this past week, took place in Germany, where researchers unveiled the world’s first high-resolution, user-configurable bionic eye. Known officially as the “Alpha IMS retinal prosthesis”, the device comes to us from the University of of Tübingen, where scientists have been working for some time to build and improve upon existing retinal prosthetics, such as Argus II – a retinal prosthesis developed by California-based company Second Sight.

Much like its predecessor, the Alpha IMS helps to restore vision by imitating the functions of a normal eye, where light is converted into electrical signals your retina and then transmitted to the brain via the optic nerve. In an eye that’s been afflicted by macular generation or diabetic retinophathy, these signals aren’t generated. Thus, the prosthetic works by essentially replacing the damaged piece of your retina with a computer chip that generates electrical signals that can be understood by your brain.

biotech_retinal-implantBut of course, the Alpha IMS improves upon previous prosthetics in a number of ways. First, it is connected to your brain via 1,500 electrodes (as opposed to the Argus II’s 60 electrodes) providing unparalleled visual acuity and resolution. Second, whereas the Argus II relies on an external camera to relay data to the implant embedded in your retina, the Alpha IMS is completely self-contained. This allows users to swivel the eye around as they would a normal eye, whereas the Argus II and others like it require the user to turn their head to change their angle of sight.

Here too the technology is still in its infancy and has a long way to go before it can outdo the real thing. For the most part, bionic eyes are still rely heavily on the user’s brain to make sense of the alien signals being pumped into it. However, thanks to the addition of configurable settings, patients have a degree of control over their perceived environment that most cannot begin to enjoy. So really, its not likely to be too long before these bionic implants improve upon the fleshy ones we come equipped with.

biotech_dnaWow, what a week! It seems that these days, one has barely has to wait at all to find that the next big thing is happening right under their very nose. I can foresee a future where people no longer fear getting into accidents, suffering burns, or losing their right eye (or left, I don’t discriminate). With the ability to regrow flesh and cartilage, and replace organic tissues with bionic ones, there may yet come a time when a human can have a close-shave with death and be entirely rebuilt.

I foresee death sports becoming a hell of a lot more popular in this future… Well, crap on me! And while we’re waiting for this future to occur, feel free to check out this animated video of the Alpha IMS being installed and how it works:


Sources:
IO9.com, Extremetech.com

The Autonomous Robotic 3D Printer!

Robo-printerTechnophiles and fans of post-apocalyptic robo-fiction, your attention please! As if the field of 3D printing was not already impressive and/or scary enough, it seems that patents have been filed for the creation of a machine that can perform the job autonomously. It’s called the Robotic Fabricator, a robot-assisted all-in-one design that can print, mill, drill, and finish a final product — and all without human intervention.

Typically, 3D printers require human handlers to oversee the production process, removing unwanted materials such as burrs on plastic and metal parts, repositioning and removing printed objects, getting rid of powdery residue from the interiors of intricate structures. But this machine, once complete, will take away the need for an operator entirely.

Roomba780_oben

The company responsible for this new concept is iRobot (no joke), the same people who brought us the Roomba vacuum robot. It features a flexible pair of robot arms and grippers that exhibit an impressive six degrees of freedom. And the platform is equipped with a series of sensors that tells the computer where it’s at in terms of the production, and when to employ the additive technique of 3D printing or the subtractive technique of milling and drilling.

3D_robotprinter

Naturally, iRobot plans to make the machine readily available to industries for the sake of producing and repairing a wide-range of consumer products. In terms of materials, the company claims it will be able to handle everything from ABS, polycarbonate, and silicone rubbers, to urethane rubbers, plastics, and low-melting-temperature metals, as well as combinations of these. What’s more, it will even be able to manufacture components for more autonomous 3D printers!

Picture it, if you dare. If this machine proves successful, it may very well become the precursor for a new breed of machinery that can assemble just about anything from scratch – including itself! As Futurists and Apocalyptics love to point out, machines that are capable of self-replicating and producing new and ever-increasing complex forms machinery is the key to the future, or to Armageddon.

Both fine choices, depends on what floats your boat!

Source: IO9.com, www.3ders.org