Jack Andraka is at it again! For those who follow this blog (or subscribe to Forbes or watch TED Talks), this young man probably needs no introduction. But if not, then you might not known that Andraka is than the young man who – at 15 years of age – invented an inexpensive litmus test for detecting pancreatic cancer. This invention won him first prize at the 2012 Intel International Science and Engineering Fair (ISEF), and was followed up less than a year later with a handheld device that could detect cancer and even explosives.
And now, Andraka is back with yet another invention: a biosensor that can quickly and cheaply detect water contaminants. His microfluidic biosensor, developed with fellow student Chloe Diggs, recently took the $50,000 first prize among high school entrants in the Siemens We Can Change the World Challenge. The pair developed their credit card-sized biosensor after learning about water pollution in a high school environmental science class.
We had to figure out how to produce microfluidic [structures] in a classroom setting. We had to come up with new procedures, and we custom-made our own equipment.
According to Andraka, the device can detect six environmental contaminants: mercury, lead, cadmium, copper, glyphosate, and atrazine. It costs a dollar to make and takes 20 minutes to run, making it 200,000 times cheaper and 25 times more efficient than comparable sensors. At this point, make scaled-down versions of expensive sensors that can save lives has become second nature to Andraka. And in each case, he is able to do it in a way that is extremely cost-effective.
For example, Andraka’s litmus test cancer-detector was proven to be 168 times faster than current tests, 90% accurate, and 400 times more sensitive. In addition, his paper test costs 26,000 times less than conventional methods – which include CT scans, MRIs, Ultrasounds, or Cholangiopancreatography. These tests not only involve highly expensive equipment, they are usually administered only after serious symptoms have manifested themselves.
In much the same vein, Andraka’s handheld cancer/explosive detector was manufactured using simple, off-the-shelf and consumer products. Using a simple cell phone case, a laser pointer and an iPhone camera, he was able to craft a device that does the same job as a raman spectrometer, but at a fraction of the size and cost. Whereas a conventional spectrometer is the size of a room and costs around $100,000, his handheld device is the size of a cell phone and costs $15 worth of components.
As part of the project, Diggs and Andraka also developed an inexpensive water filter made out of plastic bottles. Next, they hope to do large-scale testing for their sensor in Maryland, where they live. They also want to develop a cell-phone-based sensor reader that lets users quickly evaluate water quality and post the test results online. Basically, its all part of what is fast becoming the digitization of health and medicine, where the sensors are portable and the information can be uploaded and shared.
This isn’t the only project that Andraka has been working on of late. Along with the two other Intel Science Fair finalists – who came together with him to form Team Gen Z – he’s working on a handheld medical scanner that will be entered in the Tricorder XPrize. This challenge offers $10 million to any laboratory or private inventors that can develop a device that can diagnose 15 diseases in 30 patients over a three-day period. while still being small enough to carry.
For more information on this project and Team Gen Z, check out their website here. And be sure to watch their promotional video for the XPrize competition:
Source: fastcoexist.com

When it comes to the future, it is clear that the concept of the “Internet of Things” holds sway. This idea – which states that all objects will someday be identifiable thanks to a virtual representations on the internet – is at the center of a great deal of innovation that drives our modern economy. Be it wearables, wireless, augmented reality, voice or image recognition, that which helps us combine the real with the virtual are on the grow.
As Ambarish Mitra, the head of Blippar stated, AR is already gaining traction among consumers thanks to some of the world’s biggest industrial players recognizing the shift to visually mediated lifestyles. Examples include IKEA’s interactive catalog, Heinz’s AR recipe booklet or Amazon’s recent integration of the Flow AR technology into its primary shopping app. As this trend continues, we will need a Wikipedia-like database for 3-D objects that will be available to us anytime, anywhere.
For better or for worse, wearable designs of consumer electronics have come to reflect a new understanding in the past few years. Basically, they have come to be extensions of our senses, much as Marshall McCluhan wrote in his 1964 book Understanding Media: The Extensions of Man. Google Glass is representative of this revolutionary change, a step in the direction of users interacting with the environment around them through technology.
Augmented reality has already proven itself to be a multi-million dollar industry – with 60 million users and around half a billion dollars in global revenues in 2013 alone. It’s expected to exceed $1 billion annually by 2015, and combined with a Google-Glass type device, this AR could eventually allow individuals to build vast libraries of data that will be the foundation for finding any 3-D object in the physical world.
In a hilarious appearance on “Last Week Tonight” – John Oliver’s HBO show – guest Stephen Hawking spoke about some rather interesting concepts. Among these were the concepts of “imaginary time” and, more interestingly, artificial intelligence. And much to the surprise of Oliver, and perhaps more than a few viewers, Hawking’s was not too keen on the idea of the latter. In fact, his predictions were just a tad bit dire.
At worst, this could lead to the machines concluding that humanity is no longer necessary. At best, it would lead to an earthly utopia where machines address all our worries. But in all likelihood, it will lead to a future where the pace of technological change will impossible to predict. As history has repeatedly shown, technological change brings with it all kinds of social and political upheaval. If it becomes a runaway effect, humanity will find it impossible to keep up.
This past week, the Electronic Entertainment Expo (commonly referred to as E3) kicked off. This annual trade fair , which is presented by the Entertainment Software Association (ESA), is used by video game publishers, accessory manufacturers, and members of the computing industry to present their upcoming games and game-related merchandise. The festivities wrapped up this Friday, and was the source of some controversy and much speculation.
The companies have some respective big guns in the works, such as Halo 5: Guardians and Uncharted 4: A Thief’s End, but they’re also scheduled for release in 2015. However, with the brisk sales of the Xbox One and PlayStation 4 consoles, both companies have the luxury of taking their time with big games. Nintendo is not so fortunate, since the jump they made with the Wii U leaves them with a big gap that they aren’t apparently filling.
The software giant bumbled the Xbox One launch last year and alienated many gamers, mainly by focusing on TV and entertainment content instead of gaming and tying several unpopular policies to the console, which included restrictions on used games. The company eventually relented, but the Xbox One still came bundled with the voice- and motion-sensing Kinect peripheral and a price tag that was $100 higher than Sony’s rival PlayStation 4.
That was certainly the focus for Microsoft at E3. TV features weren’t even mentioned during the company’s one-and-a-half-hour press conference on Monday, with Microsoft instead talking up more than 20 upcoming games. As Mike Nichols, corporate vice-president of Xbox and studios marketing, said in an interview:
But this new virtual reality headset, which was recently bought by Facebook for $2 billion, was undeniably the hottest thing on the show floor. And the demo booth, where people got to try it on and take it for a run, was booked solid throughout the expo. Sony also wowed attendees with demos of its own VR headset, Project Morpheus. And while the PlayStation maker’s effort isn’t as far along in development as the Oculus Rift, it does work and adds legitimacy to the VR field.
Legendary Japanese creator Hideo Kojima also had to defend the torture scenes in his upcoming Metal Gear Solid V: The Phantom Pain, starring Canadian actor Kiefer Sutherland (man loves torture!), which upset some viewers. Kojima said he felt the graphic scenes were necessary to explain the main character’s motivations, and that games will never be taken seriously as culture if they can’t deal with sensitive subjects.
If one were to draw any conclusions from this year’s E3, it would undoubtedly be that times are both changing and staying the same. From console gaming garnering less and less of the gamers market, to the second coming of virtual reality, it seems that there is a shift in technology which may or may not be good for the current captains of industry. At the same time, competition and trying to maintain a large share of the market continues, with Sony, Microsoft and Nintendo at the forefront.
The 2014 FIFA World Cup made history when it opened in Sao Paolo this week when a 29-year-old paraplegic man named Juliano Pinto kicked a soccer ball with the aid of a robotic exoskeleton. It was the first time a mind-controlled prosthetic was used in a sporting event, and represented the culmination of months worth of planning and years worth of technical development.
The result of many years of development, the mind-controlled exoskeleton represents a breakthrough in restoring ambulatory ability to those who have suffered a loss of motion due to injury. Using metal braces that were tested on monkeys, the exoskeleton relies on a series of wireless electrodes attached to the head that collect brainwaves, which then signal the suit to move. The braces are also stabilized by gyroscopes and powered by a battery carried by the kicker in a backpack.


The study was published online late last month in Lab on a Chip. The study’s senior author, Ali Khademhosseini – PhD, biomedical engineer, and director of the BWH Biomaterials Innovation Research Center – explained the challenge and their goal as follows:
They were also able to successfully embed these functional and perfusable microchannels inside a wide range of commonly used hydrogels, such as methacrylated gelatin or polyethylene glycol-based hydrogels. In the former case, the cell-laden gelatin was used to show how their fabricated vascular networks functioned to improve mass transport, cellular viability and cellular differentiation. Moreover, successful formation of endothelial monolayers within the fabricated channels was achieved.

The OPALS system sought out and locked onto a laser beacon from the Optical Communications Telescope Laboratory ground station at the Table Mountain Observatory in Wrightwood, California. It then transmitted its own 2.5-watt, 1,550-nanometer laser and modulated it to send the video at a peak rate of 50 megabits per second. According to NASA, OPALS transmitted the video in 3.5 seconds instead of the 10 minutes that conventional radio would have required.

The Optionally Piloted Black Hawk (OPBH) operates under Sikorsky’s Manned/Unmanned Resupply Aerial Lifter (MURAL) program, which couples the company’s advanced Matrix aviation software with its man-portable Ground Control Station (GCS) technology. Matrix, introduced a year ago, gives rotary and fixed-wing vertical take-off and landing (VTOL) aircraft a high level of system intelligence to complete missions with little human oversight.
The Optionally Piloted Black Hawk fits into the larger trend of the military finding technological ways of reducing troop numbers. While it can be controlled from a ground control station, it can also make crucial flying decisions without any human input, relying solely on its ‘Matrix’ proprietary artificial intelligence technology. Under the guidance of these systems, it can fly a fully autonomous cargo mission and can operate both ways: unmanned or piloted by a human.
Military aircraft have grown increasingly complex over the past few decades, and automated systems have also evolved to the point that some aircraft can’t be flown without them. However, the complex controls and interfaces require intensive training to master and can still overwhelm even experienced flight crews in emergency situations. In addition, many aircraft, especially older ones, require large crews to handle the workload.
DARPA says that it wants ALIAS to not only be capable of executing a complete mission from takeoff to landing, but also handle emergencies. It would do this through the use of autonomous capabilities that can be programmed for particular missions, as well as constantly monitoring the aircraft’s systems. But according to DARPA, the development of the ALIAS system will require advances in three key areas.







