The Future is Here: The Happiness Blanket

happiness-blanketIt’s like something out of Huxley’s Brave New World: a blanket that monitors your brain activity, and takes on a corresponding color to show just how relaxed you are. Yes, it might sound like a bizarre social experiment, but in fact, it is part of a British Airways study to measure the effects of night-time travel between Heathrow and New York, a trip that takes flyers across multiple time zones.

Anyone who has ever done this knows that the jet lag can be a real pain in the ass. And for frequent flyers, jet lag has a surprisingly powerful impact on their internal clocks and circadian rhythms. Part of the problem arises from the fact that travelers are inside a metal and plastic cylinder that’s about as far from natural as possible, which poses difficulties for psychologists and others tasked with improving passenger conditions.

happiness-blanket-4Using the happiness blanket, British Airways is trying to tweak those conditions to make air travel more relaxing and better suited to adjusting to a new time zone. The blanket works by using a neurosensor studded headband to measure brain waves and determine the user’s level of relaxation, while fiber optics woven into the material display this through color patterns. Red means the minimum of relaxation, and blue indicates the maximum relaxation.

Naturally, there’s also the marketing angle that’s at work here. In truth, there’s no need for the blankets to have a readout mechanism, but it is a nice way of illustrating to the public what’s going on. Using data gleaned from volunteer fliers, British Airways hopes to learn how to adjust the various factors in the cabin options and routines – including lighting, mealtimes, menus, seating positions, types of films shown, and general cabin routine.

happiness-blanket-1According to British Airways, the key to these adjustments is to provide passengers with the best sleep possible on long flights, which is one reason why the airline has introduced lie-flat seating for business class and above. Better relaxation provides the brain with as few distractions as possible while traveling to different time zones, so it has a chance to adjust.

As Frank van der Post, British Airways’ managing director, brands and customer experience, said about the experiment:

Using technology like the British Airways ‘happiness blanket’ is another way for us to investigate how our customers’ relaxation and sleep is affected by everything on board, from the amount of light in the cabin, when they eat, to what in-flight entertainment they watch and their position in the seat.

I can smell an industry emerging. High-tech happiness monitoring. And with the growth in neurosensors and EEG headsets, its was really just a matter of time before someone got pro-active and decided to mass produce them. I imagine other companies will begin following suit, perhaps to monitor their employees happiness, or to gauge customer response to commercials. It all sounds so deliciously quasi-fascist!

And be sure to check out the video of the company’s promotional video:


Source:
gizmag.com
, britishairways.com

The Future is Here: Brain Scanning for Pets!

Up_Doug_talkingdogRemember that scene in the Disney Pixar’s Up, where the old man and the little boy discover a dog who, thanks to a special collar, is able to talk to them? As it stands, that movie may have proven to be more prophetic than anyone would have thought. Thanks to improvements in wearable tech and affordable EEG monitors, it may finally be possible to read your dog’s mind and translate it into speech.

This is not the first case of commercial technology being used to monitor an animal’s habits. In recent years, wearable devices have been made available that an track the exercise, sleeping and eating patterns of a dog. But now, thanks to EEG devices like the “No More Woof”, it might be possible to track their thoughts, learning exactly what they think of that new couch, their new dry food, or the neighbors cat.

Woof_no_more1Tomas Mazzetti, the devices inventor, came up with the idea after he got as to what would happen if he strapped an off-the-shelf EEG machine to his mother’s Australian terrier. The observations that followed inspired the launch of a new project for Mazzetti and his team of fellow creatives at the Nordic Society for Invention and Discovery.

This society – which represents a collaboration between the ad agency Studio Total and Swedish retailer MiCasa – has spawned a number of quirky products in the past. These include a rocking chair that charges your iPad, a weather forecasting lamp, and a levitating carpet for small-ish pets. No More Woof is the society’s latest work, and the team recently launched an Indiegogo campaign to raise more funding for research.

Woof_no_moreSo far, Mazzetti and his team have been able to determine three baseline dog emotions to translate into speech: sleepiness, agitation, and curiosity. In time, they hope to be able to decipher hunger pangs as processed by a dog’s brain, and come up with appropriate verbalizations for all:

When the dog is sleepy, we translate to ‘I’m tired.’ And if they are really agitated, we can translate to ‘I’m excited!’ And the most active brainwave is when the dog sees a human face and tries to recognize that face. Then the brain is working overtime.

Mazzetti and the NSID are also working on finding cheaper EEG machines, after which they can fine-tune the software. They’ve done tests on roughly 20 dogs, of which they found that short-haired pets were able to communicate with the EEG machine better. If NSID receives more funding, its researchers hope to have something for sale by March or April of next year.

Brainwave-Frequency-ChartBut while Mazzetti’s primary goal is to produce something commercially viable for use with dogs, he’s also hopeful that other research institutions or retailers will pick up where NSID leaves off. For example, what thoughts could be translated if someone were to put a more sophisticated version of No More Woof on the head of a primate, or another highly intelligent mammal?

Looking even further afield, Mazzetti has suggested that such a device could work both ways, translating human speech into concepts that a dog (or other animal) could understand. As we all know, dogs are very good at learning verbal commands, but again, the idea of two-way communication offers possibilities to convey complex messages with other, more highly-intelligent animals.

humpbackCould it be possible someday to communicate with simians without the need for sign language, to commune openly with dolphins and Orcas, or warn Humpbacks about the impending dangers of whalers and deep sea fishers? Perhaps, and it would certainly be to the benefit of all. Not only would we be able to get our mammalian brethren to better understand us, we might just learn something ourselves!

After all, the line that separates humanity from all other species is a rather fine one, and tends to blur to closer we inspect it. By being able to commune with other species in a way that can circumvent “language barriers”, we might just learn that we have more in common than we think, and aren’t such a big, screaming deal after all.

And in the meantime, enjoy this video of the No More Woof in action:


And be sure to check out this clip from Up where Doug (the talking dog) is introduced, with hilarious results!

The Future is Creepy: Reading Consumer’s Brainwaves

brainscansProduct marketing has always been a high stakes game, where companies rely on psychology, competitive strategies, and well-honed ad campaigns to appeal to consumer’s instincts. This has never been an exact science, but it may soon be possible for advertisers to simply read your brainwaves to determine what you’re thinking and how much you’re willing to pay.

This past October, the German news site Spiegel Online profiled the provocative work of a Swiss neuroscientist and former sales consultant who is working on a method of measuring brain waves to determine how much a person would be willing to pay for a good or service. Known as “feel-good pricing” to marketing critics, the idea is already inspiring horror and intrigue.

brainwavesThe neuroscientist in question is Kai-Markus Müller, the head of Neuromarketing Labs who has over 10 years of experience in neuroscience research. According to his test, Starbucks is not actually charging enough for its expensive coffee. In fact, it’s probably leaving profits on the table because people would probably still buy it if they charged more.

To conduct this test, Müller targeting an area in the brain that lights up when things don’t really make sense. When test subjects were presented with the idea of paying 10 cents for coffee, their brain reacted unconsciously because the price seemed too cheap. A coffee for $8, on other hand, produced a similar reaction since the price seemed too high.

brain-activityOne would think that this method would help to determine optimum pricing. However, Müller then set up a coffee vending machine where people were allowed to set their own price. The two methods then matched up and revealed that people were willing to pay a higher price than what Starbucks actually charges. Somehow, paying less made people think they were selecting an inferior grade of product.

Naturally, there are those who would be horrified by this idea, feeling that it represents the worst combination of Big Brother surveillance and invasive marketing. This is to be expecting when any talk of “reading brainwaves” is concerned, dredging up images of a rampant-consumer society where absolutely no privacy exists, even within the space of your own head.

neuromarketOn the other hand, Müller himself takes issue with the notion of the “transparent consumer”, claiming that “Everyone wins with this method”. As proof, he cited the numerous flops in the consumer economy in the Spiegel Online article. Apparently, roughly 80 percent of all new products disappear from shelves after a short time, mainly because the producers have misjudged the markets desire for them or what they are willing to pay.

It’s all part of a nascent concept known as Neuromarketing, and it is set to take to the market in the coming years. One can expect that consumers will have things to say about it, and no doubt those feelings will come through whenever and wherever producers try to sell you something. Personally, I am reminded of what Orwell wrote in 1984:

“Always the eyes watching you and the voice enveloping you. Asleep or awake, working or eating, indoors or out of doors, in the bath or in bed — no escape. Nothing was your own except the few cubic centimetres inside your skull.”

futurama_lightspeedbriefsAnd perhaps more appropriately, I’m also reminded of what Fry said about advertising in the Season 1 episode of Futurama entitled “A Fistfull of Dollars”:

“Leela: Didn’t you have ads in the 21st century?

Fry: Well sure, but not in our dreams. Only on TV and radio, and in magazines, and movies, and at ball games… and on buses and milk cartons and t-shirts, and bananas and written on the sky. But not in dreams, no siree.”

Somehow, truth is always stranger than fiction!

Sources: fastcoexist.com, spiegel.de, neuromarketing-labs.com

The Future is Here: The “Attention Powered” Car

attention_powered_CarDriver inattention, tunnel vision, and distraction are all major causes of road accidents. And while the law has certainly attempted to remedy this situation by imposing penalties against driving while on the phone, or driving and texting, the problem remains a statistically relevant one. Luckily, Emotiv and the Royal Automobile Club of Western Australia have joined forces to come up with a brilliant – albeit slightly unusual – solution.

It’s known as the “Attention Powered Car”, an automobile that features a neuroheadset made by Emotiv, creator of a range of electroencephalography-based monitoring gear. Basically, the driver straps on the headset while driving and  then interfaces with custom software to read the driver’s brainwaves. Any lapses in concentration are read by the headset and cause the vehicle to slow down to about 14 km/h (9 mph) as a way of alerting the driver.

emotiv_epocIn fact, the car – a Hyundai i40 – will only run at full capacity when it senses that drivers are giving their full attention to the task at hand. According to Pat Walker, RAC executive general manager:

The impact of inattention is now comparable to the number of deaths and serious injuries caused by speed and drink driving, which are all contributors to Western Australia consistently having the worst fatality rate of any Australian state. Nationally, it is estimated inattention was a factor in 46 percent of fatal crashes.

The prototype design is largely meant to bring attention to the issue of driver distraction, and also serve as a tool for investigating the problem further. Researchers have been using the car (on a track) to test how various tasks, such as switching radio stations or sending a text message, impact a driver’s attention. Factors measured include blink rate and duration, eye movement, and head tilts.

googlecarAnd while novel and pure science fiction gold, the concept is also quite due. Given the improvements made in EEG headsets in recent years, as well as computerized vehicles, it was really just a matter of time before someone realized the potential for combining the technologies to create a safer drive that still relied on a human operator.

While robot cars may be just around the corner, I imagine most people would prefer to still be in control of their vehicle. Allowing for a neuroband-operated vehicle may be just the thing to marry increased safety while avoiding the specter of a future dystopian cliche where robots handle our every need.

RAC WA has also produced a number of videos about the Attention Powered Car, including the one below. To check out others, simply click on this link and prepare to be impressed.


Sources: news.cnet.com, staging.forthebetter.com.au

The Future is Here: The Insight Neuroheadset

Emotiv_insightPortable EEG devices have come a long way in recent years. From their humble beginnings as large, wire-studded contraptions that cost upwards of $10,000, they have now reached the point where they are small, portable, and affordable. What’s more, they are capable of not only reading brainwaves and interpreting brain activity, but turning that activity into real-time commands and controls.

Once such device is the Emotiv Insight, a neuroheadset that is being created with the help of a Kickstarter campaign and is now available for preorder. Designed by the same company that produced the EPOC, an earlier brain-computer interface (BCI) that was released in 2010, the Insight offers many improvements. Unlike its bulky predecessor, the new model is sleeker, lighter, uses five sensors instead of the EPOC’s fourteen and can be linked to your smartphone.

Emotiv_insight_EPOCIn addition, the Insight uses a new type of hydrophilic polymer sensor that absorbs moisture from the environment. Whereas the EPOC’s sensors required that the user first apply saline solution to their scalp, no extra applied moisture is necessary with this latest model. This is a boon for people who plan on using it repeatedly and don’t want to moisten their head with goo every time to do it.

The purpose behind the Insight and EPOC headsets is quite simple. According to Tan Le, the founder of Emotiv, the company’s long term aim is to take a clinical system (the EEG) from the lab into the real world and to democratize brain research. As already noted, older EEG machines were prohibitively expensive for smaller labs and amateur scientists and made it difficult to conduct brain research. Le and his colleagues hope to change that.

emotiv_insight1And it seems that they are destined to get their way. Coupled with similar devices from companies like Neurosky, the stage seems set for an age when brain monitoring and brain-computer interface research is something that is truly affordable – costing just a few hundred dollars instead of $10,000 – and allowing independent labs and skunkworks to contribute their own ideas and research to the fore.

As of September 16th, when the Kickstarter campaign officially closed, Emotiv surpassed its $1 million goal and raised a total of $1,643,117 for their device. Because of this, the company plans to upgrade the headset with a six-axis intertial sensor – to keep track of the user’s head movements, gait, tremor, gestures, etc. – a microSD card reader for added security, and a 3-axis magnetometer (i.e. a compass).

woman-robotic-arm_650x366In some cases, these new brain-to computer interfaces are making it possible for people with disabilities or debilitating illnesses to control robots and prosthetics that assist them with their activities, rehab therapy, or restore mobility. On a larger front, they are also being adapted for commercial use – gaming and interfacing with personal computers and devices – as well as potential medical science applications such as neurotherapy, neuromonitoring, and neurofeedback.

Much like a fitness tracker, these devices could let us know how we are sleeping, monitor our emotional state over time, and make recommendations based on comparative analyses. So in addition to their being a viable growth market in aiding people with disabilities, there is also the very real possibility that neuroheadsets will give people a new and exciting way to interface with their machinery and keep “mental records”.

Passwords are likely to replace passthoughts, people will be able to identify themselves with brain-activity records, and remote control will take on a whole new meaning! In addition, mental records could become part of our regular medical records and could even be called upon to be used as evidence when trying to demonstrate mental fitness or insanity at trials. Dick Wolf, call me already! I’m practically giving these ideas away!

And be sure to enjoy this video from Emotiv’s Kickstarter site:


Sources: fastcoexist.com, kickstarter.com

The Worlds First Brain to Brain Interface!

Brain-ScanIt finally happened! It seems like only yesterday, I was talking about the limitations of Brain to Brain Interfacing (BBI), and how it was still limited to taking place between rats and between a human and a rat. Actually, it was two days ago, but the point remains. In spite of that, after only a few months of ongoing research, scientists have finally performed the first human-to-human interface.

Using a Skype connection, Rajesh Rao, who studies computational neuroscience at the University of Washington, successfully used his mind to control the hand of his colleague, Andrea Stucco. The experiment was conducted on Aug. 12th, less than month after researchers at Harvard used a non-invasive technique and a though to control the movement of a rat’s tail.

brain-to-brain-interfacingThis operation was quite simple: In his laboratory, Rao put on a skull cap containing electrodes which was connected to an electroencephalography (EEG) machine. These electrodes read his brainwaves and transmitted them across campus to Stocco who, seated in a separate lab, was equipped with a cap that was hooked up to a transcranial magnetic stimulation (TMS) machine.

This machine activating a magnetic stimulation coil that was integrated into the cap directly above Stocco’s left motor cortex, the part of the brain that controls movements of the hands. Back in Rao’s lab, he watched a screen displaying a video game, in which the player must tap the spacebar in order to shoot down a rocket; while  in Stocco’s lab. the computer was linked to that same game.

braininterfacing-0Instead of tapping the bar, however, Rao merely visualized himself doing so. The EEG detected the electrical impulse associated with that imagined movement, and proceeded to send a signal – via the Skype connection – to the TMS in Stocco’s lab. This caused the coil in Stocco’s cap to stimulate his left motor cortex, which in turn made his right hand move.

Given that his finger was already resting over the spacebar on his computer, this caused a cannon to fire in the game, successfully shooting down the rocket. He compared the feeling to that of a nervous tic. And to ensure that there was no chance of any outside influence, the Skype feeds were not visible to each other, and Stucco wore noise cancelling headphones and ear buds.

brain-activityIn the course of being interviewed, Rao was also quick to state that the technology couldn’t be used to read another person’s mind, or to make them do things without their willing participation. The researchers now hope to establish two-way communications between participants’ brains, as the video game experiment just utilized one-way communication.

Additionally, they would like to transmit more complex packets of information between brains, things beyond simple gestures. Ultimately, they hope that the technology could be used for things like allowing non-pilots to land planes in emergency situations, or letting disabled people transmit their needs to caregivers. And in time, the technology might even be upgraded to involve wireless implants.

brainpainting-brain-computer-interfaces-2One thing that should be emphasized here is the issue of consent. In this study, both men were willing participants, and it is certain that any future experimentation will involve people willingly accepting information back and forth. The same goes for commands, which theoretically could only occur between people willing to be linked to one another.

However, that doesn’t preclude that such links couldn’t one day be hacked, which would necessitate that anyone who chose to equip themselves with neural implants and uplinks also get their hands on protection and anti-hacking software. But that’s an issue for another age, and no doubt some future crime drama! Dick Wolf, you should be paying me for all the suggestions I’m giving you!

And of course, there’s a video of the experiment, courtesy of the University of Washington. Behold and be impressed, and maybe even a little afraid for the future:


Source:
gizmag.com

The Future is Here: Painting with Thought

Heide-PfutznerIn 2007, when artist Heide Pfüetzner was diagnosed with Amyotrophic Lateral Sclerosis (Lou Gehrig’s disease), she considered it a “personal catastrophe”. Given the effects of ALS, which include widespread muscle atrophy that affects mobility, speaking, swallowing, and breathing, this is hardly surprising. And yet, just six years later, an exhibit of her paintings made their debut; all created by her mind and a computer.

Known as “Brain on Fire,” the exhibit took place on Easdale, a small island off the west coast of Scotland, this past July. Those who visited the exhibit were treated to a vibrant display of colorful digital paintings that she made using a computer program that lets her control digital brushes, shapes, and colors by concentrating on specific points on the screen.

bmi_paintingPfüetzner, a former English teacher from Leipzig, Germany, “brain paints” using software developed by the University of Wurzburg and German artist Adi Hösle, along with equipment from biomedical engineering firm Gtec. Thanks to the equipment and software, Pfüetzner is able to paint using two monitors and an electrode-laden electroencephalogram (EEG) cap without having to move her hands or leave her chair.

While one screen displays the program’s matrix of tools, another functions like a canvas, showing the picture as it evolves. Images of the various tools flash at different times, and Pfüetzner focuses on the tool she wants to select, causing her brain activity to spike. The computer determines which option she’s focusing on by comparing the timing of the brainwaves to the timing of the desired flashing tool.

brainpainting_indexRelying on a Startnext crowdfunding campaign, Pfüetzner was able to raise the $6,500 she needed to hold the exhibit in Easdale. The money she raised through the campaign went toward printing and framing her work, as well as transporting her and her nursing team, as well as the medical equipment she needs, to Easdale, where the exhibit ran until July 25th.

Pfüetzner admits that prior to becoming ill, she was not too fond of technical equipment and did not like working with computers. But since she became acquainted with the new technology, an EEG cap and brain computer interface have become her everyday companions. Much like a canvas, brush and paint palate, “brain painting” has become second nature to her.

Heide-Pfutzner_paintingBetween her Startnext page and interviews since her exhibit went public, Pfüetzner had the following to say about her work and the software that makes it possible:

For the first time, this project gives me the opportunity to show the world that the ALS has not been the end of my life… BCI is a pioneer-making technology which allows me to create art and therefore, reconnect to my old life.

For some time now, Brain to Computer Interface (BCI) research has been pushing the realm of the possible, giving a man with locked-in syndrome the ability to tweet using eye movement, or a paraplegic woman the ability to control a robotic arm. And thanks to research team like that working at the University of Wurzburg’s labs,  the range of BCI applications for the paralyzed are quickly beyond text input and into the realm of visual art.

brainpainting-brain-computer-interfaces-3Though the life expectancy of an ALS patient averages about two to five years from the time of diagnosis, according to the ALS Association, some ALS patients, including physicist and cosmologist Stephen Hawking, have far outlived that prognosis. given her obvious inspiration and passion, not to mention talent, I sincerely hope Pfüetzner has a long and productive career!

And be sure to enjoy this video from Heide Pfüetzner’s Startnext page. It contains a personal address in German (sadly, I couldn’t find an English translation), followed by members of the University of Wurzburg team explaining how “brain painting” works:


Source: cnet.news.com, neurogadget.com, startnext.com

,