Cyberwars: Stuxnet and Cryptolocker

cyber_security1It’s been quite the year for cybercops, cybercriminals, and all those of us who are caught in between. Between viruses which continue to involve and viruses that target sensitive information in new ways, it seems clear that the information age is fraught with peril. In addition to cyberwars raging between nations, there is also the danger of guerrilla warfare and the digital weapons running amok.

Consider the Stuxnet virus, a piece of programming that made headlines last year by sabotaging the Iranian nuclear enrichment program. At the time, the target – not to mention its source (within the US) – seemed all too convenient to have been unintentional. However, this year, Stuxnet is once again garnering attention thanks to its latest target: the International Space Station.

ISSApparently, this has been the result of the virus having gone rogue, or at least become too big for its creators to control. In addition to the ISS, the latest reports state that Stuxnet is hitting nuclear plants in countries for which the virus was not originally intended. In one case, the virus even managed to infect an internal network at a Russian power planet that wasn’t even connected to the internet.

According to Eugene Kaspersky, famed head of IT security at Kaspersky Labs, the virus can travel through methods other than internet connectivity, such as via optical media or a USB drive. Kaspersky claims that this is apparently how it made its way aboard the ISS, and that it was brought aboard on more than one occasion through infected USB drives.

computer-virus.istockFor the moment, it is unclear how this virus will be taken care of, or whether or not it will continue to grow beyond any single organization’s ability to control it. All that is clear at this point is that this particular virus has returned to its original handlers. For the time being, various nations and multinational corporations are looking to harden their databases and infrastructure against cyber attack, with Stuxnet in mind.

And they are not the only ones who need to be on their guard about protecting against intrusion. Average consumers are only at risk of having their databases being accessed by an unwanted digital visitor, one that goes by the name of Cryptolocker. Designed with aggressive salesmanship – and blackmail – in mind, this virus is bringing fears about personal information being accessed to new heights.

cryptolockerBasically, the Cryptolocker works by finding people’s most important and sensitive files and selling it back to them. After obtaining the files its needs, it then contacts a remote server to create a 2048-bit key pair to encrypt them so they cannot be recovered, and then contacts the owner with an ultimatum. People are told to pay up, or the virus will begin deleting the info.

When the virus first emerged in October of this year, victims were given three days to cough up roughly $200 via BitCoin or MoneyPak currency transfer. If the virus’ authors did not receive payment within 72 hours, they said, a single line would be deleted from a text file on some hidden foreign server, forever erasing the only string of numbers that could ever bring the affected files back from the dead.

cyber_virusSome users responded by simply setting their system’s internal clock back. A temporary measure, to be sure, but one which worked by tricking the virus into thinking the deadline had not expired. In addition, the three-day deadline worked against the viruses makers, since it’s proven restrictive to the types of people who mostly contract a virus like this – i.e. senior citizens and people working on corporate networks.

Such people are more vulnerable to such scams, but seldom have the computer-savvy skills to to set up BitCoin or other such accounts and transfer the money in time. Meanwhile, infecting a corporate server means that a bloated corporate bureaucracies will be responsible for making the decision of whether or not to pay, not an individual who can decide quickly.

virus-detected-640x353So basically, the designers of Cryptolocker were facing a catch-22. They could not extend the deadline on the virus without diminishing the sense of panic that makes many people pay, but would continue to lose money as long as people couldn’t pay. Their solution: If a victim does not pay up in time, the hackers simply raise the ransom – by a factor of 10!

This allows people more time to mull over the loss of sensitive data and make a decision, but by that time – should they decide to pay up – the price tag has gone up to a bloated $2000. Luckily, this has revealed a crucial bluff in the virus’s workings by showing that all the keys to the encrypted files are in fact not deleted after the three day time limit.

???????????????As such, the security industry is encouraging people to hold on to the useless, encrypted files and waiting for the criminal server to be someday seized by the authorities. Since any ransom paid is a de-facto encouragement to hackers to write a similar virus again — or indeed to re-infect the same companies twice – people are currently being told to simply hold out and not pay up.

What’s more, regular backups are the key to protecting your database from viruses like Cryptolocker. Regular backups to off-network machines that do not auto-sync will minimize the virus’ potential for damage. The best defense is even simpler: Cryptolocker infects computers via a bogus email attachment disguised as a PDF file, so simple email safety should keep you immune.

Alas, its a world of digital warfare, and there there are no discernible sides. Just millions of perpetrators, dozens of authorities, and billions of people fearing for the safety and integrity of their data. One can only wonder what an age of quantum computers, graphene and nanotube processors will bring. But more on that later!

Sources: extremetech.com, (2), fastcoexist.com

TBBT’s “Friendship Algorithm”

TBBT_frienship_algorithmRecall that hilarious episode of The Big Bang Theory where Sheldon designed the friendship algorithm? Well, like much of what they do, the hilarity comes with its share of educational value. In fact, half of what makes the show so funny is the way they weave scientific fact into the story and nerd out on it! For those who actually get it, it’s doubly entertaining.

In this case, Sheldon’s characteristic appraisal of his situation reflected something very real and relatable about algorithms. Essentially, they are step-by-step procedures designed to solve problems. While they pertain to calculation, data processing, and automated reasoning, the concept is something we are already intimately familiar with.

Literally everyone uses algorithms in everyday decision making, thinking things out in advance and taking things into consideration to come up with alternate plans and reach the desired outcome. Treating it like a computer program, as Sheldon does below, is just an excessively nerdy way of going about it! Enjoy the video recap:

The Future of Computing: Graphene Chips and Transistors

computer_chip4The basic law of computer evolution, known as Moore’s Law, teaches that within every two years, the number of transistors on a computer chip will double. What this means is that every couple of years, computer speeds will double, effectively making the previous technology obsolete. Recently, analysts have refined this period to about 18 months or less, as the rate of increase itself seems to be increasing.

This explosion in computing power is due to ongoing improvements in the field of miniaturization. As the component pieces get smaller and smaller, engineers are able to cram more and more of them onto chips of the same size. However, it does make one wonder just how far it will all go. Certainly there is a limit to how small things can get before they cease working.

GrapheneAccording to the International Technology Roadmap for Semiconductors (ITRS), a standard which has been established by the industry’s top experts, that limit will be reached in 2015. By then, engineers will have reached the threshold of 22 nanometers, the limit of thickness before the copper wiring that currently connect the billions of transistors in a modern CPU or GPU will be made unworkable due to resistance and other mechanical issues.

However, recent revelations about the material known as graphene show that it is not hampered by the same mechanical restrictions. As such, it could theoretically be scaled down to the point where it is just a few nanometers, allowing for the creation of computer chips that are orders of magnitude more dense and powerful, while consuming less energy.

IBM-Graphene-ICBack in 2011, IBM built what it called the first graphene integrated circuit, but in truth, only some of the transistors and inductors were made of graphene while other standard components (like copper wiring) was still employed. But now, a team at the University of California Santa Barbara (UCSB) have proposed the first all-graphene chip, where the transistors and interconnects are monolithically patterned on a single sheet of graphene.

In their research paper, “Proposal for all-graphene monolithic logic circuits,” the UCSB researchers say that:

[D]evices and interconnects can be built using the ‘same starting material’ — graphene… all-graphene circuits can surpass the static performances of the 22nm complementary metal-oxide-semiconductor devices.

graphene_transistormodelTo build an all-graphene IC (pictured here), the researchers propose using one of graphene’s interesting qualities, that depending on its thickness it behaves in different ways. Narrow ribbons of graphene are semiconducting, ideal for making transistors while wider ribbons are metallic, ideal for gates and interconnects.

For now, the UCSB team’s design is simply a computer model that should technically work, but which hasn’t been built yet. In theory, though, with the worldwide efforts to improve high-quality graphene production and patterning, it should only be a few years before an all-graphene integrated circuit is built. As for full-scale commercial production, that is likely to take a decade or so.

When that happens though, another explosive period of growth in computing speed, coupled with lower power consumption is to be expected. From there, subsequent leaps are likely to involve carbon nanotubes components, true quantum computing, and perhaps even biotechnological circuits. Oh the places it will all go!

Source: extremetech.com

The Future is Here: Carbon Nanotube Computers

carbon-nanotubeSilicon Valley is undergoing a major shift, one which may require it to rethink its name. This is thanks in no small part to the efforts of a team based at Stanford that is seeking to create the first basic computer built around carbon nanotubes rather than silicon chips. In addition to changing how computers are built, this is likely to extend the efficiency and performance.

What’s more, this change may deal a serious blow to the law of computing known as Moore’s Law. For decades now, the exponential acceleration of technology – which has taken us from room-size computers run by punched paper cards to handheld devices with far more computing power – has depended the ability to place more and more transistors onto an individual chip.

PPTMooresLawaiThe result of this ongoing trend in miniaturization has been devices that are becoming smaller, more powerful, and cheaper. The law used to describe this – though “basic rule” would be a more apt description – states that the number of transistors on a chip has been doubling every 18 months or so since the dawn of the information age. This is what is known as “Moore’s Law.”

However, this trend could be coming to an end, mainly because its becoming increasingly difficult, expensive and inefficient to keep jamming more tiny transistors on a chip. In addition, there are the inevitable physical limitations involved, as miniaturization can only go on for so long before its becomes unfeasible.

carbon_nanotubecomputerCarbon nanotubes, which are long chains of carbon atoms thousands of times thinner than a human hair, have the potential to be more energy-efficient and outperform computers made with silicon components. Using a technique that involved “burning” off and weeding out imperfections with an algorithm from the nanotube matrix, the team built a very basic computer with 178 transistors that can do tasks like counting and number sorting.

In a recent release from the university, Stanford professor Subhasish Mitra said:

People have been talking about a new era of carbon nanotube electronics moving beyond silicon. But there have been few demonstrations of complete digital systems using this exciting technology. Here is the proof.

Naturally, this computer is more of a proof of concept than a working prototype. There are still a number of problems with the idea, such as the fact that nanotubes don’t always grow in straight lines and cannot always “switch off” like a regular transistor. The Stanford team’s computer’s also has limited power due to the limited facilities they had to work with, which did not have access to industrial fabrication tools.

carbon_nanotube2All told, their computer is only about as powerful as an Intel 4004, the first single-chip silicon microprocessor that was released in 1971. But given time, we can expect more sophisticated designs to emerge, especially if design teams have access to top of the line facilities to build prototypes.

And this research team is hardly alone in this regard. Last year, Silicon Valley giant IBM managed to create their own transistors using carbon nanotubes and also found that they outperformed the transistors made of silicon. What’s more, these transistors measured less than ten nanometers across, and were able to operated using very low voltage.

carbon_nanotube_transistorSimilarly, a research team from Northwestern University in Evanston, Illinois managed to create something very similar. In their case, this consisted of a logic gate – the fundamental circuit that all integrated circuits are based on – using carbon nanotubes to create transistors that operate in a CMOS-like architecture. And much like IBM and the Standford team’s transistors, it functioned at very low power levels.

What this demonstrated is that carbon nanotube transistors and other computer components are not only feasible, but are able to outperform transistors many times their size while using a fraction of the power. Hence, it is probably only a matter of time before a fully-functional computer is built – using carbon nanotube components – that will supersede silicon systems and throw Moore’s Law out the window.

Sources: news.cnet.com, (2), fastcolabs.com

Judgement Day Update: The Human Brain Project

brain_chip2Biomimetics are one of the fastest growing areas of technology today, which seek to develop technology that is capable of imitating biology. The purpose of this, in addition to creating machinery that can be merged with our physiology, is to arrive at a computing architecture that is as complex and sophisticated as the human brain.

While this might sound the slightest bit anthropocentric, it is important to remember that despite their processing power, supercomputers like the D-Wave Two, IBM’s Blue Gene/Q Sequoia, or MIT’s ConceptNet 4, have all shown themselves to be lacking when it comes to common sense and abstract reasoning. Simply pouring raw computing power into the mix does not make for autonomous intelligence.

IBM_Blue_Gene_P_supercomputerAs a result of this, new steps are being taken to crate a computer that can mimic the very organ that gives humanity these abilities – the human brain. In what is surely the most ambitious step towards this goal to date, an international group of researchers recently announced the formation of the Human Brain Project. Having secured the $1.6 billion they need to fund their efforts, these researchers will spend the next ten years conducting research that cuts across multiple disciplines.

This will involve mapping out the vast network known as the human brain – a network composed of over a hundred billion neuronal connections that are the source of emotions, abstract thought, and this thing we know as consciousness. And to do so, the researchers will be using a progressively scaled-up multilayered simulation running on a supercomputer.

Human-Brain-project-Alp-ICTConcordant with this bold plan, the team itself is made up of over 200 scientists from 80 different research institutions from around the world. Based in Lausanne, Switzerland, this initiative is being put forth by the European Commission, and has even been compared to the Large Hadron Collider in terms of scope and ambition. In fact, some have taken to calling it the “Cern for the brain.”

According to scientists working on the project, the HBP will attempt to reconstruct the human brain piece-by-piece and gradually bring these cognitive components into the overarching supercomputer. The expected result of this research will be new platforms for “neuromorphic computing” and “neurorobotics,” allowing for the creation of computing and robotic architectures that mimick the functions of the human brain.

^According to a statement released by the HBP, Swedish Nobel Laureate Torsten Wiesel had this to say about the project:

The support of the HBP is a critical step taken by the EC to make possible major advances in our understanding of how the brain works. HBP will be a driving force to develop new and still more powerful computers to handle the massive accumulation of new information about the brain, while the neuroscientists are ready to use these new tools in their laboratories. This cooperation should lead to new concepts and a deeper understanding of the brain, the most complex and intricate creation on earth.

Other distinguished individuals who were quoted in the release include President Shimon Peres of Israel, Paul G. Allen, the founder of the Allen Institute for Brain Science; Patrick Aebischer, the President of EPFL in Switzerland; Harald Kainz, Rector of Graz University of Technology, Graz, Austria; as well as a slew of other politicians and academics.

Combined with other research institutions that are producing computer chips and processors that are modelled on the human brain, and our growing understanding of the human connectome, I think it would be safe to say that by the time the HBP wraps up, we are likely to see processors that are capable of demonstrating intelligence, not just in terms of processing speed and memory, but in terms of basic reasoning as well.

At that point, we really out to consider instituting Asimov’s Three Laws of Robotics! Otherwise, things could get apocalyptic on our asses! 😉


Sources:
io9.com, humanbrainproject.eu
, documents.epfl.ch

Alan Turing Pardoned… Finally!

Alan TuringWhen it comes to the history of computing, cryptography and and mathematics, few people have earned more renown and respect than Alan Turing. In addition to helping the Allied forces of World War II break the Enigma Code, a feat which was the difference between victory and defeat in Europe, he also played an important role in the development of computers with his “Turing Machine” and designed the Turning Test – a basic intelligence requirement for future AIs.

Despite these accomplishments, Alan Turing became the target of government persecution when it was revealed in 1952 that he was gay. At the time, homosexuality was illegal in the United Kingdom, and Alan Turing was charged with “gross indecency” and given the choice between prison and chemical castration. He chose the latter, and after two years of enduring the effects of the drug, he ate an apple laced with cyanide and died.

turing-science-museum-2Officially ruled as a suicide, though some suggested that foul play may have been involved, Turing died at the tender age of 41. Despite his lifelong accomplishments and the fact that he helped to save Britain from a Nazi invasion, he was destroyed by his own government for the simple crime of being gay.

But in a recent landmark decision, the British government made a historic ruling by indicating that they would support a backbench bill that would clear his name posthumously of all charges. This ruling is not the first time that the subject of Turing’s sentencing has been visited by the British Parliament. Though for years they have been resistant to offering an official pardon, Prime Minister Gordon Brown did offer an apology for the “appalling” treatent Turing received.

Sackville_Park_Turing_plaqueHowever, it was not until now that it sought to wipe the slate clean and begin to redress the issue, starting with the ruling that ruined the man’s life. The government ruling came on Friday, and Lord Ahmad of Wimbledon, a government whip, told peers that the government would table the third reading of the Alan Turin bill at the end of October if no amendments are made.

Every year since 1966, the Turing Award – the computing worlds highest honor and equivalent of the Nobel Prize- has been given by the Association for Computing Machinery for technical or theoretical contributions to the computing community. In addition, on 23 June 1998 – what would have been Turing’s 86th birthday – an English Heritage blue plague was unveiled at his birthplace in and childhood home in Warrington Crescent, London.

Alan_Turing_Memorial_CloserIn addition, in 1994, a stretch of the A6010 road – the Manchester city intermediate ring road – was named “Alan Turing Way”, and a bridge connected to the road was named “Alan Turing Bridge”. A statue of Turing was also unveiled in Manchester in 2001 in Sackville Park, between the University of Manchester building on Whitworth Street and the Canal Street gay village.

This memorial statue depicts the “father of Computer Science” sitting on a bench at a central position in the park holding an apple. The cast bronze bench carries in relief the text ‘Alan Mathison Turing 1912–1954’, and the motto ‘Founder of Computer Science’ as it would appear if encoded by an Enigma machine: ‘IEKYF ROMSI ADXUO KVKZC GUBJ’.

turing-statueBut perhaps the greatest and most creative tribute to Turning comes in the form of the statue of him that adorns Bletchley Park, the site of the UK’s main decryption department during World War II. The 1.5-ton, life-size statue of Turing was unveiled on June 19th, 2007. Built from approximately half a million pieces of Welsh slate, it was sculpted by Stephen Kettle and commissioned by the late American billionaire Sidney Frank.

Last year, Turing was even commemorated with a Google doodle last year in honor of what would have been his 100th birthday. In a fitting tribute to Turing’s code-breaking work, this doodle designed to spell out the name Google in binary. Unlike previous tributes produced by Google, this one was remarkably complicated. Those who attempted to figure it out apparently had to consult the online source Mashable just to realize what the purpose of it was.

google_doodle_turing

For many, this news is seen as a development that has been too long in coming. Much like Canada’s own admission to wrongdoing in the case of Residential Schools, or the Church’s persecution of Galileo, it seems that some institutions are very slow to acknowledge that mistakes were made and injustices committed. No doubt, anyone in a position of power and authority is afraid to admit to wrongdoing for fear that it will open the floodgates.

But as with all things having to do with history and criminal acts, people cannot be expected to move forward until accounts are settled. And for those who would say “get over it already!”, or similar statements which would place responsibility for moving forward on the victims, I would say “just admit you were wrong already!”

Rest in peace, Alan Turing, and may continued homophobes who refuse to admit they’re wrong find the wisdom and self-respect to learn and grow from their mistakes. Orson Scott Card, I’m looking in your direction!

Sources: news.cnet.com, guardian.co.uk

The Future is Here: Cellular Computers!

dnacomputingComputing has come so far in such a relatively short space of time. Beginning with comparatively basic models, which relied on arrangements of analogue circuits (such as capacitors and resistors), scientists were able to perform complex calculations, crack impenetrable cyphers, and even know how and where to deploy counter-measures against incoming missiles. And as we all know, sometimes you have to look back to the fundamentals if you want to move any farther ahead.

And that’s precisely what researchers at MIT have done with their latest innovation: an analog computer that works inside a living cell! A massive step towards a future where machinery and biology are one and the same, these “cellular computers” were not only able to perform arithmetic, but also more complex functions like taking logarithms, square roots, and even do power law scaling.

biological-analog-computers-in-cells-640x353This news comes on the heels of researchers at Stanford who were able to create a biological transistor inside a cell. Relying on DNA and RNA to create a “transcriptors”, the Standford researchers were able to create a biological logic gate, and all on the microscopic scale. When combined the sorts of digital and analog circuits common to computing, this research could lead to powerful sensing and control platforms built on very small scales.

And like many recent innovations and developments made within the world of computing and biotechnology, the possibilities that this offers are startling and awesome. For one, all cells work with a certain biological clock, which regulates growth, circadian rhythms, aging, and numerous biological process. Thus far, the researchers in question have been hosting their biological computers in bacterial cells. But if they were to develop analogous circuits that operate in mammalian cells, these functions might be brought into better use.

DNA-molecule2What this means is that we could be very well seeing the beginning of biology that is enhanced and augmented by the addition of technology on the cellular level. And not in the sense of tiny machines or implants, things made of silicon and minerals that would regulate our blood flow, administer drugs or monitor or vitals. No, in this case, we would be talking about machines that are composed of self-regulating DNA and RNA and work in the same way our organic tissues do.

On top of that, we would be able to create things like flash drives and computation software from living tissue, cramming thousands of terabytes of into into a few cells worth of genetic material. Human beings would no longer need smartphones, PDAs or tablets, since they would be able to carry all the information they would ever need in their body. And the ability to do this could very well lead to the creation of AI’s that are not build, but grown, making them virtually indistinguishable from humans.

caprica_6And you know what that means, don’t you? The line between biological and artificial would truly begin to dissolve, Voight-Kampff and genetic tests might have to become mandatory, and we could all be looking at robots that look something like this…

Man the future is awesome and scary!

Sources: Extremetech.com, (2)

IBM Creates First Photonic Microchip

optical_computer1For many years, optical computing has been a subject of great interest for engineers and researchers. As opposed to the current crop of computers which rely on the movement of electrons in and out of transistors to do logic, an optical computer relies on the movement of photons. Such a computer would confer obvious advantages, mainly in the realm of computing speed since photons travel much faster than electrical current.

While the concept and technology is relatively straightforward, no one has been able to develop photonic components that were commercially viable. All that changed this past December as IBM became the first company to integrate electrical and optical components on the same chip. As expected, when tested, this new chip was able to transmit data significantly faster than current state-of-the-art copper and optical networks.

ibm-silicon-nanophotonic-chip-copper-and-waveguidesBut what was surprising was just how fast the difference really was. Whereas current interconnects are generally measured in gigabits per second, IBM’s new chip is already capable of shuttling data around at terabits per second. In other words, over a thousand times faster than what we’re currently used to. And since it will be no big task or expense to replace the current generation of electrical components with photonic ones, we could be seeing this chip taking the place of our standard CPUs really soon!

This comes after a decade of research and an announcement made back in 2010, specifically that IBM Research was tackling the concept of silicon nanophotonics. And since they’ve proven they can create the chips commercially, they could be on the market within just a couple of years. This is certainly big news for supercomputing and the cloud, where limited bandwidth between servers is a major bottleneck for those with a need for speed!

internetCool as this is, there are actually two key breakthroughs to boast about here. First, IBM has managed to build a monolithic silicon chip that integrates both electrical (transistors, capacitors, resistors) and optical (modulators, photodetectors, waveguides) components. Monolithic means that the entire chip is fabricated from a single crystal of silicon on a single production line, and the optical and electrical components are mixed up together to form an integrated circuit.

Second, and perhaps more importantly, IBM was able to manufacture these chips using the same process they use to produce the CPUs for the Xbox 360, PS3, and Wii. This was not easy, according to internal sources, but in so doing, they can produce this new chip using their standard manufacturing process, which will not only save them money in the long run, but make the conversion process that much cheaper and easier. From all outward indications, it seems that IBM spent most of the last two years trying to ensure that this aspect of the process would work.

Woman-Smashing-ComputerExcited yet? Or perhaps concerned that this boost in speed will mean even more competition and the need to constantly upgrade? Well, given the history of computing and technological progress, both of these sentiments would be right on the money. On the one hand, this development may herald all kinds of changes and possibilities for research and development, with breakthroughs coming within days and weeks instead of years.

At the same time, it could mean that rest of us will be even more hard pressed to keep our software and hardware current, which can be frustrating as hell. As it stands, Moore’s Law states that it takes between 18 months and two years for CPUs to double in speed. Now imagine that dwindling to just a few weeks, and you’ve got a whole new ballgame!

Source: Extremetech.com