The Economist : Cuckoo Clouds

Tuesday, May 3, 2011

IT COULD turn out to be the biggest breach of data privacy since the advent of the internet. Sony admitted this week that hackers had stolen personal information, possibly including credit-card details, of many of the 77m-plus users of its online-gaming and entertainment networks. The Japanese company did not admit the full extent of the potential risks to its customers until nearly a week after it had taken its PlayStation Network off air, though it insisted that it had done so as soon as it realised how serious the intrusion into its systems had been. Amazon, an American online retailer and provider of “cloud computing” services, has also suffered a lengthy breakdown at one of the giant server farms whose storage and processing facilities it rents to other companies. The two lapses, though unconnected and different in nature, have raised the question of whether customers can really trust the basic idea behind the cloud—that you can buy computing services from the internet, just like gas or water from a utility.

Sony’s security breach is particularly embarrassing because it wants to position its PlayStation console as an entertainment hub capable of delivering films and music over the internet, in addition to video games. An entertainment one-stop-shop of this nature will appeal to consumers only if it is secure and reliable; a DVD, after all, does not suddenly refuse to play for a week. Sony also failed to encrypt some of the personal details of its customers—an elementary error for a company that prides itself on its technological prowess. In Amazon’s case, the problems were caused by a glitch that took longer than expected to resolve, affecting the operations of several internet firms (including Reddit, Quora, HootSuite and Foursquare) that use its services, and denting the reputations of all concerned—as well as that of the cloud itself. But building a totally secure and reliable cloud-based system, or indeed any other kind of computer system, is impossible. More break-ins and breakdowns are inevitable. What matters is that service-providers, consumers and corporate clients all learn the right lessons from the events of the past week.

For providers of online services, the main lesson, beyond the obvious need to adhere to basic principles of computer security, is the importance of being open with customers when things go wrong. This seems to be something that is particularly difficult for Japanese firms, with their consensus-based decision-making and a reluctance to tell superiors when problems arise. Sony remained tight-lipped when it should have been forthcoming. Amazon has also been criticised for providing only a small amount of rather vague information about the outage. One user gave the company an “F” for communication this week; another complained that its updates seemed to have been written by its lawyers rather than its engineers. Consumers, meanwhile, should ensure that they do not use the same passwords on multiple online systems, which exposes them to the danger that a compromise in one system will enable the same credentials to be used to access another. Being able to manage passwords and spot “phishing” e-mails that try to trick recipients into revealing bank details and other information are now important life skills, like it or not. The lesson for companies let down by Amazon’s outage is that they need to be aware of the risks of being too reliant on a single supplier, with cloud computing as with anything else. Firms that use cloud-based systems should be looking at ways to distribute work across multiple providers. Although the cloud has many benefits and is generally quite reliable, it is clearly bound to produce the odd thunderstorm.

Reference :

Federated Identity Paper

Wednesday, March 30, 2011

Worth a read.

Reference :

…..Virtualisation dates back to the age of mainframe computers. To make better use of them they were sometimes split into smaller “virtual machines”, each of which could run its own operating system and application. But the approach took off only in recent years, when VMWare, a software firm, applied it to servers, the powerful computers that populate today’s corporate data centres. VMWare and its main rivals, Citrix and Microsoft, have since developed all kinds of software tools to manage virtual machines—moving them between data centres, for example. The success of server virtualisation has inspired IT firms and their customers to do the same thing with other types of hardware, such as devices to store data. Software now pools their capacity and allocates “virtual disks” as needed. Going further, Dropbox, an online storage service, saves identical files only once. Even large files can take only seconds to upload if they already exist somewhere on one of these firms’ disks.

Many company computers can already work with applications that run on a central server. But start-ups are pushing the concept further. Desktone offers virtual desktops as an online service. NComputing, a maker of computer terminals, virtualises PCs so they can be shared by up to 30 users. It has already sold more than 2.5m devices, mostly to developing countries and schools. And technology from MokaFive can send an entire virtual machine—complete with operating systems, applications and data—over the network and install it on any PC. Eventually people may no longer need to carry laptops at all. Virtual computers, including data and applications, will follow them everywhere. In the long run, smartphones and other mobile devices may also become shells to be filled as needed. Open Kernel Labs, a start-up in which Citrix has a stake, already lets smartphones run applications, multimedia and radio functions on a single processor, cutting manufacturing costs. Software from Citrix turns the iPad, Apple’s tablet computer, into a terminal for applications that run in a corporate data centre.

How quickly will virtualisation advance? Gartner, a market-research firm, predicts that the overall market for virtualisation software will grow from $2.7 billion this year to $6.3 billion in 2014. There is certainly no lack of demand. Virtualisation lowers costs by enabling firms to make better use of their servers and buy fewer new ones. The technology also allows PCs to be maintained remotely, which is much cheaper. But improved reliability and security are even more of an attraction. Users of MokaFive, for instance, can relaunch their virtual machine should a computer virus infect it. And it can be shut down if a laptop is lost or stolen. Yet the technology also has to overcome a few hurdles. The virtualisation of servers is well understood, but for PCs and mobile devices the technique has yet to mature. In the longer run institutional barriers will prove more of a problem, argues Simon Crosby, Citrix’s chief technology officer. Virtualising IT systems, he says, is only the first step to automating their management.  This is seen as a threat to existing workers and makes many IT departments hesitant to embrace the technology.

Still, analysts believe virtualisation will win out. Its impact will be felt through the industry. The technology not only makes IT systems more flexible, but allows firms to switch vendors more easily—which will weigh on the vendors’ profits. Big software firms such as Microsoft and Oracle may be hit hardest. But many hardware-makers may suffer as well, since their wares will become even more of a commodity than they are today…..Moreover, virtualisation makes it much easier to add new servers or storage devices. Alternatively, firms can simply rent extra capacity from operators of what are called “computing clouds”, such as Amazon Web Services. That outfit has built a network of data centres in which virtual machines and disks can be launched in seconds. As a result, IT systems will increasingly no longer be a capital expense, but an operational cost, like electricity. Yet the most noticeable change for computer users will be that more employees will be allowed to bring their own PC or smartphone to work, says Brian Madden of TechTarget, a consultancy. Companies can install a secure virtual heart on private machines, doing away with the need for a separate corporate device. A “bring your own computer” or “BYOC” movement has already emerged in America. Companies such as Citrix and Kraft Foods pay their employees a stipend, which they can use to buy any PC they want—even an Apple Mac. Such innovations may help to ease growing tensions between workers and IT departments. New privacy regulations and rampant cybercrime are pushing firms to tighten control of company PCs and smartphones. At the same time more and more “digital natives” enter the workforce. They have grown up with the freewheeling internet and do not suffer boring black corporate laptops gladly. Giving workers more freedom while helping firms keep control may prove to be the biggest benefit of virtualisation.

Reference :

EU Privacy Maze

Monday, November 8, 2010

The patchwork of rules across Europe regarding the handling of data poses a hurdle for Microsoft’s efforts to provide cloud-based services, a senior Microsoft attorney said on Thursday.  Countries throughout the Europe Union have differing rules regarding data retention, privacy, consumer rights, cross-border data transactions and data ownership. This means that companies such as Microsoft may not be able to offer certain types of services due to restrictions on how data is moved or questions of law.  “What needs to be done is to bring a common set of rules and in a few cases maybe a revision or a new set of rules,” said John Vassallo, vice president for E.U. affairs for Microsoft, speaking on the sidelines of Microsoft’s Government Leaders Forum in London.

Countries that are part of the E.U. are bound by the European Commission’s directives, but their interpretation of those rules is often divergent.  For example, under the Data Retention Directive, providers of electronic communications services (ECSes) are required to maintain data such as records of e-mail recipients, for a minimum of six months up to two years, for law enforcement purposes.  But when it comes to other data, E.U. countries differ on what constitutes an ECS.  Even if two countries agree on what an ECS is, they may differ on how long the provider needs to retain that data, posing more difficulties for companies.  Data sovereignty is also a concern.  For example, multiple states may have an interest in particular data, but could run into conflicting laws and regulations over which entity would have jurisdiction in case of a problem.  If a cloud service provider complies with a demand from law enforcement in one country, that might violate privacy regulations of a user in another jurisdiction.  That makes it also harder for cloud services companies to communicate to their customers under what conditions their data may be exposed.

“You must find a system that all countries at least within the E.U. at first and maybe beyond will agree to,” Vassallo said.  “These things don’t exist today.”  Vassallo said concepts that are being discussed include a “diplomatic immunity” for data, where communications would be treated with the same privilege as diplomats who carry paperwork in briefcases.  Another idea is a “data free zone,” or areas where there are harmonized rules for data transactions, similar to free trade zones.  A universal agreement for data would mean more transparency for consumers while also allowing for the growth of cloud services, which hold the promise of enabling businesses to in turn offer new services.  “The end result is it would be increasing the certainty to 500 million [ E.U.] citizens that their rights are going to be treated equally,” Vassallo said.  But “the legislative system is slower than the technology development, and that is always the case,” he said.

Reference :

IT SOUNDS like the plot of an airport thriller or a James Bond film.  A crack team of experts, assembled by a shadowy government agency, develops a cyber-weapon designed to shut down a rogue country’s nuclear programme.  The software uses previously unknown tricks to worm its way into industrial control systems undetected, searching for a particular configuration that matches its target—at which point it wreaks havoc by reprogramming the system, closing valves and shutting down pipelines.  This is not fiction, but fact.  A new software “worm” called Stuxnet (its name is derived from keywords buried in the code) seems to have been developed to attack a specific nuclear facility in Iran.  Its sophistication suggests that it is the work of a well-financed team working for a government, rather than a group of rogue hackers trying to steal secrets or cause trouble.  America and Israel are the obvious suspects.  But Stuxnet’s origins and effects are unknown.

Stuxnet first came to light in June, when it was identified by VirusBlokAda, a security firm in Belarus.  The next month Siemens, a German industrial giant, warned customers that their “supervisory control and data acquisition” (SCADA) management systems, which control valves, pipelines and industrial equipment, were vulnerable to the worm.  It targets a piece of Siemens software, called WinCC, which runs on Microsoft Windows.  For security reasons SCADA systems are not usually connected to the internet.  But Stuxnet can spread via infected memory sticks plugged into a computer’s USB port.  Stuxnet checks to see if WinCC is running.  If it is, it tries to log in, to install a clandestine “back door” to the internet, and then to contact a server in Denmark or Malaysia for instructions.  (Analysis of traffic to these servers is continuing, and may offer the best chance of casting light on Stuxnet’s purpose and origins.)  If it cannot find WinCC, it tries to copy itself on to other USB devices.  It can also spread across local networks via shared folders and print spoolers.

Initially, Stuxnet seemed to be designed for industrial espionage or to allow hackers to blackmail companies by threatening to shut down vital systems.  But its unusual characteristics suggest another explanation.  WinCC is a rather obscure SCADA system.  Hackers hoping to target as many companies as possible would have focused on more popular systems.  And Stuxnet searches for a particular configuration of industrial equipment as it spreads.  It launches an attack only when it finds a match.  “The bad news is that the virus is targeting a specific process or plant,” says Wieland Simon of Siemens.  “The good news is that most industrial processes are not the target of the virus.” (Siemens says it knows of 15 plants around the world that were infected by Stuxnet, but their operations were unaffected as they were not the intended target.)

Another odd feature is that Stuxnet uses two compromised security certificates (stolen from firms in Taiwan) and a previously unknown security hole in Windows to launch itself automatically from a memory stick.  The use of such “zero-day vulnerabilities” by viruses is not unusual.  But Stuxnet can exploit four entirely different ones in order to worm its way into a system.  These holes are so valuable that hackers would not normally use four of them in a single attack.  Whoever created Stuxnet did just that to boost its chances.  They also had detailed knowledge of Siemens’s industrial-production processes and control systems, and access to the target plant’s blueprints.  In short, Stuxnet was the work neither of amateur hackers nor of cybercriminals, but of a well- financed team.  “Behind this virus there are experts,” says Mr Simon.  “They need money and know-how.”

So what was the target?  Microsoft said in August that Stuxnet had infected more than 45,000 computers.  Symantec, a computer-security firm, found that 60% of the infected machines were in Iran, 18% in Indonesia and 8% in India.  That could be a coincidence.  But if Stuxnet was aimed at Iran, one possible target is the Bushehr nuclear reactor.  This week Iranian officials confirmed that Stuxnet had infected computers at Bushehr, but said that no damage to major systems had been done.  Bushehr has been dogged by problems for years and its opening was recently delayed once again.  Given that history, the latest hitch may not have been Stuxnet’s work.  A more plausible target is Iran’s uranium-enrichment plant at Natanz.  Inspections by the International Atomic Energy Agency, the UN’s watchdog, have found that about half Iran’s centrifuges are idle and those that work are yielding little.  Some say a fall in the number of working centrifuges at Natanz in early 2009 is evidence of a successful Stuxnet attack.  Last year Scott Borg of the United States Cyber-Consequences Unit, a think-tank, said that Israel might prefer to mount a cyber-attack rather than a military strike on Iran’s nuclear facilities.  That could involve disrupting sensitive equipment such as centrifuges, he said, using malware introduced via infected memory sticks.  His observation now looks astonishingly prescient.  “Since the autumn of 2002, I have regularly predicted that this sort of cyber-attack tool would eventually be developed,” he says.  Israel certainly has the ability to create Stuxnet, he adds, and there is little downside to such an attack, because it would be virtually impossible to prove who did it.  So a tool like Stuxnet is “Israel’s obvious weapon of choice”.  Some have even noted keywords in Stuxnet’s code drawn from the Bible’s Book of Esther—in which the Jews fight back to foil a plot to exterminate them.

Reference :