FT : Server In The Sky
Saturday, March 28, 2009
A revolution that is sweeping through corporate data centres is the untold secret of modern business. The information factories of the digital age are going through a transition that is every bit as significant as the advent of the moving assembly line was to manufacturing nearly a century ago. Companies that master the new techniques of information processing, or make the right bets about when to hand over control of that operation to someone else with greater skills, stand to reap significant benefits in the form of lower costs and greater usability of their data. But this all comes with significant risks: not only of falling behind as information processing enters a new industrial-scale era but, conversely, of losing control of one of a business’s key assets – its information. The shift is happening largely out of view. Guarded by the priesthood of the IT department and locked away from prying eyes for security reasons, data centres operate beyond the average business manager’s realm of consciousness. The places where the inner workings of business are conducted – where invoices are processed, transactions recorded and corporate secrets stored – are often taken for granted. Every now and then, though, something happens to shine a light on this arcane world. Such a moment came last week, with signs of a realignment among some of the world’s leading technology concerns. Cisco Systems, the world’s biggest maker of networking equipment, said it would start selling servers, the back-room machines that are the workhorses of corporate computing, setting up a showdown with Hewlett-Packard and IBM. It also emerged that IBM was in the late stages of negotiations to buy Sun Microsystems, bringing together two of the biggest suppliers of servers and other technologies for data centres. The maturing of the IT industry and a steep slide into recession provided the immediate impetus for these moves. But something else is at work. After a technology era characterised by the rise of the PC, a new centralisation is taking place in computing and the biggest suppliers of technology are being forced to respond. A catchphrase has been coined to describe this new approach: “cloud computing”. If the past quarter-century was characterised by a decentralisation of computing, with information processing and storage placed on every desk- and laptop, the coming era is set to bring greater consolidation of computing power in “clouds”, or large-scale, distributed computing facilities. Even Microsoft, a company that came to dominate the PC era, is racing to create one of the world’s biggest computing clouds, although it insists this will co-exist with existing forms of personal computing for years to come. The economies of scale that come from consolidating computing in fewer places, and the availability of fast internet connections that make it easy to tap into this resource, account for the shift. As a result, data centres – whether run by large companies or by internet services groups such as Google – are assuming an increased share of the world’s information processing workload.
There are striking signs of how this reconfiguration is taking shape. According to Rick Rashid, head of research at Microsoft, a handful of internet companies, including his employer as well as Google, Microsoft, Yahoo and Amazon.com, is already buying 20% of the world’s output of servers. The massive new data centres these companies are building harness an amount of computing power that far exceeds anything assembled by private companies before, he adds. These companies use much of this brute computing power to run their own online services, including internet search and electronic commerce. Increasingly, though, they are also offering their capacity as a substitute for the data processing and storage that takes place on their customers’ machines. This comes in two forms. Some is made available in the form of services. For instance, rather than store documents and run a word processor on your own PC, you can now just access Google Docs, which mimics many of the functions that Microsoft’s Office software has carried out in the past. In the business world, this approach is known as “software as a service” and involves the provision of corporate applications such as accounting or customer relationship management by companies including Salesforce.com and NetSuite. Small companies in particular are turning to these services to escape the headache of maintaining their own technology…..The second new form of outsourced computing involves the provision of raw data processing power and storage capacity: companies buy access to someone else’s data centre to boost their own capacity at times of need, or even to replace it altogether. Amazon.com has emerged as the unlikely early leader in this business. More than halfthe online bookseller’s computing resources are being consumed by other companies, which run their own applications in its data centres, says Werner Vogels, Amazon’s chief technology officer. Customers include the New York Times and Nasdaq. The same forces that are leading to the aggregation of computing power in the internet services companies are also prompting many big companies to centralise more of their internal computing resources, taking advantage of economies of scale that are bringing down the unit cost of information processing and storage. For instance, General Electric’s data centres now consume about 45% of the company’s IT budget, up from 25% three years ago, as it builds up a more centralised resource that can be used by all of its divisions, says Greg Simpson, GE’s chief technology officer. While internet companies are creating “public clouds”, many big companies like GE say they are creating private ones of their own, even as they weigh the benefits of shifting some of their computing to the outside services suppliers. There is a danger, however, in overstating the near-term impact of new developments of this nature. “The ‘cloud’ is definitely hyped right now,” says Dante Malagrino at Cisco. He describes the term as a “fancy way of saying ‘a network of resources’ “. It is also the case that in IT, there is seldom anything really new under the sun. The centralisation trend is a return to life before the fragmentation brought about by PCs and the proliferation of department-level servers in business to handle tasks such as e-mail. Forerunners of the cloud have gone by names like “utility computing” and “ondemand computing”. Sweeping visions of this kind are easy to lay out yet take years to unfold. Companies do not abandon their earlier investments in technology but layer new technologies on top of old, creating a patchwork of information architectures. Most data centres resemble archaeological digs. Nor will change happen overnight. Companies are deeply cautious about how they handle their information assets and do not experiment with new IT systems lightly. Concerns about security and reliability will act as a drag for years. “Cloud computing” has become useful shorthand for a trend that has shown signs of accelerating as software-as-a-service and the outsourcing of raw computing power are adopted more widely. Yet there are reasons why some computing will continue to take place locally that go beyond a desire by users to keep more direct control of their own data…..
Behind the rise of cloud computing lies a process revolution that has been taking place on the information production line. It has been compared to the impact on the automotive industry of Toyota’s manufacturing system, which brought a step change in quality and a reduction in costs. “The typical IT organisation is used to a highly customised way of doing things,” says Russ Daniels, chief technology officer of cloud computing at Hewlett-Packard. Adopting more standardised systems “will turn IT into more of a manufacturing process than an art”, he adds. In turn, that could lead companies to decide they no longer want to invest in staying on the cutting edge of the information processing business. The danger, says Mr Daniels, is that this could leave companies without the skills needed to understand and direct their IT. The catalyst for this new industrialisation of IT has been a simple but powerful idea: unchaining computing tasks from the physical machines on which they take place. Until recently, companies bought a new server each time they added a new application. The need to commission, house, maintain and power all those servers has become one of IT’s biggest costs. The antidote to this has been a technology known as “virtualisation”, which makes it possible to run more than one application on each machine. In essence, each application is tricked by the virtualisation software into thinking it is running on its own dedicated server: many new “virtual machines” can exist on a smaller number of actual servers. Breaking the link between a computing task and the hardware of individual computers does not only lead to more efficient use of capacity: virtual machines can be shifted between servers as they are running, or even between data centres, with no interruption. This virtualisation trend has taken hold quickly over the past two to three years. A year ago, says Greg Simpson, chief technology officer at General Electric, his company bought a new server in 85% of the instances when it had a new computing task to handle: by the end of last year, that proportion had fallen to 50%, with “virtual machines” making up the difference. It is a short step from there to shifting some computing tasks from a company’s own computers to those of an external service provider.