Facebook is bucking the trend toward server virtualization and is interested in microservers for inexpensive growth and quick failover, the company’s lab director said Tuesday.  The social networking giant came out in support of Intel’s plans for an expanded lineup of processors for microservers, as Gio Coglitore, director of Facebook labs, spoke at an Intel press briefing in San Francisco. At the event, Intel said it would introduce four new chips for microservers this year and in 2012, ranging from a 45-watt Xeon to an Atom-based processor with less than 10 watts of power consumption. All will have server-class features, such as 64-bit compatibility and ECC (Error-Correcting Code) memory. Facebook has tested microservers in production and is interested in the architecture for its massive data centers, Coglitore said. The inclusion of those server features is key for the company to be able to use microservers, he said.  Microservers, a concept that Intel introduced in 2009, are small, low-power, one-processor servers that can be packed into a data center more densely than rack or blade servers. The microservers in a rack typically share power and cooling and may also share storage and network connections, said Boyd Davis, vice president of the Intel Architecture Group and general manager of data center group marketing.  Manufacturers including Dell, Seamicro and Tyan have adopted the architecture, which has been most popular among large cloud service providers for large-scale, low-end hosting and Web serving, according to Intel. The company expects microservers to remain about 10 percent of its server processor market.

Front-end Web servers are where Facebook is nearly ready to start using microservers, according to Coglitore. “With Intel’s announcement, it’s just about to happen,” he said. Facebook will probably start implementing microservers on a large scale beginning late this year or early next year.  Facebook uses a variety of server types across different parts of its data centers, but the company’s aversion to virtualization extends throughout its infrastructure, Coglitore said.  “We find in our testing that a realized environment brings efficiencies and brings the ability to scale more effectively,” Coglitore said. “If virtualization was the right approach, we would be a virtualized environment.”  Facebook wants to be able to balance its computing load across many systems and potentially lose a server without degrading the user experience. “As you start to virtualize, the importance of that individual server is greatly enhanced, and when you have that at scale, it becomes very difficult,” Coglitore said. He prefers to think of computing units as faceless, interchangeable “foot soldiers.” Virtualization makes it harder to treat hardware resources that way, Coglitore said. Using a virtualization software layer also tends to create lock-in, he said.  In addition, though Facebook could take advantage of more powerful server platforms for some functions, it sometimes turns to lower end systems for budgetary reasons. Facebook prefers to change servers every two to three years, following the chip refresh cycles of Intel and other processor makers, Coglitore said.  Intel currently ships a 45-watt Xeon and a 30-watt Xeon processor for microservers. Upcoming microserver chips include the 45-watt E3-1260L and 20-watt E3-1220L, which are already shipping to server makers, and a unnamed 15-watt part based on the new Sandy Bridge architecture. The Atom-based, sub-10-watt microserver processor coming next year also does not yet have a name, Intel’s Davis said.

Reference : http://www.cio.com/article/print/677076

ROUND ROCK in Texas and Palo Alto in California are half a continent apart, but Dell and Hewlett-Packard (HP), two tech behemoths that, respectively, have their headquarters in those cities, have much in common. The two personal-computer makers boast impressive records as innovators—Dell in supply-chain management and HP in fundamental research—though each has lost some of its creative spark in recent years. Both are battling to remain relevant in a rapidly changing information-technology landscape. Since Michael Dell returned in 2007 to the helm of the company he founded, he has overhauled its operations in a bid to revive its fortunes. Léo Apotheker, HP’s boss since last year, is due to announce a new strategy for his company on March 14th. He has been hinting that he, too, will want to make significant changes at a firm that is still reeling from the traumatic departure of its previous boss, Mark Hurd, after a fuss over his relationship with a female marketing contractor. Among the trends the two firms are grappling with is the growing popularity of tablet computers, smartphones and other devices that let consumers work and surf the web on the move. Apple’s wildly popular iPad and other tablet offerings are starting to have an impact on low-end laptop sales. Gartner, a research firm, now reckons that global PC shipments will grow 10.5% this year, to 388m units, down from its previous forecast of almost 16%, partly because consumers are switching with such enthusiasm to tablets.

An even more important trend sweeping the industry is the growth of cloud computing. This lets companies store and process vast amounts of data in huge warehouses of servers run by third parties. The data can then be accessed over the internet whenever and wherever needed. New competitors such as Amazon and Rackspace Hosting have jumped into this market and are trying to persuade companies they would be better off renting capacity “in the cloud” than buying their own servers from the likes of Dell and HP. Of course, the cloud-services providers themselves buy lots of servers, mainly Dell and HP ones—but their huge size means they can drive a hard bargain on prices. The emergence of the cloud is also partly responsible for a third trend, “verticalisation”. This is the increasing tendency of makers of IT hardware, operating systems and applications to move into each other’s area of business, because their corporate customers no longer want to shop around for all these different bits and splice them together themselves. They now want all-in-one solutions they can just take out of the box and switch on—and which are well integrated with their cloud-computing systems. Furthermore, for providers like Dell and HP, spreading vertically into other parts of the IT business also offers the best hope for growth in a market that is, overall, maturing.  This has created a free-for-all, with software firms swallowing up hardware firms and vice versa. HP and Dell now face stiff competition from the likes of Oracle, a software firm that bought Sun Microsystems, a hardware maker (Oracle hired Mr Hurd as its president after he left HP); and from Cisco, a maker of networking gear which has moved into the market for servers. This rapid restructuring of the industry is prompting the big IT firms to get out of low-margin businesses, as Japan’s Hitachi did this week when it sold its disk-drive business to Western Digital for $4.3 billion. HP and Dell dominate the PC business, together with Taiwan’s Acer (see chart), so they are vulnerable to any shift away from such computers. Declarations of the death of the PC are premature, not least because demand from emerging markets is still growing. But Apple’s stunning success with its iPad in rich countries, and corporate customers’ demands for IT firms to provide a full service, not just bulk quantities of desktops and servers, mean that neither company can afford to be complacent. The PC is not dead, but its importance henceforth is going to be significantly diminished.

Dell is likely to fare a bit better than HP in this new world because its sales of PCs, which were 23% of the $16 billion of revenue it earned in its latest quarter, are more heavily geared towards companies, and they will probably keep buying desktops for some time to come. HP, which counted on PC sales for 32% of its $32 billion of revenue in its latest quarter, relies more heavily on purchases by consumers, whose heads are easily turned by new gadgets. However, both firms have plenty to worry about. “The computing market is fragmenting and devices are specialising,” says George Shiffler of Gartner. Dell’s relative strength in the corporate market is becoming steadily less relevant, though. Companies are increasingly being swayed by what IT gear consumers (especially their own staff) are buying. So firms like HP and Dell need to come up with successful mass-market devices or risk seeing rivals’ gadgets eat into their corporate sales. That is why they have recently unveiled tablets of their own. Dell’s Streak 7 is based on Google’s Android operating system, whereas HP’s TouchPad runs on webOS, an operating system developed by Palm, a company that HP bought last year. In a recent Bloomberg interview Mr Apotheker said HP would start installing webOS in its PCs too, in addition to Windows, to create a broader platform for it.

This highlights a difference in philosophy between the two companies. Dell places huge emphasis on making products that are “open, capable and affordable”. It is open to borrowing operating systems and other stuff from third parties and splicing them together itself. HP seems less willing—at least for now—to embrace such openness, preferring to use its in-house operating system. But if a successful ecosystem of “apps” evolves around Android, and tablets based on Google’s operating system turn out to be as popular as Android-based smartphones, it may have to rethink its approach. Although they are still paying plenty of attention to the PC and server markets, both Dell and HP have been working overtime to reduce their dependence on them—notably by splashing out on cloud-related businesses such as data-storage providers. In 2008 Dell paid $1.4 billion for EqualLogic, or more than ten times its target’s revenues. EqualLogic is now on track to deliver $800m of annual revenue, justifying what appeared to be an extravagant purchase price. Last year, Dell added to its cloud capabilities when it bought Boomi, which helps move software applications to the cloud, and InSite One, which offers cloud-based medical archiving.  But Dell missed out on its most ambitious target last year when HP waltzed off with 3PAR, another data-storage firm, after a ferocious bid battle. HP paid $2.4 billion for the company, way above Dell’s initial offer of $1.15 billion. After the dust had settled, Mr Dell claimed HP had overpaid for 3PAR, which had revenues of just $194m, whereas Dell had shown discipline. Yet Dell’s final offer was a whopping $2.35 billion, only a smidgen below its rival’s. The firm got a consolation prize in December, when it bought Compellent Technologies, another data-storage firm, for $960m. HP and Dell are destined to cross swords again as they go further down the verticalisation route and emulate IBM, a hardware maker which has built powerful software and services businesses with fat margins. Both firms have already made big acquisitions of service providers, with HP buying EDS in 2008 and, the next year, Dell buying Perot Systems (which, like EDS, was founded by Ross Perot, a former American presidential candidate). But they are bound to seek more such deals. Mr Apotheker, who used to run SAP, a German software giant, has made little secret of his desire to see HP expand in highly profitable software businesses. Last month HP bought Vertica, whose software helps firms analyse massive amounts of data fast, and there has been speculation that the company will consider more ambitious targets.

Dell’s executives argue that the firm’s diversification strategy is already bearing fruit. As evidence of this, Brian Gladden, its finance chief, points to the net profits of $927m it earned in the three months to late January, almost three times the figure a year earlier. Admittedly, Dell benefited from low component costs and an uptick in new PC purchases by companies following the economic crisis. But its results also reflect the growth of its services businesses, which have higher margins. These now account for 41% of the revenues of Dell’s large-enterprise business, up from a third at the start of 2008. HP’s recent record has been less impressive, which is reflected in the relative performance of its shares. In particular, whereas Dell and other firms such as IBM and Accenture have seen an increase in service-related sales recently, HP’s services revenue slipped 2% in its latest quarter. That partly reflects the fact that, although it is strong in outsourcing thanks to EDS, it does less well in higher-margin consulting—a weakness that Mr Apotheker ought to address. HP’s boss will also need to decide whether the computers-to-calculators company, which has $126 billion of annual revenues and 325,000 employees, can continue to dabble in so many areas. IBM, for instance, sold its PC business to Lenovo to focus on building its services activities. HP is unlikely to follow suit, considering that it is still the leader of the PC pack. Nor does it look set to jettison its printing business, which is having to adapt to an increasingly digital world. But if it is serious about beefing up its software and services activities, HP may have to make sacrifices elsewhere. At the very least, Mr Apotheker will have to be clear when he speaks next week about the strategic logic that binds HP’s disparate activities together. And he will need to emphasise that he is prepared to undo some of the savage cost-cutting undertaken during Mr Hurd’s reign, which dented the company’s innovation engine and sapped its morale. HP has already started to reinvest in its services business and in January it brought in several new board members, including Shumeet Banerji, the boss of Booz & Company, a consulting firm, and Dominique Senequier, the chief executive of AXA Private Equity, whose experience will be invaluable as it mulls new investment. Mr Apotheker likes to joke that since moving to Palo Alto from Europe to run HP he has learnt how to say “awesome” and “cool” like a true Californian. Now his task is to persuade customers and investors to use those same words about HP.

Reference : http://www.economist.com/node/18332916/print

(click for full image)

Reference : The Financial Times, Feb 12th 2011.

The Economist : Wikipedia’s Lessons

Tuesday, January 18, 2011

IT MAY not stir up international outrage like its semi-namesake WikiLeaks, but Wikipedia sparks debate. The free online encyclopedia, which celebrates its tenth birthday on January 15th, is a symbol of unpaid collaboration and one of the most popular destinations on the internet, attracting some 400m visitors a month. It also faces serious charges of elitism. Wikipedia offers more than 17m articles in 270 languages. Every day thousands of people edit entries or add new ones in return for nothing more than the satisfaction of contributing to the stock of human knowledge. Wikipedia relies on its users’ generosity to fill its coffers as well as its pages. Recent visitors to the website were confronted with images of Jimmy Wales, a co-founder (pictured), and a request for donations. The campaign was annoying but effective, raising $16m in 50 days.

With its emphasis on bottom-up collaboration and the broad dissemination of knowledge, the online encyclopedia is in many ways an incarnation of the fundamental values of the web. But Wikipedia also reveals some of the pitfalls of the increasingly popular “crowdsourcing” model of content creation. One is maintaining accuracy. On the whole, Wikipedia’s system of peer reviewing does a reasonable job of policing facts. But it is vulnerable to vandalism. Several politicians and TV personalities have had their deaths announced on Wikipedia while they were still in fine fettle. Some observers argue the site should start paying expert editors to produce and oversee content, and sell advertising to cover the cost. Problems with accuracy “are an inevitable consequence of a free-labour approach”, argues Alex Konanykhin of WikiExperts, which advises organisations on how to create Wikipedia articles. (The very existence of such outfits hints at Wikipedia’s importance, as well as its susceptibility to outside influence.) The encyclopedia’s bosses retort that such concerns are overblown and that taking advertising would dent its appeal to users.

That still leaves the site with another, bigger, headache in the form of declining engagement. The number of regular contributors to Wikipedia’s English-language encyclopedia dropped from around 54,000 at its peak in March 2007 to some 35,000 in September 2010. A similar trend has been visible in some foreign-language versions of the encyclopedia. Wikipedia’s leaders say this reflects the fact that the large majority of subjects have now been written about. Perhaps, but some evidence suggests that neophytes are being put off by Wikipedia’s clique of elite editors. One study by researchers at Xerox’s Palo Alto Research Centre looked at the number of times editorial changes were subsequently reversed. It found that roughly a quarter of the edits posted by occasional contributors were undone in late 2008, compared with less than 2% of those posted by the most active editors. And it noted that this gap had widened considerably over time. Mr Wales dismisses the notion of a clique as “silly”. Sue Gardner, the head of the nonprofit Wikimedia Foundation that operates Wikipedia, notes that people have reason to be protective about high-quality content. Still, the organisation seems concerned. Ms Gardner wants to learn from sites such as Yelp and Facebook that are good at encouraging users to contribute. She hints that in future, contributors whose changes are undone may receive a message urging them to remain active on the site. Time, perhaps, to consult Wikipedia’s entry on motivation—though it turns out to be one of the articles flagged as needing expert attention.

Reference : http://www.economist.com/node/17911276/print

Droid Really Does Lead iPhone

Friday, November 12, 2010

According to a new mobile phone report released Wednesday by Gartner, the technology research company, sales of smartphones and mobile devices continued to surge in the third quarter.  Gartner said in the report that a staggering 417 million mobile phones were sold worldwide during this past quarter, up 35 percent from the same period last year.  The report also notes that the sale of smartphones, which include phones on the Android platform from Google, the BlackBerry from Research In Motion, and the iPhone from Apple, grew 96 percent from last year.

(click for full image)

Android took top honors in the report. Android-based phones “accounted for 25.5 percent of worldwide smartphone sales, making it the No. 2 operating system” in the world, right behind Symbian, Nokia’s operating system. The growth of Android is especially startling considering that it was released only two years ago. In terms of hardware, Nokia, Samsung and LG remained the top three device manufacturers in the mobile phone market, but Apple crept into fourth place, selling 13.5 million iPhones and beating out R.I.M., which sold nearly 12 million BlackBerrys.

Gartner said Apple’s record sales numbers could have been higher if the company didn’t have supply constraints. It is expected that sales of the iPhone in the United States will grow in the first quarter of 2011 if the iPhone becomes available on the Verizon network, as reports suggest. Nokia still beat out most of its competitors, selling 110 million phones, even though it has had trouble getting certain mobile phone components, including cameras and displays. Gartner predicts that cellphone sales will continue to grow at 30 percent year over year.

References :
http://bits.blogs.nytimes.com/2010/11/10/android-takes-2nd-place-in-mobile-sales-worldwide/
The Financial Times, Nov 11th 2010

Wipro Offers Virtual Desktop Service

Thursday, October 14, 2010

Wipro Technologies has introduced Desktop as a Service (Wipro DaaS), a technology aimed at providing users with a range of differing desktop experiences according to their particular needs.  The new service, which has been announced in partnership with Microsoft, is based on Citrix’s XenDesktop and FlexCast technologies.  The beta service, which has been designed for specific vertical sectors, such education, manufacturing, banking and healthcare, will become generally available in the first quarter of next year.

The software can either be hosted at Wipro’s datacentres or within the user organisation itself.  Its appeal is going to be in saving IT departments of the need to customise each desktop for individidual users’ own needs, by offering a plug-and-play, pre-configured set-up.  The system would enable desktop virtualisation to be implemented with little capital cost.  Microsoft’s director technology communications server and tools business, Patrick O’Rourke said that the system would incorporate Microsoft’s own VDI Premium Suit, which in turn is based on Citrix Xen Desktop to help customers to virtualise, manage, stream and remotely display applications.  “It will enable organisations to speed up VDI deployment, delivering appropriate desktops to each user.”

In a statement, Deepak Jain, Wipro senior vice president, technology infrastructure services (TIS) said “”Customers are looking for virtual desktop and virtualized application solutions that allow for a modular approach at a low operating expense (OPEX).  We see a need to simplify the deployment and management of desktop virtualisation.”  Wipro said that pricing had not yet been decided for the new service.

Reference : http://www.cio.com/article/print/623315

When Dataprise, an IT services company, helped a customer with a desktop virtualization project last year, it found itself dealing with desktop virtualization’s dirty little secret: No one — including vendors — seems to know how to license the software. Having run a successful pilot, Dataprise’s client wanted to take the next step and deploy 700 virtual desktops, says Chris Sousa, director of infrastructure service at Dataprise. That’s when the trouble began. Like many businesses, the customer — a manufacturer of fiber-optic cable — had an enterprise agreement with Microsoft, but its IT staff wasn’t sure exactly what was covered in a virtualized environment.  Apparently, neither was Microsoft, says Sousa, who noted that he called the company repeatedly seeking information. “We’d get a different answer from a different person on a different day,” he says. Sousa’s experience is not unusual. In a study conducted by Info-Tech Research Group last year, Microsoft Windows licensing was the most-cited pain point for people implementing desktop virtualization, according to John Sloan, lead research analyst at Info-Tech Research Group.

Microsoft responds that it has tried to improve its virtualization pricing policies.  Most recently, the software giant relaxed its licensing rules for virtual desktops and expanded rights to access a given virtual desktop from more than one computer. The changes are “a step in the right direction,” says Sloan, but he feels that Microsoft “hasn’t gone as far as many would like.”  For example, although the new roaming rights allow users to log into their virtual desktops from devices outside of the corporate firewall, such as home PCs or airport kiosks, the virtual desktop is still licensed to a specific corporate PC.  That means that a user may not be able to access his virtual desktop from another corporate PC, like one in a branch office, Sloan explains.  Confused yet?  You’re not alone.  Even with all the changes, Microsoft licensing “is still so complicated that users and even resellers don’t understand it,” says Barb Goldworm, president and chief analyst of consultancy Focus LLC.  Not only are the specific vendor rules confusing, but IT managers also mix up the licensing of the virtualization software (which serves as a connection broker and a virtual desktop running on a back-end hypervisor) and the licensing of the software that actually runs on the desktop (the operating system and applications).  But the problem is bigger than just Microsoft.  All software vendors are struggling with this issue to some extent.  When Citrix introduced XenDesktop 4 last fall, it changed from its traditional model — concurrent licensing — to one license per named user.  But customers quickly complained that they needed more flexibility. In some industries, for example, multiple users share the same device.  So Citrix quickly added per-device licensing and brought back concurrent licensing for its Virtual Desktop Infrastructure edition, says Calvin Hsu, director of product marketing at Citrix. In some cases, IT managers throw up their hands and look for other options.  When Michael Goodman, vice president and director of information systems and technology at Crescent State Bank in Cary, N.C., discovered that he’d have to buy two licenses for the same Windows operating system — one for a thin client and one for the operating system running on the server — “it really knocked down my payback period on the ROI.”  That was one of the reasons he skipped thin clients and went with a Pano Logic device, which serves as a dumb terminal connected to the operating system, which is running on a server in the data center. In other cases, IT managers simply wing it, making a good-faith effort to pay the proper licensing fees without knowing exactly what licensing fees are required, which is what Sousa’s client did.  “We were trying to be upstanding citizens and not rip anybody off, but we couldn’t get definitive answers,” he says.

Software licensing for virtual desktops is incredibly complex, confusing and, in some cases, prohibitively expensive…..The problem is multifaceted.  Like an onion, if you peel away one layer another layer appears.  At its most basic, the problem reflects a fundamental shift in the industry: Software is being divorced from hardware at a faster rate than ever before, mostly because of virtualization. As software vendors deal with this shift, they are experimenting with different approaches.  Some still tie the software license to a specific piece of hardware, some are moving to a user-based license, others sell concurrent-user licenses and still others do a mix of all three. On top of that, there are different flavors of virtualization at the desktop level, such as virtual desktop infrastructure (VDI), application virtualization and operating system streaming.  And different types of licensing plans can apply to the different flavors.  Moreover, there are many different layers of software in any virtualized environment — the operating system, the virtualization software itself, the applications — each of which has its own licensing requirements. The confusion over licensing of Microsoft products is tripping up small and midsize companies in particular, because they may not have Software Assurance plans, says Sloan.  Large enterprises that are covered through SA and enterprise agreements sometimes don’t feel they need to keep track of all of the details, even though they should.

Bill Galinsky, senior vice president of global IT infrastructure at software vendor CA Technologies, started an internal desktop virtualization pilot project in January 2010.  So far, he has virtualized 500 desktops, and he expects to reach 2,000 of the company’s 13,000 employees within a year. When Galinsky started the pilot, he bought Microsoft VECD licenses for the virtual desktops.  “That was something we nailed early on with Microsoft — it wasn’t a challenge, it was just [a matter of] understanding what that cost was,” he says.  Although he knew that Microsoft has since changed some license policies, Galinsky said he wasn’t specifically aware that as of July 1 the VECD was going away and those rights would now be included in the SA.  “If [the VECD] went away,” he says, “I’ll be asking Microsoft for a refund or a credit.” In any event, those changes are probably covered under CA’s SA and enterprise agreements, which for all practical purposes base the licensing on the number of users rather than pieces of hardware, he says. “In our case, our enterprise agreement works out to a ratio of around 1 to 1.27.  So every employee can run 1.27 copies of the operating system and Microsoft Office.” Vince Kellen, CIO at the University of Kentucky, is also facing the pricing conundrum as he considers how to virtualize about 1,000 desktops on campus.  “It’s a challenge to get the software licensing that you want,” he says.  But, in his case, Microsoft and other big software vendors aren’t the problem.  Kellen says he’s covered under enterprisewide contracts geared for academic institutions, “but as soon as we get into other software outside of our normal contracts, it can get more difficult.” Some of the university’s smaller vendors, especially those selling niche academic and clinical applications and specialized math or statistical software packages, are “a little harder to work though the contracting,” Kellen says. Over time, he hopes that software vendors can find a less expensive pricing model that is desktop-virtualization-friendly — one that licenses concurrent users instead of specific named users, for instance.  “This will be hard for smaller vendors, I think, as larger vendors have a broader portfolio of software products and perhaps business models which will give them flexibility,” Kellen adds. The whole concept of software licensing is morphing as virtualization grows and consumer electronics invades corporate IT. ” As corporate employees start using many different devices — smartphones, laptops, iPads — corporations are asking ‘How many licenses am I going to have to buy?'” says Buchholz.

Reference : http://www.cio.com/article/print/615323