Dell Acquires SaaS Player Boomi

Friday, November 5, 2010

Dell this week said it will acquire Boomi, which operates a service designed to simplify the integration of cloud services with premises-based infrastructure.  Terms of the deal were not disclosed.  Boomi is a closely held startup based in Berwyn, PA that offers a platform called AtomSphere, described as a software-as-a-service (SaaS) integration offering that eases data transfer between cloud and premises-based applications.  AtomSphere does not require appliances, software or development, according to Dell.

Boomi’s solution is implemented in a variety of cloud-based apps, notably Salesforce.com’s CRM suite, as well as a variety of business application services.  “With AtomSphere, we were able to eliminate 95 percent of our integration costs and now have a completely transparent platform that allows us to analyze customer data real-time,” said Salesforce operations manager Tom Fox, in a prepared statement. According to Dell, Boomi’s technology appeals to customers and systems integrators by connecting data between premise and cloud-based apps to “ensure business processes are optimized, data is accurate and workflow is reliable.”  Dell also said that Boomi’s technology decreases integration time from months to weeks.

Reference : http://thejournal.com/articles/2010/11/03/dell%C2%A0to-acquire-cloud-integrator-boomi.aspx

Gartner : Cloud Computing Spend Update

Friday, October 8, 2010

Cloud computing services accounted for 12.5% of overall IT budgets, according to a research report released this week by Gartner.  The report found that 39% of those surveyed allocated portions of their IT budgets for cloud computing, while 44% of those surveyed said they procured services from outside providers.  Of those, 46% said they will increase that spending in the next budget year by an average of 32%.  Gartner said it surveyed more than 1,500 IT professionals throughout 40 countries between April and July of this year.

Another key finding of the report revealed that one-third of spending came from last year’s budget; another third was new spending; and 14% was diverted from a different budget category.  “Overall, these are healthy investment trends for cloud computing,” said Gartner analyst Bob Igou, in a statement.  “This is yet another trend that indicates a shift in spending from traditional IT assets such as the data center assets and a move toward assets that are accessed in the cloud.  “Igou, who authored the study, pointed out that 24.1% of budgets covered data center systems and business applications, 19.7% for PC and related gear, 13.7% for telecom costs, and 30% for IT personnel.  For the next budget year, the report found, 40% will increase spending on development of cloud-based applications and solutions, while 56% will spend the same amount.  Forty three% said they will increase spending on implementation of cloud computing for internal and/or restricted and 32% will increase spending on such environments for external and/or public use.

Reference : http://thejournal.com/articles/2010/09/24/gartner-says-cloud-spending-on-the-rise.aspx

Gartner is advising corporations to adopt a new style of enterprise architecture called “emergent architecture,” which the analyst firm says is necessary to respond to the growing complexity in markets, economies, networks and companies.  Also known as middle-out enterprise architecture and light EA, the emergent architecture approach is best summarized as “architect the lines, not the boxes, which means managing the connections between different parts of the business rather than the actual parts of the business themselves,” Bruce Robertson, research VP atGartner, said in a statement released Tuesday.  “The second key characteristic is that it models all relationships as interactions via some set of interfaces, which can be completely informal and manual ” for example, sending handwritten invitations to a party via postal letters – to highly formal and automated, such as credit-card transactions across the Visa network,” Robertson said.  Gartner has identified seven properties that differentiate emergent architecture from the traditional approach to EA:

  1. Non-deterministic: In the past, enterprise architects applied centralised decision-making to design outcomes. Using emergent architecture, they instead must decentralise decision-making to enable innovation.
  2. Autonomous actors: Enterprise architects can no longer control all aspects of architecture as they once did. They must now recognise the broader business ecosystem and devolve control to constituents.
  3. Rule-bound actors: Where in the past enterprise architects provided detailed design specifications for all aspects of the EA, they must now define a minimal set of rules and enable choice.
  4. Goal-oriented actors: Previously, the only goals that mattered were the corporate goals but this has now shifted to each constituent acting in their own best interests.
  5. Local Influences: Actors are influenced by local interactions and limited information. Feedback within their sphere of communication alters the behaviour of individuals. No individual actor has data about all of an emergent system. EA must increasingly coordinate.
  6. Dynamic or Adaptive Systems: The system (the individual actors as well as the environment) changes over time. EA must design emergent systems sense and respond to changes in their environment.
  7. Resource-Constrained Environment: An environment of abundance does not enable emergence; rather, the scarcity of resources drives emergence.

Gartner said that enterprise architects must be ready to embrace the inversion of control.  Where in the past, they controlled all EA decision making, they must now accept that that business units demand more autonomy.  For example, they must understand that employees demand that they can use their personal devices, there is increased integration with partners and suppliers, customers demand access to information using the technology of their choice, and regulators require more information.  “The traditional top-down style worked well when applied to complex, fixed functions — that is, human artefacts, such as aircraft, ships, buildings, computers and even EA software,” said Mr Robertson.  “However, it works poorly when applied to an equally wide variety of domains because they do not behave in a predictable way.  The traditional approach ends up constraining the ability of an emergent domain to change because it is never possible to predict – and architect for – all the possible avenues of evolution.”

References :
http://www.gartner.com/it/page.jsp?id=1124112
http://www.intelligententerprise.com/showArticle.jhtml
;jsessionid=GHGKUBBQLGGWBQE1GHOSKHWATMY32
JVN?articleID=219400106

FT : Micropayments Surge

Monday, May 4, 2009

A diverse set of companies is doing big business selling cheap online goods and services across the web.  Apple has sold more than 6bn songs for $1 apiece on iTunes.  Skype took in $550m last year selling cheap internet calling minutes.  Tencent, the China internet portal, generated $719m last year selling low-cost virtual goods.  Consumers’ growing willingness to pay small dollar amounts is matched by a payments infrastructure that is finally robust enough to accommodate demand.  This new generation of businesses and shifting consumer behaviour signal the arrival of microtransactions as a lucrative area of experiment for internet groups.  “Ten years ago there was a lack of content and a lack of willingness to charge for small amounts,” Bruce Cundiff, director of payments research at Javelin Strategy and Research, said.  “Today consumers are accustomed to downloading and paying for it.”  One of the few internet companies to go public this year is a microtransactions leader.  Changyou.com, a spin-off of Sohu, which offers free online games in China but makes its money selling virtual goods, raised $128m in its initial public offering last month.  Virtual worlds such as Stardoll, Habbo and Club Penguin each take in between $30m and $150m annually with microtransactions, selling members lowcost accessories and enhanced features.

Critics used to say that microtransactions would not take off, arguing that consumers would not pay for online content.  Yet as the market for online advertising recedes, groups are turning to microtransactions to support their sites.  Analysts say consumers are growing more accustomed to spending small amounts of money casually for digital content.  Phone companies were the first to enable this, adding small charges for ringtones to a customer’s monthly bill.  Apple introduced the iTunes store and Skype began charging pennies for online calling.  This softened consumer resistance, opening the door for a generation of online games and applications to levy similar charges…..Eric Schmidt, Google chief executive, last month called on newspapers to adopt a micropayments model for their websites.  Charging pennies for each article read online, said Mr Schmidt, might help publishers survive the collapse of advertising revenue.  His call reiterated the argument from a recent Time magazine article by Walter Isaacson.  “We need something like digital coins or an E-ZPass digital wallet,” Mr Isaacson wrote.  There should be “a one-click system with a really simple interface that will permit impulse purchases of a newspaper”.  Several US publishers that do not do so have suggested they might start charging for content.  Journalism Online, a new company founded by former media executives, will launch an e-commerce platform that news sites can use to charge for individual articles.  But even as micropayments gain traction on social networking sites and online games, history suggests the model might not work so smoothly with newspapers.  People have come to expect online news to be free, and news is quickly disseminated on the web.

Reference : http://www.ft.com/cms/s/0/b4335d52-36b1-11de-af40-00144feabdc0.html

Google Apps has gained a directory tool designed to simplify and accelerate the setup of this hosted collaboration and communication suite.  With the new Directory Sync, Apps can tap into existing LDAP-based user directories, such as the ones in IBM’s Lotus Domino and Microsoft’s Active Directory, so that administrators don’t have to set up a separate directory in the Google suite.  This functionality will likely appeal in particular to a segment of the collaboration market that Google is very interested in attracting: enterprise IT departments.  Google Apps has mostly been adopted in small and medium-size companies, and groups within large organizations, although the suite has nabbed large deployments in universities and government settings.  The new tool, which comes from technology Google acquired when it bought Postini, runs behind customers’ firewalls and offers a one-way delivery of directory information to Google Apps.  “The utility offers many of the customization settings, tests and simulations originally developed and refined for the Postini directory sync tool,” wrote Navneet Goel, Google enterprise product manager, in a blog posting Thursday.

The LDAP (Lightweight Directory Access Protocol) component is available at no additional cost for administrators of the Premier, Education and Partner versions of Apps.  It will be available as a software download that can be loaded onto an on-premise server, said Rajen Sheth, Google Apps senior product manager.  Until now, administrators have had several ways of loading user directory data into Apps, including a user-provisioning API (application programming interface), a Web-based interface for manual data entry and a bulk-uploading capability, Sheth said.  However, the new tool is tightly integrated into Apps and offers more directory management features than the other options, he said.  LDAP support is a basic requirement for any enterprise software-as-a-service offering, Gartner analyst Matt Cain said via e-mail.  “Organizations want to manage as few directories as possible and they want a secure one-way upload to the cloud.  It’s another example of Google gaining enterprise prowess from the Postini buy.”  “It’s good to see Google taking steps to show they are serious about the enterprise IT administrators.  Certainly, LDAP is how they do a lot of management of enterprise accounts,” said Rebecca Wettemann, an analyst with Nucleus Research.  “There’s still more that Google needs to do, but this is a strong step forward.”  In enterprises, Apps is often a complement, not a substitute, for collaboration platforms like the ones from IBM Lotus and Microsoft, so this directory utility will come in handy for IT staffers in those situations, she said.  With this now in place, Google would do well to give Apps administrators a tool to manage the content that the suite’s users create, something that other products, like Microsoft’s SharePoint, offer for their own platforms, she said…..

Reference : http://www.cio.com/article/491413/Google_Apps_Gains_LDAP_Support

FT : Server In The Sky

Saturday, March 28, 2009

A revolution that is sweeping through corporate data centres is the untold secret of modern business.  The information factories of the digital age are going through a transition that is every bit as significant as the advent of the moving assembly line was to manufacturing nearly a century ago.  Companies that master the new techniques of information processing, or make the right bets about when to hand over control of that operation to someone else with greater skills, stand to reap significant benefits in the form of lower costs and greater usability of their data.  But this all comes with significant risks: not only of falling behind as information processing enters a new industrial-scale era but, conversely, of losing control of one of a business’s key assets – its information.  The shift is happening largely out of view.  Guarded by the priesthood of the IT department and locked away from prying eyes for security reasons, data centres operate beyond the average business manager’s realm of consciousness.  The places where the inner workings of business are conducted – where invoices are processed, transactions recorded and corporate secrets stored – are often taken for granted.  Every now and then, though, something happens to shine a light on this arcane world.  Such a moment came last week, with signs of a realignment among some of the world’s leading technology concerns.  Cisco Systems, the world’s biggest maker of networking equipment, said it would start selling servers, the back-room machines that are the workhorses of corporate computing, setting up a showdown with Hewlett-Packard and IBM.  It also emerged that IBM was in the late stages of negotiations to buy Sun Microsystems, bringing together two of the biggest suppliers of servers and other technologies for data centres.  The maturing of the IT industry and a steep slide into recession provided the immediate impetus for these moves.  But something else is at work.  After a technology era characterised by the rise of the PC, a new centralisation is taking place in computing and the biggest suppliers of technology are being forced to respond. A catchphrase has been coined to describe this new approach: “cloud computing”.  If the past quarter-century was characterised by a decentralisation of computing, with information processing and storage placed on every desk- and laptop, the coming era is set to bring greater consolidation of computing power in “clouds”, or large-scale, distributed computing facilities.  Even Microsoft, a company that came to dominate the PC era, is racing to create one of the world’s biggest computing clouds, although it insists this will co-exist with existing forms of personal computing for years to come.  The economies of scale that come from consolidating computing in fewer places, and the availability of fast internet connections that make it easy to tap into this resource, account for the shift.  As a result, data centres – whether run by large companies or by internet services groups such as Google – are assuming an increased share of the world’s information processing workload.

There are striking signs of how this reconfiguration is taking shape.  According to Rick Rashid, head of research at Microsoft, a handful of internet companies, including his employer as well as Google, Microsoft, Yahoo and Amazon.com, is already buying 20% of the world’s output of servers.  The massive new data centres these companies are building harness an amount of computing power that far exceeds anything assembled by private companies before, he adds.  These companies use much of this brute computing power to run their own online services, including internet search and electronic commerce.  Increasingly, though, they are also offering their capacity as a substitute for the data processing and storage that takes place on their customers’ machines.  This comes in two forms.  Some is made available in the form of services.  For instance, rather than store documents and run a word processor on your own PC, you can now just access Google Docs, which mimics many of the functions that Microsoft’s Office software has carried out in the past.  In the business world, this approach is known as “software as a service” and involves the provision of corporate applications such as accounting or customer relationship management by companies including Salesforce.com and NetSuite.  Small companies in particular are turning to these services to escape the headache of maintaining their own technology…..The second new form of outsourced computing involves the provision of raw data processing power and storage capacity: companies buy access to someone else’s data centre to boost their own capacity at times of need, or even to replace it altogether.  Amazon.com has emerged as the unlikely early leader in this business.  More than halfthe online bookseller’s computing resources are being consumed by other companies, which run their own applications in its data centres, says Werner Vogels, Amazon’s chief technology officer.  Customers include the New York Times and Nasdaq.  The same forces that are leading to the aggregation of computing power in the internet services companies are also prompting many big companies to centralise more of their internal computing resources, taking advantage of economies of scale that are bringing down the unit cost of information processing and storage.  For instance, General Electric’s data centres now consume about 45% of the company’s IT budget, up from 25% three years ago, as it builds up a more centralised resource that can be used by all of its divisions, says Greg Simpson, GE’s chief technology officer.  While internet companies are creating “public clouds”, many big companies like GE say they are creating private ones of their own, even as they weigh the benefits of shifting some of their computing to the outside services suppliers.  There is a danger, however, in overstating the near-term impact of new developments of this nature.  “The ‘cloud’ is definitely hyped right now,” says Dante Malagrino at Cisco.  He describes the term as a “fancy way of saying ‘a network of resources’ “.  It is also the case that in IT, there is seldom anything really new under the sun.  The centralisation trend is a return to life before the fragmentation brought about by PCs and the proliferation of department-level servers in business to handle tasks such as e-mail.  Forerunners of the cloud have gone by names like “utility computing” and “ondemand computing”.  Sweeping visions of this kind are easy to lay out yet take years to unfold.  Companies do not abandon their earlier investments in technology but layer new technologies on top of old, creating a patchwork of information architectures.  Most data centres resemble archaeological digs.  Nor will change happen overnight.  Companies are deeply cautious about how they handle their information assets and do not experiment with new IT systems lightly.  Concerns about security and reliability will act as a drag for years.  “Cloud computing” has become useful shorthand for a trend that has shown signs of accelerating as software-as-a-service and the outsourcing of raw computing power are adopted more widely.  Yet there are reasons why some computing will continue to take place locally that go beyond a desire by users to keep more direct control of their own data…..

Behind the rise of cloud computing lies a process revolution that has been taking place on the information production line.  It has been compared to the impact on the automotive industry of Toyota’s manufacturing system, which brought a step change in quality and a reduction in costs.  “The typical IT organisation is used to a highly customised way of doing things,” says Russ Daniels, chief technology officer of cloud computing at Hewlett-Packard.  Adopting more standardised systems “will turn IT into more of a manufacturing process than an art”, he adds.  In turn, that could lead companies to decide they no longer want to invest in staying on the cutting edge of the information processing business.  The danger, says Mr Daniels, is that this could leave companies without the skills needed to understand and direct their IT.  The catalyst for this new industrialisation of IT has been a simple but powerful idea: unchaining computing tasks from the physical machines on which they take place.  Until recently, companies bought a new server each time they added a new application.  The need to commission, house, maintain and power all those servers has become one of IT’s biggest costs.  The antidote to this has been a technology known as “virtualisation”, which makes it possible to run more than one application on each machine.  In essence, each application is tricked by the virtualisation software into thinking it is running on its own dedicated server: many new “virtual machines” can exist on a smaller number of actual servers.  Breaking the link between a computing task and the hardware of individual computers does not only lead to more efficient use of capacity: virtual machines can be shifted between servers as they are running, or even between data centres, with no interruption.  This virtualisation trend has taken hold quickly over the past two to three years. A year ago, says Greg Simpson, chief technology officer at General Electric, his company bought a new server in 85% of the instances when it had a new computing task to handle: by the end of last year, that proportion had fallen to 50%, with “virtual machines” making up the difference.  It is a short step from there to shifting some computing tasks from a company’s own computers to those of an external service provider.

Reference : http://www.ft.com/cms/s/0/6b15ff9a-19a5-11de-9d34-0000779fd2ac.html

FT Podcast : Cloud Computing

Wednesday, March 25, 2009

Internal and external clouds etc.  Interesting utility computing trends discussion here.
(audio runs for 5min 36sec)