Enterprise IT: Let’s Go to the Cloud
Paul Sagawa / Artur Pylak
203.901.1633 / 203.901.1634
sagawa@ / email@example.com
March 23, 2011
Enterprise IT: Let’s Go to the Cloud
- We are bearish for near-term IT spending, and concerned about the implications of the cloud for many vendors. Global government spending and Japan represent nearly 30% of the enterprise IT market, and both are likely to decline in the face of obvious pressures. Strong spending from non-Japanese corporate buyers is unlikely to be enough to meet aggressive market expectations in the face of these headwinds. Meanwhile, the ongoing architectural shift to virtualization in the data center is a boon to leading IT vendors, but a further transition to cloud-based operations in the intermediate future looms as a catalyst for the commoditization of major product categories. Internet savvy software, IT consultants, carrier-class gear and hosts with scale and skills are expected to prosper
- Enterprise IT spending faces severe headwinds over the next year or two. Specifically, governments at all levels have grown from 12% of total enterprise IT spending to nearly 18% over the last decade. Scrutiny on current deficits, debt levels and future social spending commitments pushing local, state and national governments toward austerity, here and around the globe. We believe government will shrink sharply as a percent of enterprise IT, with absolute spending shrinking. Meanwhile, Japan represented almost 11% of the total enterprise IT market in 2010, a significant market that could be under pressure for obvious reasons. Together, more than 25% of industry revenues will be under severe pressure, a major risk to vendors with exposure
- Recovering corporate spending in 2011 will be driven by virtualization. Surveys of IT managers put virtualization at the top of the priority list. Moving applications off of PCs and onto data center servers cuts costs, facilitates software deployment, enables user collaboration and remote access, and increases IT management control. With 2011 enterprise IT growth (ex. Government and Japan) projected to be 4.4%, companies facilitating the move to virtualization with strength in corporate data centers without undue exposure to desktop PCs – e.g. VMWare, EMC, IBM, Accenture, etc. – should be beneficiaries
- Rapid growth in cloud infrastructure driven by consumer tablet/smartphone boom. Internet-based data centers operated by consumer-oriented cloud companies like Google, Amazon, Facebook and Apple, and by commercial content delivery network providers like Akamai, Level3 and Limelight have become an important growth segment for IT spending. Capital investment by this sector should continue at breakneck pace in 2011 as the explosive growth of tablets and smartphones, the step-function jump in wireless network speeds as 4G rolls out, and the demand for on-line video drive extraordinary traffic growth
- Enterprises are also moving toward a cloud future, albeit with caution. Even as 2011 shapes up as a big year for bringing applications into enterprise data center centers via virtualization, organizations on the cutting edge have begun the further move to deploy their applications to virtual machines on servers operated by third parties in the Internet cloud. Cloud software vendors targeting collaborative standard applications with need for broad accessibility, such as CRM, have been early success stories. While shifting applications to the cloud has clear benefits– i.e. major scale economies, ubiquitous access, superior off-site performance, ease and speed of upgrades/deployment, etc. – the significant enterprise concerns – i.e. data security, transition costs, management control, regulatory issues, etc. – will take time to resolve. Parsing applications by their degree of standardization and breadth of access required, only the most customized, with narrow and fixed access requirements are likely to be immune from a transition to the cloud
- The cloud is bad for leading IT vendors. Virtualization is a boon to the traditional IT equipment vendor, pulling spending into the data center environment where long-standing relationships, turn-key solutions, and stellar customer support translate to premium prices. The cloud is a 180-degree turn, taking spending away from the enterprise data center and moving investment to large-scale on-line hosts with the expertise and buying power to get the lowest possible prices on bare-bones servers, storage, and networking gear, then adding their own value with internally developed software infrastructure typically based on open-source programs. As virtualization peaks, and enterprises shift to the cloud, vendors that thrived in the corporate data center – e.g. VMWare, EMC, Cisco, HP, Dell, IBM, Oracle, etc. – may find themselves structurally uncompetitive with the lowest cost alternatives. This may take years to play out, but will be an evident threat within 3-5 years
- The cloud is good for web-savvy software, IT consultants, carrier-class gear, and hosts with scale and skills. We believe that cloud infrastructure will become the primary growth driver for the IT market. As this architecture begins to supersede traditional enterprise data centers, companies offering software-as-a-service (SaaS) that exploit the advantages of the cloud and minimize enterprise concerns will take significant market share from software vendors that focus too intently on evolving their data center based products. Salesforce.com has been a pioneer in this regard. We also believe that the move to the cloud will create opportunity for IT consultants such as Accenture, and IBM. As cloud-hosted applications will contribute to the already prodigious Internet traffic growth, we anticipate ongoing investment in carrier class optical and IP networking gear, a boon for vendors like Juniper, Ciena, F5 and others. Finally, we believe web hosting will reward operators with substantial scale and expertise in software development, such as Google, Amazon, Microsoft and IBM
First a Recession, and Now This?
2011 was expected to be a major rebound year for enterprise IT spending, with the aftermath of the financial crisis in the rear view mirror and strong corporate profits in hand. Bullish observers laid prospects for a robust PC upgrade cycle even while CIOs invested to pull applications into data centers rapidly deploying virtualization technology to smooth the provision of new services, increase IT department control and cut future costs via improved efficiency. Then, something happened.
In its 1QFY11 report in November, networking bellwether Cisco Systems lowered guidance on its concerns for Government IT spending, prompting a 20%+ drop in the company’s stock. While other IT vendors have shown less concern, we note that Cisco’s vaunted internal information systems and emphasis on market forecasting have earned the company a reputation for prescience. Looking into Cisco’s uncharacteristically nervous projections, we see ample reasons for caution. From 2000 to 2010, global spending by governments on IT rose from just under 12% of total IT spending to more than 17.5% of the total, rising at a nearly 9.6% CAGR, while all other enterprise IT spending grew just 3.6% (Exhibit 1). Importantly, government spending remained strong during the economic downturns at the beginning and end of the decades, helping to blunt the impact of weak technology investment in the private sector. It seems almost certain that this run up in government technology investment will now reverse.
Europe was the first shoe to drop, and was called directly by Cisco in its November conference call. Much of the European Union is facing austerity, not just the well documented trouble cases of Greece and Ireland, but big countries like Spain and Italy, and even leaders like the UK and France. In the US, the mid-term elections brought the issue of deficits and long-term obligations under scrutiny, catalyzing urgent moves to cut budgets not just by the Federal Government, but by many states and municipalities with new found fear of insolvency. The Middle East is the setting for three major armed conflicts amongst passionate grass roots uprisings and unprecedented sectarian tensions across the region. Even the BRIC nexus of growth economies have seen average government spending deficits of more than 4% annually (Exhibit 2)
These realities are not conducive to continued growth in government IT spending. While many government entities have not yet finalized their budgets for 2011, the scale of overall cuts under discussion and relative political ease with which IT spending can be cut vs. politically sensitive entitlements portends a sizeable reduction. Already the US DoD has advocated cutting $58 million in funding for IT at the Defense Advanced Research Projects Agency (DARPA) – the agency that has been credited with the invention of the Internet. The UK government has already identified £1B in IT contracts that it plans to stop or reduce in scope, with more expected savings as it completes its review.
We are skeptical of recent market research that has downplayed the impact of government spending pressures on IT. While it may be true that IT management within government organizations have not yet felt pressure to reduce spending, we believe that they are unlikely to see the cuts coming until they happen. Ahead of the IT spending crash of 2001, technology customers remained defiantly bullish until cuts were forced upon them from above and long-standing buying commitments were broken. Government budget cuts will come from above and we do not believe that they are well accounted for in most 2011 IT spending forecasts.
The Japanese market was 10.8% of global enterprise IT spending in 2010, and was projected by Gartner to be flat, at best, for 2011 (Exhibit 3). Given the gravity of the recent earthquake, tsunami and nuclear crisis, we view flat spending as very optimistic. As the impact of the tragedy plays out, we believe IT spending projects will give way to the more basic needs of recovery. Until a Japanese economic recovery is firmly underway, we expect a sharp curtailment of discretionary IT spending. We do not believe that expectations for enterprise IT companies yet reflect this reality.
In concert with government IT spending, we see more than 25% of enterprise IT spending at risk through at least year end. We believe this is a sufficient overhang to the entire sector to call into question expectations for near term growth. As of January, Gartner had been projecting 4.5% growth for Japanese IT spending in 2011, and 3.6% growth for Government. Both of these figures appear very aggressive.
Beyond the Gloom
Meanwhile, the remaining 75% of the enterprise IT market is likely to accelerate from the 3.4% growth of 2010, as corporations cope with explosive growth in data and explore the opportunities afforded by emerging new technologies (Exhibit 3). Users are demanding smartphones and tablets, while faster and more ubiquitous networks allow them to access enterprise applications for almost anywhere. Meanwhile, cloud computing has moved past the vague conceptual phase toward becoming a real architecture with dramatic potential benefits. However, first things first, and for most IT departments the priority for 2011 is virtualization.
Virtualization is Real
The concept of virtualization dates back to the dawn of the mainframe. Simply, virtualization slides a layer of software on top of a physical server or storage system that allows that resource to be shared by various users and applications while keeping the activity of each distinct usage wholly separate. For example, a user pulls up an application on a desktop computer. That application is actually running on a “virtual machine” managed by the virtualization software layer and running on top of a data center server. To the user, it looks and feels like it is on the desktop PC, but by running it on the server, the software can be more quickly deployed, better maintained, more easily upgraded, more efficiently run, and at lower costs to boot (Exhibit 4).
There are several types of virtualization that can be broadly classified in four categories: operating system, server, storage, and application. The most widely visible is operating system virtualization, which essentially is the concept of running more than one virtual machine on a single physical computer. A virtual machine is an independent instance of an operating system with one or more applications running that uses the local resources of the host machine. An operating system cannot distinguish between a virtual or physical machine nor can other applications or computers on a network. This allows for multiple applications within separate virtual machines to run on a single physical computer. A familiar example is enabling a Mac OS X computer to run a Windows application like MS Access.
On servers, virtualization can enable the efficient utilization of resources. As users come onto a network, servers can be powered up and resources managed to allow for optimal performance. Conversely, during off-peak times, servers can idle. Also, when a server or disk is being serviced or upgraded, resources can be shifted in the background and allow an application to continue running uninterrupted. Storage virtualization is similar. A Storage Area Network (SAN) is a distributed storage network that appears as a single physical device. A file does not reside on a local user’s computer. Application virtualization again is similar. While you may be running MS Office on your machine, the application suite is not installed, but can communicate with the local OS, middleware, plugins, and other applications. Your system provides the processing power and RAM to run the app, but nothing is stored locally.
VMware pioneered virtualization as a commercial product in 2001 and was not met with any competition until Microsoft entered the market with MS Virtualization Server in 2005. Virtualization has only recently has become a mainstream IT product and is being used by over 60% of enterprises according to Forrester in 2010. Gartner maintains that over 80% of enterprises have some virtualization program or project as of early 2011 and 49% of x86-Architecture workloads are run in virtual machines (Exhibit 5). Growth of Virtualization has been explosive with VMware and Microsoft benefitting (Exhibit 6).
Double Rainbow! What Does It Mean!?
The primary position of virtualization in enterprise IT has far reaching implications for technology vendors. First, it shifts processing demand from ever faster desk top devices to ever larger servers. It shifts storage from local hard drives to shared storage systems. It means software deployed centrally rather than in individual copies on each PC. It also means that the architecture of the data center becomes that much more complicated.
This is good for most of the big players in data center IT. Enterprise IT managers need help with the transition to virtualization, so they contract for consulting services from the likes of IBM and Accenture. Enterprises buy turn-key virtualization solutions, favoring market leaders like VMWare, EMC, Brocade, NetApp, IBM, HP, Citrix and Cisco. Virtualization supports seamless migration of most enterprise software, so major software vendors like Microsoft and Oracle have opportunity to lever their products to take advantage of the new architecture. On the flip side, virtualization reduces the urgency to upgrade PCs, so Microsoft, Dell, HP and others may take a hit, particularly as smartphones and tablets gain hold in the enterprise market.
Meanwhile, Back in the Cloud …
While enterprises move applications onto virtual machines in the data center, consumers have embraced the cloud, which we will define as applications served from publicly accessible commercial data centers connected directly to the Internet. This is in contrast to traditional applications which are stored and run directly on user devices – note that most “apps” downloaded to smartphones and tablets are merely access shortcuts to cloud-based services which rely on internet data centers for processing and storage. This is also in contrast to “private clouds” operated by enterprises from their own data centers and accessible only by authorized users (Exhibit 7).
Cloud-based applications like search, social networking, and streaming media have become the focus of the consumer computing experience, and in response, leaders like Google, Amazon, Facebook and Apple have stepped up to billions of dollars in annual IT investments, while content delivery networks like Akamai, Level 3 and Limelight have built out their own data center assets to provide cloud service for consumer cloud application providers such as Netflix (Exhibit 8). The rise of the smartphone and tablet, abetted by the introduction of 4G wireless broadband, is quickly expanding the opportunities for users to access their apps, and users are responding. As such, the strong growth in IT spending by this segment can be expected to continue.
And Back to the Enterprise …
From many perspectives, virtualization is a stepping stone for enterprises on the way to the cloud. The underlying technologies of virtualization make “private cloud” networks possible –where computing capabilities are pooled through an in house data center and PCs are nothing more than dumb terminals accessing enterprise applications via their browsers or customized “apps”, as they would a consumer service like Facebook. This approach standardizes access for users and allows organizations to replace aging PCs with inexpensive tablets or netbooks rather than pricey upgraded PCs.
The next step is the public cloud, which, at a minimum, involves moving computing and storage out of internally operated data centers and into commercially operated internet accessible data centers. The least aggressive move is closely akin to traditional data center outsourcing – an organization’s software platforms and applications are ported, as is, directly onto a 3rd party host’s infrastructure, who takes responsibility for server, storage and network access capacity. A more aggressive approach would be to just move the applications, and rely on a cloud host for basic software platforms –i.e. operating systems, data base engines, etc. The most aggressive approach would be to rely on the cloud provider for applications as well, a la Salesforce.com. In all three of these levels, infrastructure only, platform and applications, the cloud capability and software is typically paid for as an on-going service, rather than a one-time purchase.
Look How Much You’ll Save!
The economics of shifting to public clouds are compelling. The Association of Computer Machinery estimated the cost of traditional storage for a small medium sized business to be in the range of $3.75 GB/month. Amazon offers storage below $0.15 GB/month. In house processing has been estimated at $2.50-3.50 per CPU hour, while Microsoft offers the same level of service for $0.12 per CPU hour. In house enterprise computing is more expensive by a factor of 20 times (Exhibit 9).
What makes these economics possible? Most enterprises operate their server hardware at about 5-15% load capacity with plenty of redundancies to ensure high service levels. With the aid of virtualization technologies and economies of scale, cloud providers can often match these services levels more efficiently by adjusting resources according to demand and operate hardware at greater load capacities. Since applications and data are not tied to specific physical devices such as servers and storage units, servers can be powered up or down depending on demand. A client can expect to receive services that were agreed upon in a Service Level Agreement (SLA).
Beyond economics, shifting to the public cloud also yields operational advantages. Software upgrades, administered by the cloud host, happen more quickly and with less user disruption. Capacity can expand and contract with usage patterns – seasonal and event driven peaks and troughs can be anticipated and accommodated economically. Within limits, cloud hosts can typically provision additional service capacity on demand for customers. Users gain access to enterprise applications from anywhere that there is Internet, typically with faster response times and lower latency that if they were served from an internally operated data center.
Seems Too Good to Be True
While the economic case is fairly compelling, there are real obstacles to public cloud deployment. First, many organizations fear that moving their mission critical computing capability to a commercial host is a significant security risk. While we see this objection as far bigger in concept than in reality, most enterprises will exercise significant caution until a track record of airtight security can be established. As such, applications deemed vulnerable to breach will be amongst the last to move. Second, the benefits of moving to a public cloud are related by the breadth of the user base and the need to accommodate software changes. Stable applications or highly customized idiosyncratic applications, used by few users concentrated within a few locations will see far less cost and operational upside than average. Third, moving applications from private data centers to public cloud hosts entails significant transition costs.
Considering the universe of enterprise applications along the axes of degree of customization and the breadth of required access, certain categories stand out as more apt candidates for the cloud than others (Exhibit 10). For example, office productivity software and sales force management software must address a broad and mobile audience, and is typically suited to a fairly standardized approach. Not surprisingly, these areas are at the forefront of the move to the public cloud. In contrast, manufacturing and operations systems are typically highly customized, accessed on a need-to-know basis, and considered mission critical. Such systems may never be deemed appropriate for the cloud by many organizations. Given all of this, we believe the enterprise shift to the public cloud will be gradual, with in-house virtualization remaining a bigger driver of spending until mid-decade. Of the $89B spent in 2010 on Enterprise Application Software, $9.2B was attributed to software delivered via the cloud and expected to grow by a 15.8% CAGR through 2014.
Dark Clouds on the Horizon
The shift to in-house virtualization is great for high end, value added, data center information technology vendors. Enterprise CIOs typically do not have the resources and expertise internally to define and implement their own virtualization strategy, and thus, rely on turn-key, integrated solutions from trusted suppliers. Not so for public cloud hosts.
Google, Amazon and their ilk do not buy turn-key, integrated solutions. These companies differentiate themselves on their own IT expertise, and typically, develop their own, proprietary, value-added software to deliver and manage their cloud-based services. As a result, cloud host data centers typically feature commodity hardware – cheap, undifferentiated rack servers and disk storage arrays – and free, open-source software platforms customized in-house. Once enterprises begin in earnest to shift their IT usage to the cloud, this will have a dramatic and, seemingly, unanticipated effect on the demand for servers, storage systems, software platforms, and networking equipment.
We believe demand for value-added, integrated solutions will peak and wane, with demand growth coming from the most commoditized product categories. Vendors that thrived in the corporate data center – e.g. VMWare, EMC, Cisco, HP, Dell, IBM, Oracle, etc. – may find themselves structurally uncompetitive with the lowest cost alternatives (Exhibit 11).
The Silver Lining
Obviously, the cloud is a significant opportunity for those that would be cloud hosts. Given the asset intensive nature of the business, fortune should favor the bold and the big. Google, Amazon and Microsoft have already built substantial networks of cloud-connected data centers, initially to serve their own consumer internet businesses, but also poised to apply to the enterprise market as well. This head start will make them formidable competitors with scale and expertise advantages.
The cloud is also an opportunity for software vendors that can exploit the advantages of the cloud – e.g. ubiquitous access, superior speed, facile collaboration, lower costs, etc. – while addressing enterprise concerns for security and control. This transition opens the door for challengers to wrest swaths of market share from traditional software vendors that are slow or ineffective in addressing the cloud architecture. Already Salesforce.com has co-opted the fast growing field of customer relationship management with its purpose built for the cloud software-as-a-service solutions. Google Docs has established a foothold in office productivity suites – a business that has been more dominated by Microsoft than desktop operating systems.
We note that the push to data center virtualization and then on to outsourcing via cloud providers is an intricate transition strewn with risks seen and unseen, and requiring detailed planning and sure execution. We also note that few enterprise IT organizations are able to maintain the skills necessary to manage such an exodus on their own, and that CIOs will be distrustful of giving a cloud hosting partner unchecked power. As such, we see a major opportunity for IT consulting players to facilitate these changes and to counterbalance the power of the cloud host. Here, the obvious players are IBM, and Accenture, amongst others.
Finally, a transition of enterprise applications to the Internet will place considerable demands on carrier infrastructure to accommodate the precise requirements of mission critical applications. Combining this with the overwhelming volumes generated by on-line video suggests long-term demand growth for carrier-class networking gear. Optical suppliers, such as Ciena, Alcatel-Lucent, and others, and their component suppliers, such as JDS-Uniphase, Avago, and Finisar, etc., are in disfavor now, but will have the benefit of secular demand to offset the inherent lumpiness in their businesses. Networking suppliers – e.g. Cisco, Juniper, F5, Riverbed, etc. – will also benefit from the long term trends.