Cloud Data Centers: Bigger, Faster, Cheaper!


Follow SecSovTMT on Twitter

Data processing architecture has swung like a pendulum, from a centralized mainframe era, to decentralized PCs, and now back to the cloud.  Modern cloud data centers feature distributed architectures that share computing and storage resources across widely dispersed locations at nearly limitless scale, delivering exceptional performance for extraordinarily low costs.  The considerable advantages stem from four major factors: Scale economies in purchasing technology and hiring talent facilitate cost and skill advantages.  Scope puts servers much closer to users, improving response time and cutting costs.  Innovative data center design can dramatically reduce capital and operating costs.  Finally, software innovation led by Google, and followed in the open source Hadoop, allows applications to work on processor arrays and data sets of nearly limitless size.  Against these advantages, enterprises will weigh transition costs, slowly receding security concerns and application-specific idiosyncrasies to determine which applications should be shifted to the cloud and how quickly.  As the change proceeds, private data center investment will wane and traditional IT will commoditize, with the leading cloud providers, IT consultants and their customers the big winners.

Data processing has shifted from the decentralization of the PC era, back to centralized data centers, and ultimately, the cloud.  In the mainframe era, computing was expensive, scarce and rationed by central IT departments.  Minicomputers reduced cost and complexity, making it reasonable to dedicate computing to departmental groups, a trend that was punctuated by the PC ethos of pushing processing right to the desktop.  Improving networks started the pendulum back with the rise of client/server approaches that augmented local processing with shared resources, a concept maximized by virtualization, which gives users a private slice of a shared data center.  Cloud computing takes this to the maximum, allowing minimally capable access devices to harness the near limitless processing and storage available on the web.

Key innovations – i.e. ubiquitous access, distributed architecture, MapReduce, portable platforms, etc. – mitigate disadvantages and amplify advantages of centralization.  Cloud processing is only possible due to innovations that have risen over the past decade.  Fast, ubiquitous internet access is now a given; distributed data center architecture puts servers close-by; MapReduce processing keeps users and applications distinct while parsing demands in parallel to processors and storage; and portable platforms are cheap and easy to use. The access, performance and ease of use burdens of centralized processing are largely eliminated, while the cost and control benefits are magnified.

Cloud hosts have substantial scale advantages in constructing data centers, connecting them and building world-class expertise in running them.  The advantages of cloud architecture start from economies of scale.  The top cloud hosts have become the largest IT buyers in the world, leveraging their own data center needs as well those of commercial customers, and as such have leverage to exact vendor concessions for hardware, software and communications unavailable to most enterprises.  These companies can also justify building world-class IT design and management expertise.

Distributed data centers put servers closer to users improving application response times and reducing communications costs.  The distance between a user and a server has a dramatic impact on performance – each router hop geometrically adds latency and error, favoring distributed data center architecture, as exemplified by top cloud players.  Proximity also reduces communications costs and facilitates increasingly mobile workers.

Leading edge data center designs cut power needs and costs dramatically, while minimizing equipment and facilities spending.  With technology costs falling, electricity has become a major expense for data center operators.  The top cloud players, such as Google and Microsoft, are achieving extraordinary power efficiency standards impossible for private data centers to even approach.  These designs also employ commodity hardware and open source software customized to their own proprietary specs, low cost technology choices unavailable to smaller and less sophisticated buyers, deployed to maximum efficiency.

Software innovations pioneered by Google and inherent in open source solutions make cloud data centers limitlessly scalable and unusually efficient.  Google’s search franchise stems from MapReduce, a revolutionary way of breaking application tasks to small pieces allocated evenly across a huge array of processors sharing an equally huge bank of storage.  Thus, IT can be used as efficiently as possible, with the possibility of scaling from a single core to many thousands as needed.  This functionality is mimicked in the open source standard Hadoop, which forms the basis for most cloud processing architectures.  This technology amplifies the benefits of scale and scope for the cloud relative to smaller data centers.

Migration of enterprise apps to the cloud will be slow but substantial, proceeding as transition costs and security concerns are addressed.  While cloud providers can offer dramatic cost and performance improvements vs. private data centers, transition costs can be significant, potentially requiring a change from existing applications and/or infrastructure software.  Cloud providers are addressing the security issues that have been the biggest concern adoption, but many enterprises will require a longer track record before making the jump.

The biggest and most technically sophisticated hosts will dominate, commoditizing traditional IT, but providing opportunities for IT consultants.  Companies with big, consumer-driven web franchises – AKA Google, Amazon, Facebook and Microsoft – start the cloud hosting game with huge scale and scope advantages in enterprise hosting.  Most hardware vendors will be commoditized as demand shifts from private to cloud data centers, while many software vendors struggle to transition, even as the cloud opens cheaper and better alternatives for their customers.  The exceptions will be component hardware (e.g. disk drives and processors), SaaS software, and IT consulting.

For our full research notes, please visit our published research site.

Print Friendly, PDF & Email