Rackspace versus Amazon: The big data edition

By Derrick Harris

Rackspace is busy building a Hadoop service, giving the company one more avenue to compete with cloud kingpin Amazon Web Services. However, the two services — along with several others on the market — highlight just how different seemingly similar cloud services can be.
Rackspace has been on a tear over the past few months releasing new features that map closely to the core features of the Amazon Web Services platform, only with a Rackspace flavor that favors service over scale. Its next target is Amazon Elastic MapReduce, which Rackspace will be countering with its own Hadoop service in 2013. If AWS and Rackspace are, indeed, the No. 1 and No. 2 cloud computing providers around, it might be easy enough to make a decision between the two platforms.
In the cloud, however, the choices are never as simple as black or white.

Amazon versus Rackspace is a matter of control

Discussing its forthcoming Hadoop service during a phone call on Friday, Rackspace CTO John Engates highlighted the fundamental product-level differences between his company and its biggest competitor, AWS. Right now, for users, it’s primarily a question of how much control they want over the systems they’re renting — and Rackspace comes down firmly on the side of maximum control.

John Engates
For Hadoop specifically, Engates said Rackspace’s service will “really put [users] in the driver’s seat in terms of how they’re running it” by giving them granular control over how their systems are configured and how their jobs run (courtesy of the OpenStack APIs, of course). Rackspace is even working on optimizing a portion of its cloud so the Hadoop service will run on servers, storage and networking gear designed specifically for big data workloads. Essentially, Engates added, Rackspace wants to give users the experience of owning a Hadoop cluster without actually owning any of the hardware.
“It’s not MapReduce as a service,” he added, “it’s more Hadoop as a service.”
The company partnered with Yahoo spinoff Hortonworks on this in part because of its expertise and in part because its open source vision for Hadoop aligns closely with Rackspace’s vision around OpenStack. “The guys at Hortonworks are really committed to the real open source flavor of Hadoop,” Engates said.
Rackspace’s forthcoming Hadoop service appears to contrast somewhat with Amazon’s three-year-old and generally well-received Elastic MapReduce service. The latter lets users write their own MapReduce jobs and choose the number and types of servers they want, but doesn’t give users system-level control on par with what Rackspace seems to be planning. For the most part, it comports with AWS’s tried-and-true strategy of giving users some control of their underlying resources, but generally trying to offload as much of the operational burden as possible.
Elastic MapReduce also isn’t open source, but is an Amazon-specific service designed around Amazon’s existing S3 storage system and other AWS features. When AWS did choose to offer a version of Elastic MapReduce running a commercial Hadoop distribution, it chose MapR’s high-performance but partially proprietary flavor of Hadoop.

It doesn’t stop with Hadoop

Rackspace is also considering getting into the NoSQL space, perhaps with hosted versions of the open source Cassandra and MongoDB databases, and here too it likely will take a different tact than AWS. For one, Rackspace still has a dedicated hosting business to tie into, where some customers still run EMC storage area networks and NetApp network-attached storage arrays. That means Rackspace can’t afford to lock users into a custom-built service that doesn’t take their existing infrastructure into account or that favors raw performance over enterprise-class features.
Rackspace needs stuff that’s “open, readily available and not unique to us,” Engates said. Pointing specifically to AWS’s fully managed and internally developed DynamoDB service, he suggested, “I don’t think it’s in the fairway for most customers that are using Amazon today.”
Perhaps, but early DynamoDB success stories such as IMDb, SmugMug and Tapjoy suggest the service isn’t without an audience willing to pay for its promise of a high-performance, low-touch NoSQL data store.

Which is better? Maybe neither

There’s plenty of room for debate over whose approach is better, but the answer for many would-be customers might well be neither. When it comes to hosted Hadoop services, both Rackspace and Amazon have to contend with Microsoft’s newly available HDInsight service on its Windows Azure platform, as well as IBM’s BigInsights service on its SmartCloud platform. Google appears to have something cooking in the Hadoop department, as well. For developers who think all these infrastructure-level services are too much work, higher-level services such as Qubole, Infochimps or Mortar Data might look more appealing.
The NoSQL space is rife with cloud services, too, primarily focused on MongoDB but also including hosted Cassandra and CouchDB-based services.
In order to stand apart from the big data crowd, Engates said Rackspace is going to stick with its company-wide strategy of differentiation through user support. Thanks to its partnership with Hortonworks and the hybrid nature of OpenStack, for example, Rackspace is already helping customers deploy Hadoop in their private cloud environments while its public cloud service is still in the works. “We want to go where the complexity is,” he said, “where the customers value our [support] and expertise.”

Gartner sees cloud computing, mobile development putting IT on edge

By Jack Vaughan
ORLANDO, Fla. -- A variety of mostly consumer-driven forces challenge enterprise IT and application development today. Cloud computing, Web information, mobile devices and social media innovations are converging to dramatically change modern organizations and their central IT shops. Industry analyst group Gartner Inc. describes these forces as forming a nexus, a connected group of phenomena, which is highly disruptive.

As in earlier shifts to client-server computing and Web-based e-commerce, the mission of IT is likely to come under serious scrutiny. How IT should adjust was a prime topic at this week's Gartner ITxpo conference held in Orlando, Fla.

"A number of things are funneling together to cause major disruption to your environment," David Cearley, vice president and Gartner fellow, told CIOs and other technology leaders among the more than 10,000 assembled at the Orlando conference.

"Mobile has outweighed impact in terms of disruption. The mobile experience is eclipsing the desktop experience," said Cearley, who cited the large number of mobile device types as a challenge to IT shops that had over the years achieved a level of uniformity via a few browsers and PC types. "You need to prepare for heterogeneity," he told the audience.

Crucially, mobile devices coupled with cloud computing can change that basic architecture of modern corporate computing.

"You need to think about 'cloud-client' architecture, instead of client-server," Cearley said. "The cloud becomes your control point for what lives down in the client. Complex applications won't necessarily work in these mobile [settings]."

This could mean significant changes in skill sets required for enterprise application development. The cloud-client architecture will call for better design skills for front-end input. Development teams will have to make tradeoffs between use of native mobile device OSes and HTML5 Web browser alternatives, according to Cearley. The fact that these devices are brought into the organization by end users acting as consumers also is a factor.

"The user experience is changing. Users have new expectations," he said. For application architects and developers, he said, there are "new design skills required around how applications communicate and work with each other."

The consequences are complex. For example, software development and testing is increasingly not simply about whether software works or not, according to one person on hand at Gartner ITxpo. "We get great pressures to improve the quality of service we provide. It's not just the software, it's how the customer interacts with the software," said Dave Miller, vice president of software quality assurance at FedEx Services.
Cloud computing and mobile apps on the horizon

Shifts in software architecture, such as those seen today in cloud and mobile applications, have precedence, said Miko Matsumura, senior vice president of platform marketing and developer relations at Mateo, Calif.-based mobile services firm Kii Inc. In the past, there have been "impedance mismatches," for example, in the match between object software architecture and relational data architecture, he noted.

The effect of mobile development means conventional architecture is evolving. "What happens is you see the breakdown of the programming model," said Matsumura, a longtime SOA thought leader whose firm markets a type of mobile back end as a service offering. "Now we have important new distinctions. A whole class of developers treat mobile from the perspective of cloud."

"Now, if you are a mobile developer, you don't need to think about the cloud differently than you think of your mobile client," he said. With "client cloud," he continued, "it's not a different programming model, programming language or programming platform." That is a different situation than we find in most development shops today, where Java or .NET teams and JavaScript teams work with different models, languages and platforms.

The changes in application development and application lifecycles are telling, according to Gartner's Jim Duggan, vice president for research. "Mobile devices and cloud -- a lot of these things challenge conventional development," he said. He said Gartner estimates that by 2015 mobile applications will outnumber those for static deployment by four to one. This will mean that IT will have to spend more on training of developers, and will likely do more outsourcing as well.

"Deciding which [are the skills that] you need to have internally is going to change the way your shop evolves," he said.
Look out; the chief digital officer is coming

Underlying the tsunami of disruptive consumerist forces Gartner sees is a wholesale digitization of commerce. At the conference Gartner analysts said such digitization of business segments will lead to a day when every budget becomes an IT budget. Firms will begin to create the role of chief digital officer as part of the business unit leadership, Gartner boldly predicted, estimating 25% of organizations will have chief digital officers in place by 2015.

The chief digital officer title may grow, according to Kii's Matsumura. Often today, marketing departments are controlling the mobile and social media agendas within companies, he said.

"We are seeing a convergence point between the human side of the business -- in the form of marketing -- and the 'machine-side,' in the form of IT," he said.

What It Will Take to Sell Cloud to a Skeptical Business

The other week, I covered a report that asserts Fortune 500 CIOs are seeing mainly positive results from their cloud computing efforts. The CIOs seem happy, but that doesn’t mean others in the organization share that enthusiasm.
Andi Mann: 'We didn't find a horde of  happy campers.'
Andi Mann, vice president for strategic solutions at CA Technologies, for one, questions whether CIOs are the right individuals to talk to about the success of cloud computing. And he takes the original Navint Partners report – which was based on qualitative interviews with 20 CIOs – to task.
“Twenty CIOs are happy, so far, with cloud computing. But what about their bosses, their employees, their organization’s partners and their customers?” he asks. “Are they happy?”
In order to understand the business success of cloud computing or any innovative technology, Mann points out, “we need to (a) talk to more than just CIOs, (b) be certain we’re asking all the right questions, and (c) hit a larger sample than just four percent of the Fortune 500.”
CA Technologies did just that, commissioning IDG Research Services to conduct a global study on the state of business innovation. “To be ‘innovative’ means your organization is capitalizing on cutting edge advancements such as cloud computing, big data, enterprise mobility, devops, the IT supply chain, and so on,” he says.
And Mann delivers this thought-provoking news: “After talking to 800 IT and business executives at organizations with more than $250 million in revenue, we didn’t find a horde of happy campers. Yes, some CIOs are happy with the cloud, and that’s a good thing. But here’s the rub. Firstly, CIOs are not the principal drivers of innovation. Users are. Second, our research found business leaders are, in many cases, unhappy with what their CIOs are doing.”
Business executives are not as excited as IT is about cloud computing, either, Mann continues. “Forty percent of IT respondents say their organization plans to invest in cloud computing over the next 12 months, versus 31% of business executives. Also, business executives give IT low marks in enabling and supporting innovation. One in two rate IT poorly in terms of having made sufficient investment in resources and spending on innovation; 48% think IT isn’t moving fast enough; 37% think IT doesn’t have the right skills; and 40% say IT isn’t receptive to new ideas when approached by the business.”
While cloud offers IT leaders the ability to keep the lights on for less, that’s only one dimension to consider, Mann says. “The crucial question to answer is if the IT organization is innovating with the cloud. Savings are one thing. Moving the organization ahead and keeping it technologically competitive is another. “
Mann provides examples of questions the business needs to be asking before a cloud approach is contemplated: “The business wants IT to explore the world of digital technologies and deliver answers. How can you align mobile and consumerization with the business interest? How do you connect with millions of people over social media? How do you find a new market and get into that market in a different geography or a different product area? How do you use technology to smooth out the bumps in an M&A activity?”
CIOs need to help the business understand that the cloud has many other business advantages beyond cost efficiencies, says Mann. “But the only way to capitalize on those advantages is for the executive leadership to allocate budget appropriately. We need to soften the focus on cost cutting, and sharpen the focus on IT empowerment and business alignment.”
To get there, he says, “CIOs need to make a case for greater investment in cloud computing, not simply to cut costs longer term, but to drive innovation and competitive advantage. Can the CIO win additional budget if she could achieve success according to real business KPIs such as return on equity, return on assets, shareholder value, revenue growth, or competitive position? If the CIO could reliably and measurably move the needle on such things, instead of just auditing costs, the ‘happiness gap’ between the business and IT could be closed.”
It’s time to increase our understanding of the broader and deeper impact of cloud on organizations. “We’ve been talking about IT-to-business alignment for generations. Yet we keep surveying the IT guys when we should be surveying the organizations they serve,” says Mann. “I’m concerned that, statistically, a significant percentage of Fortune 500 business executives might not be happy with the cloud. And that, by far, is the more important number for CIOs to measure and improve.”

Rackspace Looks To Compete With Amazon In Cloud Storage Services

Peter Suciu for redOrbit.com – Your Universe Online
Computer racks have long offered a bit of flexibility when it comes to the size options. Now open cloud vendor Rackspace is looking to provide the same sort of flexibility in the virtual sense for users, who previously had to buy cloud server storage based not on needs, but rather on what the provider assigned.
This twist is clearly aimed at the likes of Amazon Web Services Elastic Block Store, and similar efforts that assign bulk storage, even to those who have more specific storage size needs. The Rackspace Cloud Block Storage solution, which is reportedly powered by OpenStack, is intended to complement the company’s core business services.
And because it is powered by OpenStack this flexibility option won’t be overly complex or complicated, as it has been designed to provide a common interface for customers. This allows customers to interact with this storage via working server applications.
The Rackspace block storage will feature the performance characteristics of either disk drives or solid state devices, while providing high-performance as well. The pricing options are also on par, with standard service on disks being priced at $.15 per Gigabyte per month, while the more expensive solid state option will begin at $.70 per Gigabyte.
By comparison Amazon offers a similar service at $.10 per GB per month, but adds an I/O fee of $.10 for each million I/O operations. Rackspace has noted that it will not charge I/O and instead looked to simplify the pricing of storage.
The cost for I/O suggests that these options might be aimed at customers looking for different needs in storage, with Rackspace being an option for active data as opposed to Amazon’s archiving options.
“We don’t charge for I/Os. We think it makes sense to simplify the pricing,” John Engates, CTO of the San Antonio, Texas-based cloud service and hosted service provider told Information Week on Monday.
Rackspace could also be entering the market as pricing remains an issue. In March, Amazon Web Services and Microsoft each dropped prices on their respective cloud services. At the time it appeared that Amazon Web Services was looking for big customers, while Microsoft was looking to lure in entry-level customers and developers.
Both could see increased competition as Rackspace looks to shake up the space.
The OpenStack type of storage from Rackspace could be used to hold active application data, including database transactions, messages and even files that are accessed regularly; as opposed to the Amazon solution, which is really meant to hold infrequently accessed files and data.
Rackspace has held off the launch of a cloud storage service as it waited for the framework of OpenStack to be read. The company, which had pioneered micro servers in the cloud, has been seen as slow to provide a service that adapted to the application customers wanted to run.
With OpenStack now an option, Rackspace is looking to fill the void of active storage for small and medium sized businesses (SMBs), and enterprises, including those that weren’t ready to deploy to the cloud before.

Source: Peter Suciu for redOrbit.com – Your Universe Online

redOrbit (http://s.tt/1qQHX)

How Big Data and Cloud Computing Are Pushing Networks to the Brink


A lot is expected of corporate networks these days. Companies are trying to add new services and support new devices. There’s always more data that has to keep flowing, more stuff being connected to it. And the network is expected to perform, no matter what. Now there are about five billion devices connected to the Internet and billions of individual users, all expecting their networks to perform.
The folks at Juniper Networks started to wonder if the world of networking has reached some kind of fundamental inflection point. They got together with the people at Forrester Research and surveyed 150 senior IT executives to try to get a better handle on how big trends facing the enterprise, like cloud computing and big data, are affecting enterprise networks.
The findings are kind of interesting and sort of troubling. While cloud computing and software-as-a-service products such as Salesforce.com tend to save money and time by taking dedicated hardware and software out of the equation, using them puts new demands on the network: 58 percent of those surveyed said cloud services had added enough demand to their networks that they had to upgrade the networking hardware.
Cloud services tend to go hand in hand with an increased usage of mobile devices: 47 percent of businesses have seen increased demand from employees bringing their own devices to work.
The complications for networks have grown past the point where you can simply add more bandwidth and hope for the best. The survey found that 86 percent of the companies in the survey have not been unable to spin up new services or support certain business demands, because their networks were simply not up to the task. Another 74 percent reported that their networks had become complex, while 35 said their networks had become “too rigid to manage.”
So that leaves IT organizations at a point where networks are under more demand than ever, and less able to meet those demands. It can’t go on like that. “We’re reaching the point where the effectiveness of networks is inversely proportional to the volume of information they contain,” Juniper CIO Bask Iyer told me last week.
The solution is to make sure that all the bits used to build the network work together well. The old way — running networks mainly by just adding more bandwidth — won’t get the job done, Iyer says. The network has to be built with overarching business objectives in mind, with teams that are usually separate — security, manufacturing, quality control — getting more intimately involved with building the network than they have been before. Naturally, that’s the opening for a larger discussion about the implications of the research. And, of course, Juniper is holding a Web event later today to explain what it means.

New OpenStack clouds mean something for everyone

By Barb Darrow

If there isn’t an OpenStack cloud you fancy, wait a second, there’s more — a lot more — in the pipeline. Cloudscaling, Metacloud and Dreamhost will all preview their take on the open-source cloud this week at the OpenStack Summit in San Diego.
Don’t fret if the OpenStack clouds now available from Hewlett-Packard, Rackspace, Internap and a handful of private-cloud-centric startups don’t suit your need. There will be more options to choose from very shortly.
This week at the OpenStack Summit in San Diego, new flavors of the open-source cloud will be unveiled by Cloudscaling, Dreamhost and Metacloud, among others. Here’s a roundup of some of the noteworthy news:
Cloudscaling’s Open Cloud System 2.0: An early proponent of OpenStack, Cloudscaling made waves with news that it’s new private cloud will facilitate deployment of modern applications and also allow those applications to “scale up” to both Amazon and Google Compute Engine public clouds as needed. Open Cloud System will support Amazon S3 and EC2 and relevant GCE APIs to support that, Cloudscaling CEO Michael Grant told me. The product is slated to come online by year’s end.
The GCE support raised eyebrows, but to Grant it was a no brainer. “We actually see customers wanting choice [in public cloud] and also see in Google Compute Engine a real commitment to public cloud, [it is] as a serious offering by Google,” he said. (Check out Cloudscaling CTO Randy Bias’ blog about why Google Compute Engine is a “game changer.”)

Cloudscaling co-founder and CTO Randy Bias
Open Cloud 2.0 targets modern, “dynamic” applications — software as a service, big data, media-intensive and mobile apps as opposed to legacy applications that now run in data centers, Grant said. Bias left the door open to supporting other leading public cloud infrastructure, including Rackspace, down the line.
Metacloud: This Pasadena, Calif.-based startup emerged from stealth just in time to tout its own private cloud solution at the summit. Backed by Storm Ventures and AME Cloud Ventures — the venture capital firm headed by former Yahoo CEO Jerry Yang — Metacloud said its cloud is already in use by at least one (unnamed) Fortune 100 company.
The year-old company has impressive DNA in scale-out computing. CEO and co-founder Steve Curry formerly managed Yahoo’s global storage operations. Co-founder and CTO Sean Lynch was SVP of technology operations at Ticketmaster Entertainment, and Rafi Khardalian, Metacloud’s VP of operations, was director of systems architecture at Ticketmaster Entertainment.
Dreamhost’s DreamCompute: Web-hosting veteran Dreamhost is launching its take on the OpenStack public cloud with DreamCompute. This public cloud is one of first services to use Ceph as its storage subsystem, hardly surprising since Los Angeles-based Dreamhost spun off Inktank as a company to back Ceph as an alternative to the Swift storage used by other OpenStack implementations.

Prospective users can sign up for a private beta of the service this week. “This is a production-grade public cloud service launching on OpenStack but with Ceph in the box,” said Dreamhost CEO Simon Anderson. “It delivers live migration and operating efficiency from a storage perspective, which is a good next step for OpenStack.”


Anderson said DreamCompute will give Amazon a run for its money on pricing. For the company’s existing Dreamhost dedicated server, virtual private server and dedicated hosting services, for example, inbound and outbound data transfer is always free. For DreamObjects block storage, inbound data transfer is free, outbound transfers cost $0.07 per GB. While DreamCompute pricing won’t be announced for a few weeks, Anderson said simplicity and cost savings is paramount for developers and entrepreneurs wary of Amazon’s confusing price matrix.
Rackspace updates tools: OpenStack pioneer Rackspace will announce software development kits (SDKs) for PHP and Java to help developers write for its public OpenStack cloud in their language of choice. “We want to lift all boats, this will make it easier for developers to consume the cloud,” Jim Curry, GM of Rackspace’s Private Cloud business said last week.
In addition, Curry said 25 percent of the Fortune 100 companies have downloaded that private cloud implementation since it became available on Aug. 15.
Cisco Edition for OpenStack: This week the community will be able to download Cisco Edition for OpenStack which combines reference architectures, documentation and automation scripts to make it easier for service providers and private cloud customers to deploy OpenStack in production, said Cisco CTO (and OpenStack vice chairman) Lew Tucker.
Cisco, as it has discussed, is also baking in OpenStack support into its networking gear. The goal is to unify physical and virtual networking through programmable interfaces and make all of that available through OpenStack’s Quantum networking service, the company said.

Now bring on the customers

So there’s no dearth of available or soon-to-be-available OpenStack distributions coming on line. And there is no shortage of customers testing these clouds. The big question is when significant numbers of customers actually start using these clouds in production and as options or adjuncts to Amazon’s public cloud or their own VMware-based private infrastructure. That’s why last week’s news that Comcast is aboard OpenStack was newsworthy. With its broadband reach to consumers and small businesses, Comcast could drive real OpenStack user adoption. When and if these businesses are ready to make the move, there will be plenty of OpenStack iterations to chose from.
Feature photo courtesy of Flickr user JoeInSouthernCA

AT&T to offer Secure Cloud Services, Joins hand with IBM

AT&T and IBM have joined hands to offer a service which gives clients access to IBM’s back end infrastructure through secure private lines of AT&T.
Dennis Quan, IBM’s VP for SmartCloud Infrastructure described in an interview about the service- “It’s an integrated, end-to-end solution that provides access to cloud computing resources, and a secure pipe over which to access them. It’s going to unleash a lot of pent up demand for cloud computing because we’ve made the consumption of cloud a lot simpler.”
Both IBM and AT&T plan to start the service in the first quarter of 2013. The service will be mostly used by the Customer base of IBM.
According to a statement issued by AT&T Business Solutions CEO Andy Geisse- “AT&T and IBM are delivering a new, network-enabled cloud service that marries the security and speed of AT&T’s global network with the control and management capabilities of IBM’s enterprise cloud,”

The service will be accessible for only those customers who will sign up for IBM’s SmartCloud Enterprise+ service. Dennis Quan however did not give any details of the charges for access to the AT&T private VPN network through which the service will be run. However he did say that the compute and network resources is available through partnership and would be sold on a pay as you go basis.
IBM said that the service will give a secure and one to one connection for enterprises. The SmartCloud Platform will include to virtual machines and storage, as well as application and security services. IBM by signing up with AT&T showed its motive of increasing its reach and hold over the cloud computing market. IBM is planning to touch $7 Billion as cloud related sales by 2015.

OpenNebula cloud — bigger than expected in business

By Barb Darrow
OpenNebula, the European-rooted open-source cloud platform is used by more businesses and in more countries than many might expect. At the ripe old age of 7, OpenNebula remains a quiet force compared to say, OpenStack.
big cloudphoto: Editor B
OpenNebula — the open-source cloud behind the European Space Agency and CERN – may be bigger in private industry business than many may have anticipated.
According to new survey data from C12G Labs, the company behind OpenNebula, 43 percent of 600 users responding are in business accounts compared to 17 percent in research, and less than 10 percent in academia. Maybe this shouldn’t be a shocker given that OpenNebula’s customer page lists such companies as Akamai, Dell, IBM, SAP and Telefonica.


OpenNebula: world traveler

Also a bit surprising was that geographic distribution was much more heterogeneous than expected for a technology born and raised in Europe. Sure, nearly half (49 percent) of respondents are in Europe or Russia but a healthy 23 percent are in North America where the open-source cloud buzz is much more around the newer OpenStack cloud platform backed by Rackspace, IBM, HP, Cisco, and other tech powers and OpenStack’s slapfight with CloudStack and Eucalyptus, two other open-source cloud platforms vying for the limelight (and customers.)

What gets lost in that hubbub — which will doubtless ramp up next week at the OpenStack Summit, is that OpenNebula is more mature and battle tested than its competitors.
OpenNebula should get more credit, said Carl Brooks, cloud analyst for Tier1 Research. “I am a big fan because it’s so organically driven  … it’s seven years old, it’s parked in some of the biggest organizations out there, and that all happened without a massive wave of hype,” Brooks said.

Other fun facts

Most of the respondents (79 percent) are deploying OpenNebula in private not public cloud. The KVM hypervisor was most popular with 42 percent of respondents using it. VMware was second at 27 percent — although VMware was the preferred hypervisor of 55 percent of the big deployers — those with 500 or more nodes.
Ignacio Llorente, director of OpenNebula will be at Structure Europe next week to discuss how truly interoperable cloud technology can transform business. Based on OpenNebulas traction so far, he knows whereof he speaks.
 Feature photo courtesy of Flickr user Editor B

Amazon's Share Of Government Cloud Computing 'Accelerating'

As Amazon announces new services for government customers, it says the 1,800 government and education customers that now use Amazon Web Services prove "rapid" adoption of its cloud computing products.
Top 20 Government Cloud Service Providers

As government agencies continue to adopt cloud computing, Amazon is among those reaping the rewards: the company announced Wednesday that more than 300 government agencies and 1,500 educational institutions now use Amazon Web Services.

According to Amazon, the 1,800-plus customers represent its "rapid growth in the public sector."
"Government agencies and education institutions are rapidly accelerating their adoption of the AWS Cloud," Teresa Carlson, VP of Amazon Web Services' public sector business, said in a statement.
Amazon's growth isn't surprising. State and local governments have been moving quickly to cloud services as a way to save money. Federal CIO Steven VanRoekel and his predecessor Vivek Kundra have been championing the adoption of cloud computing at the federal level to cut costs and improve government IT services with a Cloud First mandate, which requires federal agencies to consider cloud computing as part of most new information technology acquisitions.


The announcement of continued growth in Amazon's public sector business came as the company also announced new features for government customers at Amazon's AWS Public Sector Summit in Washington, D.C., on Wednesday. The new features are for AWS GovCloud, a dedicated community cloud for U.S. government customers that meets strict federal arms control regulations.

Likely chief among the new features is the high-performance computing capability made available through Amazon's Compute Cluster Instances, which has already been used for things like molecular and genomic modeling and analysis, and which can leverage big data technologies such as MapReduce. Even before launching a government version of this service, government agencies have used Amazon for supercomputing: The U.S. Air Force Research Laboratory, for example, has used Amazon to simulate a gas turbine and related airflow dynamics.

Among the other new offerings are elastic load balancing, auto scaling, CloudWatch alarms, the Simple Notification Service, and the Simple Queue Service. Amazon says that the addition of these features should make it easier for government customers to scale their cloud services and to ensure those services' reliability. "With the new services and features added today in AWS GovCloud, public sector customers now have greater capabilities to rapidly design, build, and deploy high-performance applications," Carlson said.

A wide array of government agencies, from NASA to Douglas County, Nebraska, and from the University of Oxford to the Centers for Disease Control and Prevention, are now among the customers of Amazon Web Services. Some of the recent services hosted with Amazon include CDC BioSense 2.0, a service that collects information from health facilities as part of an effort to improve official response to diseases and healthcare trends.

So Far, So Good: Fortune 500 CIOs Seem Happy With Cloud Computing

Many organizations are still in the early stages of their cloud computing journeys, and the reports are: so far, so good. No major flaws or “gotchas” have emerged in nascent cloud engagements, and CIOs are saying full steam ahead.  Still needed, however, are more security assurances, and more vendor flexibility.

That’s the key takeaway from a new report just published by Navint Partners, LLC, which finds large companies are seeing mainly positive results from their cloud computing efforts. The consulting company convened a roundtable with 20 CIOs from Fortune 500 companies to discuss their progress and concerns about cloud computing.

Nine out of 10 respondents, for example, say they have received 100% of the savings they expected from their cloud computing projects.  In addition, four out of five say their cloud efforts have helped their organizations achieve some sort of competitive advantage, and two-thirds say cloud has helped their organization’s efficiency and effectiveness.

These are all vague terms, of course, and the Navint report dug a little deeper into CIOs’ heads to see where they see these benefits unfolding. Robert Summers, CIO of tax preparation firm Jackson Hewitt, cautions that while positive, these are likely very preliminary results. “Many [executives and business managers] think it will take five or more years to fully realize the tangible benefits of cloud implementation.” Summers’ points are borne out in another recent study from the Cloud Security Alliance and ISACA,which estimates that cloud maturity is still at least three years out at most organizations.

Close to half of the CIOs participating in Navint’s roundtable even indicate they already have a dedicated budget for cloud computing. A majority of cloud budgets, then, are discretionary, and Summers states that “many organizations regard cloud expenses as research and development. Organizations are analyzing how the cloud demand differs across departments before appropriating fixed funding for long‐term contracts.”

Another possibility not mentioned in the report is that because cloud may be tightly baked into business processes, to the point that spending often occurs outside the IT department.  Many analysts, including Gartner, predict that soon, business units outside of IT will spend more on technology than IT itself, and cloud will be a big part of this.

Nevertheless, more money will be going into private cloud efforts than public cloud resources over the next two years — which is not surprising, given the huge internal IT inventories within Fortune 500 companies. Close to half of the CIOs contributing to the Navint report expect to boost their private cloud budgets by more than 20%, and about a third will be increasing spending on public cloud services.

Fortune 500 CIOs, then, seem happy with cloud, but there are still issues — security is still the most pressing concern — but, surprisingly, CIOs also complain about cloud vendor inflexibility.  “Despite highly advanced security and fraud countermeasures employed by cloud vendors, CIOs and other executives regard security guarantees and redundancy policies with guarded pessimism.” This concern still holds back a lot of companies from moving mission-critical applications to the cloud. “The security seems to be there today; it will become less of an issue, but people want to see it,” as Summers put it.

However, while one of the most profound advantages seen with cloud is the ability to scale up to spikes in demand — such as the seasonal demand a tax preparation firm may see, this option seems to be lacking in cloud vendors’ offerings. Jackson Hewitt’s Summers, for example, found that in general, cloud vendors offered very few options for firms and industries with peak business activity condensed into certain parts of the year. “None of the major players [we] spoke to had a pay‐as‐you‐go option,” he says. “All the agreements were about the same: you pay a standard amount for the entire year, and the provider agrees to handle some spikes in usage as percentage of the base. We needed a more custom arrangement.”