Mumbai, Nov 9 — EMC Data Storage Systems India, a subsidiary of US-based EMC Corporation, plans to train around 30,000 people in cloud computing, data science and big data analytics by 2013 through its new course, a top company official said Thursday.
Rajesh Janey, president, EMC India and SAARC, told IANS that the Indian cloud computing market (use of computing resources delivered as a service over the internet), currently estimated at $4 billion, was likely to touch $4.5 billion and the business opportunity in big data (huge data difficult to process with existing tools) is expected to touch $300 million in a couple of years.
"However, there is a huge shortage of manpower in these domains," he said.
According to Janey, the new certification courses will be a part of EMC Academic Alliance which has been implemented by tying up with over 250 educational institutes in India and to EMC's customers and partners.
Janey said that in the digital era, individual need and sentiment had become more prevalent and businesses and governments had the opportunity to understand this on a mass scale in real time and take necessary steps that would transform services and delivery.
"Cloud computing is transforming IT and big data is transforming business. But there is shortage of people with requisite skills. It is estimated globally that there is a shortage of around 192,000 data scientists," Janey said.
He said the new data science and big data analytics training and certification helped build a solid foundation for data analytics with focus on opportunities and challenges presented by big data.
The new Cloud Infrastructure and Services Associate-Level Training and Certification provides essential foundation of technical principles and concepts to enable IT professionals to together make informed business and technical decisions on migration to the cloud.
Citing EMC Zinnov study Janey said digital information is creating new opportunities in cloud computing and data.
Janey said the cloud opportunity in India is expected to create 100,000 jobs by 2015 from 10,000 in 2011.
Apart from addressing the domestic big data market need, India has the opportunity to address the global market expected to touch $25 billion.
India's potential to tap this market is around $1 billion by 2015 at a compounded annual growth rate (CAGR) of 83 percent, Janey said citing a NASSCOM CRISIL study.
Rackspace versus Amazon: The big data edition
By Derrick Harris
Rackspace is busy building a Hadoop service, giving the company one more avenue to compete with cloud kingpin Amazon Web Services. However, the two services — along with several others on the market — highlight just how different seemingly similar cloud services can be.
Rackspace has been on a tear over the past few months releasing new features that map closely to the core features of the Amazon Web Services platform, only with a Rackspace flavor that favors service over scale. Its next target is Amazon Elastic MapReduce, which Rackspace will be countering with its own Hadoop service in 2013. If AWS and Rackspace are, indeed, the No. 1 and No. 2 cloud computing providers around, it might be easy enough to make a decision between the two platforms.
In the cloud, however, the choices are never as simple as black or white.
For Hadoop specifically, Engates said Rackspace’s service will “really put [users] in the driver’s seat in terms of how they’re running it” by giving them granular control over how their systems are configured and how their jobs run (courtesy of the OpenStack APIs, of course). Rackspace is even working on optimizing a portion of its cloud so the Hadoop service will run on servers, storage and networking gear designed specifically for big data workloads. Essentially, Engates added, Rackspace wants to give users the experience of owning a Hadoop cluster without actually owning any of the hardware.
“It’s not MapReduce as a service,” he added, “it’s more Hadoop as a service.”
The company partnered with Yahoo spinoff Hortonworks on this in part because of its expertise and in part because its open source vision for Hadoop aligns closely with Rackspace’s vision around OpenStack. “The guys at Hortonworks are really committed to the real open source flavor of Hadoop,” Engates said.
Rackspace’s forthcoming Hadoop service appears to contrast somewhat with Amazon’s three-year-old and generally well-received Elastic MapReduce service. The latter lets users write their own MapReduce jobs and choose the number and types of servers they want, but doesn’t give users system-level control on par with what Rackspace seems to be planning. For the most part, it comports with AWS’s tried-and-true strategy of giving users some control of their underlying resources, but generally trying to offload as much of the operational burden as possible.
Elastic MapReduce also isn’t open source, but is an Amazon-specific service designed around Amazon’s existing S3 storage system and other AWS features. When AWS did choose to offer a version of Elastic MapReduce running a commercial Hadoop distribution, it chose MapR’s high-performance but partially proprietary flavor of Hadoop.
Rackspace needs stuff that’s “open, readily available and not unique to us,” Engates said. Pointing specifically to AWS’s fully managed and internally developed DynamoDB service, he suggested, “I don’t think it’s in the fairway for most customers that are using Amazon today.”
Perhaps, but early DynamoDB success stories such as IMDb, SmugMug and Tapjoy suggest the service isn’t without an audience willing to pay for its promise of a high-performance, low-touch NoSQL data store.
The NoSQL space is rife with cloud services, too, primarily focused on MongoDB but also including hosted Cassandra and CouchDB-based services.
In order to stand apart from the big data crowd, Engates said Rackspace is going to stick with its company-wide strategy of differentiation through user support. Thanks to its partnership with Hortonworks and the hybrid nature of OpenStack, for example, Rackspace is already helping customers deploy Hadoop in their private cloud environments while its public cloud service is still in the works. “We want to go where the complexity is,” he said, “where the customers value our [support] and expertise.”
Rackspace is busy building a Hadoop service, giving the company one more avenue to compete with cloud kingpin Amazon Web Services. However, the two services — along with several others on the market — highlight just how different seemingly similar cloud services can be.
Rackspace has been on a tear over the past few months releasing new features that map closely to the core features of the Amazon Web Services platform, only with a Rackspace flavor that favors service over scale. Its next target is Amazon Elastic MapReduce, which Rackspace will be countering with its own Hadoop service in 2013. If AWS and Rackspace are, indeed, the No. 1 and No. 2 cloud computing providers around, it might be easy enough to make a decision between the two platforms.
In the cloud, however, the choices are never as simple as black or white.
Amazon versus Rackspace is a matter of control
Discussing its forthcoming Hadoop service during a phone call on Friday, Rackspace CTO John Engates highlighted the fundamental product-level differences between his company and its biggest competitor, AWS. Right now, for users, it’s primarily a question of how much control they want over the systems they’re renting — and Rackspace comes down firmly on the side of maximum control.For Hadoop specifically, Engates said Rackspace’s service will “really put [users] in the driver’s seat in terms of how they’re running it” by giving them granular control over how their systems are configured and how their jobs run (courtesy of the OpenStack APIs, of course). Rackspace is even working on optimizing a portion of its cloud so the Hadoop service will run on servers, storage and networking gear designed specifically for big data workloads. Essentially, Engates added, Rackspace wants to give users the experience of owning a Hadoop cluster without actually owning any of the hardware.
“It’s not MapReduce as a service,” he added, “it’s more Hadoop as a service.”
The company partnered with Yahoo spinoff Hortonworks on this in part because of its expertise and in part because its open source vision for Hadoop aligns closely with Rackspace’s vision around OpenStack. “The guys at Hortonworks are really committed to the real open source flavor of Hadoop,” Engates said.
Rackspace’s forthcoming Hadoop service appears to contrast somewhat with Amazon’s three-year-old and generally well-received Elastic MapReduce service. The latter lets users write their own MapReduce jobs and choose the number and types of servers they want, but doesn’t give users system-level control on par with what Rackspace seems to be planning. For the most part, it comports with AWS’s tried-and-true strategy of giving users some control of their underlying resources, but generally trying to offload as much of the operational burden as possible.
Elastic MapReduce also isn’t open source, but is an Amazon-specific service designed around Amazon’s existing S3 storage system and other AWS features. When AWS did choose to offer a version of Elastic MapReduce running a commercial Hadoop distribution, it chose MapR’s high-performance but partially proprietary flavor of Hadoop.
It doesn’t stop with Hadoop
Rackspace is also considering getting into the NoSQL space, perhaps with hosted versions of the open source Cassandra and MongoDB databases, and here too it likely will take a different tact than AWS. For one, Rackspace still has a dedicated hosting business to tie into, where some customers still run EMC storage area networks and NetApp network-attached storage arrays. That means Rackspace can’t afford to lock users into a custom-built service that doesn’t take their existing infrastructure into account or that favors raw performance over enterprise-class features.Rackspace needs stuff that’s “open, readily available and not unique to us,” Engates said. Pointing specifically to AWS’s fully managed and internally developed DynamoDB service, he suggested, “I don’t think it’s in the fairway for most customers that are using Amazon today.”
Perhaps, but early DynamoDB success stories such as IMDb, SmugMug and Tapjoy suggest the service isn’t without an audience willing to pay for its promise of a high-performance, low-touch NoSQL data store.
Which is better? Maybe neither
There’s plenty of room for debate over whose approach is better, but the answer for many would-be customers might well be neither. When it comes to hosted Hadoop services, both Rackspace and Amazon have to contend with Microsoft’s newly available HDInsight service on its Windows Azure platform, as well as IBM’s BigInsights service on its SmartCloud platform. Google appears to have something cooking in the Hadoop department, as well. For developers who think all these infrastructure-level services are too much work, higher-level services such as Qubole, Infochimps or Mortar Data might look more appealing.The NoSQL space is rife with cloud services, too, primarily focused on MongoDB but also including hosted Cassandra and CouchDB-based services.
In order to stand apart from the big data crowd, Engates said Rackspace is going to stick with its company-wide strategy of differentiation through user support. Thanks to its partnership with Hortonworks and the hybrid nature of OpenStack, for example, Rackspace is already helping customers deploy Hadoop in their private cloud environments while its public cloud service is still in the works. “We want to go where the complexity is,” he said, “where the customers value our [support] and expertise.”
Gartner sees cloud computing, mobile development putting IT on edge
By Jack Vaughan
ORLANDO, Fla. -- A variety of mostly consumer-driven forces challenge enterprise IT and application development today. Cloud computing, Web information, mobile devices and social media innovations are converging to dramatically change modern organizations and their central IT shops. Industry analyst group Gartner Inc. describes these forces as forming a nexus, a connected group of phenomena, which is highly disruptive.
As in earlier shifts to client-server computing and Web-based e-commerce, the mission of IT is likely to come under serious scrutiny. How IT should adjust was a prime topic at this week's Gartner ITxpo conference held in Orlando, Fla.
"A number of things are funneling together to cause major disruption to your environment," David Cearley, vice president and Gartner fellow, told CIOs and other technology leaders among the more than 10,000 assembled at the Orlando conference.
"Mobile has outweighed impact in terms of disruption. The mobile experience is eclipsing the desktop experience," said Cearley, who cited the large number of mobile device types as a challenge to IT shops that had over the years achieved a level of uniformity via a few browsers and PC types. "You need to prepare for heterogeneity," he told the audience.
Crucially, mobile devices coupled with cloud computing can change that basic architecture of modern corporate computing.
"You need to think about 'cloud-client' architecture, instead of client-server," Cearley said. "The cloud becomes your control point for what lives down in the client. Complex applications won't necessarily work in these mobile [settings]."
This could mean significant changes in skill sets required for enterprise application development. The cloud-client architecture will call for better design skills for front-end input. Development teams will have to make tradeoffs between use of native mobile device OSes and HTML5 Web browser alternatives, according to Cearley. The fact that these devices are brought into the organization by end users acting as consumers also is a factor.
"The user experience is changing. Users have new expectations," he said. For application architects and developers, he said, there are "new design skills required around how applications communicate and work with each other."
The consequences are complex. For example, software development and testing is increasingly not simply about whether software works or not, according to one person on hand at Gartner ITxpo. "We get great pressures to improve the quality of service we provide. It's not just the software, it's how the customer interacts with the software," said Dave Miller, vice president of software quality assurance at FedEx Services.
Cloud computing and mobile apps on the horizon
Shifts in software architecture, such as those seen today in cloud and mobile applications, have precedence, said Miko Matsumura, senior vice president of platform marketing and developer relations at Mateo, Calif.-based mobile services firm Kii Inc. In the past, there have been "impedance mismatches," for example, in the match between object software architecture and relational data architecture, he noted.
The effect of mobile development means conventional architecture is evolving. "What happens is you see the breakdown of the programming model," said Matsumura, a longtime SOA thought leader whose firm markets a type of mobile back end as a service offering. "Now we have important new distinctions. A whole class of developers treat mobile from the perspective of cloud."
"Now, if you are a mobile developer, you don't need to think about the cloud differently than you think of your mobile client," he said. With "client cloud," he continued, "it's not a different programming model, programming language or programming platform." That is a different situation than we find in most development shops today, where Java or .NET teams and JavaScript teams work with different models, languages and platforms.
The changes in application development and application lifecycles are telling, according to Gartner's Jim Duggan, vice president for research. "Mobile devices and cloud -- a lot of these things challenge conventional development," he said. He said Gartner estimates that by 2015 mobile applications will outnumber those for static deployment by four to one. This will mean that IT will have to spend more on training of developers, and will likely do more outsourcing as well.
"Deciding which [are the skills that] you need to have internally is going to change the way your shop evolves," he said.
Look out; the chief digital officer is coming
Underlying the tsunami of disruptive consumerist forces Gartner sees is a wholesale digitization of commerce. At the conference Gartner analysts said such digitization of business segments will lead to a day when every budget becomes an IT budget. Firms will begin to create the role of chief digital officer as part of the business unit leadership, Gartner boldly predicted, estimating 25% of organizations will have chief digital officers in place by 2015.
The chief digital officer title may grow, according to Kii's Matsumura. Often today, marketing departments are controlling the mobile and social media agendas within companies, he said.
"We are seeing a convergence point between the human side of the business -- in the form of marketing -- and the 'machine-side,' in the form of IT," he said.
ORLANDO, Fla. -- A variety of mostly consumer-driven forces challenge enterprise IT and application development today. Cloud computing, Web information, mobile devices and social media innovations are converging to dramatically change modern organizations and their central IT shops. Industry analyst group Gartner Inc. describes these forces as forming a nexus, a connected group of phenomena, which is highly disruptive.
As in earlier shifts to client-server computing and Web-based e-commerce, the mission of IT is likely to come under serious scrutiny. How IT should adjust was a prime topic at this week's Gartner ITxpo conference held in Orlando, Fla.
"A number of things are funneling together to cause major disruption to your environment," David Cearley, vice president and Gartner fellow, told CIOs and other technology leaders among the more than 10,000 assembled at the Orlando conference.
"Mobile has outweighed impact in terms of disruption. The mobile experience is eclipsing the desktop experience," said Cearley, who cited the large number of mobile device types as a challenge to IT shops that had over the years achieved a level of uniformity via a few browsers and PC types. "You need to prepare for heterogeneity," he told the audience.
Crucially, mobile devices coupled with cloud computing can change that basic architecture of modern corporate computing.
"You need to think about 'cloud-client' architecture, instead of client-server," Cearley said. "The cloud becomes your control point for what lives down in the client. Complex applications won't necessarily work in these mobile [settings]."
This could mean significant changes in skill sets required for enterprise application development. The cloud-client architecture will call for better design skills for front-end input. Development teams will have to make tradeoffs between use of native mobile device OSes and HTML5 Web browser alternatives, according to Cearley. The fact that these devices are brought into the organization by end users acting as consumers also is a factor.
"The user experience is changing. Users have new expectations," he said. For application architects and developers, he said, there are "new design skills required around how applications communicate and work with each other."
The consequences are complex. For example, software development and testing is increasingly not simply about whether software works or not, according to one person on hand at Gartner ITxpo. "We get great pressures to improve the quality of service we provide. It's not just the software, it's how the customer interacts with the software," said Dave Miller, vice president of software quality assurance at FedEx Services.
Cloud computing and mobile apps on the horizon
Shifts in software architecture, such as those seen today in cloud and mobile applications, have precedence, said Miko Matsumura, senior vice president of platform marketing and developer relations at Mateo, Calif.-based mobile services firm Kii Inc. In the past, there have been "impedance mismatches," for example, in the match between object software architecture and relational data architecture, he noted.
The effect of mobile development means conventional architecture is evolving. "What happens is you see the breakdown of the programming model," said Matsumura, a longtime SOA thought leader whose firm markets a type of mobile back end as a service offering. "Now we have important new distinctions. A whole class of developers treat mobile from the perspective of cloud."
"Now, if you are a mobile developer, you don't need to think about the cloud differently than you think of your mobile client," he said. With "client cloud," he continued, "it's not a different programming model, programming language or programming platform." That is a different situation than we find in most development shops today, where Java or .NET teams and JavaScript teams work with different models, languages and platforms.
The changes in application development and application lifecycles are telling, according to Gartner's Jim Duggan, vice president for research. "Mobile devices and cloud -- a lot of these things challenge conventional development," he said. He said Gartner estimates that by 2015 mobile applications will outnumber those for static deployment by four to one. This will mean that IT will have to spend more on training of developers, and will likely do more outsourcing as well.
"Deciding which [are the skills that] you need to have internally is going to change the way your shop evolves," he said.
Look out; the chief digital officer is coming
Underlying the tsunami of disruptive consumerist forces Gartner sees is a wholesale digitization of commerce. At the conference Gartner analysts said such digitization of business segments will lead to a day when every budget becomes an IT budget. Firms will begin to create the role of chief digital officer as part of the business unit leadership, Gartner boldly predicted, estimating 25% of organizations will have chief digital officers in place by 2015.
The chief digital officer title may grow, according to Kii's Matsumura. Often today, marketing departments are controlling the mobile and social media agendas within companies, he said.
"We are seeing a convergence point between the human side of the business -- in the form of marketing -- and the 'machine-side,' in the form of IT," he said.
What It Will Take to Sell Cloud to a Skeptical Business
The other week, I covered a report that asserts Fortune 500 CIOs are seeing mainly positive results
from their cloud computing efforts. The CIOs seem happy, but that
doesn’t mean others in the organization share that enthusiasm.
Andi Mann, vice president for strategic solutions at CA Technologies, for one, questions whether CIOs are the right individuals to talk to about the success of cloud computing. And he takes the original Navint Partners report – which was based on qualitative interviews with 20 CIOs – to task.
“Twenty CIOs are happy, so far, with cloud computing. But what about their bosses, their employees, their organization’s partners and their customers?” he asks. “Are they happy?”
In order to understand the business success of cloud computing or any innovative technology, Mann points out, “we need to (a) talk to more than just CIOs, (b) be certain we’re asking all the right questions, and (c) hit a larger sample than just four percent of the Fortune 500.”
CA Technologies did just that, commissioning IDG Research Services to conduct a global study on the state of business innovation. “To be ‘innovative’ means your organization is capitalizing on cutting edge advancements such as cloud computing, big data, enterprise mobility, devops, the IT supply chain, and so on,” he says.
And Mann delivers this thought-provoking news: “After talking to 800 IT and business executives at organizations with more than $250 million in revenue, we didn’t find a horde of happy campers. Yes, some CIOs are happy with the cloud, and that’s a good thing. But here’s the rub. Firstly, CIOs are not the principal drivers of innovation. Users are. Second, our research found business leaders are, in many cases, unhappy with what their CIOs are doing.”
Business executives are not as excited as IT is about cloud computing, either, Mann continues. “Forty percent of IT respondents say their organization plans to invest in cloud computing over the next 12 months, versus 31% of business executives. Also, business executives give IT low marks in enabling and supporting innovation. One in two rate IT poorly in terms of having made sufficient investment in resources and spending on innovation; 48% think IT isn’t moving fast enough; 37% think IT doesn’t have the right skills; and 40% say IT isn’t receptive to new ideas when approached by the business.”
While cloud offers IT leaders the ability to keep the lights on for less, that’s only one dimension to consider, Mann says. “The crucial question to answer is if the IT organization is innovating with the cloud. Savings are one thing. Moving the organization ahead and keeping it technologically competitive is another. “
Mann provides examples of questions the business needs to be asking before a cloud approach is contemplated: “The business wants IT to explore the world of digital technologies and deliver answers. How can you align mobile and consumerization with the business interest? How do you connect with millions of people over social media? How do you find a new market and get into that market in a different geography or a different product area? How do you use technology to smooth out the bumps in an M&A activity?”
CIOs need to help the business understand that the cloud has many other business advantages beyond cost efficiencies, says Mann. “But the only way to capitalize on those advantages is for the executive leadership to allocate budget appropriately. We need to soften the focus on cost cutting, and sharpen the focus on IT empowerment and business alignment.”
To get there, he says, “CIOs need to make a case for greater investment in cloud computing, not simply to cut costs longer term, but to drive innovation and competitive advantage. Can the CIO win additional budget if she could achieve success according to real business KPIs such as return on equity, return on assets, shareholder value, revenue growth, or competitive position? If the CIO could reliably and measurably move the needle on such things, instead of just auditing costs, the ‘happiness gap’ between the business and IT could be closed.”
It’s time to increase our understanding of the broader and deeper impact of cloud on organizations. “We’ve been talking about IT-to-business alignment for generations. Yet we keep surveying the IT guys when we should be surveying the organizations they serve,” says Mann. “I’m concerned that, statistically, a significant percentage of Fortune 500 business executives might not be happy with the cloud. And that, by far, is the more important number for CIOs to measure and improve.”
Andi Mann, vice president for strategic solutions at CA Technologies, for one, questions whether CIOs are the right individuals to talk to about the success of cloud computing. And he takes the original Navint Partners report – which was based on qualitative interviews with 20 CIOs – to task.
“Twenty CIOs are happy, so far, with cloud computing. But what about their bosses, their employees, their organization’s partners and their customers?” he asks. “Are they happy?”
In order to understand the business success of cloud computing or any innovative technology, Mann points out, “we need to (a) talk to more than just CIOs, (b) be certain we’re asking all the right questions, and (c) hit a larger sample than just four percent of the Fortune 500.”
CA Technologies did just that, commissioning IDG Research Services to conduct a global study on the state of business innovation. “To be ‘innovative’ means your organization is capitalizing on cutting edge advancements such as cloud computing, big data, enterprise mobility, devops, the IT supply chain, and so on,” he says.
And Mann delivers this thought-provoking news: “After talking to 800 IT and business executives at organizations with more than $250 million in revenue, we didn’t find a horde of happy campers. Yes, some CIOs are happy with the cloud, and that’s a good thing. But here’s the rub. Firstly, CIOs are not the principal drivers of innovation. Users are. Second, our research found business leaders are, in many cases, unhappy with what their CIOs are doing.”
Business executives are not as excited as IT is about cloud computing, either, Mann continues. “Forty percent of IT respondents say their organization plans to invest in cloud computing over the next 12 months, versus 31% of business executives. Also, business executives give IT low marks in enabling and supporting innovation. One in two rate IT poorly in terms of having made sufficient investment in resources and spending on innovation; 48% think IT isn’t moving fast enough; 37% think IT doesn’t have the right skills; and 40% say IT isn’t receptive to new ideas when approached by the business.”
While cloud offers IT leaders the ability to keep the lights on for less, that’s only one dimension to consider, Mann says. “The crucial question to answer is if the IT organization is innovating with the cloud. Savings are one thing. Moving the organization ahead and keeping it technologically competitive is another. “
Mann provides examples of questions the business needs to be asking before a cloud approach is contemplated: “The business wants IT to explore the world of digital technologies and deliver answers. How can you align mobile and consumerization with the business interest? How do you connect with millions of people over social media? How do you find a new market and get into that market in a different geography or a different product area? How do you use technology to smooth out the bumps in an M&A activity?”
CIOs need to help the business understand that the cloud has many other business advantages beyond cost efficiencies, says Mann. “But the only way to capitalize on those advantages is for the executive leadership to allocate budget appropriately. We need to soften the focus on cost cutting, and sharpen the focus on IT empowerment and business alignment.”
To get there, he says, “CIOs need to make a case for greater investment in cloud computing, not simply to cut costs longer term, but to drive innovation and competitive advantage. Can the CIO win additional budget if she could achieve success according to real business KPIs such as return on equity, return on assets, shareholder value, revenue growth, or competitive position? If the CIO could reliably and measurably move the needle on such things, instead of just auditing costs, the ‘happiness gap’ between the business and IT could be closed.”
It’s time to increase our understanding of the broader and deeper impact of cloud on organizations. “We’ve been talking about IT-to-business alignment for generations. Yet we keep surveying the IT guys when we should be surveying the organizations they serve,” says Mann. “I’m concerned that, statistically, a significant percentage of Fortune 500 business executives might not be happy with the cloud. And that, by far, is the more important number for CIOs to measure and improve.”
Rackspace Looks To Compete With Amazon In Cloud Storage Services
Peter Suciu for redOrbit.com – Your Universe Online
Computer racks have long offered a bit of flexibility when it comes to the size options. Now open cloud vendor Rackspace is looking to provide the same sort of flexibility in the virtual sense for users, who previously had to buy cloud server storage based not on needs, but rather on what the provider assigned.
This twist is clearly aimed at the likes of Amazon Web Services Elastic Block Store, and similar efforts that assign bulk storage, even to those who have more specific storage size needs. The Rackspace Cloud Block Storage solution, which is reportedly powered by OpenStack, is intended to complement the company’s core business services.
And because it is powered by OpenStack this flexibility option won’t be overly complex or complicated, as it has been designed to provide a common interface for customers. This allows customers to interact with this storage via working server applications.
The Rackspace block storage will feature the performance characteristics of either disk drives or solid state devices, while providing high-performance as well. The pricing options are also on par, with standard service on disks being priced at $.15 per Gigabyte per month, while the more expensive solid state option will begin at $.70 per Gigabyte.
By comparison Amazon offers a similar service at $.10 per GB per month, but adds an I/O fee of $.10 for each million I/O operations. Rackspace has noted that it will not charge I/O and instead looked to simplify the pricing of storage.
The cost for I/O suggests that these options might be aimed at customers looking for different needs in storage, with Rackspace being an option for active data as opposed to Amazon’s archiving options.
“We don’t charge for I/Os. We think it makes sense to simplify the pricing,” John Engates, CTO of the San Antonio, Texas-based cloud service and hosted service provider told Information Week on Monday.
Rackspace could also be entering the market as pricing remains an issue. In March, Amazon Web Services and Microsoft each dropped prices on their respective cloud services. At the time it appeared that Amazon Web Services was looking for big customers, while Microsoft was looking to lure in entry-level customers and developers.
Both could see increased competition as Rackspace looks to shake up the space.
The OpenStack type of storage from Rackspace could be used to hold active application data, including database transactions, messages and even files that are accessed regularly; as opposed to the Amazon solution, which is really meant to hold infrequently accessed files and data.
Rackspace has held off the launch of a cloud storage service as it waited for the framework of OpenStack to be read. The company, which had pioneered micro servers in the cloud, has been seen as slow to provide a service that adapted to the application customers wanted to run.
With OpenStack now an option, Rackspace is looking to fill the void of active storage for small and medium sized businesses (SMBs), and enterprises, including those that weren’t ready to deploy to the cloud before.
Computer racks have long offered a bit of flexibility when it comes to the size options. Now open cloud vendor Rackspace is looking to provide the same sort of flexibility in the virtual sense for users, who previously had to buy cloud server storage based not on needs, but rather on what the provider assigned.
This twist is clearly aimed at the likes of Amazon Web Services Elastic Block Store, and similar efforts that assign bulk storage, even to those who have more specific storage size needs. The Rackspace Cloud Block Storage solution, which is reportedly powered by OpenStack, is intended to complement the company’s core business services.
And because it is powered by OpenStack this flexibility option won’t be overly complex or complicated, as it has been designed to provide a common interface for customers. This allows customers to interact with this storage via working server applications.
The Rackspace block storage will feature the performance characteristics of either disk drives or solid state devices, while providing high-performance as well. The pricing options are also on par, with standard service on disks being priced at $.15 per Gigabyte per month, while the more expensive solid state option will begin at $.70 per Gigabyte.
By comparison Amazon offers a similar service at $.10 per GB per month, but adds an I/O fee of $.10 for each million I/O operations. Rackspace has noted that it will not charge I/O and instead looked to simplify the pricing of storage.
The cost for I/O suggests that these options might be aimed at customers looking for different needs in storage, with Rackspace being an option for active data as opposed to Amazon’s archiving options.
“We don’t charge for I/Os. We think it makes sense to simplify the pricing,” John Engates, CTO of the San Antonio, Texas-based cloud service and hosted service provider told Information Week on Monday.
Rackspace could also be entering the market as pricing remains an issue. In March, Amazon Web Services and Microsoft each dropped prices on their respective cloud services. At the time it appeared that Amazon Web Services was looking for big customers, while Microsoft was looking to lure in entry-level customers and developers.
Both could see increased competition as Rackspace looks to shake up the space.
The OpenStack type of storage from Rackspace could be used to hold active application data, including database transactions, messages and even files that are accessed regularly; as opposed to the Amazon solution, which is really meant to hold infrequently accessed files and data.
Rackspace has held off the launch of a cloud storage service as it waited for the framework of OpenStack to be read. The company, which had pioneered micro servers in the cloud, has been seen as slow to provide a service that adapted to the application customers wanted to run.
With OpenStack now an option, Rackspace is looking to fill the void of active storage for small and medium sized businesses (SMBs), and enterprises, including those that weren’t ready to deploy to the cloud before.
Source: Peter Suciu for redOrbit.com – Your Universe Online
redOrbit (http://s.tt/1qQHX)
How Big Data and Cloud Computing Are Pushing Networks to the Brink
A lot is expected of corporate networks these days. Companies are trying to add new services and support new devices. There’s always more data that has to keep flowing, more stuff being connected to it. And the network is expected to perform, no matter what. Now there are about five billion devices connected to the Internet and billions of individual users, all expecting their networks to perform.
The folks at Juniper Networks started to wonder if the world of networking has reached some kind of fundamental inflection point. They got together with the people at Forrester Research and surveyed 150 senior IT executives to try to get a better handle on how big trends facing the enterprise, like cloud computing and big data, are affecting enterprise networks.
The findings are kind of interesting and sort of troubling. While cloud computing and software-as-a-service products such as Salesforce.com tend to save money and time by taking dedicated hardware and software out of the equation, using them puts new demands on the network: 58 percent of those surveyed said cloud services had added enough demand to their networks that they had to upgrade the networking hardware.
Cloud services tend to go hand in hand with an increased usage of mobile devices: 47 percent of businesses have seen increased demand from employees bringing their own devices to work.
The complications for networks have grown past the point where you can simply add more bandwidth and hope for the best. The survey found that 86 percent of the companies in the survey have not been unable to spin up new services or support certain business demands, because their networks were simply not up to the task. Another 74 percent reported that their networks had become complex, while 35 said their networks had become “too rigid to manage.”
So that leaves IT organizations at a point where networks are under more demand than ever, and less able to meet those demands. It can’t go on like that. “We’re reaching the point where the effectiveness of networks is inversely proportional to the volume of information they contain,” Juniper CIO Bask Iyer told me last week.
The solution is to make sure that all the bits used to build the network work together well. The old way — running networks mainly by just adding more bandwidth — won’t get the job done, Iyer says. The network has to be built with overarching business objectives in mind, with teams that are usually separate — security, manufacturing, quality control — getting more intimately involved with building the network than they have been before. Naturally, that’s the opening for a larger discussion about the implications of the research. And, of course, Juniper is holding a Web event later today to explain what it means.
New OpenStack clouds mean something for everyone
By Barb Darrow
If there isn’t an OpenStack cloud you fancy, wait a second, there’s more — a lot more — in the pipeline. Cloudscaling, Metacloud and Dreamhost will all preview their take on the open-source cloud this week at the OpenStack Summit in San Diego.
This week at the OpenStack Summit in San Diego, new flavors of the open-source cloud will be unveiled by Cloudscaling, Dreamhost and Metacloud, among others. Here’s a roundup of some of the noteworthy news:
Cloudscaling’s Open Cloud System 2.0: An early proponent of OpenStack, Cloudscaling made waves with news that it’s new private cloud will facilitate deployment of modern applications and also allow those applications to “scale up” to both Amazon and Google Compute Engine public clouds as needed. Open Cloud System will support Amazon S3 and EC2 and relevant GCE APIs to support that, Cloudscaling CEO Michael Grant told me. The product is slated to come online by year’s end.
The GCE support raised eyebrows, but to Grant it was a no brainer. “We actually see customers wanting choice [in public cloud] and also see in Google Compute Engine a real commitment to public cloud, [it is] as a serious offering by Google,” he said. (Check out Cloudscaling CTO Randy Bias’ blog about why Google Compute Engine is a “game changer.”)
Open Cloud 2.0 targets modern, “dynamic” applications — software as a service, big data, media-intensive and mobile apps as opposed to legacy applications that now run in data centers, Grant said. Bias left the door open to supporting other leading public cloud infrastructure, including Rackspace, down the line.
Metacloud: This Pasadena, Calif.-based startup emerged from stealth just in time to tout its own private cloud solution at the summit. Backed by Storm Ventures and AME Cloud Ventures — the venture capital firm headed by former Yahoo CEO Jerry Yang — Metacloud said its cloud is already in use by at least one (unnamed) Fortune 100 company.
The year-old company has impressive DNA in scale-out computing. CEO and co-founder Steve Curry formerly managed Yahoo’s global storage operations. Co-founder and CTO Sean Lynch was SVP of technology operations at Ticketmaster Entertainment, and Rafi Khardalian, Metacloud’s VP of operations, was director of systems architecture at Ticketmaster Entertainment.
Dreamhost’s DreamCompute: Web-hosting veteran Dreamhost is launching its take on the OpenStack public cloud with DreamCompute. This public cloud is one of first services to use Ceph as its storage subsystem, hardly surprising since Los Angeles-based Dreamhost spun off Inktank as a company to back Ceph as an alternative to the Swift storage used by other OpenStack implementations.
Prospective users can sign up for a private beta of the service this week. “This is a production-grade public cloud service launching on OpenStack but with Ceph in the box,” said Dreamhost CEO Simon Anderson. “It delivers live migration and operating efficiency from a storage perspective, which is a good next step for OpenStack.”
Anderson said DreamCompute will give Amazon a run for its money on pricing. For the company’s existing Dreamhost dedicated server, virtual private server and dedicated hosting services, for example, inbound and outbound data transfer is always free. For DreamObjects block storage, inbound data transfer is free, outbound transfers cost $0.07 per GB. While DreamCompute pricing won’t be announced for a few weeks, Anderson said simplicity and cost savings is paramount for developers and entrepreneurs wary of Amazon’s confusing price matrix.
Rackspace updates tools: OpenStack pioneer Rackspace will announce software development kits (SDKs) for PHP and Java to help developers write for its public OpenStack cloud in their language of choice. “We want to lift all boats, this will make it easier for developers to consume the cloud,” Jim Curry, GM of Rackspace’s Private Cloud business said last week.
In addition, Curry said 25 percent of the Fortune 100 companies have downloaded that private cloud implementation since it became available on Aug. 15.
Cisco Edition for OpenStack: This week the community will be able to download Cisco Edition for OpenStack which combines reference architectures, documentation and automation scripts to make it easier for service providers and private cloud customers to deploy OpenStack in production, said Cisco CTO (and OpenStack vice chairman) Lew Tucker.
Cisco, as it has discussed, is also baking in OpenStack support into its networking gear. The goal is to unify physical and virtual networking through programmable interfaces and make all of that available through OpenStack’s Quantum networking service, the company said.
Feature photo courtesy of Flickr user JoeInSouthernCA
If there isn’t an OpenStack cloud you fancy, wait a second, there’s more — a lot more — in the pipeline. Cloudscaling, Metacloud and Dreamhost will all preview their take on the open-source cloud this week at the OpenStack Summit in San Diego.
photo: Flickr / JoeInSouthernCA
Don’t fret if the OpenStack clouds now available from Hewlett-Packard, Rackspace, Internap and a handful of private-cloud-centric startups don’t suit your need. There will be more options to choose from very shortly.This week at the OpenStack Summit in San Diego, new flavors of the open-source cloud will be unveiled by Cloudscaling, Dreamhost and Metacloud, among others. Here’s a roundup of some of the noteworthy news:
Cloudscaling’s Open Cloud System 2.0: An early proponent of OpenStack, Cloudscaling made waves with news that it’s new private cloud will facilitate deployment of modern applications and also allow those applications to “scale up” to both Amazon and Google Compute Engine public clouds as needed. Open Cloud System will support Amazon S3 and EC2 and relevant GCE APIs to support that, Cloudscaling CEO Michael Grant told me. The product is slated to come online by year’s end.
The GCE support raised eyebrows, but to Grant it was a no brainer. “We actually see customers wanting choice [in public cloud] and also see in Google Compute Engine a real commitment to public cloud, [it is] as a serious offering by Google,” he said. (Check out Cloudscaling CTO Randy Bias’ blog about why Google Compute Engine is a “game changer.”)
Open Cloud 2.0 targets modern, “dynamic” applications — software as a service, big data, media-intensive and mobile apps as opposed to legacy applications that now run in data centers, Grant said. Bias left the door open to supporting other leading public cloud infrastructure, including Rackspace, down the line.
Metacloud: This Pasadena, Calif.-based startup emerged from stealth just in time to tout its own private cloud solution at the summit. Backed by Storm Ventures and AME Cloud Ventures — the venture capital firm headed by former Yahoo CEO Jerry Yang — Metacloud said its cloud is already in use by at least one (unnamed) Fortune 100 company.
The year-old company has impressive DNA in scale-out computing. CEO and co-founder Steve Curry formerly managed Yahoo’s global storage operations. Co-founder and CTO Sean Lynch was SVP of technology operations at Ticketmaster Entertainment, and Rafi Khardalian, Metacloud’s VP of operations, was director of systems architecture at Ticketmaster Entertainment.
Dreamhost’s DreamCompute: Web-hosting veteran Dreamhost is launching its take on the OpenStack public cloud with DreamCompute. This public cloud is one of first services to use Ceph as its storage subsystem, hardly surprising since Los Angeles-based Dreamhost spun off Inktank as a company to back Ceph as an alternative to the Swift storage used by other OpenStack implementations.
Prospective users can sign up for a private beta of the service this week. “This is a production-grade public cloud service launching on OpenStack but with Ceph in the box,” said Dreamhost CEO Simon Anderson. “It delivers live migration and operating efficiency from a storage perspective, which is a good next step for OpenStack.”
Anderson said DreamCompute will give Amazon a run for its money on pricing. For the company’s existing Dreamhost dedicated server, virtual private server and dedicated hosting services, for example, inbound and outbound data transfer is always free. For DreamObjects block storage, inbound data transfer is free, outbound transfers cost $0.07 per GB. While DreamCompute pricing won’t be announced for a few weeks, Anderson said simplicity and cost savings is paramount for developers and entrepreneurs wary of Amazon’s confusing price matrix.
Rackspace updates tools: OpenStack pioneer Rackspace will announce software development kits (SDKs) for PHP and Java to help developers write for its public OpenStack cloud in their language of choice. “We want to lift all boats, this will make it easier for developers to consume the cloud,” Jim Curry, GM of Rackspace’s Private Cloud business said last week.
In addition, Curry said 25 percent of the Fortune 100 companies have downloaded that private cloud implementation since it became available on Aug. 15.
Cisco Edition for OpenStack: This week the community will be able to download Cisco Edition for OpenStack which combines reference architectures, documentation and automation scripts to make it easier for service providers and private cloud customers to deploy OpenStack in production, said Cisco CTO (and OpenStack vice chairman) Lew Tucker.
Cisco, as it has discussed, is also baking in OpenStack support into its networking gear. The goal is to unify physical and virtual networking through programmable interfaces and make all of that available through OpenStack’s Quantum networking service, the company said.
Now bring on the customers
So there’s no dearth of available or soon-to-be-available OpenStack distributions coming on line. And there is no shortage of customers testing these clouds. The big question is when significant numbers of customers actually start using these clouds in production and as options or adjuncts to Amazon’s public cloud or their own VMware-based private infrastructure. That’s why last week’s news that Comcast is aboard OpenStack was newsworthy. With its broadband reach to consumers and small businesses, Comcast could drive real OpenStack user adoption. When and if these businesses are ready to make the move, there will be plenty of OpenStack iterations to chose from.Feature photo courtesy of Flickr user JoeInSouthernCA
AT&T to offer Secure Cloud Services, Joins hand with IBM
AT&T
and IBM have joined hands to offer a service which gives clients access
to IBM’s back end infrastructure through secure private lines of
AT&T.
The service will be accessible for only those customers who will sign up for IBM’s SmartCloud Enterprise+ service. Dennis Quan however did not give any details of the charges for access to the AT&T private VPN network through which the service will be run. However he did say that the compute and network resources is available through partnership and would be sold on a pay as you go basis.
IBM said that the service will give a secure and one to one connection for enterprises. The SmartCloud Platform will include to virtual machines and storage, as well as application and security services. IBM by signing up with AT&T showed its motive of increasing its reach and hold over the cloud computing market. IBM is planning to touch $7 Billion as cloud related sales by 2015.
Dennis Quan, IBM’s VP for SmartCloud Infrastructure described in an interview about the service- “It’s an integrated, end-to-end solution that provides access to cloud computing resources, and a secure pipe over which to access them. It’s going to unleash a lot of pent up demand for cloud computing because we’ve made the consumption of cloud a lot simpler.”Both IBM and AT&T plan to start the service in the first quarter of 2013. The service will be mostly used by the Customer base of IBM.
According to a statement issued by AT&T Business Solutions CEO Andy Geisse- “AT&T and IBM are delivering a new, network-enabled cloud service that marries the security and speed of AT&T’s global network with the control and management capabilities of IBM’s enterprise cloud,”
The service will be accessible for only those customers who will sign up for IBM’s SmartCloud Enterprise+ service. Dennis Quan however did not give any details of the charges for access to the AT&T private VPN network through which the service will be run. However he did say that the compute and network resources is available through partnership and would be sold on a pay as you go basis.
IBM said that the service will give a secure and one to one connection for enterprises. The SmartCloud Platform will include to virtual machines and storage, as well as application and security services. IBM by signing up with AT&T showed its motive of increasing its reach and hold over the cloud computing market. IBM is planning to touch $7 Billion as cloud related sales by 2015.
OpenNebula cloud — bigger than expected in business
By Barb Darrow
OpenNebula — the open-source cloud behind the European Space Agency and CERN – may be bigger in private industry business than many may have anticipated.
According to new survey data from C12G Labs, the company behind OpenNebula, 43 percent of 600 users responding are in business accounts compared to 17 percent in research, and less than 10 percent in academia. Maybe this shouldn’t be a shocker given that OpenNebula’s customer page lists such companies as Akamai, Dell, IBM, SAP and Telefonica.
Also a bit surprising was that geographic distribution was much more
heterogeneous than expected for a technology born and raised in Europe.
Sure, nearly half (49 percent) of respondents are in Europe or Russia
but a healthy 23 percent are in North America where the open-source
cloud buzz is much more around the newer OpenStack cloud platform
backed by Rackspace, IBM, HP, Cisco, and other tech powers and
OpenStack’s slapfight with CloudStack and Eucalyptus, two other
open-source cloud platforms vying for the limelight (and customers.)
What gets lost in that hubbub — which will doubtless ramp up next week at the OpenStack Summit, is that OpenNebula is more mature and battle tested than its competitors.
OpenNebula should get more credit, said Carl Brooks, cloud analyst for Tier1 Research. “I am a big fan because it’s so organically driven … it’s seven years old, it’s parked in some of the biggest organizations out there, and that all happened without a massive wave of hype,” Brooks said.
Ignacio Llorente, director of OpenNebula will be at Structure Europe next week to discuss how truly interoperable cloud technology can transform business. Based on OpenNebulas traction so far, he knows whereof he speaks.
Feature photo courtesy of Flickr user Editor B
OpenNebula, the European-rooted open-source cloud platform is
used by more businesses and in more countries than many might expect. At
the ripe old age of 7, OpenNebula remains a quiet force compared to
say, OpenStack.
photo: Editor B
According to new survey data from C12G Labs, the company behind OpenNebula, 43 percent of 600 users responding are in business accounts compared to 17 percent in research, and less than 10 percent in academia. Maybe this shouldn’t be a shocker given that OpenNebula’s customer page lists such companies as Akamai, Dell, IBM, SAP and Telefonica.
OpenNebula: world traveler
Also a bit surprising was that geographic distribution was much more
heterogeneous than expected for a technology born and raised in Europe.
Sure, nearly half (49 percent) of respondents are in Europe or Russia
but a healthy 23 percent are in North America where the open-source
cloud buzz is much more around the newer OpenStack cloud platform
backed by Rackspace, IBM, HP, Cisco, and other tech powers and
OpenStack’s slapfight with CloudStack and Eucalyptus, two other
open-source cloud platforms vying for the limelight (and customers.)What gets lost in that hubbub — which will doubtless ramp up next week at the OpenStack Summit, is that OpenNebula is more mature and battle tested than its competitors.
OpenNebula should get more credit, said Carl Brooks, cloud analyst for Tier1 Research. “I am a big fan because it’s so organically driven … it’s seven years old, it’s parked in some of the biggest organizations out there, and that all happened without a massive wave of hype,” Brooks said.
Other fun facts
Most of the respondents (79 percent) are deploying OpenNebula in private not public cloud. The KVM hypervisor was most popular with 42 percent of respondents using it. VMware was second at 27 percent — although VMware was the preferred hypervisor of 55 percent of the big deployers — those with 500 or more nodes.Ignacio Llorente, director of OpenNebula will be at Structure Europe next week to discuss how truly interoperable cloud technology can transform business. Based on OpenNebulas traction so far, he knows whereof he speaks.
Feature photo courtesy of Flickr user Editor B
Amazon's Share Of Government Cloud Computing 'Accelerating'
As Amazon announces new services for government customers, it
says the 1,800 government and education customers that now use Amazon
Web Services prove "rapid" adoption of its cloud computing products.
By J. Nicholas Hoover
By J. Nicholas Hoover
As
government agencies continue to adopt cloud computing, Amazon is among
those reaping the rewards: the company announced Wednesday that more
than 300 government agencies and 1,500 educational institutions now use
Amazon Web Services.
According to Amazon, the 1,800-plus customers represent its "rapid growth in the public sector."
"Government agencies and education institutions are rapidly accelerating their adoption of the AWS Cloud," Teresa Carlson, VP of Amazon Web Services' public sector business, said in a statement.
Amazon's growth isn't surprising. State and local governments have been moving quickly to cloud services as a way to save money. Federal CIO Steven VanRoekel and his predecessor Vivek Kundra have been championing the adoption of cloud computing at the federal level to cut costs and improve government IT services with a Cloud First mandate, which requires federal agencies to consider cloud computing as part of most new information technology acquisitions.
The announcement of continued growth in Amazon's public sector business came as the company also announced new features for government customers at Amazon's AWS Public Sector Summit in Washington, D.C., on Wednesday. The new features are for AWS GovCloud, a dedicated community cloud for U.S. government customers that meets strict federal arms control regulations.
Likely chief among the new features is the high-performance computing capability made available through Amazon's Compute Cluster Instances, which has already been used for things like molecular and genomic modeling and analysis, and which can leverage big data technologies such as MapReduce. Even before launching a government version of this service, government agencies have used Amazon for supercomputing: The U.S. Air Force Research Laboratory, for example, has used Amazon to simulate a gas turbine and related airflow dynamics.
Among the other new offerings are elastic load balancing, auto scaling, CloudWatch alarms, the Simple Notification Service, and the Simple Queue Service. Amazon says that the addition of these features should make it easier for government customers to scale their cloud services and to ensure those services' reliability. "With the new services and features added today in AWS GovCloud, public sector customers now have greater capabilities to rapidly design, build, and deploy high-performance applications," Carlson said.
A wide array of government agencies, from NASA to Douglas County, Nebraska, and from the University of Oxford to the Centers for Disease Control and Prevention, are now among the customers of Amazon Web Services. Some of the recent services hosted with Amazon include CDC BioSense 2.0, a service that collects information from health facilities as part of an effort to improve official response to diseases and healthcare trends.
According to Amazon, the 1,800-plus customers represent its "rapid growth in the public sector."
"Government agencies and education institutions are rapidly accelerating their adoption of the AWS Cloud," Teresa Carlson, VP of Amazon Web Services' public sector business, said in a statement.
Amazon's growth isn't surprising. State and local governments have been moving quickly to cloud services as a way to save money. Federal CIO Steven VanRoekel and his predecessor Vivek Kundra have been championing the adoption of cloud computing at the federal level to cut costs and improve government IT services with a Cloud First mandate, which requires federal agencies to consider cloud computing as part of most new information technology acquisitions.
The announcement of continued growth in Amazon's public sector business came as the company also announced new features for government customers at Amazon's AWS Public Sector Summit in Washington, D.C., on Wednesday. The new features are for AWS GovCloud, a dedicated community cloud for U.S. government customers that meets strict federal arms control regulations.
Likely chief among the new features is the high-performance computing capability made available through Amazon's Compute Cluster Instances, which has already been used for things like molecular and genomic modeling and analysis, and which can leverage big data technologies such as MapReduce. Even before launching a government version of this service, government agencies have used Amazon for supercomputing: The U.S. Air Force Research Laboratory, for example, has used Amazon to simulate a gas turbine and related airflow dynamics.
Among the other new offerings are elastic load balancing, auto scaling, CloudWatch alarms, the Simple Notification Service, and the Simple Queue Service. Amazon says that the addition of these features should make it easier for government customers to scale their cloud services and to ensure those services' reliability. "With the new services and features added today in AWS GovCloud, public sector customers now have greater capabilities to rapidly design, build, and deploy high-performance applications," Carlson said.
A wide array of government agencies, from NASA to Douglas County, Nebraska, and from the University of Oxford to the Centers for Disease Control and Prevention, are now among the customers of Amazon Web Services. Some of the recent services hosted with Amazon include CDC BioSense 2.0, a service that collects information from health facilities as part of an effort to improve official response to diseases and healthcare trends.
So Far, So Good: Fortune 500 CIOs Seem Happy With Cloud Computing
Many organizations are still in the early stages of their cloud computing journeys, and the reports are: so far, so good. No major flaws or “gotchas” have emerged in nascent cloud engagements, and CIOs are saying full steam ahead. Still needed, however, are more security assurances, and more vendor flexibility.
That’s the key takeaway from a new report just published by Navint Partners, LLC, which finds large companies are seeing mainly positive results from their cloud computing efforts. The consulting company convened a roundtable with 20 CIOs from Fortune 500 companies to discuss their progress and concerns about cloud computing.
Nine out of 10 respondents, for example, say they have received 100% of the savings they expected from their cloud computing projects. In addition, four out of five say their cloud efforts have helped their organizations achieve some sort of competitive advantage, and two-thirds say cloud has helped their organization’s efficiency and effectiveness.
These are all vague terms, of course, and the Navint report dug a little deeper into CIOs’ heads to see where they see these benefits unfolding. Robert Summers, CIO of tax preparation firm Jackson Hewitt, cautions that while positive, these are likely very preliminary results. “Many [executives and business managers] think it will take five or more years to fully realize the tangible benefits of cloud implementation.” Summers’ points are borne out in another recent study from the Cloud Security Alliance and ISACA,which estimates that cloud maturity is still at least three years out at most organizations.
Close to half of the CIOs participating in Navint’s roundtable even indicate they already have a dedicated budget for cloud computing. A majority of cloud budgets, then, are discretionary, and Summers states that “many organizations regard cloud expenses as research and development. Organizations are analyzing how the cloud demand differs across departments before appropriating fixed funding for long‐term contracts.”
Another possibility not mentioned in the report is that because cloud may be tightly baked into business processes, to the point that spending often occurs outside the IT department. Many analysts, including Gartner, predict that soon, business units outside of IT will spend more on technology than IT itself, and cloud will be a big part of this.
Nevertheless, more money will be going into private cloud efforts than public cloud resources over the next two years — which is not surprising, given the huge internal IT inventories within Fortune 500 companies. Close to half of the CIOs contributing to the Navint report expect to boost their private cloud budgets by more than 20%, and about a third will be increasing spending on public cloud services.
Fortune 500 CIOs, then, seem happy with cloud, but there are still issues — security is still the most pressing concern — but, surprisingly, CIOs also complain about cloud vendor inflexibility. “Despite highly advanced security and fraud countermeasures employed by cloud vendors, CIOs and other executives regard security guarantees and redundancy policies with guarded pessimism.” This concern still holds back a lot of companies from moving mission-critical applications to the cloud. “The security seems to be there today; it will become less of an issue, but people want to see it,” as Summers put it.
However, while one of the most profound advantages seen with cloud is the ability to scale up to spikes in demand — such as the seasonal demand a tax preparation firm may see, this option seems to be lacking in cloud vendors’ offerings. Jackson Hewitt’s Summers, for example, found that in general, cloud vendors offered very few options for firms and industries with peak business activity condensed into certain parts of the year. “None of the major players [we] spoke to had a pay‐as‐you‐go option,” he says. “All the agreements were about the same: you pay a standard amount for the entire year, and the provider agrees to handle some spikes in usage as percentage of the base. We needed a more custom arrangement.”
That’s the key takeaway from a new report just published by Navint Partners, LLC, which finds large companies are seeing mainly positive results from their cloud computing efforts. The consulting company convened a roundtable with 20 CIOs from Fortune 500 companies to discuss their progress and concerns about cloud computing.
Nine out of 10 respondents, for example, say they have received 100% of the savings they expected from their cloud computing projects. In addition, four out of five say their cloud efforts have helped their organizations achieve some sort of competitive advantage, and two-thirds say cloud has helped their organization’s efficiency and effectiveness.
These are all vague terms, of course, and the Navint report dug a little deeper into CIOs’ heads to see where they see these benefits unfolding. Robert Summers, CIO of tax preparation firm Jackson Hewitt, cautions that while positive, these are likely very preliminary results. “Many [executives and business managers] think it will take five or more years to fully realize the tangible benefits of cloud implementation.” Summers’ points are borne out in another recent study from the Cloud Security Alliance and ISACA,which estimates that cloud maturity is still at least three years out at most organizations.
Close to half of the CIOs participating in Navint’s roundtable even indicate they already have a dedicated budget for cloud computing. A majority of cloud budgets, then, are discretionary, and Summers states that “many organizations regard cloud expenses as research and development. Organizations are analyzing how the cloud demand differs across departments before appropriating fixed funding for long‐term contracts.”
Another possibility not mentioned in the report is that because cloud may be tightly baked into business processes, to the point that spending often occurs outside the IT department. Many analysts, including Gartner, predict that soon, business units outside of IT will spend more on technology than IT itself, and cloud will be a big part of this.
Nevertheless, more money will be going into private cloud efforts than public cloud resources over the next two years — which is not surprising, given the huge internal IT inventories within Fortune 500 companies. Close to half of the CIOs contributing to the Navint report expect to boost their private cloud budgets by more than 20%, and about a third will be increasing spending on public cloud services.
Fortune 500 CIOs, then, seem happy with cloud, but there are still issues — security is still the most pressing concern — but, surprisingly, CIOs also complain about cloud vendor inflexibility. “Despite highly advanced security and fraud countermeasures employed by cloud vendors, CIOs and other executives regard security guarantees and redundancy policies with guarded pessimism.” This concern still holds back a lot of companies from moving mission-critical applications to the cloud. “The security seems to be there today; it will become less of an issue, but people want to see it,” as Summers put it.
However, while one of the most profound advantages seen with cloud is the ability to scale up to spikes in demand — such as the seasonal demand a tax preparation firm may see, this option seems to be lacking in cloud vendors’ offerings. Jackson Hewitt’s Summers, for example, found that in general, cloud vendors offered very few options for firms and industries with peak business activity condensed into certain parts of the year. “None of the major players [we] spoke to had a pay‐as‐you‐go option,” he says. “All the agreements were about the same: you pay a standard amount for the entire year, and the provider agrees to handle some spikes in usage as percentage of the base. We needed a more custom arrangement.”
Cloud Services Market Expanding to $109 Billion in 2012
It wasn't too terribly long ago that "cloud computing" was a loosey-goosey marketing term being thrown around by anyone and everyone in the software space. And now? There's been a marked shift towards cloud-based services, which is a market that research firm Gartner predicts will grow 19.6 percent to $109 billion by the end of 2012.
"The cloud services market is clearly a high-growth sector within the overall IT marketplace," said Ed Anderson, research director at Gartner. "The key to taking advantage of this growth will be understanding the nuances of the opportunity within service segments and geographic regions, and then prioritizing investments in line with the opportunities."
In 2011, Gartner says the total public cloud services market was $91.4 billion. By 2016, the firm forecasts it will top $206.6 billion.
Gartner's forecast is anything but absurd. Mobile users are increasingly dependent on the cloud, in part because they're somewhat forced to be, and also because of the convenience factor. Think about the number of media tablets that skimp on built-in storage, some of which don't offer a way to expand via a microSD card slot.
On the IT side, Gartner notes that business process as a service (BPaaS) is the largest cloud-based segment, accounting for more than two thirds of the market. In other words, firms are increasingly outsourcing their payroll, printing, ecommerce, and other business processes to the cloud.
While on the topic of cloud computing, be sure to check out our fancy cloud managing guide for some indispensable tips!
Will OpenStack Usher in a Cloud Revolution?
OpenStack is seen by many as the project that will, finally, make the cloud enterprise ready. Detractors don’t believe it and point to political in-fighting, domination by large vendors and a lack of maturity.
By Jeff Vance
If you’re already tired of all the hype surrounding cloud computing, you’d better brace yourself for another cycle focused on OpenStack, the open-source cloud platform that’s touted as the operating system for the cloud.
The forces aligning behind OpenStack are impressive ones. The project originated in NASA, was moved along in partnership with Rackspace and is now spearheaded by the OpenStack Foundation. OpenStack is backed by 180 member companies, including biggies like AT&T, HP, IBM, Cisco, VMware and Intel.
Oh, and the Foundation has $10 million in funding. Not bad.
Despite its roster of supporters, OpenStack has plenty of detractors too. Before acquiring network virtualization startup Nicira, recently added member VMware bad-mouthed OpenStack as an “ugly sister” to vCloud (the other ugly sisters being Citrix-led CloudStack and Eucalyptus).
I should also note that while you don’t hear much out of Amazon, AWS is the 800-pound, thus far silent, gorilla in the room.
Just when you were hoping the enterprise cloud picture was getting clearer, along comes political in-fighting about whose cloud is more open and which will meet the performance standards needed for enterprise-class cloud computing.
VMware Joins – Consensus or Kiss of Death
OpenStack was originally seen as an open alternative to VMware’s proprietary dominance over data center virtualization and what would eventually turn into proprietary clouds.
Boris Renski, EVP of cloud startup Mirantis and a Gold Member of OpenStack, isn’t happy about VMware’s participation in OpenStack. On the Mirantis blog, Renski wrote:
"Subduing OpenStack is exactly what VMware did by joining the foundation. Every enterprise considering OpenStack that we ever encountered at Mirantis was primarily interested in OpenStack as an open alternative to proprietary VMware. While in reality OpenStack and VMware are different kinds of beasts, perception-wise there is no argument: enterprises see OpenStack as a substitute for VMware. Now, with VMware in the OpenStack foundation, every enterprise buyer will rightfully ask the question: 'If OpenStack is not competing with VMware, then what the hell is OpenStack?'”
Not everyone in OpenStack feels this way. “Boris was one of two board members to vote against VMware. Two out of twenty-four,” said Josh McKenty, a co-founder of OpenStack and the CEO of Piston Cloud Computing.
“Look at hypervisors, they [VMware] have 90 percent market share. Every cloud has a hypervisor in it somewhere. VMware is in best position to make all of that work, and when we talk to end users, they all tell us they want VMware in OpenStack,” he said.
McKenty added that what is good for the overall OpenStack community in this case might not work out as well for Mirantis. We’ll see.
Performance and Maturity
Other detractors point to the relative immaturity of OpenStack and performance issues.
German company Dolphin IT Services downloaded OpenStack, gave it a test run, but then abandoned it.
“It became apparent quite fast that the product was not mature enough to be deployed productively. A lot needed to be done on our side in order for it to work correctly. There were still some features we dearly needed that were not implemented yet – or not sufficiently implemented – like billing and an appealing Web front end,” said Andreas Kunter, CEO of Dolphin IT Services.
The company instead adopted the cloud platform from startup OnApp. “[With OnApp] we could basically deploy out-of-the-box. It integrated nicely with our chosen billing platform, and allowed us to grow our business without having to pay large amounts upfront,” he said.
One of the main concerns Kunter had with OpenStack is support. “Support after deployment is often underestimated,” he said. “We know a lot about virtualization and hypervisor technology, but we are still learning while maintaining our cloud. When things go wild and you need to solve it fast, it is good to have proficient support to back you up.”
Plenty of companies, Rackspace included, intend to make money supporting OpenStack. However, it will take time to build those support teams out, and to get them up to speed on all of the ins and outs of the sprawling OpenStack project.
The other two major complaints that detractors have about OpenStack are: 1) big vendors like Cisco, HP, IBM, Intel and VMware could dominate the project in ways that won’t be beneficial to the larger community and 2) OpenStack performance isn’t yet competitive with the likes of AWS.
As for complaint one, anytime you have that many big vendors working together, people will be wary of them. Would it be better for them to simply offer an array of competing products, rather than putting their proprietary layers and services over a common core?
At the very least, the threat of vendor-lock, a very real cloud concern, is lessened with OpenStack.
Complaint two is trickier.
Cloud storage startup Nasuni has tested all the major bulk cloud storage service providers for performance, stability and scalability. “We’re neutral about OpenStack,” said Andres Rodriguez, CEO of Nasuni. “But we’re going to rigorously test any product or service we may offer to our own customers.”
Rodriguez pointed out that Nasuni didn’t specifically test OpenStack, but since Rackspace developed and uses OpenStack, Nasuni regards Rackspace Cloud as the OpenStack showcase. Based on benchmarks Nasuni set, only six out of the sixteen cloud storage providers they tested passed.
Rackspace did indeed pass, but the cloud storage test found that Rackspace lagged far behind leader Amazon in a number of key metrics, including speed, both read and write errors, and stability.
“Remember, cloud storage is all about scale, and there’s a huge gap between Amazon and everyone else, just based on scale alone. Most of Rackspace’s business is still colocation. Their cloud storage footprint is dwarfed by Amazon,” he said.
While cloud storage is new to most vendors, Amazon has been working away on this problem for the past twelve or fifteen years. “By the time they went public, they’d already accumulated seven years or so of real-world operational experience. Challengers are coming in a decade late,” he added.
The Future of OpenStack
Storage is only one part of OpenStack, and arguably one of its least mature parts. It’s premature to write it off as something that will never challenge Amazon, especially as storage demands continue to skyrocket.
And not all storage is the same. Mission-critical storage is much different than, say, backups of non-critical media files.
McKenty, the OpenStack co-founder, said that his next major step for OpenStack is to get the separate projects better joined beneath a common framework. With so many developers working independently, the projects can grow apart. “It’s a challenge to figure out how to keep the projects loosely coupled, so you don’t stifle the creativity of developers, yet linked, so everything works well together,” he said.
He believes it’ll be another release or two before they reach that goal, but when they do, OpenStack will have a single command line that will ensure interoperability at the most basic level.
“We’ve learned plenty of lessons along the way,” said Wayne Walls, one of Rackspace’s key OpenStack developers. “We quickly learned that you could have about a billion different variations of an OpenStack cloud where you tell people to download it and go run it as they see fit. But that’s very hard to support at scale.”
Walls believes that OpenStack is nearly past its awkward “immature” phase. Companies are taking OpenStack and developing products around it. If you look at some of the startups in the OpenStack Foundation, including Mirantis and Piston Cloud Computing, all of their messaging focuses on OpenStack. You would have a hard time getting much VC funding if OpenStack was a doomed science project.
“OpenStack has 550,000 lines of code today, with close to 500 developers around world contributing code,” Walls said. “This model pushes best of breed to the top. Whether it’s networking or storage or whatever else, the world’s top experts decide how things are done.”
The other key variable that Walls pointed to is the fact that no matter “how deep in the weeds” you get with OpenStack, it’s very easy to pick up your assets and move them. It’s not nearly as easy to do so with closed APIs.
Of course, this is important if enterprises are ever going to gain confidence in public clouds, but it’s equally important for private clouds. This kind of portability means an OpenStack cloud, almost by definition, is a hybrid cloud, allowing you to move back and forth between private and public environments almost at will.
That’s a competitive advantage that OpenStack is gaining that competing solutions will have trouble keeping up with.
By Jeff Vance
If you’re already tired of all the hype surrounding cloud computing, you’d better brace yourself for another cycle focused on OpenStack, the open-source cloud platform that’s touted as the operating system for the cloud.
The forces aligning behind OpenStack are impressive ones. The project originated in NASA, was moved along in partnership with Rackspace and is now spearheaded by the OpenStack Foundation. OpenStack is backed by 180 member companies, including biggies like AT&T, HP, IBM, Cisco, VMware and Intel.
Oh, and the Foundation has $10 million in funding. Not bad.
Despite its roster of supporters, OpenStack has plenty of detractors too. Before acquiring network virtualization startup Nicira, recently added member VMware bad-mouthed OpenStack as an “ugly sister” to vCloud (the other ugly sisters being Citrix-led CloudStack and Eucalyptus).
I should also note that while you don’t hear much out of Amazon, AWS is the 800-pound, thus far silent, gorilla in the room.
Just when you were hoping the enterprise cloud picture was getting clearer, along comes political in-fighting about whose cloud is more open and which will meet the performance standards needed for enterprise-class cloud computing.
VMware Joins – Consensus or Kiss of Death
OpenStack was originally seen as an open alternative to VMware’s proprietary dominance over data center virtualization and what would eventually turn into proprietary clouds.
Boris Renski, EVP of cloud startup Mirantis and a Gold Member of OpenStack, isn’t happy about VMware’s participation in OpenStack. On the Mirantis blog, Renski wrote:
"Subduing OpenStack is exactly what VMware did by joining the foundation. Every enterprise considering OpenStack that we ever encountered at Mirantis was primarily interested in OpenStack as an open alternative to proprietary VMware. While in reality OpenStack and VMware are different kinds of beasts, perception-wise there is no argument: enterprises see OpenStack as a substitute for VMware. Now, with VMware in the OpenStack foundation, every enterprise buyer will rightfully ask the question: 'If OpenStack is not competing with VMware, then what the hell is OpenStack?'”
Not everyone in OpenStack feels this way. “Boris was one of two board members to vote against VMware. Two out of twenty-four,” said Josh McKenty, a co-founder of OpenStack and the CEO of Piston Cloud Computing.
“Look at hypervisors, they [VMware] have 90 percent market share. Every cloud has a hypervisor in it somewhere. VMware is in best position to make all of that work, and when we talk to end users, they all tell us they want VMware in OpenStack,” he said.
McKenty added that what is good for the overall OpenStack community in this case might not work out as well for Mirantis. We’ll see.
Performance and Maturity
Other detractors point to the relative immaturity of OpenStack and performance issues.
German company Dolphin IT Services downloaded OpenStack, gave it a test run, but then abandoned it.
“It became apparent quite fast that the product was not mature enough to be deployed productively. A lot needed to be done on our side in order for it to work correctly. There were still some features we dearly needed that were not implemented yet – or not sufficiently implemented – like billing and an appealing Web front end,” said Andreas Kunter, CEO of Dolphin IT Services.
The company instead adopted the cloud platform from startup OnApp. “[With OnApp] we could basically deploy out-of-the-box. It integrated nicely with our chosen billing platform, and allowed us to grow our business without having to pay large amounts upfront,” he said.
One of the main concerns Kunter had with OpenStack is support. “Support after deployment is often underestimated,” he said. “We know a lot about virtualization and hypervisor technology, but we are still learning while maintaining our cloud. When things go wild and you need to solve it fast, it is good to have proficient support to back you up.”
Plenty of companies, Rackspace included, intend to make money supporting OpenStack. However, it will take time to build those support teams out, and to get them up to speed on all of the ins and outs of the sprawling OpenStack project.
The other two major complaints that detractors have about OpenStack are: 1) big vendors like Cisco, HP, IBM, Intel and VMware could dominate the project in ways that won’t be beneficial to the larger community and 2) OpenStack performance isn’t yet competitive with the likes of AWS.
As for complaint one, anytime you have that many big vendors working together, people will be wary of them. Would it be better for them to simply offer an array of competing products, rather than putting their proprietary layers and services over a common core?
At the very least, the threat of vendor-lock, a very real cloud concern, is lessened with OpenStack.
Complaint two is trickier.
Cloud storage startup Nasuni has tested all the major bulk cloud storage service providers for performance, stability and scalability. “We’re neutral about OpenStack,” said Andres Rodriguez, CEO of Nasuni. “But we’re going to rigorously test any product or service we may offer to our own customers.”
Rodriguez pointed out that Nasuni didn’t specifically test OpenStack, but since Rackspace developed and uses OpenStack, Nasuni regards Rackspace Cloud as the OpenStack showcase. Based on benchmarks Nasuni set, only six out of the sixteen cloud storage providers they tested passed.
Rackspace did indeed pass, but the cloud storage test found that Rackspace lagged far behind leader Amazon in a number of key metrics, including speed, both read and write errors, and stability.
“Remember, cloud storage is all about scale, and there’s a huge gap between Amazon and everyone else, just based on scale alone. Most of Rackspace’s business is still colocation. Their cloud storage footprint is dwarfed by Amazon,” he said.
While cloud storage is new to most vendors, Amazon has been working away on this problem for the past twelve or fifteen years. “By the time they went public, they’d already accumulated seven years or so of real-world operational experience. Challengers are coming in a decade late,” he added.
The Future of OpenStack
Storage is only one part of OpenStack, and arguably one of its least mature parts. It’s premature to write it off as something that will never challenge Amazon, especially as storage demands continue to skyrocket.
And not all storage is the same. Mission-critical storage is much different than, say, backups of non-critical media files.
McKenty, the OpenStack co-founder, said that his next major step for OpenStack is to get the separate projects better joined beneath a common framework. With so many developers working independently, the projects can grow apart. “It’s a challenge to figure out how to keep the projects loosely coupled, so you don’t stifle the creativity of developers, yet linked, so everything works well together,” he said.
He believes it’ll be another release or two before they reach that goal, but when they do, OpenStack will have a single command line that will ensure interoperability at the most basic level.
“We’ve learned plenty of lessons along the way,” said Wayne Walls, one of Rackspace’s key OpenStack developers. “We quickly learned that you could have about a billion different variations of an OpenStack cloud where you tell people to download it and go run it as they see fit. But that’s very hard to support at scale.”
Walls believes that OpenStack is nearly past its awkward “immature” phase. Companies are taking OpenStack and developing products around it. If you look at some of the startups in the OpenStack Foundation, including Mirantis and Piston Cloud Computing, all of their messaging focuses on OpenStack. You would have a hard time getting much VC funding if OpenStack was a doomed science project.
“OpenStack has 550,000 lines of code today, with close to 500 developers around world contributing code,” Walls said. “This model pushes best of breed to the top. Whether it’s networking or storage or whatever else, the world’s top experts decide how things are done.”
The other key variable that Walls pointed to is the fact that no matter “how deep in the weeds” you get with OpenStack, it’s very easy to pick up your assets and move them. It’s not nearly as easy to do so with closed APIs.
Of course, this is important if enterprises are ever going to gain confidence in public clouds, but it’s equally important for private clouds. This kind of portability means an OpenStack cloud, almost by definition, is a hybrid cloud, allowing you to move back and forth between private and public environments almost at will.
That’s a competitive advantage that OpenStack is gaining that competing solutions will have trouble keeping up with.
Why corporate strategy needs to change with the cloud
By Prabhakar Gopalan
Prabhakar Gopalan looks at how the skill sets and mindsets of
corporate strategists need to change in the cloud computing era. This
means everything from learning to accept new revenue models to being
technologically savvy enough to actually build new products.
Cloud computing changes everything, including corporate strategy as
a practice. I have listed five reasons why, although I’m sure there are
many more. Long story short: Corporate strategists need to get out of
their 20th century mindset and into the 21st century.
Cloud computing doesn’t operate in the intentional strategy space. There are a lot of unknowns, many of which can change rapidly. A small firm could develop something valuable very quickly, scale it to millions of users in a very short time and all equations about competitor reaction, supply and demand forecasts become irrelevant (hey, that’s why we have auto scaling!). The frameworks have to be discarded for more agile ways of solving problems.
Want strategists? Hire technologists and entrepreneurs who can talk real subject matter, produce prototypes and demos, and delight your customers. Who wants another boring PowerPoint that debates whether the next big widget market is going to be $5 billion or $500 billion next year? Hire makers, and go make it.
Traditional corporate strategy teams staffed with ex-big consulting firm consultants should find the exit door as soon as they can if they are in a meeting to discuss growth plans incorporating the cloud. Move them to areas of business strategy that don’t have a large impact due to cloud computing (are there still some?) or retrain them. Even the billionaire mayor of New York City wants to retrain himself and learn coding!
GrubHub CEO Matt Maloney was right on when he said, “The take-away was that a strategy may play out on paper, but the only way to truly test the validity of your product is to put it in front of customers.”
To compete in this space, corporate strategy has to reset expectations of what growth means –can larger firms deal with the smaller doses of revenues? It took Amazon Web Services almost eight years to get to top a billion dollars in revenue (see How Big is AWS) while the closest competitor, Rackspace, has taken almost the same time to get to $188 million in annual revenue.
So, the road to large revenues and profits is going to be much longer and more measured. For example, the average revenue per user (ARPU) for Evernote, a popular SaaS note-taking application, is less than $2 for most versions of its product.
The problem is that the synergy component depends on cutting costs by a lot. And when you welcome the cost cutters, they don’t know what inspires and spurs innovation and experimentation. They go with their chopping block and cut costs across the board. R&D budgets get tossed out and what you get is an innovative cloud startup squeezed in a non-cloud firm that doesn’t understand the factors for generating continuous innovation.
The next is IP acquisition. Here again, the IP that many of the innovative firms have developed is in design — user interface and user experience — which means discarding the old and making something really simple and easy to use. It is hard to measure that IP in traditional terms. Further, most of the development is based on open source and crowd sourced technologies, so there’s not a whole lot of proprietary technology to own when you buy a cloud computing startup.
Lastly, the distribution channel of the incumbents is of very little use because cloud services aren’t delivered in the same way previous services are delivered. Thousands of firms are developing applications over weekend hackathons using open source technologies and uploading them to app stores or their own sites accessible to anyone with an Internet connection. Few incumbents have significant investments in the value chain of these developers.
Prabhakar Gopalan is an entrepreneur and a product guy. Opinions expressed here do not reflect those of his employers. You can follow Prabhakar on Twitter: @PGopalan. Prabhakar is the founder of Simple Idea Labs, and kanban2go.com is one of their first experiments. Prabhakar writes and talks about products, strategy and chaos.
1. Emergent strategy rules
For years, the practice of strategy has been about analyzing value chains, applying frameworks like Porter’s five forces or newer strategic-intent-driven ideas like Blue Ocean Strategy. The problem with those framework-driven ideas is they assume a very static, deterministic model of the world. They work when the variables required to solve a problem are already well known, few in number and change at a slow pace.Cloud computing doesn’t operate in the intentional strategy space. There are a lot of unknowns, many of which can change rapidly. A small firm could develop something valuable very quickly, scale it to millions of users in a very short time and all equations about competitor reaction, supply and demand forecasts become irrelevant (hey, that’s why we have auto scaling!). The frameworks have to be discarded for more agile ways of solving problems.
2. Subject matter expertise in technology matters
As traditional growth frameworks and models are rendered useless (see No. 1 above), so too are the operators of those models. There’s very little a consultant with an MBA and no subject matter expertise in the cloud or its underlying technologies can advise your firm on what it means to design or develop a product for the future.Want strategists? Hire technologists and entrepreneurs who can talk real subject matter, produce prototypes and demos, and delight your customers. Who wants another boring PowerPoint that debates whether the next big widget market is going to be $5 billion or $500 billion next year? Hire makers, and go make it.
Traditional corporate strategy teams staffed with ex-big consulting firm consultants should find the exit door as soon as they can if they are in a meeting to discuss growth plans incorporating the cloud. Move them to areas of business strategy that don’t have a large impact due to cloud computing (are there still some?) or retrain them. Even the billionaire mayor of New York City wants to retrain himself and learn coding!
3. Product teams inform corporate growth in the cloud, not corporate strategy teams
If you are reaching out to your corporate strategy team to figure out what the next wave of innovation is, you have already missed the boat. Ask your product strategy team that is in front of your customer every day. They can tell you where the next pocket of growth is going to be. They are spending the time with the customer and know what to make while the corporate drones are still analyzing the spreadsheets and profit pools of a business that is past its prime or analyzing a market entry problem when you are already locked out of the market.GrubHub CEO Matt Maloney was right on when he said, “The take-away was that a strategy may play out on paper, but the only way to truly test the validity of your product is to put it in front of customers.”
4. Long tail + subscription economy = Granular growth
Firms that are used to billion-dollar businesses built largely through lock-in, slow incremental-feature-driven innovation after years of squeezing the customers, and near monopoly in market segments are finally seeing how the cloud’s production, distribution and pricing channel are changing the economics of growth. A large software firm used to getting thousands of dollars on licensed software now has to compete with a nimble software firm that can deliver the same or better software for just a few dollars a month. Production, distribution and pricing have all changed the basis of competition.To compete in this space, corporate strategy has to reset expectations of what growth means –can larger firms deal with the smaller doses of revenues? It took Amazon Web Services almost eight years to get to top a billion dollars in revenue (see How Big is AWS) while the closest competitor, Rackspace, has taken almost the same time to get to $188 million in annual revenue.
So, the road to large revenues and profits is going to be much longer and more measured. For example, the average revenue per user (ARPU) for Evernote, a popular SaaS note-taking application, is less than $2 for most versions of its product.
5. Revisiting growth via M&A
Large incumbents of non-cloud technologies have resorted to M&A as a growth path. The business cases for these M&A projects are based on synergy, acquiring intellectual property and distributing it through existing channels that were developed over many years.The problem is that the synergy component depends on cutting costs by a lot. And when you welcome the cost cutters, they don’t know what inspires and spurs innovation and experimentation. They go with their chopping block and cut costs across the board. R&D budgets get tossed out and what you get is an innovative cloud startup squeezed in a non-cloud firm that doesn’t understand the factors for generating continuous innovation.
The next is IP acquisition. Here again, the IP that many of the innovative firms have developed is in design — user interface and user experience — which means discarding the old and making something really simple and easy to use. It is hard to measure that IP in traditional terms. Further, most of the development is based on open source and crowd sourced technologies, so there’s not a whole lot of proprietary technology to own when you buy a cloud computing startup.
Lastly, the distribution channel of the incumbents is of very little use because cloud services aren’t delivered in the same way previous services are delivered. Thousands of firms are developing applications over weekend hackathons using open source technologies and uploading them to app stores or their own sites accessible to anyone with an Internet connection. Few incumbents have significant investments in the value chain of these developers.
What to look for in your strategy leadership?
Companies need strategic decision makers and executives with the ability to overcome their own cognitive biases against radical new methods of delivering services to customers. In other words, they need executives that view the world through the lens of modern computing and business paradigms, can take delight in looking at the latest app and getting curious about its AAARR metrics, and can take a startup approach to market size and traction. These things will significantly effect their ability to experiment and grow corporate strategy practice in the brave, new cloudy world.Prabhakar Gopalan is an entrepreneur and a product guy. Opinions expressed here do not reflect those of his employers. You can follow Prabhakar on Twitter: @PGopalan. Prabhakar is the founder of Simple Idea Labs, and kanban2go.com is one of their first experiments. Prabhakar writes and talks about products, strategy and chaos.
Surprise! VMware will join OpenStack
By Barb Darrow
Never say never. VMware is about to join the OpenStack Foundation, a group initially backed by other industry giants as a counterweight to VMware’s server virtualization dominance. Intel and NEC are also on deck to join as Gold OSF members.
Just in time for VMworld, VMware is about to join the OpenStack Foundation as a Gold member, along with Intel and NEC, according to a post on the OpenStack Foundation Wiki. The applications for membership are on the agenda of the August 28 OpenStack Foundation meeting.
A year ago, a VMware-OpenStack hookup would have been seen as unlikely. When Rackspace and NASA launched the OpenStack Project more than two years ago, it was seen as a competitive response to VMware’s server virtualization dominance inside company data centers and to Amazon’s heft in public cloud computing. Many tech companies including but not limited to Rackspace, IBM, Hewlett-Packard, Citrix, Red Hat and Microsoft saw VMware as a threat and were bound and determined to keep the company from extending its virtualization lock into the cloud.
But, things change. VMware’s surprise acquisition of Nicira and DynamicOps last month, showed there might be a thaw in the air. For one thing, Nicira is an OpenStack player. By bringing Nicira and DynamicOps into the fold, VMware appeared to be much more willing to work with non-VMware-centric infrastructure, as GigaOM’s Derrick Harris reported at the time.
This is a symbolic coup for OpenStack and its biggest boost since IBM and Red Hat officially joined as Platinum members in April. And it’s especially important since Citrix, a virtualization rival to VMware undercut it’s own OpenStack participation last April by pushing CloudStack as an alternative open source cloud stack.
OpenStack Gold members, which include Cloudscaling, Dell, MorphLabs, Cisco Systems, and NetApp, pay a fee pegged at 0.25 percent of their revenue — at least $50,000 but capped at $200,000 according to the foundation wiki. (VMware’s fee will be $66,666, according to the application, submitted by VMware CTO Steve Herrod, which is linked on the wiki post.) Platinum members — AT&T, Canonical, HP, Rackspace, IBM, Nebula, Red Hat, and SUSE – pay $500,000 per year with a 3-year minimum commitment.
Never say never. VMware is about to join the OpenStack Foundation, a group initially backed by other industry giants as a counterweight to VMware’s server virtualization dominance. Intel and NEC are also on deck to join as Gold OSF members.
A year ago, a VMware-OpenStack hookup would have been seen as unlikely. When Rackspace and NASA launched the OpenStack Project more than two years ago, it was seen as a competitive response to VMware’s server virtualization dominance inside company data centers and to Amazon’s heft in public cloud computing. Many tech companies including but not limited to Rackspace, IBM, Hewlett-Packard, Citrix, Red Hat and Microsoft saw VMware as a threat and were bound and determined to keep the company from extending its virtualization lock into the cloud.
But, things change. VMware’s surprise acquisition of Nicira and DynamicOps last month, showed there might be a thaw in the air. For one thing, Nicira is an OpenStack player. By bringing Nicira and DynamicOps into the fold, VMware appeared to be much more willing to work with non-VMware-centric infrastructure, as GigaOM’s Derrick Harris reported at the time.
This is a symbolic coup for OpenStack and its biggest boost since IBM and Red Hat officially joined as Platinum members in April. And it’s especially important since Citrix, a virtualization rival to VMware undercut it’s own OpenStack participation last April by pushing CloudStack as an alternative open source cloud stack.
OpenStack Gold members, which include Cloudscaling, Dell, MorphLabs, Cisco Systems, and NetApp, pay a fee pegged at 0.25 percent of their revenue — at least $50,000 but capped at $200,000 according to the foundation wiki. (VMware’s fee will be $66,666, according to the application, submitted by VMware CTO Steve Herrod, which is linked on the wiki post.) Platinum members — AT&T, Canonical, HP, Rackspace, IBM, Nebula, Red Hat, and SUSE – pay $500,000 per year with a 3-year minimum commitment.
Selecting a private cloud is harder than you think
Most providers that started with public clouds now do private clouds, and private clouds in turn become even more confusing
As InfoWorld's Ted Samson reported last week, Rackspace has released Rackspace Private Cloud Software,
which is the same complete version of Essex OpenStack the company runs
in its own hosted private clouds. This move was designed to get
Rackspace more traction in the cloud computing market, targeting the
greater-than-expected spending on private clouds by enterprises.
Rackspace's latest release reminded me of what's been happening steadily in cloud computing over the last few years. While we keep discussing public clouds, almost like it's a religion, enterprise IT continues to gravitate toward private clouds in a big way. Not surprisingly, traditional on-premise vendors like Hewlett-Packard, Microsoft, and IBM have responded in kind. But so too have the "traditional" cloud vendors, such as Rackspace and, with its Eucalyptus partnership, Amazon Web Services.
However, selecting a private cloud is harder than you may think, even when dealing with vendors you already know. No relevant standards exist, other than the emerging open source initiatives, and what constitutes a private cloud seems to be in the eye of the beholder. When you evaluate private clouds, you'll find it's difficult to compare them.
For example, consider features such as how a private cloud provisions resources, manages tenants, and handles use-based accounting, logging, application management, databases, and even security. All vendors approach these very important aspects of cloud computing in a different way, and some have skipped over one or two features altogether.
That's why it's hard to select a private cloud vendor. Of course, it always comes down to understanding technology and business requirements. But even with that necessary prerequisite accomplished, making sense of who has what and what it actually provides requires some detective work. Look beyond the hype to what is actually ready for deployment and how it all works together.
By David Linthicum
Rackspace's latest release reminded me of what's been happening steadily in cloud computing over the last few years. While we keep discussing public clouds, almost like it's a religion, enterprise IT continues to gravitate toward private clouds in a big way. Not surprisingly, traditional on-premise vendors like Hewlett-Packard, Microsoft, and IBM have responded in kind. But so too have the "traditional" cloud vendors, such as Rackspace and, with its Eucalyptus partnership, Amazon Web Services.
However, selecting a private cloud is harder than you may think, even when dealing with vendors you already know. No relevant standards exist, other than the emerging open source initiatives, and what constitutes a private cloud seems to be in the eye of the beholder. When you evaluate private clouds, you'll find it's difficult to compare them.
For example, consider features such as how a private cloud provisions resources, manages tenants, and handles use-based accounting, logging, application management, databases, and even security. All vendors approach these very important aspects of cloud computing in a different way, and some have skipped over one or two features altogether.
That's why it's hard to select a private cloud vendor. Of course, it always comes down to understanding technology and business requirements. But even with that necessary prerequisite accomplished, making sense of who has what and what it actually provides requires some detective work. Look beyond the hype to what is actually ready for deployment and how it all works together.
Cloud computing: The great equalizer
BY HERNS A. HERMIDA
Local companies are now adopting cloud-based, capex-free
business solutions that enable them to be more flexible, cost efficient
and globally competitive. It is a paradigm shift where money usually
spent on maintaining expensive, in-house or on-premise hosted
applications, turn into savings, sans the need to invest heavily on
hardware, software licenses, or IT personnel dedicated to maintaining
these systems.
With the cloud, companies are now able to quickly implement easy-to-use enterprise applications for internal use, resulting in a far more productive workplace because of employees’ ability to work anywhere on any device, and helping the company focus on its core business and generate revenue earlier than otherwise projected.
The Budget Equalizer
With minimal investment on equipment and licenses, taking operations to the cloud is perhaps the most ideal way to do business today. It’s truly a game changer like no other.
In the past, only large corporations with the resources and capital to invest in IT, were able to streamline internal processes and manage operations through costly IT systems such as ERP (enterprise resource planning), CRM (customer relationship management), and database management. Small and medium enterprises or SMEs, on the other hand, had to make do with what their limited IT budgets can provide, to stay competitive in the market.
Not anymore. More SMEs are now getting the same capabilities as their larger, more established counterparts, thanks to the cloud. Today, we see numerous business applications available in the cloud including ERP, CRM, financial systems, unified communications, and proprietary applications.
Amid all value-added features that enterprise cloud apps offer however, it is the “capex-free” come-on that has proven to be the driving force behind the growing migration to the cloud. Because these business tools are paid on a per account, per year (or month) basis, companies are able to manage their funds better and take their products and services to market faster -- while enjoying the benefits of cloud computing for their day-to-day operations.
The Productivity Equalizer
Consider this. A global leader in consumer cosmetic products, now uses its own social app that rides on its cloud computing infrastructure, to enable its sales consultants to connect with clients, and each other, and receive company and product updates in real time, anytime, anywhere.
Productivity is enhanced by the accessibility of cloud-based applications. As long as there is Internet in the area, work gets done. Remote and traveling personnel can easily access and edit files and systems in real time from wherever they are located. Cloud applications not only cut companies’ software costs by more than half, but increases productivity through real-time collaboration amongst multiple users.
Cloud-based business applications range from workforce management and sales distribution to customer relationship management and even customized applications, all of which aim to improve workplace productivity, and help organizations achieve business success.
Projects involving hardware installations such as servers, are also ramped up faster in a cloud environment. Resources are easily scaled up or down depending on usage while changes are easily made without the need to buy additional hardware.
A very simple way that a company can get one foot in the cloud is by enhancing their office email experience with a secure web-based service, instead of using a terminal-based email client. This way, work email can be accessed practically anywhere, thus improving response times and overall productivity.
The Security Equalizer
In a recent article, a leading Spanish bank did an initial pilot test for 7,000 of its employees to use cloud-based email and productivity applications. Finding no significant issues, the rest of the bank’s 110,000 workforce now trust the Cloud even for sensitive financial data.
With any computing solution, whether cloud-based or not, security is always an important issue. And security is one of the reasons why many corporations are still hesitant to take on a cloud computing mindset.
The best cloud providers are those that implement multiple layers of security to ensure the confidentiality, integrity and availability of such critical information. ISO 27001:2005 Information Security Management Systems is the global standard certification that ensures this. By being ISO 27001 certified, cloud providers assure customers of world-class security measures for their cloud services, giving them peace of mind for their business.
However, though cloud security lies much on the ability of service providers to ensure physical and network security as well as the resiliency of its data center environment, it also rests on the ability of customers to safeguard respective log-in credentials which most corporate IT platforms use to limit end-user access to sensitive data.
The Effectiveness Equalizer
Large companies stand to gain from the cloud’s shared resources as it allows them to have more flexible operations and healthier bottom lines. Though if large companies in established industries can make noteworthy strides under the cloud, there’s no reason why SMEs will not be able to enjoy the same progress with the same inexpensive, secure and productivity-enhancing infrastructure.
One of the Philippines’ leading low-cost airlines felt the need to migrate to a cloud-based email platform from their existing legacy email system to enhance accessibility, security, and overall employee productivity. As cloud email services are highly scalable, the company was able to manage the number of users and online storage allocation as it requires, and pay only for what is used. Such flexibility can only be experienced with cloud services.
That was just for email.
A leading transport and logistics company in the country currently uses the cloud for various facets of their business such as account management, marketing automation, database management, pipeline management, and business analytics and reporting. It is a great testament to how the cloud can be an essential component of any business, even those with existing ERPs.
The Philippine economy is poised to hugely benefit from local enterprises’ continuing adoption of cloud solutions, especially once the technology reaches its tipping point in our developing nation.
Herns A. Hermida is the Vice President for Consulting Services - Business Solutions Group of IP-Converge.
CLOUD COMPUTING has emerged as the 'great equalizer' in business, and is rapidly changing the way many companies operate.
With the cloud, companies are now able to quickly implement easy-to-use enterprise applications for internal use, resulting in a far more productive workplace because of employees’ ability to work anywhere on any device, and helping the company focus on its core business and generate revenue earlier than otherwise projected.
The Budget Equalizer
With minimal investment on equipment and licenses, taking operations to the cloud is perhaps the most ideal way to do business today. It’s truly a game changer like no other.
In the past, only large corporations with the resources and capital to invest in IT, were able to streamline internal processes and manage operations through costly IT systems such as ERP (enterprise resource planning), CRM (customer relationship management), and database management. Small and medium enterprises or SMEs, on the other hand, had to make do with what their limited IT budgets can provide, to stay competitive in the market.
Not anymore. More SMEs are now getting the same capabilities as their larger, more established counterparts, thanks to the cloud. Today, we see numerous business applications available in the cloud including ERP, CRM, financial systems, unified communications, and proprietary applications.
Amid all value-added features that enterprise cloud apps offer however, it is the “capex-free” come-on that has proven to be the driving force behind the growing migration to the cloud. Because these business tools are paid on a per account, per year (or month) basis, companies are able to manage their funds better and take their products and services to market faster -- while enjoying the benefits of cloud computing for their day-to-day operations.
The Productivity Equalizer
Consider this. A global leader in consumer cosmetic products, now uses its own social app that rides on its cloud computing infrastructure, to enable its sales consultants to connect with clients, and each other, and receive company and product updates in real time, anytime, anywhere.
Productivity is enhanced by the accessibility of cloud-based applications. As long as there is Internet in the area, work gets done. Remote and traveling personnel can easily access and edit files and systems in real time from wherever they are located. Cloud applications not only cut companies’ software costs by more than half, but increases productivity through real-time collaboration amongst multiple users.
Cloud-based business applications range from workforce management and sales distribution to customer relationship management and even customized applications, all of which aim to improve workplace productivity, and help organizations achieve business success.
Projects involving hardware installations such as servers, are also ramped up faster in a cloud environment. Resources are easily scaled up or down depending on usage while changes are easily made without the need to buy additional hardware.
A very simple way that a company can get one foot in the cloud is by enhancing their office email experience with a secure web-based service, instead of using a terminal-based email client. This way, work email can be accessed practically anywhere, thus improving response times and overall productivity.
The Security Equalizer
In a recent article, a leading Spanish bank did an initial pilot test for 7,000 of its employees to use cloud-based email and productivity applications. Finding no significant issues, the rest of the bank’s 110,000 workforce now trust the Cloud even for sensitive financial data.
With any computing solution, whether cloud-based or not, security is always an important issue. And security is one of the reasons why many corporations are still hesitant to take on a cloud computing mindset.
The best cloud providers are those that implement multiple layers of security to ensure the confidentiality, integrity and availability of such critical information. ISO 27001:2005 Information Security Management Systems is the global standard certification that ensures this. By being ISO 27001 certified, cloud providers assure customers of world-class security measures for their cloud services, giving them peace of mind for their business.
However, though cloud security lies much on the ability of service providers to ensure physical and network security as well as the resiliency of its data center environment, it also rests on the ability of customers to safeguard respective log-in credentials which most corporate IT platforms use to limit end-user access to sensitive data.
The Effectiveness Equalizer
Large companies stand to gain from the cloud’s shared resources as it allows them to have more flexible operations and healthier bottom lines. Though if large companies in established industries can make noteworthy strides under the cloud, there’s no reason why SMEs will not be able to enjoy the same progress with the same inexpensive, secure and productivity-enhancing infrastructure.
One of the Philippines’ leading low-cost airlines felt the need to migrate to a cloud-based email platform from their existing legacy email system to enhance accessibility, security, and overall employee productivity. As cloud email services are highly scalable, the company was able to manage the number of users and online storage allocation as it requires, and pay only for what is used. Such flexibility can only be experienced with cloud services.
That was just for email.
A leading transport and logistics company in the country currently uses the cloud for various facets of their business such as account management, marketing automation, database management, pipeline management, and business analytics and reporting. It is a great testament to how the cloud can be an essential component of any business, even those with existing ERPs.
The Philippine economy is poised to hugely benefit from local enterprises’ continuing adoption of cloud solutions, especially once the technology reaches its tipping point in our developing nation.
Herns A. Hermida is the Vice President for Consulting Services - Business Solutions Group of IP-Converge.
OpenStack cloud fluffer growing faster than Linux
Two years in, and still not quite a cloud operating system
By Timothy Prickett Morgan
Analysis The OpenStack cloud controller, launched two years ago to the day by techies at NASA Ames Research Center and Rackspace Hosting, has come a long way in its infancy.With Hewlett-Packard and Rackspace ramping up clouds based on the current "Essex" release, OpenStack is only just learning to walk. But it won't be long before companies looking to deploy private clouds – or to sell them as a service – will not only be looking for OpenStack to run, but to run well, run fast, and run networking in addition to compute and storage.
It's a lot to ask of a two-year-old, but OpenStack has a lot of mentors and advisors. If the OpenStack Foundation does what it is supposed to do, then it will be a meritocracy, not dominated by Rackspace, that will allow some of the best minds in the world to perfect a cloud operating system.
They may call it by different names, but the data center is the new server and a cloud operating system is exactly what Citrix Systems, Microsoft, Red Hat, and VMware have been seeking to build with their various software stacks.
The idea is to turn static physical hardware into malleable virtual hardware at the compute, storage, and network layers, allowing it to be programmatically controlled. Applications and their required software underpinnings can be puffed up in an instance and shuffled around inside a private cloud, or pushed out to a public cloud as workloads and economics dictate.
The
Rackspace wanted to transform itself from a simple hosting provider to a cloud provider, so it started work on cloudy storage under its Mosso subsidiary back in 2006. That effort got beefed up when Rackspace acquired Jungle Disk in October 2008, on the same day that it bought Slicehost, which had created its own compute cloud biz.
The Mosso cloud storage and Jungle Disk access code morphed into the Swift storage controller, which Rackspace currently uses in production. The Ozone compute controller, based on ideas from Slicehost, was underway when the NASA techies and the Rackspace techies decided to pool their efforts and at the same time foster an open source community to drive development of an alternative cloud control freak.
At the time, Eucalyptus was the early favorite as an alternative to VMware's "virtual data center operating system." It was embracing multiple hypervisors and virtual machine images to gain leverage over VMware and what would eventually become vSphere and vCloud Director. Canonical, the commercial entity behind Ubuntu Linux, embraced Eucalyptus and embedded it inside of its distro, and server makers fell all over themselves saying they were going to build Eucalyptus clouds.
Everyone talks about OpenStack now. That is, unless they are talking about CloudStack, the alternative now championed by Citrix Systems. Citrix left the OpenStack community (more or less) and open sourced CloudStack under an Apache license back in April. Citrix and the rest of the OpenStack community have a fundamental difference of opinion that cannot be resolved.
OpenStack, which is more or less driven by Rackspace at this point, wants to build a cloud fabric that has its own set of APIs. Citrix and the CloudStack community want to create an open source fabric that adheres to the API sets created by Amazon for its eponymous Web Services cloud. You can imagine that Rackspace would rather crawl through 500 miles of the west Texas desert than kowtow to Amazon, which pretty much rules the public cloud these days.
OpenStack ramps faster than Linux
Since OpenStack launched two years ago, a flurry of IT vendors and independents have rallied to its banner. Jim Curry, general manager of the Cloud Builders program at Rackspace and the man who spearheads the OpenStack efforts from inside Rackspace, says that the enthusiasm building around OpenStack rivals that of Linux in its early days.Curry did a little math, and he tells El Reg that it took 828 weeks to get 180 companies contributing code to the Linux kernel, whereas OpenStack had lined up contributors from 166 companies in 84 weeks. In terms of actual code commits, Linux had 200 contributors by week 615 of its existence, while OpenStack had 206 contributors by week 84.
To show you how developer interest has grown, there were 75 people in attendance at the first OpenStack Design Summit in July 2010. There were more than 1,600 people at the event in April when the "Essex" release was launched.
From Austin to Diable to Essex
At the moment, there have been over 200,000 downloads from the OpenStack repository, and that doesn't count all of the versions that have been repackaged by Canonical, SUSE Linux, Piston Computing, and Stackops. There are over 100 commercial clouds running OpenStack today, including ones from AT&T, HP, Deutsche Telekom, DreamHost, Korea Telecom, NTT, and Internap.And, of course, soon Rackspace itself will offer a commercial cloud. It is currently in the process of beta testing the "Essex" release in production. (The Swift controller has been used for years in production at Rackspace for the Cloud Files service; what is changing is the move to the Nova compute controller and all of the other tools that wrap around the two.)
The Essex release had 235 code contributors, according to Curry, and even though the OpenStack Foundation, formed late last year and moving towards taking over the steering of the various OpenStack projects that make up the cloud control freak.
Curry says that the foundation will take control in the next two to three months, when the trademarks and other intellectual property will be transferred to the foundation, and that the goal is for this to happen before the OpenStack Design Summit in October. The governance documents for the foundation have been drafted and are sufficiently rigorous for Red Hat to join up. OpenStack is searching for an executive director at the moment and is doing interviews.
Jim Curry, GM of OpenStack Cloud Builders at Rackspace
"We used to manage the project, but we are now acting as a member of the community," says Curry.In the wake of the initial code dump from NASA, Rackspace was pretty much the only code contributor to the project, but now Rackspace's contributions are down around 50 per cent – and that is against contributions that are two to three times as great as when the project was first started two years ago. So the amount of code coming out of Rackspace is actually growing, but that contributed by other community members is growing much faster.
"The rest of the community is taking responsibility for the project, and that is what we want. We didn't open source this because we wanted to control this," Curry says.
In fact, Curry says that Rackspace expect for the foundation to take control of OpenStack until 2013 or 2014, so if anything, getting it done by this October is early.
Incidentally, the OpenStack Foundation just announced in a blog post that individuals in the OpenStack community can now formally join the foundation and therefore have a say in nominating board members.
OpenStack: Folsom and beyond
With the Linux kernel and its related operating system layers in the late 1990s and early 2000s, each subsequent release was important because so much functionality was being added every time Linus Torvalds gave the go-ahead. Linux today is much more sophisticated and useful, but the changes are arguably not as jarring as they were, because Linux and the process by which it is created are both more mature.As a two-year-old, OpenStack is still, in many ways, trying to code wings onto the airplane as the fuselage rockets down the runway towards liftoff. It is a fun time for OpenStack contributors, and this, says Curry, is one of the reasons why the ramp for OpenStack is steeper than that for Linux at the same point in its cycle.
"The promise of an open cloud is a very interesting problem to work on," says Curry. "We got into a market that clearly needed a solution."
It took a dot-com bust to really put the shine on Linux and take the bloom off Unix and proprietary machines.
The next release of OpenStack is code-named "Folsom," and it has just passed its third milestone, with the goal of getting it out the door in the last week of September. It will add Quantum virtual networking, Ceph block storage, and substantial enhancements to the Nova compute controller, the Glance virtual machine imaging system, Horizon management dashboard.
"Folsom is coming along well, and we have a lot of new things coming," says Curry. But speaking on a personal basis, not as someone in charge of OpenStack but as a member of the community, Curry wants the OpenStack community to slow down a bit.
"I am glad that all of these features are being added, but my personal focus would be to make this more usable," he says. "The speed of feature additions is not always conducive to ease of deployment and management."
That, ultimately, may be the other big selling point for OpenStack over other alternatives, aside from it being free if you are smart enough to support it by yourself (just like Linux). The Penguin was never going to be able to knock Windows off the desktop because of the overwhelming familiarity (and for many, contempt) that we all have with Windows as it has grown up on our desktops and moved into servers over the decades.
But in the cloud operating system racket, nothing is familiar because, quite honestly, so few companies are really doing anything more than server virtualization. OpenStack has an even chance of beating VMware vCloud, even if Citrix tries to go its own way with the Apache-licensed CloudStack and hitches its future on compatibility with AWS.
Microsoft is always a wild card because there are so many Windows servers out there and it can give away for free what others have to charge for. Red Hat is trying to do its own thing with OpenShift and CloudForms, but it may just give in, move to OpenStack once it matures, and ship a commercial-grade variant with support.
Year three promises to be interesting for OpenStack. That's for sure
Subscribe to:
Posts (Atom)