May 2014 Archives

My 10 minutes with Google's datacentre VP

avenkatraman | No Comments | No TrackBacks
| More

Google's Joe Kava speaking at the Google EU Da...

Google's Joe Kava speaking at the Google EU Data Center Summit (Photo credit: Tom Raftery)

At the Datacentres Europe 2014 conference in Monaco, I had a chance to not just hear Google's datacentre VP Joe Kava deliver a keynote speech on how the search giant uses machine learning to achieve energy efficiency but also to speak to him individually for 10 minutes.

Here is my quick Q&A with him:

What can smaller datacentre operators learn from Google's datacentres? There's a feeling among many CIOs and IT teams that Google can afford to pump in millions into its facilities to keep them efficient.

Joe Kava: That attitude is not correct. In 2011, we published an exhaustive "how to" instruction set explaining how datacentres can be made more energy efficient without spending a lot of money. We can demonstrate it through our own use cases. Google's network division, which is the size of a medium enterprise, had a technology refresh and by spending between $25,000 and $50,000 per site, we could improve their high availability features and improve their PUEs from 2.2 to 1.5. The savings were so high that it yielded a payback of the IT spend in just seven months. You show me a CIO who wouldn't like a payback in seven months. 

Are there any factors, such as strict regulations, that are stifling the datacentre sector?

It is always better for an industry to regulate itself than have the government do it. It fosters innovation. There are many players in the industry that voluntarily regulate themselves in terms of data security and carbon emissions. One example is how since 2006, the industry has strongly rallied together behind the PUE metric and has taken energy efficiency tools quite to heart.

What impact is IoT having on datacentres?

Joe Kava: IoT (internet of things) is definitely having an impact on datacentres. As more volumes of data are created and as mass adoption of the cloud takes place, naturally it will require IT to think about datacentres and its efficiency differently. IoT brings huge sets of opportunities to datacentres.

What is your one piece of advice to CIOs?

You may think I am saying this because I am from Google but I strongly feel that most people that operate their datacentres shouldn't be doing it. That's not their core competency. Even if they do everything correctly and even if they have a big budget to build a resilient, highly efficient datacentre, they cannot compete in terms of the quick turnaround and the scalability that dedicated third-party providers can offer.

Tell us something about Google's datacentres that we do not know

It is astounding to see what we can achieve in terms of efficiency with good old-fashioned testing and development and diligence. The datacentre team constantly questions the parameters and constantly pushes the boundaries to find newer ways to save money with efficiency. We design and build a lot of our own components and I am not just talking about servers and racks. We even design and build our own cooling infrastructure and develop our own components of the power architecture that goes into a facility.

It is a better way of doing things.

Are you building a new datacentre in Europe?

(Smiles broadly) We are always looking at expanding our facilities.

How do you feel about the revelations of the NSA surveillance project and how it has affected third-party datacentre users' confidence?

It is a subject I feel very strongly from my heart but it is a question that I will let the press and policy team of Google handle.

Thank you Joe

Thank you!

 


Enhanced by Zemanta

No such thing as absolute freedom from vendor lock-in, even in open source, proves Red Hat

avenkatraman | No Comments | No TrackBacks
| More

OpenStack is a free, open source cloud computing platform giving users freedom from vendor lock-in. When it was alleged that Red Hat won't support customers who use other versions of OpenStack cloud on its Linux operating systems, its president Paul Cormier passionately shared the company's vision of open-source but steered clear from stating wholeheartedly that it WILL support its users no matter what version of OpenStack they use.

Any CIO worth his salt will admit that support services can be a deal-breaker when deciding to invest in technology.

Red Hat customers opt for the vendor's commercial version of Linux (RHEL) over free Linux versions because they want to use its support services and make their IT enterprise-class. This has helped Red Hat build a $10bn empire around Linux and become the most dominant provider of commercial open source platform.

OpenStack

OpenStack (Photo credit: Wikipedia)

So when Cormier says -- "Users are free to deploy Red Hat Enterprise Linux with any OpenStack offering, and there is no requirement to use our OpenStack technologies to get a Red Hat Enterprise Linux subscription. 

And separately, "Our OpenStack offerings are 100% open source. In addition, we provide support for Red Hat Enterprise Linux OpenStack Platform," -- customers are likely to pick Red Hat's OpenStack cloud on Red Hat operating system resulting in supplier lock-in.

Cormier justified: "Enterprise-class open source requires quality assurance. It requires standards. It requires security. OpenStack is no different. To cavalierly 'compile and ship,' untested OpenStack offerings would be reckless. It would not deliver open source products that are ready for mission critical operations and we would never put our customers in that position or at risk." 

Yes, Red Hat has to seek growth from its cloud offerings and as an open source leader, it has to protect the reputation of open cloud as being enterprise-ready.

Red Hat's efforts in the open source industry are commendable. For instance, it acquired Ceph provider Inktank last month and said it will open source Inktank's closed source monitoring offering.

But as the open sourced poster child, it also has the responsibility to contribute more to the spirit of open cloud and to invest more in Open source technology to give users absolute freedom to choose the cloud they like.

Competition among cloud providers is getting fiercer. To grab a larger share of the growing market, some cloud providers are slashing cloud costs while others are differentiating by offering managed services.  But snatching flexibility and freedom from cloud users is never a good idea.

But it will be unfair to single out Red Hat to open up its ecosystem. There's HP, IBM, VMware and Oracle who are all part of the OpenStack project and who all have their versions of OpenStack cloud.

As Cormier says, "We would celebrate and welcome competitors like HP showing commitment to true open source by open sourcing their entire software portfolio."

Until then it's a murky world. What open source? What open cloud? 


Enhanced by Zemanta

Using cloud for test and development environments? Avoid this costly mistake

avenkatraman | No Comments | No TrackBacks
| More

Using cloud services for application testing or software development is becoming a common practice because of cloud's scalability, agility, ease of deployment and cost savings.

But some users are not yielding the cost saving benefit, and in some cases, even seeing cloud costs soar because of a simple error -- they are not turning storage instances down when not in use.

Time and again purveyors of cloud computing have highlighted scalability as the hallmark of cloud computing and time and again users have listed the ability to scale the resources up and down as one of the biggest cost saving factors of the cloud.

But when discussing cloud costs and myths with a public cloud consultancy firm recently, I was shocked to learn that many enterprises that use the cloud for testing and development forget to scale down their testing environment at the end of the day and end up paying for idle IT resources - defeating the purpose of using cloud computing.

Building a test and dev lab in the cloud has its benefits - it saves the team time from building the entire environment from the ground up. Also, should the new software not work, they can launch another iteration quickly.  But the main benefit is the lower cost.

But delirious app testers and software developers may be leaving the instances running and pay for cloud storage for the hours of the night when no activity takes place on the infrastructure.

On the public cloud, turning down unused instances and capacity does not delete the testing environment. This means developers can simply scale the system up the next day to start from where they left.

But the practice of leaving programs running on the cloud is so common that cloud suppliers, management companies, and consultancies have all developed tools to help customers mitigate this waste.

For instance, AWS provides CloudWatch alarms which help customers set parameters on their instances so they automatically shut down if they are idle or underutilised.

Another tool it offers is AWS Trusted Advisor - available for free to customers on Business Level Support, or above. It looks at their account activity and actively shows them how they can save money by shutting down instances, buying Reserved Instances or moving to Spot Pricing.

"In 2013 alone, it generated more than a million recommendations for customers, helping customers realise over $207m in cost reductions," AWS spokesman told me.

Cloud costs can be slashed by following good practices in capacity planning and resource-provisioning. But that's at a strategic level while quick savings can be achieved by simple, common sense measures such as running instances only when necessary.

Perhaps, it is time to think of cloud resources as utilities - if you don't leave the lights on when you leave work why should you leave idle instance running on the pay-as-you-operate cloud?

That's $207m IT efficiency savings for customers of just one cloud provider. Imagine.

 

Enhanced by Zemanta

AWS may be building a datacentre in Germany, but will the cloud data remain safe and private?

avenkatraman | No Comments | No TrackBacks
| More

As public cloud provider AWS is looking to expand its datacentre footprint in Europe in the post-Prism world, it may have picked Germany because of the stricter regulations around data sovereignty. But the recent US court ruling asking Microsoft to hand over one customer's email data held in its Dublin datacentre suggests that data on the cloud, regardless of where it is stored, may not be really private and secure.

While AWS has not clearly said it is building a datacentre in Germany, at its London Summit last week, Stephen Schmidt, its vice-president and chief information security officer told me that they are always looking to expand and that a Wall Street Journal article was "pretty explicit" about where their next datacentre might be?

The WSJ article quotes Andy Jassy, senior vice president naming Germany as its next datacentre location because of its "significant business in Germany" who could be demanding that their data resides within the country.

According to Chris Bunch from Cloudreach, a UK cloud consultancy firm that implements AWS clouds, AWS is growing so fast and have such market dominance that adding capcity for further growth is clearly sensible. AWS will have built one in the region within the next 12 months. 

Amazon already has three infrastructure facilities in Frankfurt, with seven others in London, Paris and Amsterdam. In addition to these ten Edge locations, it has three EC2 availability zones in Ireland, catering to EU customers.

But just as one would hail the potential AWS datacentre in Germany as a credible move to protect user data on the cloud, comes a US magistrate Court judgment ordering Microsoft to give the District Court access to the contents of one of its customer's emails stored on a server located in Dublin. Microsoft challenged the decision but the judge disagreed and rejected its challenge.

Microsoft said: "The US government doesn't have the power to search a home in another country, nor should it have the power to search the content of email stored overseas."

"Microsoft's argument is simple, perhaps deceptively so," Judge Francis said in an official document, quashing Microsoft's challenge.

"It has long been the law that a subpoena requires the recipient to produce information in its possession, custody, or control regardless of the location of that information," he said.

Well, perhaps we still have a long way to go to see the rules of data sovereignty upheld, but with AWS's growing customer portfolio, it will be good news to have public cloud data reside in Germany which has one of the strongest and toughest data regulations around the world.


Enhanced by Zemanta

About this Archive

This page is an archive of entries from May 2014 listed from newest to oldest.

June 2014 is the next archive.

Find recent content on the main index or look in the archives to find all content.

-- Advertisement --