Thin provisioning makes server virtualisation more efficient

In recent years, server virtualisation has probably had the single largest impact on storage. Organisations have moved a high percentage of direct attach storage servers to a consolidated virtual environment, with NAS storage or a SAN to support it.

Some organisations took the opportunity to streamline the logical volumes during the re-provisioning process, be that a physical-to-virtual process or new virtual machine build, in order to detach their previous mindset of 2 x 73 GB hard disks = OS + application data capacity. This was made worse by the advent of the 146 GB drives, as organisations scrambled around for legacy hardware -- in the form of smaller capacity drives -- to provide efficient storage platforms for simple infrastructure requirements.

Even in today's virtualised environments we still have storage containers for virtual machines, big buckets of 300 GB to 500 GB or larger volumes, to house the virtual machine data. The net efficiency of these storage containers over DAS may be a vast improvement, but nevertheless there is still overhead in existence within these architectures.

In larger deployments of say 30-plus hosts, where we are expecting to see a conservative consolidation ratio of 8:1, this can mean 240 virtual machines, each with its own 10 GB OS drive, plus a data storage drive of perhaps 20 GB to 60 GB, which would consume a very large piece of storage. In the detailed design, an organisation would be looking at performance optimisation as a critical success factor for the virtualised servers to deliver a service back to the business. This will involve balancing load against disk capacity.

Although running lean is not everybody's cup of tea, it simplifies business as usual storage provisioning, optimises capacity planning through tools and avoids unnecessary capacity purchases.
Andrew McCreath
Engagement partner, GlassHouse Technologies (UK)
Thin provisioning can make this process more efficient by providing a storage-on-demand model. No more over-allocation, and a very lean operating environment. Capacity planners can work with 'actual numbers' at the storage level and compare those to business application requirements. In many cases, thin provisioning is more than just a tool; it is a methodology and strategy. Although running lean is not everybody's cup of tea, it does add value to the organisation at many levels, such as simplifying business as usual storage provisioning, optimising capacity planning through tools, and avoiding unnecessary capacity purchases.

Who are the regular adopters of thin provisioning today? Organisations with a solid grasp of ITIL and capacity planning onboard, and medium-sized businesses who have control over their distributed or regional data centres.

Who should be adopting thin provisioning? Certainly most people where the price is right. Large corporations with core data centres and centralised data warehouses will see the largest ROI, and companies who feel the budget-pinch each year and need to deliver the same with less – go claim back those unused gigabytes, a penny saved is a penny earned.

About the author: Andrew McCreath is an engagement partner with GlassHouse Technologies (UK), a global provider of IT infrastructure services, with more than 16 years experience of Infrastructure and Management Information Systems. Prior to joining GlassHouse, Andrew managed multi-million-dollar projects while employed with Accenture, Credit Suisse First Boston, Kimberly-Clark, Société Générale and EMC. He is currently specialised in server virtualisation and data centre consolidation.

Read more on Storage management and strategy