Subscribe Now

* You will receive the latest news and updates on the Canadian IT marketplace.

Trending News

Blog Post

Removing the guesswork from storage performance planning

Removing the guesswork from storage performance planning 

In terms of performance, storage is possibly one of the most poorly-understood components of an enterprise data center, and yet it is also one of the most vital and expensive. In most large enterprises, storage now represents the largest portion of the physical IT infrastructure budget and increasingly influences overall application performance.

Predicting how storage systems will perform in the real world, and how to optimize performance and expenditure has forced its way to become a central focus of data center planners. This creates a significant movement of large enterprises to a new generation of storage performance analysis tools that are more accurate, more flexible, and more empowering. 

Determining how storage will perform in a given environment is complicated, due to the interaction between servers, network, virtualization layers and storage, along with the wide diversity in application workloads. Most vendor-provided specifications and benchmarks do not give accurate insight into how their products will actually perform under real-world workloads.  They simply don’t have the tools to figure this out and so typically provide educated guesses in relation to performance sizing.

Although application level performance management (APM) tools are very good in assessing potential performance problems, they can’t isolate storage performance from overall application response times. Storage performance testing needs to be a standalone process; an IT best practice that can help determine the optimal products to deploy and help identify bottlenecks from a particular storage system or network switch. 

One common approach utilized by many organizations is the use of script-based tools such as Iometer and VDBench. These are open source tools that can provide basic performance information if the IT organization doesn’t mind investing heavily in scripting tests and writing custom reports. Iometer was conceived in 1998 with a first release in 2001. Without a corporate backer, it offers no dedicated support and has suffered significant lulls in developer activity – and when last looked, access to the user manual was broken.

Such tools also require dozens of servers and hundreds of VMs to generate a sufficient amount of load to test high-performance storage systems, such as all-flash arrays. Every change to the infrastructure can necessitate a rewrite of the scripts. This ties down skilled personnel resources in scripting and maintaining in-house platforms required to recreate production environments. As noted by Jim Miller, storage analyst for Enterprise Management Associates in his Impact Brief on the subject, “The net effect is that relatively few useful storage tests are run.”

A new generation of storage testing tools from Load DynamiX takes a new approach and focuses on making performance evaluation easier, flexible and more accurate. These tools accurately reproduce “production” workload situations targeting flash and hybrid storage systems for their testing.

They enable easy what-if analysis across a dozen or more key parameters. The simulated production workloads can in turn be modified and reproduced time and time again to more accurately test the possible variations of expected loads for new, yet-to-be deployed systems. Load DynamiX uses purpose build appliances to deliver real-world workloads which previously could only be produced in a cost-effective manner.

As a consequence, a rapidly growing number of enterprise IT departments including GE, Cisco, T-Mobile, AT&T, and Go Daddy use Load DynamiX for storage performance planning, including technology and vendor selection, configuration optimization and change validation.

Moving away from older, less accurate and less insightful solutions allows these large enterprises to more confidently and flexibly adjust their storage needs to their constantly-changing environments and requirements. Storage architects now have validated, easy, and cost-effective insight into how workloads will perform on their potential storage systems and software updates before the new infrastructure and applications are deployed.

Len Rosenthal is the vice president of Marketing for Load DynamiX.

Related posts