• Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Home
  • About
    • Stephen Foskett
      • My Publications
        • Urban Forms in Suburbia: The Rise of the Edge City
      • Storage Magazine Columns
      • Whitepapers
      • Multimedia
      • Speaking Engagements
    • Services
    • Disclosures
  • Categories
    • Apple
    • Ask a Pack Rat
    • Computer History
    • Deals
    • Enterprise storage
    • Events
    • Personal
    • Photography
    • Terabyte home
    • Virtual Storage
  • Guides
    • The iPhone Exchange ActiveSync Guide
      • The iPhone Exchange ActiveSync Troubleshooting Guide
    • The iPad Exchange ActiveSync Guide
      • iPad Exchange ActiveSync Troubleshooting Guide
    • Toolbox
      • Power Over Ethernet Calculator
      • EMC Symmetrix WWN Calculator
      • EMC Symmetrix TimeFinder DOS Batch File
    • Linux Logical Volume Manager Walkthrough
  • Calendar

Stephen Foskett, Pack Rat

Understanding the accumulation of data

You are here: Home / Everything / Enterprise storage / In Praise of Performance Comparisons

In Praise of Performance Comparisons

May 14, 2010 By Stephen Leave a Comment

I’ve long been critical of poorly-executed performance comparisons and the “fastest is always best” mentality behind them. Who really cares if a Honda minivan accelerates quicker than a Toyota when no real-world owner will ever keep the accelerator floored from stop to highway speed? The same goes for enterprise gear: Is an over-subscribed backplane really a problem when most network switches hum along at 20% load? But, although it sounds inconsistent, I still love reading the performance “comparos” in Car & Driver, and I am committed to the belief that the enterprise IT world needs lab tests and performance comparisons.

Is Maximum Performance Relevant?

Loping along at 75 mph in a Corvette Z06

My opinion on maximum performance might seem clear from the paragraph above, but it’s more nuanced than that. Although maximum performance is not singularly important, it is often an indicator for more relevant metrics. For example, a car that lags behind all others in absolute acceleration or top speed might be similarly unable to deliver satisfying performance driving around town. And, all other elements being equal, the quicker car may have better engineering.

Consider the now-infamous Tolly report comparing the HP c7000 blade system with Cisco’s UCS. Many complained that the test was unfair to Cisco, and I noted that they cherry-picked favorable results. Yet this report did start a discussion on HP’s blade products, oversubscription, Ethernet and FCoE versus Virtual Connect and Flex10, and the merits of blade personality. Although the performance test wasn’t the smack-down that HP seems to have wanted, they would probably judge the report to be a success.

Microsoft recently demonstrated that their software iSCSI initiator (in combination with Intel’s Xeon 5500 and 10 GbE adapters) can achieve wire-speed throughput and one million IOPS. This was a particularly wise benchmark even though it neither demonstrated a real-world use case nor directly compared to the performance of competing protocols. No, the report was news-worthy because it demonstrated a level of performance that defied conventional wisdom. The Microsoft/Intel iSCSI test was analogous to Nissan’s record-setting lap of the Nürburgring Nordschleife in their GT-R: It set the world on notice that they were a serious contender.

So maximum-performance tests can be useful to get the world talking and challenge the status quo. They can also demonstrate innate technical superiority, though one has to investigate such claims fully to see whether they are being made fairly.

Comparing Contenders

Although maximum performance should always be taken with a grain of salt, comparisons can take many other factors into consideration. The real value of performance comparisons comes when an attempt is made to model real-world usage. Holistic evaluation, taking both objective and subjective metrics into account, can help buyers separate the wheat from the chaff.

I very much respect the spirit behind “real world” benchmarks like SPC Benchmark-1. The creators attempted to reflect actual enterprise workloads for storage systems, including email servers, databases, and OLTP. Although many criticize the exact specifications or application of these tests, I applaud that they are rooted in what end users actually do with storage.

I was similarly impressed by the Data Center Infrastructure Group’s new Midrange Array Buyer’s Guide. Jerome Wendt and company assessed every storage system across a slice of the market and laid out the facts in an easy-to-understand format. My initial examination of the results was reassuring: The Guide passed my “sniff test”, with systems I know to be good near the top. One can argue the merits of each system’s placement, but I am certain that end users will be able to use this document to create “short lists” of solid products to evaluate.

Then there are people like Howard Marks at DeepStorage, Dennis Martin at Demartek, and the folks at ESG Labs. Each is doing a yeoman’s job of trying to generate real-world use cases and comparisons. This is what benchmarking and testing should be all about: Helping people make sense of the confusing array of products on the market. I applaud them for turning hands-on time into suggestions for improvement, guides for usage, and fodder for comparison. No one will run out and buy “that specific device” based on a benchmark, but they might open their eyes and consider “these few.” That sounds like a win to me.

You might also want to read these other posts...

  • Electric Car Over the Internet: My Experience Buying From…
  • How To Connect Everything From Everywhere with ZeroTier
  • Introducing Rabbit: I Bought a Cloud!
  • Tortoise or Hare? Nvidia Jetson TK1
  • Liberate Wi-Fi Smart Bulbs and Switches with Tasmota!

Filed Under: Enterprise storage, Everything, Virtual Storage Tagged With: benchmarks, Car & Driver, Cisco, DCIG, Dennis Martin, ESG, ESG Labs, Howard Marks, HP, Intel, IOPS, Jerome Wendt, Microsoft, Nissan, performance, SPC Benchmark-1, SPC-1, Tolly

Primary Sidebar

The work of the information officer [should be] regarded as the natural dynamic extension of that of the librarian.

Douglas John Foskett

Subscribe via Email

Subscribe via email and you will receive my latest blog posts in your inbox. No ads or spam, just the same great content you find on my site!
 New posts (daily)
 Where's Stephen? (weekly)

Download My Book


Download my free e-book:
Essential Enterprise Storage Concepts!

Recent Posts

How To Install ZeroTier on TrueNAS 12

February 3, 2022

Scam Alert: Fake DMCA Takedown for Link Insertion

January 24, 2022

How To Connect Everything From Everywhere with ZeroTier

January 14, 2022

Electric Car Over the Internet: My Experience Buying From Vroom

November 28, 2020

Powering Rabbits: The Mean Well LRS-350-12 Power Supply

October 18, 2020

Tortoise or Hare? Nvidia Jetson TK1

September 22, 2020

Running Rabbits: More About My Cloud NUCs

September 21, 2020

Introducing Rabbit: I Bought a Cloud!

September 10, 2020

Remove ROM To Use LSI SAS Cards in HPE Servers

August 23, 2020

Test Your Wi-Fi with iPerf for iOS

July 9, 2020

Symbolic Links

    Featured Posts

    Introducing Rabbit: I Bought a Cloud!

    September 10, 2020

    A High-Tech Water Heater? Yep! Introducing the A. O. Smith Vertex

    November 15, 2012

    The Myths of Standardization

    December 15, 2011

    The I/O Blender Part 1: Ye Olde Storage I/O Path

    May 23, 2012

    Regarding My Symbolic Links and Good Reads

    April 16, 2015

    A Fairy Tale of Two Storage Protocols

    September 23, 2014

    Virtual Machine Mobility: Of What, and to Where and in What State?

    January 16, 2012

    FCoE vs. iSCSI – Making the Choice

    May 20, 2011

    GPS Time Rollover Failures Keep Happening (But They’re Almost Done)

    April 6, 2019

    The Ideal pfSense Platform: Netgate RCC-VE 2440

    September 21, 2015

    Footer

    Legalese

    Copyright © 2022 · Log in