Waves of innovation and waves of companies, crash on the storage market, but the same incumbent leaders and product lines survive for decades. Are things changing? It’s hard to see sometimes, but real progress has been made.
The difference between traditional compression and modern data deduplication is somewhat hazy. And it doesn’t help that various implementations fall all along the spectrum from “mildly interesting” to “cutting edge!”
The latest beta of the server version of Microsoft’s forthcoming Windows 8 operating system includes a handy tool related to the new data deduplication feature. DDPEVAL will test a given dataset using the new deduplication and compression engine and report the savings to be expected. And it works even on non-Windows 8 systems!
Tomorrow, I will be in San Francisco for TechTarget’s Storage Decisions conference. This show does a good job on the editorial side, suggesting timely topics and bringing in folks like Dennis Martin, Mark Staimer, and Jon Toigo. I will have two presentations on data reduction and storage virtualization in the main conference track – both are updated from my New York sessions.
Native Format Optimization (NFO) makes a lot of sense, since it addresses a common user error in a practical way, and allows capacity savings to â€œtrickle-downâ€ to backups, e-mail systems, and archives. But wholesale compression and the duplication of primary storage may not be worth much, especially since the cost of disk keeps dropping dramatically.