Integrating solid state storage as a VMware cache isn’t a trivial task. In fact, it’s become the core challenge for some of the best minds in storage, and few real answers have yet emerged. This will be a primary area of focus for me and others who watch and comment on virtualization and enterprise storage!
No small storage company has had more press coverage and “buzz” than “ioMemory” maker, Fusion-io. I have long marveled at the company’s ability to attract attention, but this has rub some analysts wrong. How, they argue, as component vendors enter their space, can a premium company with proprietary products compete over the long term?
Eye-Fi (the company) would rather that we focus on the capabilities of their card rather than its technical components. But any self-respecting geek is going to want to know what makes it tick! I’d rather not cut open my card to get a peek at the chips inside, but Eye-Fi released some official details about the components used in the X2 series of cards, and a quick Google search revealed all that I needed to know.
After testing the Iomega USB 3.0 SSD extensively both in terms of benchmarks and real-world usability, I’m sold on it. the only outstanding question is the high price of the unit: The 64 GB drive starts at an attainable $190, but the big 256 GB drive is downright expensive at $620 (street price). It’s hard to knock the drive’s performance, component choices, or build quality, but is it worth more than a budget laptop?
I’m building a home/lab server to run a variety of workloads, but VMware ESX is chief among these. Sadly, VMware ESX is especially picky about network interface cards (NICs): Although many are supported, most are intended for servers and thus very expensive and difficult to find at retail. So I set out browsing through the VMware ESX HCL, Newegg, and Amazon to find the best network card for my home lab machine. Here’s what I’ve found out so far.