I recently read a great article about investing in “new tech” in order to save a bunch of bucks. I know, it seems counter-intuitive but hear me out.
For starters, given today’s changing workloads and newer application software stacks, businesses truly need better overall performance, besides the older application software probably includes a much higher security risk. And the older legacy hardware is probably less reliable and includes a disproportionate amount of IT dollars going just for its maintenance support contract.
Surprisingly, according to figures provided by Intel, the most up-to-date server solutions are actually less expensive than those based on the previous generation of processors, especially when you consider the cost per virtual machine. With new technology you can get a much denser virtual environment, not to mention being more reliable and more secure.
Generally, you need upgrades in three areas: processors, storage and networks. Intel’s calculations show that if an organization upgrades its servers and storage, but keeps an its older network infrastructure in place, CPU utilization drops from 45% to 31%. So even with new server and storage solutions in place, the overall application compute environment is still a bottleneck.
However, if you can update all three (servers, storage and networks), the benefits pile up. Courtesy of Intel, here is a breakdown for a hypothetical IT environment requiring 125 VMs:
As you can see, the “new” system costs about $124K less than the old one, even though its individual components are more expensive. The new solution’s cost per VM is roughly half that of the legacy system. What’s not to like?