The Next Platform, 3/21/16
By Lazarus Vekiarides, CTO and Co-founder, ClearSky Data
Today’s tech users are spoiled. The devices we use daily, such as our phones and our laptops, rapidly process information.
Our Internet connections are fast and reliable (well, most of the time). As a result, we can access personalized content – from news to entertainment – on-demand. In this climate, it’s no surprise that we expect to be able to quickly access and transfer whatever data we need, whenever we need it.
Yet, in the enterprise IT space, storage pros haven’t been able to resolve the latency issues that occur when they move large amounts of data to the cloud, especially when unpredictable Internet performance is a factor. That factor is now prevalent in an enterprise world consisting of geographically distributed islands of compute. If you’re not considering physical distance and even the speed of light when choosing a storage system, you’re going to find yourself facing unexpected delays and issues – and your users, who are accustomed to on-demand access for everything else, won’t tolerate the slow down. If you’re not careful, these problems can kill your cloud initiative or business transformation plan.
Latency is an old conversation when it comes to IT infrastructure. It’s not uncommon for storage pros to feel bombarded – by vendors and even the media – with questions about how latency affects their storage systems’ performance. However, performance and latency are not one and the same. And in many cases, it’s network latency, not storage infrastructure latency, creating the hurdle most IT teams are unequipped to jump. This issue is never more relevant than when the public cloud comes into play.