Channel Futures, February 15th, 2018
Is solving network latency issues the key to continued cloud growth? That's what Laz Vekiarides of ClearSky thinks.
In computer networking, latency refers to the delay between the time data is sent by one party and when it is received by another. Although we tend to think of the internet as the great enabler of instantaneous communication, in reality, even the fastest networks suffer from latency. The delays might only be a few milliseconds, but they still exist.
The delays grow longer as the geographic distance between data's origin point and endpoint increases. Latency increases by about one millisecond for every 60 miles. If you're in Boston and trying to open websites hosted on servers in China, the response rates will be slower than they would be for sites based in the United States.