Originally posted to Data Center POST

As technology has advanced, people’s patience has declined. With smartphones and AI assistance, we have grown to expect everything instantaneously. We want the answer to any question given to us with a simple ‘Okay, Google,’ or the latest movie readily available on our phones while we wait for the next train to arrive. This must-have-it-now expectation can also be found in organizations across most industries. Fast, easy-to-access data, and the insights it provides, have become the norm in business operations, causing the tech industry to evolve rapidly in an attempt to meet this demand.

However, there is one major challenger to this demand — latency. Latency refers to the time it takes for a data packet to travel from its origin point to its destination. The type of connection that is being used plays a part in the latency. However, distance also largely influences speed. This is due to the fact that, no matter how fast the connection being used is, data has to physically travel between two points, which, of course, can take time.

Another factor is network complexity. This is the number of nodes and alternative paths that exist within a network, and it means that data doesn’t always travel along the same route. Data would be rerouted to other connections if the simplest route is unavailable, which can cause increased network latency.

Lastly, other increasing factors that can further complicate matters include the internet of things (IoT), as well as growing technology trends such as artificial intelligence (AI). These new technologies generate and process large amounts of data while simultaneously using massive resources to complete their tasks. This can clog up the bandwidth and data routes.

To read the full article, please click here.