Latency


Definition of Latency in the Network Encyclopedia.

What is Latency?

Latency is the delay that occurs when a packet or signal is transmitted from one part of a network to another. A network with high latency can experience unpredictable delays.

Latency
Network Latency

These delays usually do not affect data transmission appreciably since network protocols such as Internet Protocol (IP) are connectionless, but they have a serious impact on transmissions such as streaming audio and video because the human ear and eye can easily detect latency in these forms of transmission.



The term “latency” can also refer to the delay in forming a connection, such as the 15 to 30 seconds required to establish a modem connection.

Intrinsic latency in a transmission is caused by the finite transmission speed of the electrical signals through the wires (or the light signals through the fiber-optic cabling). Intrinsic latency cannot be eliminated but is usually quite small.

Much greater latency is usually introduced into a network by gateway devices such as routers and bridges, which process packets and perform protocol conversion.

The latency for a bridge is thus the time delay between the moment when the packet enters one port of the bridge and the moment when it leaves another port – usually a fraction of a millisecond.




Editor

Articles posted after being checked by editors.

Recent Content

link to Http Cookie

Http Cookie

In Internet technologies, a cookie is a text file that a Web server saves on a client machine during a Hypertext Transfer Protocol (HTTP) session. HTTP Cookies are used to record information about the client’s usage patterns, including the date and time the client visited the site, which pages were accessed, Web browser preferences, and so on.