Definition of Latency in the Network Encyclopedia.

What is Latency?

Latency is the delay that occurs when a packet or signal is transmitted from one part of a network to another. A network with high latency can experience unpredictable delays.

Network Latency

These delays usually do not affect data transmission appreciably since network protocols such as Internet Protocol (IP) are connectionless, but they have a serious impact on transmissions such as streaming audio and video because the human ear and eye can easily detect latency in these forms of transmission.

The term “latency” can also refer to the delay in forming a connection, such as the 15 to 30 seconds required to establish a modem connection.

Intrinsic latency in a transmission is caused by the finite transmission speed of the electrical signals through the wires (or the light signals through the fiber-optic cabling). Intrinsic latency cannot be eliminated but is usually quite small.

Much greater latency is usually introduced into a network by gateway devices such as routers and bridges, which process packets and perform protocol conversion.

The latency for a bridge is thus the time delay between the moment when the packet enters one port of the bridge and the moment when it leaves another port – usually a fraction of a millisecond.


Articles posted after being checked by editors.

Recent Content

link to Working Set

Working Set

Working set is the physical memory assigned to a process by the Microsoft Windows NT and Windows 2000 operating systems. The working set consists of pages, which are sections of code and application data that have recently been used by the process.
link to HTTPS


HTTPS is a protocol developed by Netscape for secure transmission of Web content over the Internet. HTTPS is another name for Netscape’s implementation of the Secure Sockets Layer (SSL) protocol that functions as a subprotocol to the application layer (layer 7) protocol, Hypertext Transfer Protocol (HTTP).