Definition of Latency in the Network Encyclopedia.
What is Latency?
Latency is the delay that occurs when a packet or signal is transmitted from one part of a network to another. A network with high latency can experience unpredictable delays.

These delays usually do not affect data transmission appreciably since network protocols such as Internet Protocol (IP) are connectionless, but they have a serious impact on transmissions such as streaming audio and video because the human ear and eye can easily detect latency in these forms of transmission.
The term “latency” can also refer to the delay in forming a connection, such as the 15 to 30 seconds required to establish a modem connection.
Intrinsic latency in a transmission is caused by the finite transmission speed of the electrical signals through the wires (or the light signals through the fiber-optic cabling). Intrinsic latency cannot be eliminated but is usually quite small.
Much greater latency is usually introduced into a network by gateway devices such as routers and bridges, which process packets and perform protocol conversion.
The latency for a bridge is thus the time delay between the moment when the packet enters one port of the bridge and the moment when it leaves another port – usually a fraction of a millisecond.