Network Performance Parameters
Network performance refers to the quality of service of the network from the customer’s point of view.
There are many different ways to measure the performance of a network because each network is different in nature and design. Performance can also be modeled and modeled instead of measured; An example of this is the use of state transition diagrams to simulate the performance of a queue or use a network simulator.
The following measures are often considered important:
- Bandwidth
- Throughput
- Latency
- Jitter
- Error Rate
Bandwidth
In-Network Performance Metrics The bandwidth, usually measured in bits per second, is the maximum rate at which information can be transmitted. Bandwidth is the maximum data transfer rate for a particular route. Bandwidth can be described as network bandwidth, data bandwidth, or digital bandwidth.
Bandwidth is different from signal processing, wireless, modem data, digital communications, and electronics, the bandwidth used to designate the bandwidth of a signal. Analog is measured in hertz, that is, the frequency ranges between the lowest and the highest frequencies is possible to achieve a well-defined level of signal degradation.
However, the actual transmission rate that can be achieved depends not only on the signal bandwidth but also on the noise in the channel.
- Network Bandwidth Capacity
- Network Bandwidth Consumption
- Asymptotic Bandwidth
- Multimedia Bandwidth
- Bandwidth in Web Hosting
- Internet Connection Bandwidth
Throughput
In-Network Performance Metrics The throughput is the real transfer rate. Throughput is the number of messages delivered successfully per unit of time. Performance is controlled by available bandwidth, signal-to-noise ratio, and available hardware limitations.
The time interval is the period during which the performance is measured. The choice of the appropriate time window will often dominate the performance calculations, and whether the delay will be taken into account or not will determine whether or not the delay affects performance.
Several factors can affect the throughput of a communication system, including the limitations of basic analog physical media, the available processing performance of system components, and end-user behavior. If we take into account several total costs for the protocol, the useful speed of the transmitted data can be considerably lower than the maximum performance that can be achieved; the useful part is commonly referred to as Goodput.
- Maximum Throughput
- Maximum Theoretical Throughput
- Asymptotic Throughput
- Peak Measured Throughput
- Maximum Sustained Throughput
Latency
In-Network Performance Metrics The latency is the delay between the transmitter and the receiver, which decodes it. Roughly, it is a function of the time of passage of the signal and the processing time of any node through which the information is transmitted.
The speed of light imposes a minimum propagation time on all electromagnetic signals. It is not possible to reduce the delay of latency under t = s / c. where are the distance and cm is the speed of light in the middle? This means approximately one millisecond (RTT) over a distance of 100 km / 62 miles between nodes. Other delays also occur at intermediate nodes. In packet-switched networks, delays may occur due to queues.
Latency is the time interval between stimulation and reaction of response or, from a more general point of view, the delay between the cause and the effect of a physical change in the observed system. The latency is physically a consequence of the limited speed with which any physical interaction can spread.
The magnitude of this speed is always less than or equal to the speed of light. As a result, each physical system will suffer some type of latency, regardless of the nature of the stimulation to which it has been subjected. The following parameters measure the latency.
- Communication Latency
- Audio Latency
- Operational Latency
- Mechanical Latency
- Computer Hardware and Operating System Latency
Jitter
In-Network Performance Metrics Jitter is an undesirable deviation from the actual periodicity of the periodic signal assumed in electronics and telecommunications, often compared to a reference clock source. Jitter can be observed by characteristics such as the frequency of consecutive pulses, the amplitude of the signal or the phase of the periodic signals. Jitter is an important and generally undesirable factor when designing almost all communication channels (eg USB, PCI-e, SATA, OC-48). In synchronization recovery applications, this is called timing jitter.
Jitter can be quantified in the same terms as all time-varying signals, such as the root mean square (RMS) or the peak-to-peak displacement. In addition to other time-varying signals, the jitter can be expressed in terms of spectral density.
Jitter can be caused by electromagnetic interference and crosstalk with carriers of other signals. Instability can cause flickering of the monitor screen, affect processor performance on personal computers, cause clicks or other unwanted effects on audio signals, and loss of data transmitted between network devices. The amount of jitter tolerance depends on the application used.
There are three types of jitter.
- Random Jitter
- Deterministic Jitter
- Total Jitter
Error Rate
In digital transmission, the number of bit errors corresponds to the number of bits received from the data stream over the communication channel, which have been modified due to noise, interference, distortion or synchronization error bits.
Bit Error Rate (BER) is the number of bit errors divided by the total number of bits transmitted during the time interval studied. BER is a measure of dimensionless performance, often expressed as a percentage.
The bit error rate pe is the expected BER value. The BER can be considered as a rough estimate of the probability of bit error. This estimate is accurate for a long time interval and a large number of bit errors.