Throughput is a term used to describe the rate at which data is successfully delivered over a network. It may refer to the speed at which data is transmitted over a packet radio, Ethernet, or other type of communication channel. It may also refer to the rate at which network messages are transmitted through nodes on a network.
Throughput is a measure of the productivity of a computing device or service. It represents the amount of work that a system completes in a specified period of time. It is also a measure of the efficiency of the memory and network communications within a system. Throughput was first introduced as a way to evaluate the productivity of computer processors. It was originally calculated in terms of batch jobs per second, although its derivatives have evolved to include more details, such as the amount of work being done and the number of users concurrently.
For instance, a factory manager wants to determine the throughput rate of its bolt production line. He knows that his factory can produce three thousand bolts per minute, so he wants to calculate the throughput rate by dividing the number of bolts produced by each minute. Similarly, a cell phone store manager needs to estimate the throughput of the store’s inventory. If the store is able to produce five hundred phones per hour, then its throughput rate is approximately twenty per hour.
Network throughput, also known as network bandwidth, is the speed at which data is transferred over a network. Typically, throughput is measured in bits per second, although it can also be measured in megabits or gigabits. By examining how packets move within a network, administrators can pinpoint the performance bottlenecks in their network and fix the problem.