In computing, buffer underrun or buffer underflow is a state occurring when a buffer used to communicate between two devices or processes is fed with data at a lower speed than the data is being read from it. (The term is distinct from buffer overflow, a condition where a portion of memory being used as a buffer has a fixed size but is filled with more than that amount of data.) This requires the program or device reading from the buffer to pause its processing while the buffer refills. This can cause undesired and sometimes serious side effects because the data being buffered is generally not suited to stop-start access of this kind.
In terms of concurrent programming, a buffer underrun can be considered a form of resource starvation.
The terms buffer underrun and buffer underflow are also used to mean buffer underwrite, a condition similar to buffer overflow, but where the program is tricked into writing before the beginning of the buffer, overriding potential data there, like permission bits.
Buffer underruns are often the result of transitory issues involving the connection which is being buffered: either a connection between two processes, with others competing for CPU time, or a physical link, with devices competing for bandwidth.
The simplest guard against such problems is to increase the size of the buffer—if an incoming data stream needs to be read at 1 bit per second, a buffer of 10 bits would allow the connection to be blocked for up to 10 seconds before failing, whereas one of 60 bits would allow a blockage of up to a minute. However, this requires more memory to be available to the process or device, which can be expensive. It assumes that the buffer starts full—requiring a potentially significant pause before the reading process begins—and that it will always remain full unless the connection is currently blocked. If the data does not, on average, arrive faster than it is needed, any blockages on the connection will be cumulative; "dropping" one bit every minute on a hypothetical connection with a 60-bit buffer would lead to a buffer underrun if the connection remained active for an hour. In real-time applications, a large buffer size also increases the latency between input and output, which is undesirable in low-latency applications such as video conferencing.