In telecommunications, guard intervals are used to ensure that distinct transmissions do not interfere with one another. These transmissions may belong to different users (as in TDMA) or to the same user (as in OFDM).
The purpose of the guard interval is to introduce immunity to propagation delays, echoes and reflections, to which digital data is normally very sensitive.
In OFDM, the beginning of each symbol is preceded by a guard interval. As long as the echoes fall within this interval, they will not affect the receiver's ability to safely decode the actual data, as data is only interpreted outside the guard interval.
In TDMA, each user's timeslot ends with a guard period, to avoid data loss and to reduce interference to the following user, caused by propagation delay. Thus a user's timeslot is protected from interference from the preceding user, by the guard interval (guard period) at the end of that preceding user's timeslot. It is a common misconception that each TDMA timeslot begins with a guard interval, however the ITU Technical Specifications (such as GSM 05.05) clearly define the guard period as being at the end of each timeslot, thus providing protection against data loss within that timeslot, and protection against interference to the following timeslot.
Longer guard periods allow more distant echoes to be tolerated. However, longer guard intervals reduce the channel efficiency. For example, in DVB-T, four guard intervals are available (given as fractions of a symbol period):
Hence, 1/32 gives the lowest protection and the highest data rate; 1/4 results in the best protection but the lowest data rate.
Radio waves propagate at the speed of light, 3 µs per 1000 meter (5 μs/mile). Ideally, the guard interval is just longer than the delay spread of the channel.
The standard symbol guard interval used in 802.11 OFDM is 0.8 μs. To increase data rate, 802.11n added optional support for a 0.4 μs guard interval. This provides an 11% increase in data rate.
The shorter guard interval results in a higher packet error rate when the delay spread of the channel exceed the guard interval and/or if timing synchronization between the transmitter and receiver is not precise. A scheme could be developed to work out whether a short guard interval would be of benefit to a particular link. To reduce complexity, manufacturers typically only implement a short guard interval as a final rate adaptation step when the device is running at its highest data rate.