Deinterlacing is the process of converting interlaced video, such as common analog television signals or 1080i format HDTV signals, into a non-interlaced form.
An interlaced video frame consists of two sub-fields taken in sequence, each sequentially scanned at odd, and then even, lines of the image sensor. Analog television employed this technique because it allowed for less transmission bandwidth and further eliminated the perceived flicker that a similar frame rate would give using progressive scan. CRT-based displays were able to display interlaced video correctly due to their complete analogue nature. Newer displays are inherently digital, in that the display comprises discrete pixels. Consequently, the two fields need to be combined into a single frame, which leads to various visual defects. The deinterlacing process should try to minimize these.
Deinterlacing has been researched for decades and employs complex processing algorithms; however, consistent results have been very hard to achieve.
Both video and photographic film capture a series of frames (still images) in rapid succession; however, television systems read the captured image by serially scanning the image sensor by lines (rows). In analog television, each frame is divided into two consecutive fields, one containing all even lines, another with the odd lines. The fields are captured in succession at a rate twice that of the nominal frame rate. For instance, PAL and SECAM systems have a rate of 25 frames/s or 50 fields/s, while the NTSC system delivers 29.97 frames/s or 59.94 fields/s. This process of dividing frames into half-resolution fields at double the frame rate is known as interlacing.
Since the interlaced signal contains the two fields of a video frame shot at two different times, it enhances motion perception to the viewer and reduces flicker by taking advantage of the persistence of vision effect. This results in an effective doubling of time resolution as compared with non-interlaced footage (for frame rates equal to field rates). However, interlaced signal requires a display that is natively capable to show the individual fields in a sequential order, and only traditional CRT-based TV sets are capable of displaying interlaced signal, due to the electronic scanning and lack of apparent fixed resolution.