*** Welcome to piglix ***

Phase noise


In signal processing, phase noise is the frequency domain representation of rapid, short-term, random fluctuations in the phase of a waveform, caused by time domain instabilities ("jitter"). Generally speaking, radio frequency engineers speak of the phase noise of an oscillator, whereas digital system engineers work with the jitter of a clock.

Historically there have been two conflicting yet widely used definitions for phase noise. Some authors define phase noise to be the spectral density of a signal's phase only, while the other definition refers to the phase spectrum (which pairs up with the amplitude spectrum, see spectral density#Related concepts) resulting from the spectral estimation of the signal itself. Both definitions yield the same result at offset frequencies well removed from the carrier. At close-in offsets however, the two definitions differ.

The IEEE defines phase noise as ℒ(f)=Sφ(f)/2 where the "phase instability" Sφ(f) is the one-sided spectral density of a signal's phase deviation. Although Sφ(f) is a one-sided function, it represents "the double-sideband spectral density of phase fluctuation". The phase noise expression ℒ(f) is pronounced "script ell of f".

An ideal oscillator would generate a pure sine wave. In the frequency domain, this would be represented as a single pair of Dirac delta functions (positive and negative conjugates) at the oscillator's frequency, i.e., all the signal's power is at a single frequency. All real oscillators have phase modulated noise components. The phase noise components spread the power of a signal to adjacent frequencies, resulting in noise sidebands. Oscillator phase noise often includes low frequency flicker noise and may include white noise.


...
Wikipedia

...