In particle physics, the parton model is a model of hadrons, such as protons and neutrons, proposed by Richard Feynman. It is useful for interpreting the cascades of radiation (a parton shower) produced from QCD processes and interactions in high-energy particle collisions.
Parton showers are simulated extensively in Monte Carlo event generators, in order to calibrate and interpret (and thus understand) processes in collider experiments. As such, the name is also used to refer to algorithms that approximate or simulate the process.
The parton model was proposed at Cambridge University by Richard Feynman in 1969 as a way to analyze high-energy hadron collisions. Any hadron (for example, a proton) can be considered a composition of a number of point-like constituents, termed "partons". The parton model was immediately applied to electron-proton deep inelastic scattering by Bjorken and Paschos.
A hadron is composed of a number of point-like constituents, termed "partons". Later, with the experimental observation of Bjorken scaling, the validation of the quark model, and the confirmation of asymptotic freedom in quantum chromodynamics, partons were matched to quarks and gluons. The parton model remains a justifiable approximation at high energies, and others have extended the theory over the years.
Just as accelerated electric charges emit QED radiation (photons), the accelerated coloured partons will emit QCD radiation in the form of gluons. Unlike the uncharged photons, the gluons themselves carry colour charges and can therefore emit further radiation, leading to parton showers.
The hadron is defined in a reference frame where it has infinite momentum — a valid approximation at high energies. Thus, parton motion is slowed by time dilation, and the hadron charge distribution is Lorentz-contracted, so incoming particles will be scattered "instantaneously and incoherently".