A thermographic camera (also called an infrared camera or thermal imaging camera) is a device that forms an image using infrared radiation, similar to a common camera that forms an image using visible light. Instead of the 400–700 nanometre range of the visible light camera, infrared cameras operate in wavelengths as long as 14,000 nm (14 µm). Their use is called thermography.
Infrared was discovered in 1800 by Sir William Herschel as a form of radiation beyond red light. These "infrared rays" (infra is the Latin prefix for "below") were used mainly for thermal measurement. There are four basic laws of IR radiation: Kirchhoff's law of thermal radiation, Stefan-Boltzmann law, Planck’s law, and Wien’s displacement law. The development of detectors was mainly focused on the use of thermometer and bolometers until World War I. Leopoldo Nobili fabricated the first thermocouple in 1829, which paved the way for Macedonio Melloni to show that a person 10 meters away could be detected with his multielement thermopile. The bolometer was invented in 1878 by Langley. It had the capability to detect radiation from a cow from 400 meters away, and was sensitive to differences in temperature of one hundred thousandth of a degree Celsius.
The first advanced application of IR technology in the civil section may have been a device to detect the presence of icebergs and steamships using a mirror and thermopile, patented in 1913. This was soon outdone by the first true IR iceberg detector, which did not use thermopiles, patented in 1914 by R.D. Parker. This was followed up by G.A. Barker’s proposal to use the IR system to detect forest fires in 1934. The technique was not truly industrialized until it was used in the analysis of heating uniformity in hot steel strips in 1935.