*** Welcome to piglix ***

Film speed


Film speed is the measure of a photographic film's sensitivity to light, determined by sensitometry and measured on various numerical scales, the most recent being the ISO system. A closely related ISO system is used to measure the sensitivity of digital imaging systems.

Relatively insensitive film, with a correspondingly lower speed index, requires more exposure to light to produce the same image density as a more sensitive film, and is thus commonly termed a slow film. Highly sensitive films are correspondingly termed fast films. In both digital and film photography, the reduction of exposure corresponding to use of higher sensitivities generally leads to reduced image quality (via coarser film grain or higher image noise of other types). In short, the higher the sensitivity, the grainier the image will be. Ultimately sensitivity is limited by the quantum efficiency of the film or sensor.

The first known practical sensitometer, which allowed measurements of the speed of photographic materials, was invented by the Polish engineer Leon Warnerke – pseudonym of Władysław Małachowski (1837–1900) – in 1880, among the achievements for which he was awarded the Progress Medal of the Photographic Society of Great Britain in 1882. It was commercialized since 1881.

The Warnerke Standard Sensitometer consisted of a frame holding an opaque screen with an array of typically 25 numbered, gradually pigmented squares brought into contact with the photographic plate during a timed test exposure under a phosphorescent tablet excited before by the light of a burning Magnesium ribbon. The speed of the emulsion was then expressed in 'degrees' Warnerke (sometimes seen as Warn. or °W.) corresponding with the last number visible on the exposed plate after development and fixation. Each number represented an increase of 1/3 in speed, typical plate speeds were between 10° and 25° Warnerke at the time.


...
Wikipedia

...