*** Welcome to piglix ***

Fifth magnitude star


The apparent magnitude (m) of a celestial object is a number that is a measure of its brightness as seen by an observer on Earth. The brighter an object appears, the lower its magnitude value (i.e. inverse relation). The Sun, at apparent magnitude of −26.7, is the brightest object in the sky. It is adjusted to the value it would have in the absence of the atmosphere. Furthermore, the magnitude scale is logarithmic. A difference of 1 in magnitude corresponds to a change in brightness by a factor of 5100, or about 2.512.

The measurement of apparent magnitudes or brightnesses of celestial objects is known as photometry. Apparent magnitudes are used to quantify the brightness of sources at ultraviolet, visible, and infrared wavelengths. An apparent magnitude is usually measured in a specific passband corresponding to some photometric system such as the UBV system. In standard astronomical notation, an apparent magnitude in the V ("visual") filter band would be denoted either as mV or often simply as V, as in "mV = 15" or "V = 15" to describe a 15th-magnitude object.

The scale used to indicate magnitude originates in the Hellenistic practice of dividing stars visible to the naked eye into six magnitudes. The brightest stars in the night sky were said to be of first magnitude (m = 1), whereas the faintest were of sixth magnitude (m = 6), which is the limit of human visual perception (without the aid of a telescope). Each grade of magnitude was considered twice the brightness of the following grade (a logarithmic scale), although that ratio was subjective as no photodetectors existed. This rather crude scale for the brightness of stars was popularized by Ptolemy in his Almagest and is generally believed to have originated with Hipparchus.


...
Wikipedia

...