|
|||
|
In astronomy, magnitude is a logarithmic measure of the brightness of an object, measured in a specific wavelength or passband, usually in the visible or near-infrared spectrum. An imprecise but systematic determination of the magnitude of objects was introduced in ancient times by Hipparchus.
Astronomers use two different definitions of magnitude: apparent magnitude and absolute magnitude. The apparent magnitude (m, or vmag for the visible spectrum) is the brightness of an object as it appears in the night sky from Earth, while the absolute magnitude (Mv, V and H) describes the intrinsic brightness of an object as it would appear if it were placed at a certain distance from Earth. This distance is 10 parsecs for stars and 1 astronomical unit for planets and small Solar System bodies. A minor planet's size is typically estimated based on its absolute magnitude in combination with its presumed albedo.
The brighter an object appears, the lower the value of its magnitude, with the brightest objects reaching negative values. The Sun has an apparent magnitude of −27, the full moon −13, the brightest planet Venus measures −5, and Sirius, the brightest visible star in the night sky, is at −1.5. An apparent magnitude can also be assigned to man-made objects in Earth orbit. The brightest satellite flares are ranked at −9, and the International Space Station, ISS, appears at a magnitude of −6. The scale is logarithmic, and defined such that each step of one magnitude changes the brightness by a factor of the fifth root of 100, or approximately 2.512. For example, a magnitude 1 star is exactly a hundred times brighter than a magnitude 6 star, as the difference of five magnitude steps corresponds to 2.5125 or 100.