Stellar Magnitude Sequence
Stellar Magnitude
Stars in General greatly differ in brightness from one another, the magnitude scale had been developed on the basis of measurement of the brightness of stars.
The brightness of stars that we see with our naked eyes is called apparent brightness of the star. The brightness of stars also depend upon the actual (or intrinsic) brightness as well
Earlier Classifications and Observations
During the 2nd century, BC Hippocratus first classified the stars according to their apparent brightness and cataloged about 1000 stars into 6 groups
- The 1st magnitude group contained the brightest star and the 2nd magnitude the next bright ones and so on.
- The faintest visible stars were contained in the 6th magnitude group.
In 1830 William Herschel determined that a star of 1st magnitude group was about 100 times brighter than a star of the 6th magnitude group
In 1856 N R Pogson defined the magnitude scale by taking the ratio of brightness of a first magnitude star to that of 6th magnitude one exactly as 100:1
- Pogson assumed that equal ratios of brightness would give equal differences in the magnitudes. By this assumption the ratio of magnitudes of 2 stars whose magnitude differ by unity is the 5th root of 100 i.e. 2.512
Mathematical Formulation
Suppose B1 denote the brightness of a star of first magnitude group and B6 denote the brightness of a star of the sixth magnitude group
The Brightness of 2 stars whose apparent magnitude differ by unity will differ by a factor of 2.512
If Bm and Bn (n>m) be the brightness of 2 stars having magnitudes m and n respectively, then
The above equation shows that higher the magnitude of the star less is the brightness and vice versa.
The apparent magnitude of the brightest star Sirius is -1.4, among the planets Venus is the brightest with a magnitude -4.4. The apparent magnitude of Sun is -26.8, the sun appears to be 10billion times brighter than the brightest star Sirius.
No comments: