Star brightness and apparent magnitude
Hipparchus of ancient Greece distinguished the stars in the night sky according to their brightness. He classified the brightest star as ‘magnitude 1’ and the darkest star as ‘magnitude 6.’ The stars in between were divided into 2, 3, 4, and 5 according to their brightness.
By the 19th century, scientists were able to measure the brightness of stars accurately. As a result, ‘magnitude 1’ was 100 times brighter than ‘magnitude 6’. And we noticed that there is about 2.5 times the brightness difference for every 1 magnitude difference.
Like this, the star’s apparent magnitude is based on our eyes.
Absolute magnitude
The bright star also looks dark when it is far away. Therefore, you cannot compare the energy emitted by stars with apparent magnitude. To compare the energy that a star emits, you need to know the brightness when you are at the same distance.
Absolute magnitude is the brightness when the Earth’s distance to all-stars is 10 pc(parsec).
For example, the sun looks very bright, with an apparent magnitude of -26.8. But an absolute magnitude is 4.8.
The relation of star’s magnitude and distance
If the apparent magnitude of the star is m, the absolute magnitude is M, and the distance to the star is d(pc), then the following expression is:
\( m \, – \, M \, = \, 5(log_{10}{d} – 1) \)