Admin and Founder of ‘The Secrets Of The Universe’ and former intern at Indian Institute of Astrophysics, Bangalore, I am a science student pursuing a Master’s in Physics from India. I love to study and write about Stellar Astrophysics, Relativity & Quantum Mechanics.
When we look up in the sky, we see different celestial objects. Some are very bright (the Sun and the Moon) while others are barely visible with the naked eye (the nebulae and the faint stars). The brightness of an object depends on various parameters. While distance to that object is one main parameter, the brightness of an object is also a function of its energy output. So, there must be a definite concept behind classifying the celestial objects according to its brightness. In the eighth article Basics of Astrophysics, let us learn about the concept of magnitude in Astrophysics.
Before we start, I want to emphasize on the fact that the concept of magnitude is important, not only for astrophysicists, but also for amateur astronomers who love sky gazing. In this article, we will also learn about the types of magnitudes used by astronomers. So let us start.
What Is Magnitude In Astrophysics?
Generally, magnitude of a quantity refers to the numeric value of something. For example, consider velocity v = 40 km/hr due East. This vector quantity has a magnitude and a direction. The magnitude of velocity is 40 while direction is East. But the concept of magnitude in astrophysics has nothing to do with vectors like stated above. In Astrophysics, magnitude is a measure of the amount of energy emitted by a body over the entire EM spectrum. Roughly speaking, it is the measure of the “brightness” of an object.
Types of Magnitudes
There are three main types of magnitudes in Astrophysics. Each type has its own significance and applications.
The apparent magnitude of an object is a number or a unitless quantity that describes the brightness of an object as seen from Earth. To understand this concept, we need to go back in history and see from where all this stems out.
In 2nd Century BC, Hipparchus first classified the stars according to their apparent brightness and cataloged some 1000 stars into 6 groups. The group 1 stars were the brightest, group two stars were fainter than the group 1 and so on. The 6th group stars were the faintest. In 2nd Century AD, Ptolemy also used the same approach and classified the stars into his own catalog. In 1830, British Astronomer William Herschel discovered that the stars of group 1 were 100 times brighter than stars of group 6. Later in 1856, N.R. Pogson developed a scale in which he assumed that the ratio of the brightness of objects in two successive groups is same.
This led to the conclusion that a first magnitude star is 2.5 times brighter than the second magnitude star and so on for the next ratios. This is how the sixth magnitude star will be 100 times fainter than first magnitude star. 2.5 is the fifth root of 100. Mathematically, if B(m) and B(n) are the brightness of two stars with magnitudes m and n (n>m ), then
B(m)/B(n) = (2.5)^ n-m
This equation has something very important to tell. According to this equation, the brighter the object, lesser is its magnitude. This means that an object with magnitude -4 is brighter than the object with magnitude, say +2.
The apparent magnitude of Sun is -26.74 and that of the brightest star in the night sky is -1.74. So according to the above equation, Sun turns out to be more than 10 billion times brighter than Sirius. The image below illustrates the concept briefly. In a nutshell, the apparent magnitude of an object is the number associated with it, that gives a measure of its brightness. Lesser the number, brighter is the object.
The apparent magnitude depends upon the luminosity of an object or the total amount of energy radiated from it per second. Besides this, it also depends upon a very crooked parameter, the distance. Since the distances vary largely, the apparent magnitude of an astronomical object does not provide the actual brightness or luminosity. As an example, consider Sun and the red supergiant star Betelgeuse. The apparent magnitude of the former is -26.74 while that of the latter is +0.50. This means Sun is 69 billion times brighter than the star Betelgeuse (found this value by equation above). But in reality, it is about 100,000 times more luminous than Sun. So the new term introduced was the Absolute Magnitude.
In the concept of absolute magnitude, we fix the distance of an object to some standard distance. The chosen distance is 10 parsec (32.6 light years). So the absolute magnitude of an object is actually the apparent magnitude of an object if it were 10 parsecs away. We assume the objects to be placed at this distance and compare their luminosity. Mathematically, if m is the apparent magnitude of an object, M is its absolute magnitude and D is its distance from earth then,
m – M = 5 log (D) – 5
The quantity m – M only depends on the distance and hence it is known as the distance modulus. The equation represents a relation between an object’s absolute and apparent magnitude and its distance from us. The absolute magnitudes of most of the stars lie between -20 to + 10. The absolute magnitude of the sun is +4.8 which shows that it is an average star in the stellar population. It means that if the Sun is placed at a distance of 10 parsecs, it will appear as a very faint speck in the night sky.
When stars are observed visually, it is termed as visual magnitude. Our retina is insensitive to a wide range of the electromagnetic spectrum. A star can be photographed using different filters and the magnitudes so obtained are termed as Photovisual magnitudes. The main filters used by the researchers are u,g,r,i,z. You can find the data about the magnitudes in the SDSS or SIMBAD. These surveys are widely used by astrophysicists in their research.
All the magnitudes defined so far cover only limited regions of the spectrum. The stellar magnitude based on the radiations measured over the entire range of electromagnetic spectrum is known as the bolometric magnitude.
But since no single detector is sensitive to all the wavelengths of the stellar spectrum, to convert any other magnitude into bolometric magnitude, some corrections need to be applied. In particular, the difference between the bolometric magnitude and photovisual magnitude is termed as Bolometric Correction (BC).
The BC for Sun is -0.11 (always a negative quantity). The importance of this magnitude in astrophysics can be understood from a simple example. There is an orange giant star Arcturus, in the constellation of Bootes. It is about 110 times brighter than the Sun. But in the infrared spectrum, the star is 180 times more powerful than the Sun. So the total (bolometric) output of Arcturus is far more than its apparent output.
When I was in my teenage, the concept of magnitude was the first topic I came across in astrophysics. I was confused because back then, my physics teacher had just introduced me to the notion of vectors. I later asked him about the connection between the two and he politely replied, “it is totally a different concept”. The concept of magnitude in astrophysics was an important lesson for me back then, because it helped me locate the planets in the sky.
I placed this article in the 8th slot because we will need it in the upcoming articles. We have now begun a very important topic – stellar astrophysics. We will now focus on the concepts of this field. In the next article, we will again see the importance of spectroscopy in astrophysics and learn how trillions and trillions of stars are classified into just 7 categories. I hope you are enjoying the series. Feel free to contact me (Email: firstname.lastname@example.org).