Question
Question: A star is \[100\] times brighter than a star . Then \[{m_B} - {m_A}\;\] the difference in their appa...
A star is 100 times brighter than a star . Then mB−mA the difference in their apparent magnitude is-
A. 100
B. 0.01
C. 5
D. 0.2
Solution
We should know that the difference in thin apparent magnitude is equal to the log value of thin intensities.
∴mB−mA=−2.5log[IAIB]
Complete step by step answer:
However, the brightness of a star depends on its composition and how far it is from the planet. Astronomers define star brightness in terms of apparent magnitude — how bright the star appears from Earth — and absolute magnitude — how bright the star appears at a standard distance of 32.6 light-years, or 10 parsecs.
On this magnitude scale, a brightness ratio of 100is set to correspond exactly to a magnitude difference of 5. As magnitude is a logarithmic scale, one can always transform a brightness ratio IAIBinto the equivalent magnitude difference m2−m1by the formula:
∴mB−mA=−2.5log[IAIB]
We know that the difference in their apparent magnetite is equal to the log value of their intensities. Thus,
mB−mA=−2.5log[IAIB]- - - - - - - - - - - - - - - - - - - - (1)
mB−mA=−2.5×log(100×IB1×IB)..........(2) ⇒[IA=100as it is 100 times brighter.]
IB = intensity of B
IA = intensity of A
mB−mA=2.5×[log1−log100] ormB - mA = - 2.5×(−2) [mB−mA=5] - - - - - - - - - - - - - - - - - - - - (3)
So, the correct answer is “Option C”.
Note:
We should know that the magnitude in astronomy measures the brightness of a star or other celestial body. The brighter the object, the number assigned as a magnitude.