Question
Question: A bulb when cold has \(1\) ohm resistance. It draws a current of \(0.3\) ampere when glowing from a ...
A bulb when cold has 1 ohm resistance. It draws a current of 0.3 ampere when glowing from a source of three volts. Calculate the resistance of the bulb when glowing and explain the reason for the difference in resistance.
Solution
Ohm's Law is a formula applied to calculate the relation between current, voltage, and resistance in an electrical circuit. The resistance is directly proportional to temperature. With the rise in temperature, the vibrational movement of the atoms of the conductor increases. Due to an increment in vibration, the probability of collision among atoms and electrons rises. As a result, the resistance of the conductor increases.
Complete step-by-step solution:
Given: Resistance of bulb when it is cold, Rcold=1Ω
Current, I=0.3A
Voltage, V=3V
We need to calculate the resistance of the bulb when glowing.
We will use ohm’s law.
V=IR
We will find the formula for resistance from above formula:
R=IV
Put values of voltage and current, we get:
R=0.33
⟹R=10Ω
The value for resistance of the bulb when glowing is 10Ω.
The reason for the difference in resistance is that the resistance depends upon temperature. As the bulb is glowing, the current flows through the bulb, and the temperature of the bulb increases. Resistance of the filament of the bulb grows with a temperature rise. Hence when it glows, its resistance is more significant than when it is cool.
Note: Resistance cannot be estimated in an operating circuit, so Ohm's Law is beneficial when it requires to be calculated. Rather than closing off the circuit to determine resistance, a technician can find R using the variation of Ohm's Law.