Solveeit Logo

Question

Question: A bulb when cold has \(1\) ohm resistance. It draws a current of \(0.3\) ampere when glowing from a ...

A bulb when cold has 11 ohm resistance. It draws a current of 0.30.3 ampere when glowing from a source of three volts. Calculate the resistance of the bulb when glowing and explain the reason for the difference in resistance.

Explanation

Solution

Ohm's Law is a formula applied to calculate the relation between current, voltage, and resistance in an electrical circuit. The resistance is directly proportional to temperature. With the rise in temperature, the vibrational movement of the atoms of the conductor increases. Due to an increment in vibration, the probability of collision among atoms and electrons rises. As a result, the resistance of the conductor increases.

Complete step-by-step solution:
Given: Resistance of bulb when it is cold, Rcold=1ΩR_{cold} = 1 \Omega
Current, I=0.3AI = 0.3 A
Voltage, V=3VV = 3V
We need to calculate the resistance of the bulb when glowing.
We will use ohm’s law.
V=IRV = IR
We will find the formula for resistance from above formula:
R=VIR = \dfrac{V}{I}
Put values of voltage and current, we get:
R=30.3R = \dfrac{3}{0.3}
    R=10Ω\implies R = 10 \Omega
The value for resistance of the bulb when glowing is 10Ω10 \Omega.
The reason for the difference in resistance is that the resistance depends upon temperature. As the bulb is glowing, the current flows through the bulb, and the temperature of the bulb increases. Resistance of the filament of the bulb grows with a temperature rise. Hence when it glows, its resistance is more significant than when it is cool.

Note: Resistance cannot be estimated in an operating circuit, so Ohm's Law is beneficial when it requires to be calculated. Rather than closing off the circuit to determine resistance, a technician can find R using the variation of Ohm's Law.