Solveeit Logo

Question

Question: How do you calculate the ionization energy of a hydrogen atom in its ground state?...

How do you calculate the ionization energy of a hydrogen atom in its ground state?

Explanation

Solution

The minimum energy required to remove an electron from its outermost orbital of a gaseous atom or ion. The greater value of ionization energy represents the difficulty of removing an electron from its orbit. It often refers to one mole of a molecule and is denoted in kJmol1kJmo{l^{ - 1}}.

Complete answer:
We can calculate the ionization energy of a hydrogen atom in its ground state with the help of Bohr’s model of the hydrogen atom according to which electrons could only orbit in the nucleus in specific shells or orbits having fixed radius.
According to Bohr’s model, an electron would absorb energy in the form of photons and get excited to a higher energy level. After jumping to a higher energy level, the electron will be in a less stable position and thus, will emit a photon to reach back its stable energy level.
The energy difference between the energy levels involved in the transition can be calculated as follows:
ΔE=EnfEni\Delta E = {E_{{n_f}}} - {E_{{n_i}}}
We know that for hydrogen atoms the energy level can be expressed in terms of principal quantum number as E=13.6n2E = \dfrac{{ - 13.6}}{{{n^2}}}. Substituting it in the expression, the difference in energy level for transition can be represented as follows:
ΔE=13.6nf2(13.6ni2)\Rightarrow \Delta E = \dfrac{{ - 13.6}}{{n_f^2}} - \left( {\dfrac{{ - 13.6}}{{n_i^2}}} \right)
ΔE=13.6(1nf21ni2)\Rightarrow \Delta E = - 13.6\left( {\dfrac{1}{{n_f^2}} - \dfrac{1}{{n_i^2}}} \right)
Now, we need to calculate the ionization energy of a hydrogen atom at ground state i.e., the electron is leaving the atom and it is assumed to reach infinity. Thus, the transition will take place from n=1n = 1 to n=n = \infty that means nf={n_f} = \infty and ni=1{n_i} = 1. Substituting values:
ΔE=13.6(11)\Rightarrow \Delta E = - 13.6\left( {\dfrac{1}{\infty } - 1} \right)
ΔE=13.6(01)            [1=0]\Rightarrow \Delta E = - 13.6(0 - 1)\;\;\;\;\;\;\left[ {\because \dfrac{1}{\infty } = 0} \right]
ΔE=13.6  eV\Rightarrow \Delta E = 13.6\;eV
Converting the energy difference in units of joule:
ΔE=2.18×1018J        [1eV=1.602×1019]\Rightarrow \Delta E = 2.18 \times {10^{ - 18}}J\;\;\;\;\left[ {\because 1eV = 1.602 \times {{10}^{ - 19}}} \right]
This is the required ionization energy for the single electron of a hydrogen atom. But we know that the ionization energy often refers to one mole of the molecule. So, the ionization energy of one mole of hydrogen atom with 6.023×10236.023 \times {10^{23}} photons will be as follows:
I.E=2.18×1018×6.023×1023I.E = 2.18 \times {10^{ - 18}} \times 6.023 \times {10^{23}}
I.E=1313×103Jmol1\Rightarrow I.E = 1313 \times {10^3}Jmo{l^{ - 1}}
I.E=1313  kJmol1\Rightarrow I.E = 1313\;kJmo{l^{ - 1}}
Thus, we can conclude that the ionization energy of a hydrogen atom is 1313  kJmol11313\;kJmo{l^{ - 1}}.

Note:
It is important to note that Bohr's model is only applicable to a single electron system. The infinity level here represents the highest possible energy an electron may have as a part of a hydrogen atom. It represents the point at which the ionization of a hydrogen atom occurs to form a positively charged ion.