Thermionic Emission

Part of Vacuum Tubes

Thermionic emission is the escape of electrons from a heated metal surface — the fundamental phenomenon underlying all vacuum tubes.

Why This Matters

Everything in vacuum tube technology traces back to one physical fact: heat a metal sufficiently and electrons escape from its surface. Thomas Edison noticed this effect in 1883 when he found that current could flow between a heated filament and a separate plate inside an evacuated glass bulb. Owen Richardson worked out the physics in detail and won a Nobel Prize for it in 1928. The devices built on this principle powered all electronic technology for fifty years.

Understanding thermionic emission explains why tubes need heaters, why cathode temperature affects maximum current, why tube emission declines with age, and why different cathode materials have such different emission efficiencies. This understanding is the basis for diagnosing emission-limited tube failures and for evaluating whether a salvaged tube still has useful life remaining.

For anyone building vacuum tubes from scratch — a real possibility for communities that want sustainable electronics capability — thermionic emission physics directly governs the choice of cathode material, the operating temperature required, and the expected emission current from a given cathode area.

Richardson-Dushman Equation

The maximum emission current density from a heated surface is given by the Richardson-Dushman equation:

J = A × T² × e^(−φ/kT)

where:

  • J = emission current density (amperes per square meter)
  • A = Richardson constant, a material property (amperes per m² per K²)
  • T = absolute temperature (Kelvin)
  • φ = work function of the surface (electron-volts, eV)
  • k = Boltzmann’s constant (8.617 × 10⁻⁵ eV/K)
  • e = base of natural logarithm (2.718)

The work function φ is the energy barrier an electron must overcome to escape the metal surface. Electrons in a metal have a distribution of energies. Most are below the Fermi level and cannot escape. Those with enough thermal energy to overcome the work function barrier can escape — and at higher temperatures, more electrons have sufficient energy.

The exponential term e^(−φ/kT) dominates the equation. It describes how the fraction of electrons with enough energy to escape varies with temperature. This function is extremely sensitive to temperature: a 10% increase in temperature (from 1000K to 1100K) can increase emission by orders of magnitude, depending on the work function.

Work Functions and Emission Efficiency

The work function is the key material property for cathode selection. Lower work function means more emission at lower temperature.

Pure tungsten: φ ≈ 4.5 eV. At 2200°C (2473K), the emission is adequate but requires enormous heater power per unit emission. The practical emission efficiency is about 1-5 mA/W.

Thoriated tungsten: φ ≈ 2.6 eV (for the thorium monolayer surface). At 1600°C (1873K), emission is comparable to pure tungsten at 2200°C. The efficiency is approximately 50-100 mA/W.

Barium-strontium oxide on nickel: φ ≈ 1.0-1.3 eV. At 700-900°C (973-1173K), emission is prolific. Efficiencies of 100-400 mA/W make this the preferred cathode for all receiving tubes.

The Richardson constant A also varies by material. For tungsten A ≈ 600,000 A/m²K²; for oxide cathodes, A is effectively much lower (the Richardson-Dushman equation applies to metallic emission; oxide cathode emission involves more complex mechanisms), but the low work function dominates and produces high emission at moderate temperatures.

Temperature and Emission

The exponential dependence of emission on temperature means small temperature changes have large effects. For an oxide cathode with φ = 1.2 eV:

At 900°C (1173K): exponential term = e^(−1.2/(8.617×10⁻⁵ × 1173)) = e^(−11.87) = 6.96 × 10⁻⁶ At 850°C (1123K): exponential term = e^(−1.2/(8.617×10⁻⁵ × 1123)) = e^(−12.40) = 4.07 × 10⁻⁶

A 50°C temperature reduction (from 900 to 850°C) reduces emission by a factor of 1.71 — a 41% reduction from a 5.6% temperature change. This steep dependence has practical implications:

Operating tubes at or near specification voltage is critical. A 10% reduction in heater voltage reduces cathode temperature and can reduce emission by 30-50%. Tubes operated at low heater voltage exhibit weak emission, distortion, and reduced gain — often misdiagnosed as a dying tube when the real problem is a low heater supply voltage.

Conversely, excess heater voltage accelerates cathode aging. Operating an oxide cathode at 10% above rated heater voltage raises the temperature by approximately 30-50°C, increasing emission but also accelerating the depletion of the active barium layer. The cathode’s finite supply of active material depletes faster at higher temperature. Running tubes on standardized voltages (6.3V ±5%) maximizes service life.

Emission Decline and Tube Aging

New oxide cathodes have a thick layer of barium oxide with a surface of free barium atoms providing prolific emission. Over thousands of hours of operation:

  1. Barium evaporates from the surface — the most energetic surface barium atoms have enough energy to escape into the vacuum, where they are deposited on the glass and getter.

  2. Contamination poisons the surface — residual gases (even at 10⁻⁵ Pa vacuum) slowly react with surface barium, converting active metallic barium to barium oxide, hydroxide, or carbonate, which have higher effective work functions.

  3. Ion bombardment sputters the oxide coating — positive ions formed by ionization of residual gas are accelerated to the cathode by the electric field and mechanically displace oxide material.

  4. The oxide-metal interface resistance increases — a layer of barium orthosilicate (BaOSiO₂) forms between the oxide coating and the nickel substrate, increasing the effective series resistance of the cathode circuit.

The practical result: over 5,000 to 15,000 hours of operation (depending on tube type and operating conditions), emission current at a given operating point decreases. The tube first shows signs of distortion at high signal levels (insufficient emission to supply peak currents), then reduced output power, then reduced gain, until finally the tube cannot sustain normal operation.

Measuring Emission

A tube tester measures emission by applying a standard plate voltage and measuring the resulting plate current. Most commercial tube testers include a “mutual conductance” (transconductance) measurement that simulates actual operating conditions more accurately than a simple DC emission test.

For field testing without a tube tester, insert a 1-ohm resistor between cathode and ground in the socket. Apply normal operating voltages. Measure the cathode current (mV across the 1-ohm resistor = mA of current). Compare against specified plate current at the same operating conditions. A tube with less than 70% of specified plate current has reduced emission and will perform below specification.

The maximum emission current test (grid driven fully positive) reveals the total emission capacity. Compare to the tube’s published maximum emission figure. Tubes with maximum emission below 60% of specification are effectively worn out and should be replaced.