The Zanstra method is a method to determine the temperature of central stars of planetary nebulae. It was developed by Herman Zanstra in 1927.
It is assumed that the nebula is optically thick in the Lyman continuum, which means that all ionizing photons from the central star are absorbed inside the nebula.Based on this assumption, the intensity ratio of a stellar reference frequency to a nebular line such as Hβ can be used to determine the central star's effective temperature.
For a pure hydrogen nebula, the ionization equilibrium states that the number per unit time of ionizing photons from the central star has to be balanced by the rate of recombinations of protons and electrons to neutral hydrogen inside the Strömgren sphere of the nebula. Ionizations can only be caused by photons having at least the frequency
\nu0
infty | |
\int | |
\nu0 |
L\nu | |
h\nu |
d\nu=
r1 | |
\int | |
0 |
npne\alphaBdV
Here,
r1
np,ne
L\nu
\alphaB
The ratio between the number of photons emitted by the nebula in the Hβ line and the number of ionizing photons from the central star can then be estimated:
| ||||||||||||
|
≈ h\nuH\beta
| |||||||
\alphaB |
where
eff | |
\alpha | |
H\beta |
Given a stellar reference frequency
\nus
Z=
| ||||||||||||
|
=h\nuH\beta
| |||||||
\alphaB |
| |||||
FH\beta |
with
F | |
\nus |
FH\beta