In electromagnetics, the antenna factor (AF, units: m−1, reciprocal meter) is defined as the ratio of the electric field E (units: V/m or μV/m) to the voltage V (units: V or μV) induced across the terminals of an antenna:
AF=
E | |
V |
If all quantities are expressed logarithmically in decibels instead of SI units, the above equation becomes
AFdB/m=EdBV/m-VdBV
The voltage measured at the output terminals of an antenna is not the actual field intensity due to actual antenna gain, aperture characteristics, and loading effects.[1]
For a magnetic field, with units of A/m, the corresponding antenna factor is in units of A/(V⋅m). For the relationship between the electric and magnetic fields, see the impedance of free space.
For a 50 Ω load, knowing that PD Ae = Pr = V2/R and E2=
\sqrt{ | \mu0 |
\varepsilon0 |
AF=
\sqrt{377PD | |
Where
\mu0
\varepsilon0
For antennas which are not defined by a physical area, such as monopoles and dipoles consisting of thin rod conductors, the effective length (units: meter) is used to measure the ratio between voltage and electric field.