Unitarity gauge explained

In theoretical physics, the unitarity gauge or unitary gauge is a particular choice of a gauge fixing in a gauge theory with a spontaneous symmetry breaking. In this gauge, the scalar fields responsible for the Higgs mechanism are transformed into a basis in which their Goldstone boson components are set to zero. In other words, the unitarity gauge makes the manifest number of scalar degrees of freedom minimal.

The gauge was introduced to particle physics by Steven Weinberg[1] [2] in the context of the electroweak theory. In electroweak theory, the degrees of freedom in a unitarity gauge are the massive spin-1 W+, W and Z bosons with three polarizations each, the photon with two polarizations, and the scalar Higgs boson.

The unitarity gauge is usually used in tree-level calculations. For loop calculations, other gauge choices such as the 't Hooft–Feynman gauge often reduce the mathematical complexity of the calculation.

References

  1. Weinberg. Steven. Physical Processes in a Convergent Theory of the Weak and Electromagnetic Interactions. Physical Review Letters. 1971. 27. 24. 1688–1691. 10.1103/PhysRevLett.27.1688. 1971PhRvL..27.1688W .
  2. Weinberg. Steven. General Theory of Broken Local Symmetries. Physical Review D. 1973. 7. 4. 1068–1082. 10.1103/PhysRevD.7.1068. 1973PhRvD...7.1068W .

Further reading