Xen qabbalah Wiki
Xen qabbalah Wiki
Xen qabbalah Wiki

In theoretical physics, quantum field theory (QFT) is the theoretical framework for constructing quantum mechanicalmodels of subatomic particles in particle physics and quasiparticles in condensed matter physics. It is a set of notions and mathematical tools that combines classical fields, special relativity, and quantum mechanics,and, when combined with the cluster decomposition principle,it may be the only way to do so,while retaining the ideas of quantum point particles and locality. QFT was historically widely believed to be truly fundamental. It is now believed, primarily due to the continued failures of quantization of general relativity, to be only a very good low-energy approximation, i.e. an effective field theory, to some more fundamental theory.

QFT treats particles as excited states of an underlying field, so these are called field quanta. In quantum field theory, quantum mechanical interactions among particles are described by interaction terms among the corresponding underlying quantum fields. These interactions are conveniently visualized by Feynman diagrams, which are a formal tool of relativistically covariant perturbation theory, serving to evaluate particle processes.

History

Even though QFT is an unavoidable consequence of the reconciliation of quantum mechanics with special relativity (Weinberg (1995)), historically, it emerged in the 1920s with the quantization of the electromagnetic field (the quantization being based on an analogy with the eigenmode expansion of a vibrating string with fixed endpoints).

Early development

The first achievement of quantum field theory, namely quantum electrodynamics (QED), is "still the paradigmatic example of a successful quantum field theory" (Weinberg (1995)). Ordinary quantum mechanics (QM) cannot give an account of photons, which constitute the prime case of relativistic 'particles'. Since photons have rest mass zero, and correspondingly travel in the vacuum at the speed c, a non-relativistic theory such as ordinary QM cannot give even an approximate description. Photons are implicit in the emission and absorption processes which have to be postulated, for instance, when one of an atom's electrons makes a transition between energy levels. The formalism of QFT is needed for an explicit description of photons. In fact most topics in the early development of quantum theory (the so-called old quantum theory, 1900–25) were related to the interaction of radiation and matter and thus should be treated by quantum field theoretical methods. However, quantum mechanics as formulated by Dirac, Heisenberg, and Schrödinger in 1926–27 started from atomic spectra and did not focus much on problems of radiation.

As soon as the conceptual framework of quantum mechanics was developed, a small group of theoreticians tried to extend quantum methods to electromagnetic fields. A good example is the famous paper by Born, Jordan & Heisenberg (1926). (P. Jordan was especially acquainted with the literature on light quanta and made seminal contributions to QFT.) The basic idea was that in QFT the electromagnetic field should be represented by matrices in the same way that position and momentum were represented in QM by matrices (matrix mechanics oscillator operators). The ideas of QM were thus extended to systems having an infinite number of degrees of freedom, so an infinite array of quantum oscillators.

The inception of QFT is usually considered to be Dirac's famous 1927 paper on "The quantum theory of the emission and absorption of radiation".Here Dirac coined the name "quantum electrodynamics" (QED) for the part of QFT that was developed first. Dirac supplied a systematic procedure for transferring the characteristic quantum phenomenon of discreteness of physical quantities from the quantum-mechanical treatment of particles to a corresponding treatment of fields. Employing the theory of the quantum harmonic oscillator, Dirac gave a theoretical description of how photons appear in the quantization of the electromagnetic radiation field. Later, Dirac's procedure became a model for the quantization of other fields as well. These first approaches to QFT were further developed during the following three years. P. Jordan introduced creation and annihilation operators for fields obeying Fermi–Dirac statistics. These differ from the corresponding operators for Bose–Einstein statistics in that the former satisfy anti-commutation relations while the latter satisfy commutation relations.

The methods of QFT could be applied to derive equations resulting from the quantum-mechanical (field-like) treatment of particles, e.g. the Dirac equation, the Klein–Gordon equation and the Maxwell equations. Schweber points out that the idea and procedure of second quantization goes back to Jordan, in a number of papers from 1927,while the expression itself was coined by Dirac. Some difficult problems concerning commutation relations, statistics, and Lorentz invariance were eventually solved. The first comprehensive account of a general theory of quantum fields, in particular, the method of canonical quantization, was presented by Heisenberg & Pauli in 1929–30.Whereas Jordan's second quantization procedure applied to the coefficients of the normal modes of the field, Heisenberg & Pauli started with the fields themselves and subjected them to the canonical procedure. Heisenberg and Pauli thus established the basic structure of QFT as presented in modern introductions to QFT. Fermi and Dirac, as well as Fock and Podolsky, presented different formulations which played a heuristic role in the following years.

Quantum electrodynamics rests on two pillars, see e.g., the short and lucid "Historical Introduction" of Scharf (2014). The first pillar is the quantization of the electromagnetic field, i.e., it is about photons as the quantized excitations or 'quanta' of the electromagnetic field. This procedure will be described in some more detail in the section on the particle interpretation. As Weinberg points out the "photon is the only particle that was known as a field before it was detected as a particle" so that it is natural that QED began with the analysis of the radiation field.The second pillar of QED consists of the relativistic theory of the electron, centered on the Dirac equation.

The problem of infinities

The emergence of infinities

Quantum field theory started with a theoretical framework that was built in analogy to quantum mechanics. Although there was no unique and fully developed theory, quantum field theoretical tools could be applied to concrete processes. Examples are the scattering of radiation by free electrons, Compton scattering, the collision between relativistic electrons or the production of electron-positron pairs by photons. Calculations to the first order of approximation were quite successful, but most people working in the field thought that QFT still had to undergo a major change. On the one side, some calculations of effects for cosmic rays clearly differed from measurements. On the other side and, from a theoretical point of view more threatening, calculations of higher orders of the perturbation series led to infinite results. The self-energy of the electron as well as vacuum fluctuations of the electromagnetic field seemed to be infinite. The perturbation expansions did not converge to a finite sum and even most individual terms were divergent.

The various forms of infinities suggested that the divergences were more than failures of specific calculations. Many physicists tried to avoid the divergences by formal tricks (truncating the integrals at some value of momentum, or even ignoring infinite terms) but such rules were not reliable, violated the requirements of relativity and were not considered as satisfactory. Others came up with the first ideas for coping with infinities by a redefinition of the parameters of the theory and using a measured finite value, for example of the charge of the electron, instead of the infinite 'bare' value. This process is called renormalization.

From the point of view of the philosophy of science, it is remarkable that these divergences did not give enough reason to discard the theory. The years from 1930 to the beginning of World War II were characterized by a variety of attitudes towards QFT. Some physicists tried to circumvent the infinities by more-or-less arbitrary prescriptions, others worked on transformations and improvements of the theoretical framework. Most of the theoreticians believed that QED would break down at high energies. There was also a considerable number of proposals in favor of alternative approaches. These proposals included changes in the basic concepts e.g. negative probabilities and interactions at a distance instead of a field theoretical approach, and a methodological change to phenomenological methods that focusses on relations between observable quantities without an analysis of the microphysical details of the interaction, the so-called S-matrix theory where the basic elements are amplitudes for various scattering processes.

Despite the feeling that QFT was imperfect and lacking rigor, its methods were extended to new areas of applications. In 1933 Fermi's theory of the beta decay started with conceptions describing the emission and absorption of photons, transferred them to beta radiation and analyzed the creation and annihilation of electrons and neutrinos described by the weak interaction. Further applications of QFT outside of quantum electrodynamics succeeded in nuclear physics with the strong interaction. In 1934 Pauli & Weisskopf showed that a new type of field, the scalar field, described by the Klein–Gordon equation, could be quantized.This is another example of second quantization. This new theory for matter fields could be applied a decade later when new particles, pions, were detected.

The taming of infinities

After the end of World War II more reliable and effective methods for dealing with infinities in QFT were developed, namely coherent and systematic rules for performing relativistic field theoretical calculations, and a general renormalization theory. At three famous conferences, the Shelter Island Conference 1947, the Pocono Conference 1948, and the 1949 Oldstone Conference, developments in theoretical physics were confronted with relevant new experimental results. In the late forties, there were two different ways to address the problem of divergences. One of these was discovered by Richard Feynman, the other one (based on an operator formalism) by Julian Schwinger and, independently, by Shin'ichirō Tomonaga.

In 1949, Freeman Dyson showed that the two approaches are in fact equivalent and fit into an elegant field-theoretic framework. Thus, Freeman Dyson, Feynman, Schwinger, and Tomonaga became the inventors of renormalization theory. The most spectacular successes of renormalization theory were the calculations of the anomalous magnetic moment of the electron and the Lamb shift in the spectrum of hydrogen. These successes were so outstanding because the theoretical results were in better agreement with high-precision experiments than anything in physics encountered before. Nevertheless, mathematical problems lingered on and prompted a search for rigorous formulations (discussed below).

The rationale behind renormalization is to avoid divergences that appear in physical predictions by shifting them into a part of the theory where they do not influence empirical statements. Dyson could show that a rescaling of charge and mass ('renormalization') is sufficient to remove all divergences in QED consistently, to all orders of perturbation theory. A QFT is called renormalizable if all infinities can be absorbed into a redefinition of a finite number of coupling constants and masses. A consequence for QED is that the physical charge and mass of the electron must be measured and cannot be computed from first principles.

Perturbation theory yields well-defined predictions only in renormalizable quantum field theories; luckily, QED, the first fully developed QFT, belonged to this class of renormalizable theories. There are various technical procedures to renormalize a theory. One way is to cut off the integrals in the calculations at a certain value Λ of the momentum which is large but finite. This cut-off procedure is successful if, after taking the limit Λ → ∞, the resulting quantities are independent of Λ.

Feynman's formulation of QED is of special interest from a philosophical point of view. His so-called space-time approach is visualized by the celebrated Feynman diagramsthat look like depicting paths of particles. Feynman's method of calculating scattering amplitudes is based on the functional integral formulation of field theory.A set of graphical rules can be derived so that the probability of a specific scattering process can be calculated by drawing a diagram of that process and then using that diagram to write down the precise mathematical expressions for calculating its amplitude in relativistically covariant perturbation theory.

The diagrams provide an effective way to organize and visualize the various terms in the perturbation series, and they naturally account for the flow of electrons and photons during the scattering process. External lines in the diagrams represent incoming and outgoing particles, internal lines are connected with virtual particles and vertices with interactions. Each of these graphical elements is associated with mathematical expressions that contribute to the amplitude of the respective process. The diagrams are part of Feynman's very efficient and elegant algorithm for computing the probability of scattering processes.

The idea of particles traveling from one point to another was heuristically useful in constructing the theory. This heuristic, based on Huygens' principle, is useful for concrete calculations and actually gives the correct particle propagators as derived more rigorously.Nevertheless, an analysis of the theoretical justification of the space-time approach shows that its success does not imply that particle paths need be taken seriously. General arguments against a particle interpretation of QFT clearly exclude that the diagrams represent actual paths of particles in the interaction area. Feynman himself was not particularly interested in ontological questions.

The golden age: gauge theory and the standard model

In 1933, Enrico Fermi had already established that the creation, annihilation and transmutation of particles in the weak interaction beta decay could best be described in QFT,specifically his quartic fermion interaction. As a result, field theory had become a prospective tool for other particle interactions. In the beginning of the 1950s, QED had become a reliable theory which no longer counted as preliminary. However, it took two decades from writing down the first equations until QFT could be applied successfully to important physical problems in a systematic way.

The theories explored relied on—indeed, were virtually fully specified by—a rich variety of symmetries pioneered and articulated by Murray Gell-Mann.The new developments made it possible to apply QFT to new particles and new interactions and fully explain their structure.

In the following decades, QFT was extended to well-describe not only the electromagnetic force but also weak and strong interaction so that new Lagrangians were found which contain new classes of particles or quantum fields. The search still continues for a more comprehensive theory of matter and energy, a unified theory of all interactions.

The new focus on symmetry led to the triumph of non-Abelian gauge theories (the development of such theories was pioneered in 1954–60 with the work of Yang and Mills;see Yang–Mills theory) and spontaneous symmetry breaking (by Yoichiro Nambu).Today, there are reliable theories of the strong, weak, and electromagnetic interactions of elementary particles which have an analogous structure to QED: They are the dominant framework of particle physics.

A combined renormalizable theory associated with the gauge group SU(3) × SU(2) × U(1) is dubbed the standard model of elementary particle physics (even though it is a full theory, and not just a model) and was assembled by Sheldon Glashow, Steven Weinberg and Abdus Salam in 1959–67 (see Electroweak unification), and Frank Wilczek, David Gross and David Politzer in 1973 (see Asymptotic freedom), on the basis of conceptual breakthroughs by Peter Higgs, François Englert, Robert Brout, Martin Veltman, and Gerard 't Hooft.

According to the standard model, there are, on the one hand, six types of leptons (e.g. the electron and its neutrino) and six types of quarks, where the members of both groups are all fermions with spin 1/2. On the other hand, there are spin 1 particles (thus bosons) that mediate the interaction between elementary particles and the fundamental forces, namely the photon for electromagnetic interaction, two W and one Z-boson for weak interaction, and the gluons for strong interaction.The linchpin of the symmetry breaking mechanism of the theory is the spin 0 Higgs boson, discovered 40 years after its prediction.

Renormalization group

Parallel breakthroughs in the understanding of phase transitions in condensed matter physics led to novel insights based on the renormalization group. They emerged in the work of Leo Kadanoff (1966)and Kenneth Geddes Wilson & Michael Fisher(1972)—extending the work of Ernst Stueckelberg–André Petermann (1953)and Murray Gell-Mann–Francis Low(1954)—which led to the seminal reformulation of quantum field theory by Kenneth Geddes Wilson in 1975.This reformulation provided insights into the evolution of effective field theories with scale, which classified all field theories, renormalizable or not (cf. subsequent section). The remarkable conclusion is that, in general, most observables are "irrelevant", i.e., the macroscopic physics is dominated by only a few observables in most systems.

During the same period, Kadanoff (1969)

introduced an operator algebra formalism for the two-dimensional Ising model, a widely studied mathematical model of ferromagnetism in statistical physics. This development suggested that quantum field theory describes its scaling limit. Later, there developed the idea that a finite number of generating operators could represent all the correlation functions of the Ising model.

Conformal field theory

The existence of a much stronger symmetry for the scaling limit of two-dimensional critical systems was suggested by Alexander Belavin, Alexander Polyakov and Alexander Zamolodchikov in 1984, which eventually led to the development of conformal field theory,a special case of quantum field theory, which is presently utilized in different areas of particle physics and condensed matter physics.

Historiography

The first chapter in Weinberg (1995) is a very good short description of the earlier history of QFT. A detailed account of the historical development of QFT can be found in Schweber (1994).

Varieties of approaches

Most theories in standard particle physics are formulated as relativistic quantum field theories, such as QED, QCD, and the Standard Model. QED, the quantum field-theoretic description of the electromagnetic field, approximately reproduces Maxwell's theory of electrodynamics in the low-energy limit, with small non-linear corrections to the Maxwell equations required due to virtual electron–positron pairs.

Perturbative and non-perturbative approaches

In the perturbative approach to quantum field theory, the full field interaction terms are approximated as a perturbative expansion in the number of particles involved. Each term in the expansion can be thought of as forces between particles being mediated by other particles. In QED, the electromagnetic force between two electronsis caused by an exchange of photons. Similarly, intermediate vector bosons mediate the weak force and gluons mediate the strong force in QCD. The notion of a force-mediating particle comes from perturbation theory, and does not make sense in the context of non-perturbative approaches to QFT, such as with bound states.

QFT and gravity

There is currently no complete quantum theory of the remaining fundamental force, gravity. Many of the proposed theories to describe gravity as a QFT postulate the existence of a graviton particle that mediates the gravitational force. Presumably, the as yet unknown correct quantum field-theoretic treatment of the gravitational field will behave like Einstein's general theory of relativity in the low-energy limit. Quantum field theory of the fundamental forces itself has been postulated to be the low-energy effective field theory limit of a more fundamental theory such as superstring theory.

Definition

Quantum electrodynamics (QED) has one electron field and one photon field; quantum chromodynamics (QCD) has one field for each type of quark; and, in condensed matter, there is an atomic displacement field that gives rise to phonon particles. Edward Witten describes QFT as "by far" the most difficult theory in modern physics– "so difficult that nobody fully believed it for 25 years."

Dynamics

Ordinary quantum mechanical systems have a fixed number of particles, with each particle having a finite number of degrees of freedom. In contrast, the excited states of a quantum field can represent any number of particles. This makes quantum field theories especially useful for describing systems where the particle count/number may change over time, a crucial feature of relativistic dynamics. A QFT is thus an organized infinite array of oscillators.

States

QFT interaction terms are similar in spirit to those between charges with electric and magnetic fields in Maxwell's equations. However, unlike the classical fields of Maxwell's theory, fields in QFT generally exist in quantum superpositions of states and are subject to the laws of quantum mechanics.

Because the fields are continuous quantities over space, there exist excited states with arbitrarily large numbers of particles in them, providing QFT systems with effectively an infinite number of degrees of freedom. Infinite degrees of freedom can easily lead to divergences of calculated quantities (e.g., the quantities become infinite). Techniques such as renormalization of QFT parameters or discretization of spacetime, as in lattice QCD, are often used to avoid such infinities so as to yield physically plausible results.

Fields and radiation

The gravitational field and the electromagnetic field are the only two fundamental fields in nature that have infinite range and a corresponding classical low-energy limit, which greatly diminishes and hides their "particle-like" excitations. Albert Einstein in 1905, attributed "particle-like" and discrete exchanges of momenta and energy, characteristic of "field quanta", to the electromagnetic field. Originally, his principal motivation was to explain the thermodynamics of radiation. Although the photoelectric effect and Compton scattering strongly suggest the existence of the photon, it might alternatively be explained by a mere quantization of emission; more definitive evidence of the quantum nature of radiation is now taken up into modern quantum optics as in the antibunching effect.

Principles

Associated phenomena

Beyond the most general features of quantum field theories, special aspects such as renormalizability, gauge symmetry, and supersymmetry are outlined below.

Renormalization

Early in the history of quantum field theory, as detailed above, it was found that many seemingly innocuous calculations, such as the perturbative shift in the energy of an electron due to the presence of the electromagnetic field, yield infinite results. The reason is that the perturbation theory for the shift in an energy involves a sum over all other energy levels, and there are infinitely many levels at short distances, so that each gives a finite contribution which results in a divergent series.

Many of these problems are related to failures in classical electrodynamics that were identified but unsolved in the 19th century, and they basically stem from the fact that many of the supposedly "intrinsic" properties of an electron are tied to the electromagnetic field that it carries around with it. The energy carried by a single electron—its self-energy—is not simply the bare value, but also includes the energy contained in its electromagnetic field, its attendant cloud of photons. The energy in a field of a spherical source diverges in both classical and quantum mechanics, but as discovered by Weisskopf with help from Furry, in quantum mechanics the divergence is much milder, going only as the logarithm of the radius of the sphere.

The solution to the problem, presciently suggested by Stueckelberg, independently by Bethe after the crucial experiment by Lamb and Retherford (the Lamb–Retherford experiment), implemented at one loop by Schwinger, and systematically extended to all loops by Feynman and Dyson, with converging work by Tomonagain isolated postwar Japan, comes from recognizing that all the infinities in the interactions of photons and electrons can be isolated into redefining a finite number of quantities in the equations by replacing them with the observed values: specifically the electron's mass and charge: this is called renormalization. The technique of renormalization recognizes that the problem is tractable and essentially purely mathematical; and that, physically, extremely short distances are at fault.

In order to define a theory on a continuum, one may first place a cutoff on the fields, by postulating that quanta cannot have energies above some extremely high value. This has the effect of replacing continuous space by a structure where very short wavelengths do not exist, as on a lattice. Lattices break rotational symmetry, and one of the crucial contributions made by Feynman, Pauli and Villars, and modernized by 't Hooft and Veltman, is a symmetry-preserving cutoff for perturbation theory (this process is called regularization). There is no known symmetrical cutoff outside of perturbation theory, so for rigorous or numerical work people often use an actual lattice.

On a lattice, every quantity is finite but depends on the spacing. When taking the limit to zero spacing, one makes sure that the physically observable quantities like the observed electron mass stay fixed, which means that the constants in the Lagrangian defining the theory depend on the spacing. By allowing the constants to vary with the lattice spacing, all the results at long distances become insensitive to the lattice, defining a continuum limit.

The renormalization procedure only works for a certain limited class of quantum field theories, called renormalizable quantum field theories. A theory is perturbatively renormalizable when the constants in the Lagrangian only diverge at worst as logarithms of the lattice spacing for very short spacings. The continuum limit is then well defined in perturbation theory, and even if it is not fully well defined non-perturbatively, the problems only show up at distance scales that are exponentially small in the inverse coupling for weak couplings. The Standard Model of particle physics is perturbatively renormalizable, and so are its component theories (quantum electrodynamics/electroweak theory and quantum chromodynamics). Of the three components, quantum electrodynamics is believed to not have a continuum limit by itself, while the asymptotically free SU(2) and SU(3) weak and strong color interactions are nonperturbatively well defined.

The renormalization group as developed along Wilson's breakthrough insights relates effective field theories at a given scale to such at contiguous scales. It thus describes how renormalizable theories emerge as the long distance low-energy effective field theory for any given high-energy theory. As a consequence, renormalizable theories are insensitive to the precise nature of the underlying high-energy short-distance phenomena (the macroscopic physics is dominated by only a few "relevant" observables). This is a blessing in practical terms, because it allows physicists to formulate low energy theories without detailed knowledge of high-energy phenomena. It is also a curse, because once a renormalizable theory such as the standard model is found to work, it provides very few clues to higher-energy processes.

The only way high-energy processes can be seen in the standard model is when they allow otherwise forbidden events, or else if they reveal predicted compelling quantitative relations among the coupling constants of the theories or models.

On account of renormalization, the couplings of QFT vary with scale, thereby confining quarks into hadrons, allowing the study of weakly-coupled quarks inside hadrons, and enabling speculation on ultra-high energy behavior.

Gauge freedom

A gauge theory is a theory that admits a symmetry with a local parameter. For example, in every quantum theory, the global phase of the wave function is arbitrary and does not represent something physical. Consequently, the theory is invariant under a global change of phases (adding a constant to the phase of all wave functions, everywhere); this is a global symmetry. In quantum electrodynamics, the theory is also invariant under a local change of phase, that is – one may shift the phase of all wave functions so that the shift may be different at every point in space-time. This is a local symmetry. However, in order for a well-defined derivative operator to exist, one must introduce a new field, the gauge field, which also transforms in order for the local change of variables (the phase in our example) not to affect the derivative. In quantum electrodynamics, this gauge field is the electromagnetic field. The change of local gauge of variables is termed gauge transformation.

By Noether's theorem, for every such symmetry there exists an associated conserved current. The aforementioned symmetry of the wavefunction under global phase changes implies the conservation of electric charge. Since the excitations of fields represent particles, the particle associated with excitations of the gauge field is the gauge boson, e.g., the photon in the case of quantum electrodynamics.

The degrees of freedom in quantum field theory are local fluctuations of the fields. The existence of a gauge symmetry reduces the number of degrees of freedom, simply because some fluctuations of the fields can be transformed to zero by gauge transformations, so they are equivalent to having no fluctuations at all, and they, therefore, have no physical meaning. Such fluctuations are usually called "non-physical degrees of freedom" or gauge artifacts; usually, some of them have a negative norm, making them inadequate for a consistent theory. Therefore, if a classical field theory has a gauge symmetry, then its quantized version (the corresponding quantum field theory) will have this symmetry as well. In other words, a gauge symmetry cannot have a quantum anomaly.

In general, the gauge transformations of a theory consist of several different transformations, which may not be commutative. These transformations are combine into the framework of a gauge group; infinitesimal gauge transformations are the gauge group generators. Thus, the number of gauge bosons is the group dimension (i.e., the number of generators forming the basis of the corresponding Lie algebra).

All the known fundamental interactions in nature are described by gauge theories (possibly barring the Higgs multiplet couplings, if considered in isolation). These are:

  • Quantum chromodynamics, whose gauge group is SU(3). The gauge bosons are eight gluons.
  • The electroweak theory, whose gauge group is U(1) × SU(2), (a direct product of U(1) and SU(2)). The gauge bosons are the photon and the massive W± and Z⁰bosons.
  • Gravity, whose classical theory is general relativity, relies on the equivalence principle, which is essentially a form of gauge symmetry. Its action may also be written as a gauge theory of the Lorentz group on tangent space.

Supersymmetry

Supersymmetry assumes that every fundamental fermion has a superpartner that is a boson and vice versa. Its gauge theory, Supergravity, is an extension of general relativity. Supersymmetry is a key ingredient for the consistency of string theory.

It was utilized in order to solve the so-called Hierarchy Problem of the standard model, that is, to explain why particles not protected by any symmetry (like the Higgs boson) do not receive radiative corrections to their mass, driving it to the larger scales such as that of GUTs, or the Planck mass of gravity. The way supersymmetry protects scale hierarchies is the following: since for every particle there is a superpartner with the same mass but different statistics, any loop in a radiative correction is cancelled by the loop corresponding to its superpartner, rendering the theory more UV finite.

Since, however, no super partners have been observed, if supersymmetry existed it should be broken severely (through a so-called soft term, which breaks supersymmetry without ruining its helpful features). The simplest models of this breaking require that the energy of the superpartners not be too high; in these cases, supersymmetry could be observed by experiments at the Large Hadron Collider. However, to date, after the observation of the Higgs boson there, no such superparticles have been discovered.

Axiomatic approaches

The preceding description of quantum field theory follows the spirit in which most physicists approach the subject. However, it is not mathematically rigorous. Over the past several decades, there have been many attempts to put quantum field theory on a firm mathematical footing by formulating a set of axioms for it. Finding proper axioms for quantum field theory is still an open and difficult problem in mathematics. One of the Millennium Prize Problems—proving the existence of a mass gap in Yang–Mills theory—is linked to this issue. These attempts fall into two broad classes.

Wightman axioms

The first class of axioms, first proposed during the 1950s, include the Wightman, Osterwalder–Schrader, and Haag–Kastler systems. They attempted to formalize the physicists' notion of an "operator-valued field" within the context of functional analysis and enjoyed limited success. It was possible to prove that any quantum field theory satisfying these axioms satisfied certain general theorems, such as the spin-statistics theorem and the CPT theorem. Unfortunately, it proved extraordinarily difficult to show that any realistic field theory, including the Standard Model, satisfied these axioms. Most of the theories that could be treated with these analytic axioms were physically trivial, being restricted to low-dimensions and lacking interesting dynamics. The construction of theories satisfying one of these sets of axioms falls in the field of constructive quantum field theory. Important work was done in this area in the 1970s by Segal, Glimm, Jaffe and others.

Topological quantum field theory

During the 1980s, the second set of axioms based on topological ideas was proposed. Before 1980 all states of matter could be classified by geometry and the principle of broken symmetry. For example Einstein's theory of general relativity is based on the geometrical curvature of space and time, while crystals, magnets and superconductors can all be classified by the symmetries they break. In 1980 the quantum Hall effect provided the first example of a state of matter that has no spontaneous broken symmetry; its characterization is dependant on its topology and not on its geometry (See geometry v. topology). The quantum Hall effect can be described by extending quantum field theory into an effective topological quantum field theory based on the Chern–Simons theory.This line of investigation, which extends quantum field theories to topological quantum field theories, is associated most closely with Michael Atiyah and Graeme Segal, and was notably expanded upon by Edward Witten, Richard Borcherds, and Maxim Kontsevich. The main impact of topological quantum field theory has been in condensed matter physics where physicists have observed exotic quasiparticles such as magnetic monopoles and Majorana fermions.Topological field considerations could have radical applications in a new form of electronics called spintronics and topological quantum computers.The Standard Model allows for topological terms but is generally not formulated as a topological quantum field theory.

Topological quantum field theory has also had broad impact in mathematics, with important applications in representation theory, algebraic topology, and differential geometry.

Haag's theorem

From a mathematically rigorous perspective, there exists no interaction picture in a Lorentz-covariant quantum field theory. This implies that the perturbative approach of Feynman diagrams in QFT is not strictly justified, despite producing vastly precise predictions validated by experiment. This is called Haag's theorem, but most particle physicists relying on QFT largely shrug it off, as not really limiting the power of the theory.