In mathematics, a quadratic irrational number (also known as a quadratic irrational or quadratic surd) is an irrational number that is the solution to some quadratic equation with rational coefficients which is irreducible over the rational numbers.[1] Since fractions in the coefficients of a quadratic equation can be cleared by multiplying both sides by their least common denominator, a quadratic irrational is an irrational root of some quadratic equation with integer coefficients. The quadratic irrational numbers, a subset of the complex numbers, are algebraic numbers of degree 2, and can therefore be expressed as
{a+b\sqrt{c}\overd},
for integers ; with, and non-zero, and with square-free. When is positive, we get real quadratic irrational numbers, while a negative gives complex quadratic irrational numbers which are not real numbers. This defines an injection from the quadratic irrationals to quadruples of integers, so their cardinality is at most countable; since on the other hand every square root of a prime number is a distinct quadratic irrational, and there are countably many prime numbers, they are at least countable; hence the quadratic irrationals are a countable set.
Quadratic irrationals are used in field theory to construct field extensions of the field of rational numbers . Given the square-free integer, the augmentation of by quadratic irrationals using produces a quadratic field). For example, the inverses of elements of) are of the same form as the above algebraic numbers:
{d\overa+b\sqrt{c}}={ad-bd\sqrt{c}\overa2-b2c}.
Quadratic irrationals have useful properties, especially in relation to continued fractions, where we have the result that all real quadratic irrationals, and only real quadratic irrationals, have periodic continued fraction forms. For example
\sqrt{3}=1.732\ldots=[1;1,2,1,2,1,2,\ldots]
h(x)=1/x-\lfloor1/x\rfloor
We may rewrite a quadratic irrationality as follows:
a+b\sqrt{c | |
It follows that every quadratic irrational number can be written in the form
a+\sqrt{c | |
This expression is not unique.
Fix a non-square, positive integer
c
0
1
4
Sc
Sc=\left\{
a+\sqrt{c | |
Every quadratic irrationality is in some set
Sc
A matrix
\begin{pmatrix}\alpha&\beta\ \gamma&\delta\end{pmatrix}
with integer entries and
\alpha\delta-\beta\gamma=1
y
Sc
z=
\alphay+\beta | |
\gammay+\delta |
If
y
Sc
z
The relation between
y
z
Sc
Sc
There are finitely many equivalence classes of quadratic irrationalities in
Sc
\phi
c
Sc
\phi(tx2+uxy+vy2)=
-u+\sqrt{c | |
A computation shows that
\phi
Through the bijection
\phi
Sc
The definition of quadratic irrationals requires them to satisfy two conditions: they must satisfy a quadratic equation and they must be irrational. The solutions to the quadratic equation ax2 + bx + c = 0 are
-b\pm\sqrt{b2-4ac | |
Thus quadratic irrationals are precisely those real numbers in this form that are not rational. Since b and 2a are both integers, asking when the above quantity is irrational is the same as asking when the square root of an integer is irrational. The answer to this is that the square root of any natural number that is not a square number is irrational.
The square root of 2 was the first such number to be proved irrational. Theodorus of Cyrene proved the irrationality of the square roots of non-square natural numbers up to 17, but stopped there, probably because the algebra he used could not be applied to the square root of numbers greater than 17. Euclid's Elements Book 10 is dedicated to classification of irrational magnitudes. The original proof of the irrationality of the non-square natural numbers depends on Euclid's lemma.
Many proofs of the irrationality of the square roots of non-square natural numbers implicitly assume the fundamental theorem of arithmetic, which was first proven by Carl Friedrich Gauss in his Disquisitiones Arithmeticae. This asserts that every integer has a unique factorization into primes. For any rational non-integer in lowest terms there must be a prime in the denominator which does not divide into the numerator. When the numerator is squared that prime will still not divide into it because of the unique factorization. Therefore, the square of a rational non-integer is always a non-integer; by contrapositive, the square root of an integer is always either another integer, or irrational.
Euclid used a restricted version of the fundamental theorem and some careful argument to prove the theorem. His proof is in Euclid's Elements Book X Proposition 9.[2]
The fundamental theorem of arithmetic is not actually required to prove the result, however. There are self-contained proofs by Richard Dedekind,[3] among others. The following proof was adapted by Colin Richard Hughes from a proof of the irrationality of the square root of 2 found by Theodor Estermann in 1975.[4] [5]
If D is a non-square natural number, then there is a natural number n such that:
n2 < D < (n + 1)2,
so in particular
0 < - n < 1.
If the square root of D is rational, then it can be written as the irreducible fraction p/q, so that q is the smallest possible denominator, and hence the smallest number for which q is also an integer. Then:
(- n)q = qD - nq
which is thus also an integer. But 0 < ( - n) < 1 so ( - n)q < q. Hence ( - n)q is an integer smaller than q which multiplied by makes an integer. This is a contradiction, because q was defined to be the smallest such number. Therefore, cannot be rational.