Newton's identities

From formulasearchengine
Jump to navigation Jump to search

In mathematics, Newton's identities, also known as the Newton–Girard formulae, give relations between two types of symmetric polynomials, namely between power sums and elementary symmetric polynomials. Evaluated at the roots of a monic polynomial P in one variable, they allow expressing the sums of the k-th powers of all roots of P (counted with their multiplicity) in terms of the coefficients of P, without actually finding those roots. These identities were found by Isaac Newton around 1666, apparently in ignorance of earlier work (1629) by Albert Girard. They have applications in many areas of mathematics, including Galois theory, invariant theory, group theory, combinatorics, as well as further applications outside mathematics, including general relativity.

Mathematical statement

Formulation in terms of symmetric polynomials

Let x1,…, xn be variables, denote for k ≥ 1 by pk(x1,…,xn) the k-th power sum:

and for k ≥ 0 denote by ek(x1,…,xn) the elementary symmetric polynomial that is the sum of all distinct products of k distinct variables, so in particular

Then Newton's identities can be stated as

valid for all n ≥ k ≥ 1. Concretely, one gets for the first few values of k:

The form and validity of these equations do not depend on the number n of variables (although the point where the left-hand side becomes 0 does, namely after the n-th identity), which makes it possible to state them as identities in the ring of symmetric functions. In that ring one has

and so on; here the left-hand sides never become zero. These equations allow to recursively express the ei in terms of the pk; to be able to do the inverse, one may rewrite them as

Application to the roots of a polynomial

Now view the xi as parameters rather than as variables, and consider the monic polynomial in t with roots x1,…,xn:

where the coefficients are given by the elementary symmetric polynomials in the roots: ak = ek(x1,…,xn). Now consider the power sums of the roots

Then according to Newton's identities these can be expressed recursively in terms of the coefficients of the polynomial using

Application to the characteristic polynomial of a matrix

When the polynomial above is the characteristic polynomial of a matrix A (in particular when A is the companion matrix of the polynomial), the roots are the eigenvalues of the matrix, counted with their algebraic multiplicity. For any positive integer k, the matrix Ak has as eigenvalues the powers xik, and each eigenvalue of A contributes its multiplicity to that of the eigenvalue xik of Ak. Then the coefficients of the characteristic polynomial of Ak are given by the elementary symmetric polynomials in those powers xik. In particular, the sum of the xik, which is the k-th power sum sk of the roots of the characteristic polynomial of A, is given by its trace:

The Newton identities now relate the traces of the powers Ak to the coefficients of the characteristic polynomial of A. Using them in reverse to express the elementary symmetric polynomials in terms of the power sums, they can be used to find the characteristic polynomial by computing only the powers Ak and their traces.

This computation requires computing the traces of matrix powers Ak and solving a triangular system of equations. Both can be done in complexity class NC (solving a triangular system can be done by divide-and-conquer). Therefore, characteristic polynomial of a matrix can be computed in NC. By the Cayley-Hamilton theorem, every matrix satisfies its characteristic polynomial, and a simple transformation allows to find the matrix inverse in NC.

Rearranging the computations into an efficient form leads to the Fadeev-Leverrier algorithm (1840), a fast parallel implementation of it is due to L. Csanky (1976). Its disadvantage is that it requires division by integers, so in general the field should have characteristic 0.

Relation with Galois theory

For a given n, the elementary symmetric polynomials ek(x1,…,xn) for k = 1,…, n form an algebraic basis for the space of symmetric polynomials in x1,…. xn: every polynomial expression in the xi that is invariant under all permutations of those variables is given by a polynomial expression in those elementary symmetric polynomials, and this expression is unique up to equivalence of polynomial expressions. This is a general fact known as the fundamental theorem of symmetric polynomials, and Newton's identities provide explicit formulae in the case of power sum symmetric polynomials. Applied to the monic polynomial with all coefficients ak considered as free parameters, this means that every symmetric polynomial expression S(x1,…,xn) in its roots can be expressed instead as a polynomial expression P(a1,…,an) in terms of its coefficients only, in other words without requiring knowledge of the roots. This fact also follows from general considerations in Galois theory (one views the ak as elements of a base field, the roots live in an extension field whose Galois group permutes them according to the full symmetric group, and the field fixed under all elements of the Galois group is the base field).

The Newton identities also permit expressing the elementary symmetric polynomials in terms of the power sum symmetric polynomials, showing that any symmetric polynomial can also be expressed in the power sums. In fact the first n power sums also form an algebraic basis for the space of symmetric polynomials.

Related identities

There are a number of (families of) identities that, while they should be distinguished from Newton's identities, are very closely related to them.

A variant using complete homogeneous symmetric polynomials

Denoting by hk the complete homogeneous symmetric polynomial that is the sum of all monomials of degree k, the power sum polynomials also satisfy identities similar to Newton's identities, but not involving any minus signs. Expressed as identities of in the ring of symmetric functions, they read

valid for all n ≥ k ≥ 1. Contrary to Newton's identities, the left-hand sides do not become zero for large k, and the right-hand sides contain ever more non-zero terms. For the first few values of k, one has

These relations can be justified by an argument analogous to the one by comparing coefficients in power series given above, based in this case on the generating function identity

Proofs of Newton's identities, like these given below, cannot be easily adapted to prove these variants of those identities.

Expressing elementary symmetric polynomials in terms of power sums

As mentioned, Newton's identities can be used to recursively express elementary symmetric polynomials in terms of power sums. Doing so requires the introduction of integer denominators, so it can be done in the ring ΛQ of symmetric functions with rational coefficients:

and so forth. Applied to a monic polynomial, these formulae express the coefficients in terms of the power sums of the roots: replace each ei by ai and each pk by sk.

Expressing complete homogeneous symmetric polynomials in terms of power sums

The analogous relations involving complete homogeneous symmetric polynomials can be similarly developed, giving equations

Failed to parse (Conversion error. Server ("cli") reported: "SyntaxError: Illegal TeX function Found \begin{alignat}in 1:16"): {\displaystyle \begin{alignat}2 h_1 &= p_1,\\ h_2 &= \textstyle\frac12p_1^2 + \frac12p_2 &&= \textstyle\frac12 ( p_1^2 + p_2 ),\\ h_3 &= \textstyle\frac16p_1^3 + \frac12p_1 p_2 + \frac13p_3 &&= \textstyle\frac{1}{6} ( p_1^3 + 3 p_1 p_2 + 2 p_3 ),\\ h_4 &= \textstyle\frac1{24}p_1^4 + \frac14p_1^2 p_2 + \frac18p_2^2 + \frac13p_1 p_3 + \frac14p_4 &&= \textstyle\frac1{24} ( p_1^4 + 6 p_1^2 p_2 + 3 p_2^2 + 8 p_1 p_3 + 6 p_4 ),\\ h_n &= \sum_{m_1+2m_2+\cdots+nm_n=n} \prod_{i=1}^n \frac{p_i^{m_i}}{m_i ! i^{m_i}}\\ \end{alignat}}

and so forth, in which there are only plus signs. These expressions correspond exactly to the cycle index polynomials of the symmetric groups, if one interprets the power sums pi as indeterminates: the coefficient in the expression for hk of any monomial p1m1p2m2plml is equal to the fraction of all permutations of k that have m1 fixed points, m2 cycles of length 2, …, and ml cycles of length l. Explicitly, this coefficient can be written as where ; this N is the number permutations commuting with any given permutation π of the given cycle type. The expressions for the elementary symmetric functions have coefficients with the same absolute value, but a sign equal to the sign of π, namely (−1)m2+m4+\dots.

It can be proved by:

Expressing power sums in terms of elementary symmetric polynomials

One may also use Newton's identities to express power sums in terms of symmetric polynomials, which does not introduce denominators:

The first four formulas were obtained by Albert Girard in 1629 (thus before Newton).[1] The general formula is: which can be proved as follows: