Induction Proofs And Applications Of The AM-GM Inequality

0
5444

Abstract

This paper explores the Inequality of Arithmetic and Geometric Means (AM-GM inequality), which is one of the fundamental inequalities in mathematics with applications in many areas. This paper provides background on the AM-GM inequality, a precise statement of the inequality, various proofs of the inequality based on induction, and applications of the inequality with solved examples. The large number of different proofs of this inequality and applications to be used demonstrates its important nature in mathematics.

Introduction

The AM-GM inequality states that the arithmetic mean of any set of non-negative real numbers is always greater than or equal to the geometric mean of that same set of numbers. In mathematical notation, the AM-GM inequality states that for non-negative real numbers x_{1},x_{2},\cdots ,x_{n}

\frac{x_{1}+x_{2}+\cdots + x_{n}}{n} \geq \sqrt[n] {x_{1}x_{2}\cdots x_{n}}

and equality holds if and only if x_{1}=x_{2}=\cdots =x_{n}.

The AM-GM inequality is one of the most famous inequalities within algebra. Within the area of inequalities itself, the AM-GM inequality is utmost fundamental, and plays a huge role in the development of other, more complex, inequalities. For example, there are multiple other inequalities generalized from the AM-GM inequality such as the Weighted AM-GM inequality, and the Mean Inequality Chain.

There is a rich history in mathematics regarding the AM-GM inequality, with many different proofs developed over time. The first written proof of the AM-GM inequality was by Augustin-Louis Cauchy (1789-1857), in his book Cours d’analyse [1, p. 306]1. In his book, Cauchy proved the AM-GM inequality by using forward-backward induction (also known as Cauchy Induction). From that point, there have been a multitude of new proofs for the general AM-GM inequality. For the Weighted AM-GM inequality, George Polya provided a proof using exponential functions and some calculus [2, p. 18 ]2. Most proofs for the general AM-GM inequality uses induction.

More specifically for the case with 2 non-negative real numbers, there have been a wide variety of methods used to prove it. It may be proved with some simple algebra and the trivial inequality, but there are also many other geometric proofs of the AM-GM inequality3, which is surprising that geometry could connect to a concept of algebra.

The AM-GM inequality was also further developed and generalized. For example, the Mean Inequality Chain, an inequality between the root-mean power, arithmetic mean, geometric mean, and harmonic mean, directly utilizes and expands the AM-GM inequality, as well as some other inequalities (Cauchy-Schwarz Inequality and Jensen’s Inequality) for its proof4.

This paper explores the Inequality of Arithmetic and Geometric Means (AM-GM inequality). First, there will first be an introduction of the AM-GM inequality including what it states, its importance, and its historical developments. Then, there will be some inductions proofs of the AM-GM inequality shown. Lastly, the applications of the AM-GM inequality will be explored, with examples provided.

Definitions

The AM-GM inequality deals with the arithmetic mean and the geometric mean, thus it is useful to have the definitions of the two means.

For the arithmetic mean, let x_{1},x_{2},\cdots ,x_{n} be a set of n quantities of a certain quantitative variable x. The arithmetic mean of the set A is the sum of the quantities divided by the number of the quantities n5. The Concise Encyclopedia of Statistics. Springer, New York, NY.)). In mathematical terms

A=\frac{\sum_{i=1}^{n} x_{i}}{n}=\frac{x_{1}+x_{2}+\cdots + x_{n}}{n} \geq \sqrt[n] {x_{1}x_{2}\cdots x_{n}}

For the geometric mean, again let x_{1},x_{2},\cdots ,x_{n} be a set of n non negative quantities of a certain quantitative variable x. The geometric mean of the set G is the product of the quantities taken to the nth root5. Written mathematically

G=\sqrt[n] {x_{1}x_{2}\cdots x_{n}}

The geometric mean can also be expressed in a logarithmic form, where

ln(G)=\frac{\sum_{i=1}^{n} ln (x_{i})}{n}

Using the definition of these two means and what the AM-GM inequality states, we do indeed get the inequality as shown above.

Proofs

There are various different proofs of the AM-GM inequality. This section will go over three of the more simpler proofs of the AM-GM inequality that uses induction methods.

Cauchy Induction

Cauchy’s proof [1, p. 306] of the AM-GM inequality, is the earliest proof of the AM-GM inequality as discussed1. For a modern exposition of this proof, see [Ch. 1, Ex. 9]6.
Let P(n) be the proposition that the AM-GM inequality is true for a set of n nonnegative numbers. Written mathematically, for non-negative real numbers x_{1},x_{2},\cdots ,x_{n},

\frac{x_{1}+x_{2}+\cdots + x_{n}}{n} \geq \sqrt[n] {x_{1}x_{2}\cdots x_{n}}

and equality holds if and only if x_{1}=x_{2}=\cdots =x_{n}.
As with all induction proofs, a base case is required. Using the smallest non-trivial case of two variables P(2), this case can be easily proved by simple algebra. From the original form:

\frac{x_{1}+x_{2}}{2} \geq \sqrt[] {x_{1}x_{2}}

The inequality may be manipulated

x_1 + x_2 &\geq 2x_1x_2
x_1^2 + 2x_1x_2 + x_2^2 &\geq 4x_1x_2
x_1^2 - 2x_1x_2 + x_2^2 &\geq 0
(x_1 - x_2)^2 &\geq 0

Which is true by the trivial inequality. It is evident that the left side of the inequality is only equal to 0 when x_{1}=x_{2}. Note that all the steps shown are reversible, thus we proved our base case of (n=2). Using our induction hypothesis that P(n) holds, we prove that this implies that the inequality is also true for (2n).

P(2n):\quad \frac{x_1 + x_2 + \dots + x_{2n}}{2n} \geq \sqrt[2n]{x_1 \cdot x_2 \cdot \dots \cdot x_{2n}}

We can split this set of (2n) numbers into (2) groups, each of length (n) x_1+\cdots+x_n) and (x_{n+1}+\cdots+x_{2n}. From the two lists and our induction hypothesis P(n), we have

\frac{x_1 + \dots + x_n}{n} &\geq \sqrt[n]{x_1 \dots x_n}

\frac{x_{n+1} + \ldots + x_{2n}}{n} &\geq \sqrt[n]{x_{n+1} \ldots x_{2n}}

Adding the above inequalities, dividing by (2), and applying the AM-GM inequality for the case (n=2) implies

\frac{x_1 + x_2 + \ldots + x_{2n}}{2n} &\geq \frac{\sqrt[n]{x_1 \ldots x_n} + \sqrt[n]{x_{n+1} \ldots x_{2n}}}{2}
\geq \sqrt{\sqrt[n]{x_1 \ldots x_n} \cdot \sqrt[n]{x_{n+1} \ldots x_{2n}}}
= \sqrt[2n]{x_1 x_2 \ldots x_{2n}}

which is the same as our P(2n). However, the equality condition must be shown as well. From our induction hypothesis, for equality to hold x_{1}=x_{2}=\cdots= x_{2n}, and also if

\sqrt[n]{x_1 \ldots x_n} = \sqrt[n]{x_{n+1} \ldots x_{2n}}

These two equality conditions are from when we used P(n) and P(2) in the proof. From these two equations, they both hold only when x_{1=x_2=\cdots= x_{2n}. Thus, we also proved the equality condition. The steps for this part of the proof are reversible as well, making our proof valid. Since we proved the base case of P(2) and we proved that P(n) implies P(2n), we have proved the AM-GM inequality for all cases where (n) is a power of (2).

Lastly, we prove that P(n) implies P(n-1). In order to do this, we can set

x_n = \frac{x_1 + \cdots + x_{n-1}}{n-1}

and by our induction hypothesis

\frac{1}{n}\left(x_1+\dots+x_{n-1}+\frac{x_1+\dots+x_{n-1}}{n-1}\right)

\geq \sqrt[n]{x_1 \ldots x_{n-1} \left(\frac{x_1 + \ldots + x_{n-1}}{n-1}\right)}

We can do this because it satisfies the equality conditions. The equality holds if and only if

x_1 &= \ldots = x_{n-1} = \frac{x_1 + \ldots + x_{n-1}}{n-1}

which is in fact implied if x_{1} &= \ldots = x_{n-1}. Manipulating the inequality will give

\frac{x_1 + \cdots + x_{n-1}}{n-1} &\geq \sqrt[n]{x_1 \cdots x_{n-1} \left(\frac{x_1 + \cdots + x_{n-1}}{n-1}\right)}
\frac{x_1 + \cdots + x_{n-1}}{n-1} &\geq \sqrt[n-1]{x_1 \cdots x_{n-1}}

Thus we prove that P(n) implies P(n-1), while also preserving the equality condition. Because we have proved that the base case P(2) is true, and that P(n) implies P(2n) and also P(n-1), this covers all cases of positive integers of n (excluding the trivial case of n=1). Therefore, by induction, our induction proposition is true.

Induction With Calculus

This proof also uses induction, but this time calculus will be used along with it. We let P(n) be the proposition that the AM-GM inequality is true for a set of n non-negative numbers, x_{1}, x_{2}, \ldots, x_{n}, and

\frac{x_1 + x_2 + \ldots + x_n}{n} \geq \sqrt[n]{x_1 x_2 \ldots x_n}

As shown in the previous section, and equality holds if and only if (x_{1}=x_{2}=\ldots=x_{n}).

For the base case, we use (n=1), which is easily true because both sides are equal. Now, we must use induction to prove P(n+1). Rearranging P(n+1), we have

\frac{x_1 + x_2 + \ldots + x_{n+1}}{n+1} - \left(x_1x_2\ldots x_{n+1}\right)^{\frac{1}{n+1}} \geq 0

We let x_{n+1} = t, and

f(t) = \frac{x_1 + x_2 + \cdots + x_n + t}{n+1} - \left(x_1 x_2 \ldots x_n t\right)^{\frac{1}{n+1}}

Thus, we want to show f(t) \geq 0 for all t \geq 0.
Firstly, if t=0,

f(0) = \frac{x_1 + x_2 + \ldots + x_n}{n + 1}

We must prove that f(t) is greater than or equal to 0 since x_{1}, x_{2}, \ldots, x_{n}, {n} are all greater than or equal to 0. Because we know that f(0) \geq 0, all we need to do to prove that f(t) \geq 0 for all non-negative t is to show that f(t) is a strictly increasing function in the positive region. To do this, calculus must be used. Using basic derivative laws to find the first and second derivatives, we have

f'(t) = \frac{1}{n+1} \left(1 - \left(x_1 x_2 \ldots x_n\right)^{\frac{1}{n+1}} t^{-\frac{n}{n+1}}\right)

f''(t) = \frac{n}{(n+1)^2} (x_1 x_2 \ldots x_n)^{\frac{1}{n+1}} t^{-\frac{n}{n+1}-1}

Looking at the second derivative, f''(t) \geq 0, therefore there is a unique critical point t_0 where

f'(t_0) = \frac{1}{n+1} \left(1 - (x_1 x_2 \ldots x_n)^{\frac{1}{n+1}} t_0^{-\frac{n}{n+1}} \right) = 0

Solving for t_0 gives

t_0 = \left( x_1 x_2 \ldots x_n \right)^{\frac{1}{n}}

which happens to be the geometric mean. Because f''(t) \geq 0 for all t \geq 0, the function is strictly convex, and there is a global minimum at t_0. Finding the value of f(t) at t_0, we have

f(t_0) &= \frac{x_1 + \ldots + x_n + (x_1 \ldots x_n)^{\frac{1}{n}}}{n+1} - (x_1 \ldots x_n)^{\frac{1}{n+1}} (x_1 \ldots x_n)^{\frac{1}{n(n+1)}}


= \frac{x_1 + \ldots + x_n}{n+1} + \frac{1}{n+1} (x_1 \ldots x_n)^{\frac{1}{n}} - (x_1 \ldots x_n)^{\frac{1}{n}}


= \frac{x_1 + \ldots + x_n}{n+1} - \frac{n}{n+1} (x_1 \ldots x_n)^{\frac{1}{n}}


= \frac{n}{n+1} \left(\frac{x_1 + \ldots + x_n}{n} - (x_1 \ldots x_n)^{\frac{1}{n}}\right)

We note that f(t_0) must be greater than or equal to 0 due to our induction hypothesis and that \frac{n}{n+1} must be positive. Because the value of the function is positive at the global minimum, the value of the function at all t\geq 0 must also be positive. For the equality condition, if x_{1},x_{2},\ldots,x_{n} are equal, then the geometric mean t_0 must also be equal to the same value, thus the equality condition for P(n+1) is also x_{1}=x_{2}=\ldots=x_{n}=x_{n+1}. Thus, we have shown that P(n) implies P(n+1) for both the inequality and the equality condition, completing the induction proof.

Replacement of Quantities

This proof follows the proof shown in “The Art of Problem Solving” by Ruscyzk and Lehoczky7. This proof uses similar inductive techniques as the previous two proofs since it involves replacing quantities with the arithmetic mean A of the set while showing that the geometric mean G gradually increases until it reaches a maximum value equal to A when all the quantities are equal to A.

The start of the proof involves a lemma.

Lemma 3.2: Suppose that there are positive real numbers x and y such that x > y. Now we decrease x and increase y by some amount n such that x - n \geq y + n. Then (x - n)(y + n) > xy.

Proof: Essentially, by decreasing x by n and increasing y by n, the product increases while the average remains the same. To prove the lemma, we must show that

(x - n)(y + n) > xy

With some expansion and manipulation, we have

(x - y)n - n^2 &> 0

Since x - n \geq y + n, x - y \geq 2n, therefore

(x - y)^n - n^2 \geq (2n)n - n^2 = n^2 > 0

Thus we proved (x - n)(y + n) > xy.

Moving on to the proof, let x_1,x_2,\ldots,x_n be positive real numbers. Let the arithmetic mean of these numbers be A and the geometric mean be G. If all x_i are equal, then A=G=x_i. However, if not all x_i are equal and the average remains A, then let x_j be the number closest but not equaling A. Without loss of generality, let x_j<A. Therefore, there must be some numbers that are greater than A, let x_k be the greatest of those numbers. Evidently, x_k-A \geq A-x_j. Doing some quick construction with this inequality, we have

x_k - A + x_j \geq A - x_j + x_j \quad\ \quad x_k - (A - x_j) \geq x_j + (A - x_j)

Which is true by our lemma. Looking at both sides of the inequality, we notice that the right side is equal to A. If we let the left and right sides be the two new terms of the set (i.e., x_j and x_k turn into x_k+x_j-A and A), from our lemma, the average of the new numbers stays the same, while the product and the geometric mean increases. We can apply this process as many times as needed, and ultimately, all the numbers in the set will equal A since each application turns one of the previous terms of the set into A. Thus, we proved that for positive real numbers x_1,x_2,\ldots,x_n, the maximum product and the maximum geometric mean happen when all elements are equal to A, where A=G=x_i. Moreover, the maximum only occurs with these conditions because if not all elements are equal to A, the process above can be applied to increase the product. Therefore, we proved that

A &= \frac{x_1+x_2+\cdots+x_n}{n} \geq \sqrt[n]{x_1x_2\ldots x_n} = G

These proofs are some of the simpler proofs of the AM-GM inequality. There are more proofs of the AM-GM inequalities that involve more complex concepts. The main method for proving the general AM-GM inequality is by induction and is utilized by most proofs. However, there are also other ways to prove the AM-GM inequality that don’t involve induction. For example, the AM-GM inequality may be directly proven from Jensen’s Inequality and applying it to a concave logarithmic function8. Evidently, the AM-GM inequality can be proved from a variety of concepts.

Applications

The AM-GM inequality is one of the most fundamental inequalities within mathematics. Thus, it plays a most important role in algebra and inequalities, with useful applications in those areas.

Using the AM-GM inequality to solve other inequalities

The AM-GM inequality is particularly useful in solving and proving other inequalities. If there is another inequality that we wish to prove, we may attempt to rearrange the inequality into a form of the AM-GM inequality. And since we know the AM-GM inequality is true from the multiple proofs shown, then it follows that the original inequality is true as well. Of course, for this method it is important to ensure that all the steps are reversible, which since we are manipulating inequalities, they always are.
For a simple example of this [Ch.14 Pg.160]7, we wish to prove that

\frac{a}{b} + \frac{b}{a} \geq 2

We may directly prove this by applying the AM-GM inequality

\frac{1}{2} \left( \frac{a}{b} + \frac{b}{a} \right) \geq \sqrt{\frac{a}{b} \cdot \frac{b}{a}} = 1

Note that this equation could also be solved by manipulating the inequality and then applying the trivial inequality. However, this way would require relatively more steps than directly applying the AM-GM inequality to prove the above.
However, usually we can not directly apply the AM-GM inequality, rather we must do some manipulations first. For example [Ch.14 Pg.161]7, if we wish to prove that

x^2 + y^2 + z^2 \geq xy + yz + xz

If we attempt to directly apply the AM-GM inequality we get

\frac{x^2 + y^2 + z^2}{3} \geq \sqrt[3]{x^2 y^2 z^2}

Which does not resemble our initial inequality. However, we can split it into three different inequalities

\frac{x^2+y^2}{2} &\geq \sqrt{x^2y^2} = xy

\frac{y^2+z^2}{2} &\geq \sqrt{y^2z^2} = yz

\frac{x^2+z^2}{2} &\geq \sqrt{x^2z^2} = xz

Adding up the three inequalities, we get our desired inequality above.

The AM-GM inequality can not only be used to prove inequalities, but it can also be helpful in solving certain equations as well. For example, if we wish to solve

a^2 + b^2 + 8 &= 4(a + b)

for positive integers a,b, doing some slight manipulation we have

(a^2+4)+(b^2+4) &= 4a+4b

Applying the AM-GM inequality, we get

a^2+4 &\geq 2\sqrt{4a^2} = 4a \b^2+4 &\geq 2\sqrt{4b^2} = 4b

Adding up the two inequalities, we get

a^2 + b^2 + 8 &\geq 4a + 4b = 4(a + b)

This is extremely similar to the original equation. Since we are looking for equality on both sides, we use the equality condition of all terms being equal. Thus, 4=a^2=b^2 and a=b=2. This is a neat application of the AM-GM inequality, which involves using the equality condition to solve equations which can be written in the form of AM-GM inequality.

Clearly, the AM-GM inequality is important in the use of proving inequalities, with some surprising use in solving equations. The general method in applying the AM-GM inequality to these is to first manipulate the inequality or equation into a form in which the AM-GM inequality can be applied. After the application, we will have achieved our proof of the inequality or obtain some other important information of the original equation.

AM-GM inequality in optimization

Another important application of the AM-GM inequality within mathematics is optimization. If we wish to solve for the maximum or minimum of some expression, the AM-GM inequality can be a very useful tool for getting an inequality with the desired expression on one side. There are mainly two steps to optimization problems, show that the maximum or minimum value can be obtained (through the AM-GM inequality in this case) and show the value can be obtained.

One use of the AM-GM inequality for optimization is in geometry. For example [Ch.14 Pg.164]7, if we wish to find the maximum volume of a rectangular prism with a surface area of 150, we may use the AM-GM inequality to solve this problem. Letting the length, width, and height be l,w,h respectively, the surface area is

2(lw+lh+wh) &= 150

Therefore

\quad lw+lh+wh &= 75

If we apply the AM-GM inequality to the above three terms we have

\frac{lw + lh + wh}{3} &\geq \sqrt[3]{(lw)(lh)(wh)}=(lwh)^{\frac{2}{3}}

Rearranging the inequality and substituting the value for lw+lh+wh, we have

V = lwh \leq 125

Now for showing that the maximum can be obtained, we can use the equality condition for the AM-GM inequality lw=lh=wh. Combining this with the original equation, we can easily solve l=w=h=5. We may substitute this into V=lwh to confirm that this does indeed yield the maximum of 125.

The AM-GM inequality can also be used for solving for minimums, except for minimums we wish to make sure that the desired expression is on the greater than or equal to side. For example, if xyz=125 and we want the minimum value of x+y+z, we need x+y+z on the greater side in order to get a minimum value. Applying the AM-GM inequality, we have

\frac{x+y+z}{3} &\geq \sqrt[3]{xyz} = 5

Therefore x+y+z\geq 15, with equality when x=y=z=5. Thus we conclude that the minimum value is 15. The key idea behind using the AM-GM inequality for optimization involves the inequality sign in the middle. If we wish to find a minimum, then the desired expression should be on the greater than or equal to side. Meanwhile, finding a maximum will require having the desired expression on the lesser than or equal to side. Then, if we are given values that allow us to determine the value of the other side of the inequality, then the maximum/minimum of the desired expression can be obtained.

Conclusion

In this paper, first some proofs of the AM-GM inequality were shown which mainly involved induction. Then, the multiple applications of the AM-GM inequality in mathematics were shown. Firstly, the AM-GM inequality is an important tool for proving other non-standard inequalities, by manipulating the other inequalities into a form of the AM-GM inequality. Secondly, the AM-GM inequality can be used to solve certain equations by using the equality condition of the AM-GM inequality. Thirdly, the AM-GM inequality may be used for optimization problems, since the inequality allows us to find minimums or maximums of expressions. From all these applications, it is undoubtable that the AM-GM inequality has significant contributions to mathematics.

  1. R. E. Bradley, & C. E. Sandifer. (2010). Cauchy’s Cours d’analyse. Springer New York. [] []
  2. G.H. Hardy, J.E. Littlewood; G. Polya. (1952), Inequalities (2. ed.), Cambridge: Cambridge University Press. []
  3. J. Wilson. (2012). Using the Arithmetic Mean-Geometric Mean Inequality in Problem Solving. University of Georgia. []
  4. Root-Mean Square-Arithmetic Mean-Geometric Mean-Harmonic mean Inequality.https://artofproblemsolving.com/wiki/index.php/Root-Mean_Square-Arithmetic_Mean-Geometric_Mean-Harmonic_mean_Inequality. []
  5. Y. Dodge. (2008). The Concise Encyclopedia of Statistics. Springer, New York, NY. [] []
  6. R. L. Graham, D. E. Knuth, & O. Patashnik. (1989). Concrete mathematics: a foundation for computer science. Computers in Physics, 3(5), 106-107. []
  7. R. Rusczyk, & S. Lehoczky. (1994). The art of problem solving, vol. 2: and beyond (Vol. 2, p. 320). AoPS. [] [] [] []
  8. H. Angelidakis. Convexity and Jensen’s Inequality (and the AM-GM Inequality), https://pub.towardsai.net/convexity-and-jensens-inequality-and-the-am-gm-inequality-7593348b9954. (2020). []

LEAVE A REPLY

Please enter your comment!
Please enter your name here