Section 4.4 AM/GM Inequality and Cauchy-Schwarz
In this section we cover some very useful inequalities that allow us to separate terms in a product. As a warm-up, let's begin with the following.
Lemma 4.4.1.
Let \(u,v\in\RR\text{.}\) Then
Notice that on one side of the above inequality, the terms \(u\) and \(v\) are multiplied together, and on the other side, they are separate (but now squared).
Proof.
Since\((u-v)^2\geq 0\text{,}\) we get
So we can add \(2uv\) to both sides to get
and then we can multiply both sides by \(\frac{1}{2}\) to get \(uv \leq \frac{u^2+v^2}{2}\) as desired.
Exercise 4.4.2.
When does equality hold in the previous lemma? That is, when do we have \(uv = \frac{u^2+v^2}{2}\text{?}\)
Let's retrace the proof of Lemma 4.4.1 backwards trying to use only equalities. If \(2uv=u^2+v^2\text{,}\) then
and this last equality holds if and only if \(u=v\text{.}\)
From this, we deduce the famous Arithmetic Mean-Geometric Mean inequality:
Theorem 4.4.3. AM/GM Inequality.
Let \(n\in\NN\text{,}\) and let \(x_{1},\dots,x_{n}\geq 0\) be nonnegative real numbers. Then
with equality if and only if \(x_1=x_2=\cdots=x_n\text{.}\)
That is, the geometric mean on the left is at most the arithmetic mean on the right.
Proof.
Write \(a = (x_1 + \cdots + x_n)/n\) for the arithmetic mean. First, let us consider the case in which we have \(x_1=\cdots=x_n=a\text{.}\) In that case, we have equality, because both the arithmetic and geometric means are equal to \(a\text{.}\) In particular, the claim is true for \(n=1\text{.}\)
Now for an induction argument, we may assume that the claim holds for \(n=k\text{,}\) where \(k\in\NN\text{,}\) and we aim to prove it for \(n=k+1\text{.}\) Furthermore, we may assume that at least two of the \(x_i\) are unequal to \(a\text{;}\) in particular \(a>0\text{.}\) Since one of these must be larger than \(a\) and the other must be smaller, we are free to assume that \(x_k\lt a\) and \(x_{k+1}\gt a\text{.}\) We aim to show that
If any of the \(x_i\)s are zero, then the geometric mean vanishes, so we have our strict inequality.
Now set \(\xi_k = x_k + x_{k+1}-a\text{;}\) note that \(\xi_k > 0\text{.}\) Now \(a = (x_1+x_2+\cdots +x_{k-1}+\xi_k)/k\text{.}\) So by hypothesis, we have
with equality if and only if \(x_1=x_2=\cdots=x_{k-1}=\xi_k\text{.}\) Now note that
so we deduce that
as desired.
The next useful inequality is the Cauchy-Schwarz inequality:
Theorem 4.4.4. Cauchy-Schwarz Inequality.
Let \(n\in\NN\text{,}\) and let \(x_{1},\dots,x_{n},y_{1},\dots,y_{n}\in\RR\text{.}\) Then
Proof.
Let \(a=\sqrt{x_1^2+\cdots+x_n^2}\text{,}\) and let \(b=\sqrt{y_1^2+\cdots+y_n^2}\text{.}\) We aim to show that
Now by the AM/GM Inequality, we have, for each \(i\text{,}\)
Now if we sum these over all \(i\text{,}\) the right hand side becomes
and so the proof is complete.
Why is this useful? On the left we have a mixed product of \(x\)-terms and \(y\)-terms, and we are able to relate it to a product of two terms where now the \(x\)'s and \(y\)'s are separated.
Example 4.4.5.
Show that
Which of our above inequalities should we try using? Lemma 4.4.1 and AM-GM don't seem relevant since we don't have something like \(ab\) on the left, so let's see if we can use the Cauchy-Schwarz inequality: Rewrite \(\frac{a+b}{2}=a\cdot \frac{1}{2} +b\cdot\frac{1}{2}\text{,}\) then the Cauchy Schwarz inequality implies: