Distance Inequality

Problem

Distance Inequality

Solution 1

$a,b,c\,$ that satisfy constraints lie in triangle with vertices $A=\left(\frac{3}{2},\frac{3}{2},0\right)\,$ $B=\left(\frac{3}{2},0,\frac{3}{2}\right),\,$ and $C=\left(0,\frac{3}{2},\frac{3}{2}\right).\,$ On the other hand, $a^2+b^2+c^2\,$ is the square of the distance to the origin. Thus the question reduces to finding the farthest from the origin point in $\Delta ABC.$

The nearest point to the origin is clearly the center of the triangle as it's the foot of the perpendicular from the origin to the plane $a+b+c=3.\,$ Imagine an expanding sphere with the center at the origin. After reaching the center of the triangle, it will intersect the plane in expanding circles centered at the center of $\Delta ABC.\,$ The last position the points on that circle satisfy the constraints is when the circles passes through the vertices of the triangle. At these points,

$a^2+b^2+c^2=\displaystyle\frac{9}{4}+\frac{9}{4}+0=\frac{9}{2}.$

It follows that everywhere else in the triangle $a^2+b^2+c^2\le\frac{9}{2}.$

Just for the fun of it, note that without the constraint $a,b,c\in [0,\frac{3}{2}],\,$ the inquality to prove would be $a^2+b^2+c^2\le 9,\,$ with equality at points $(3,0,0),\,$ $(0,3,0),\,$ $(0,0,3).$

More interesting is, perhaps, the asymmetric case where $a,b,c\in [0,2].\,$ The inequality to prove appears to be $a^2+b^2+c^2\le 5,\,$ with equality at points $(2,1,0)\,$ and permutations.

Solution 2

There exist $\lambda_1,\lambda_2,\lambda_3\in [0,1]\,$ such that

$\displaystyle\begin{align} a &= \lambda_1\cdot 0+(1-\lambda_1)\frac{3}{2};\\ b &= \lambda_2\cdot 0+(1-\lambda_2)\frac{3}{2};\\ c &= \lambda_3\cdot 0+(1-\lambda_3)\frac{3}{2}. \end{align}$

The constraint rewrites as $\displaystyle\sum_{k=1}^3(1-\lambda_k)\frac{3}{2}=3,\;$ so that $\displaystyle\sum_{k=1}^3\lambda_k=1.\,$ Further, using Jensen's inequality,

$\displaystyle\begin{align} a^2+b^2+c^2 &= \sum_{k=1}^3\left[\lambda_k\cdot 0+(1-\lambda_k)\frac{3}{2}\right]^2\\ &\le 0^2\sum_{k=1}^3\lambda_k+\left(\frac{3}{2}\right)^2\sum_{k=1}^3(1-\lambda_k)\\ &=\frac{9}{4}\left(3-\sum_{k=1}^3\lambda_k\right)\\ &=\frac{9}{4}\cdot 2=\frac{9}{2}. \end{align}$

Illustration

Distance inequality

Acknowledgment

The problem has been kindly posted at the CutTheKnotMath facebook page by Leo Giugiuc, with a comment "Almost new year happy. Beautiful and a little complicated." Solution 2 is by Marian Dinca. Illustration is by Nassim Nicholas Taleb.

 

Inequalities with the Sum of Variables as a Constraint

|Contact| |Front page| |Contents| |Algebra|

Copyright © 1996-2018 Alexander Bogomolny

71471229