Problem: (2002 USAMO - Proposed, Gabriel Carroll Original) Let $n$ be an integer greater than $1$. Suppose that $n$ distinct positive integers are given, all less than $n^2$. Prove that there exist some two of them whose greatest common divisor is less than $n$.
Solution: We will use some interesting facts about divisors. Firstly, $ gcd(a,b) | (a-b) $ (the greatest common divisor divides the difference). This is pretty obvious because $ gcd(a,b)|a $ and $ gcd(a,b)|b $. And secondly, if $ a|b $ then $ a \le b $ which is quite obvious too, since we have $ b = ka $ with $ k \ge 1 $.
Suppose one of the numbers, $ m $, is less than $ n $. Let $ p $ be any of the other numbers. Then $ gcd(m,p)|m \Rightarrow gcd(m,p) \le m < n $ and we are done. So assume all the numbers are at least $ n $.
Partition the interval $ [n, n^2-1] $ into sets of $ n $ consecutive integers. We get
$ [n, 2n-1] $, $ [2n, 3n-1] $, $\ldots$, $ [n^2-n, n^2-1] $.
It's easy to see there are $ n-1 $ sets (the first begins with $ 1 \cdot n $ and the last with $ (n-1) \cdot n $). Then by Pigeonhole, we know two of our original $ n $ integers fall into one interval. Let these be $ q,r $.
We have $ q-r < n $, so $ gcd(q,r)|(q-r) \Rightarrow gcd(q,r) \le q-r < n $. Hence there exists two of the integers whose greatest common divisor is less than $ n $. QED.
--------------------
Comment: This is a pretty interesting problem, combining basic number theory with a nice application of the Pigeonhole Principle. The key step is noticing that if two numbers lie in an interval of $ n $ consecutive integers then their greatest common divisor is less than $ n $. Then we have to eliminate the smallest interval $ [1, n-1] $ so we could have just $ n-1 $ intervals and apply Pigeonhole.
--------------------
Practice Problem: Find all integer values $ a,b,c $ such that the equation $ ax+by = c $ has a solution $ (x,y) $ in the integers.
No comments:
Post a Comment