We will now consider some properties of the square roots of positive integers, beginning with one of the most famous examples, √2. Here is a surprisingly simple proof of irrationality, which we will later generalize to other cases.*
For the sake of contradiction, we will assume temporarily that √2 is a rational number, which we will denote a/b, where a and b are integers and b is nonzero. This means that:
a/b = √2
(a/b)2 = 2
a2/b2 = 2
a2 = 2b2
Consider the number of factors of 2 in a. a2 will have 2 times the number of factors of 2 in a, so it follows that a2 has an even number of such factors (which can still be zero). Applying the same reasoning, we see that b2 also has an even number of factors of 2 as well. So on the left side, we have an even number of factors of 2, and on the right, we have an even number of factors of 2, plus one extra from the coefficient, so the right side has an odd number of factors of 2. This implies that the same number has both and even and odd number of factors of 2, which is simply impossible. Therefore, by contradiction, the square root of 2 is irrational.
It is interesting to note that this proof can be adapted to other primes as well with minor changes to the argument. In all cases, we reach a contradiction about the number of some factors in the result. Furthermore, the proof can also be used for composite numbers by breaking them down into a product of prime factors and then accounting for the numbers of each of these factors. Considering the square root of 6, for example, we see that a2 = 2 * 3 * b2 and the same number has both an odd and even number of factors of 2 and 3. (Perfect-square cases such as the square root of 4 fail when we cannot reach a contradiction about the number of some prime factors. In this particular case, we have a2 = 4b2, giving an even number of factors of 2 on each side; therefore there is no such contradiction.)
It is reasonable to believe that we can use the above method to show that, except for perfect squares, the square root of an integer is irrational. However, I have also created a general proof that covers all of these cases together. This works by contradiction. We will assume temporarily that for some non-perfect square integer c, a rational number a/b in simplest form, where a and b have no common factors other than 1, is its square root. This implies that:
(a/b)2 = c
a2/b2 = c
We think that this fraction should somehow reduce to an integer. However, because a and b are relatively prime, a2 and b2 are also relatively prime (squaring a number does not introduce any new factors) and the only common factor of the numerator and denominator is 1. It then follows that the only way for this fraction to be an integer is for b to equal 1, which makes a/b an integer as well and contradicts our assumption that c is not a perfect square. Therefore, unless an integer is a perfect square, its square root is irrational. With generalization, this proof can also be extended to cover any nth root of an integer for any integer n > 1: Unless an integer c is a perfect nth power, its nth root is irrational.
*This is not the only possible proof of this fact; other methods do exist. We can, for example, use the polynomial x2 - 2 and the Rational Root Theorem (see "Rationality or Irrationality of Polynomial Roots") to demonstrate irrationality. We see that this polynomial has the positive and negative square roots of 2 as roots (by applying the difference of squares factoring pattern a2 - b2 = (a + b)(a - b)). However, when we apply the Rational Root Theorem to the original polynomial, we see that the only possible rational roots are 1/1, 2/1,and their opposites, none of which equals √2: 12 = (-1)2 = 1 and 22 = (-2)2 = 4. Then, by contradiction, √2 is irrational.