provenance: unknown

« Get rich quick  |  Hard fun »

Euclid, anyone?

Assume the square root of 2 is rational. Then it can be expressed as a fraction, and this fraction is reducible to a unique expression of the form x/y, with x and y both whole numbers and for which there exists no z, also a whole number, such that z divides x and z divides y.

Since squareroot(2) = x/y
then 2 = x2/y2
and 2y2 = x2.

Then 2 divides x2 (i.e., x2 is even), and x is even. (If x were odd, x2 would be odd.)

Define x1 such that x = 2x1.

Then 2y2 = (2x1)2 = 4x12
and y2 = 2x12.

Then 2 divides y2, and therefore 2 divides y (by the same reasoning as above).

Then both x and y are divisible by 2. I.e., there exists no unique expression x/y such that there exists no common factor z that divides both x and y.

Therefore the square root of 2 is irrational.

(For my cousin Mark, who didn't think terribly much of it.)

September 27, 2002 12:58 AM

Comments (and TrackBacks)

One of my top 3 arguments. Evah.

Posted by Graham on September 27, 2002 8:52 AM

What are the other two? Please tell!

Posted by M on September 27, 2002 12:23 PM

Cantor's Diagonal proof and Euclid's proof of the infinitude of primes. Easy. After that they start to get obscure.

Posted by Graham on September 29, 2002 3:22 PM

Ah, so. My friend was a substitute teacher for a year and would teach Cantor's diagonal proof to 7th and 8th grade classes -- I always thought that alone was sufficiently good reason to go be a sub. And it was what I was going to show my cousin next, but then he went and acted all bored on me. I'll see what he thinks of it some other time....

Posted by M on September 29, 2002 5:01 PM

I often show my calc students Euclid's proof. Some of them get pretty excited - lots get horribly confused about proof by contradiction.

Posted by Graham on September 29, 2002 6:53 PM

A learned friend once told me that the Greeks' would keep the 'root 2 is irrational' proof from all but their most advanced students because they thought it was a loophole in mathematics that would challenge all their teaching. Since they had great difficulty conceiving of a number that was neither odd nor even, only the most gifted were let into the secret of their existence.

Posted by umop apisdn on September 30, 2002 10:50 AM

lots get horribly confused about proof by contradiction.

That is a healthy confusion to have, though. On some level, reductio is (of course) highly suspicious; it's not really clear that you're actually talking about anything when you make an assumption that can't possibly be true....

Posted by M on September 30, 2002 2:57 PM

it's not really clear that you're actually talking about anything when you make an assumption that can't possibly be true....

What you need is one of two things: A rock-hard confidence that everything is possible until we prove otherwise (there are some interesting points here about the role of temporality in mathematics, but nevermind), or a willingness to assign the same ontological status to math as to unicorns. In the second case, think about sentences like, "Unicorns are like horses with horns." In one sense, you're not "talking about anything" — in another, what you said is (almost) tautological, so darn well better be true.

An ability to tolerate this sort of slipperiness is, in my experience, a major stumbling block between passing your calculus class and being a math major.

Posted by Graham on September 30, 2002 3:29 PM

I'd like to retract that last sentence above on the grounds that (a) it makes me sound like an arrogant ass; and (2) I'm not really sure it's true. Thank you.

Posted by Graham on September 30, 2002 4:19 PM

It's an interesting idea, though. The slipperiness isn't so much mathematical as it is philosophical -- it only arises if you ask what your doing actually means. Maybe philosophical rigor -- unless you're remarkably good at resolving the inevitable conundrums -- really does interfere with one's ability to do mathematics.

(Of course, in practice, it could just as well be that it's a lack of philosophical rigor that's the problem....)

P.S. Your remark idn't sound arrogant to me; what you observe is what you observe.

Posted by M on September 30, 2002 7:38 PM

You're right - as long as you don't look down, you don't fall into the canyon. We should all learn life's lessons from Wile E. Coyote. Most working mathematicians regard ontological debates about mathematical objects to be a waste of time, and even more, an impediment to working. Just don't look down.

I tried to be careful in that potentially arrogant comment to say "tolerate", rather than "resolve", or the like. If you spend too much time thinking about the slipperiness, you find yourself wanting to do an entirely different kind of (meta)mathematics. That's a lovely thing, and I'm glad there are people doing it, but part of the reason I'm glad about it is that it means I don't have to. I'm much happier playing with my rings and things of doubtful ontological status.

Posted by Graham on October 1, 2002 11:38 AM

Ah, I am jealous. I loved algebra and analysis and topology, but in the end their beauty wasn't enough for me. Not that I quite understand why; I still haven't figured out just what it is I'm looking for.

Posted by M on October 2, 2002 12:39 AM

i would like to state for the record that i love both of you.

Posted by k on October 6, 2002 3:07 AM


Post a comment

Name:


Email address: (optional)


URL: (optional)


Comment:


Remember info?


Copyright ©2001-2003 Matt Pfeffer

NAVIGATE

. Home
. Web Editing
. Stray Voices
. Writings
. About
. Archive