I cringe when hearing "Math teaches you to think".
It's a well-meaning but ineffective appeal that only satisfies existing fans (see: "Reading takes you anywhere!"). What activity, from crossword puzzles to memorizing song lyrics, doesn't help you think?
Math seems different, and here's why: it's a specific, powerful vocabulary for ideas.
Imagine a cook who only knows the terms "yummy" and "yucky". He makes a bad meal. What's wrong? Hrm. There's no way to describe it! Too mild? Salty? Sweet? Sour? Cold? These specific critiques become hazy variations of the "yucky" bucket. He probably wouldn't think "Needs more umami".
Words are handholds that latch onto thoughts. You (yes, you!) think with extreme mathematical sophistication. Your common-sense understanding of quantity includes concepts refined over millenia (zero, decimals, negatives).
What we call "Math" are just the ideas we haven't yet internalized.
Let's explore our idea of quantity. It's a funny notion, and some languages only have words for one, two and many. They never thought to subdivide "many", and you never thought to refer to your East and West hands.
Here's how we've refined quantity over the years:
- We have "number words" for each type of quantity ("one, two, three... five hundred seventy nine")
- The "number words" can be written with symbols, not regular letters, like lines in the sand. The unary (tally) system has a line for each object.
- Shortcuts exist for large counts (Roman numerals: V = five, X = ten, C = hundred)
- We even have a shortcut to represent emptiness: 0
- The position of a symbol is a shortcut for other numbers. 123 means 100 + 20 + 3.
- Numbers can have incredibly small, fractional differences: 1.1, 1.01, 1.001...
- Numbers can be negative, less than nothing (Wha?). This represents "opposite" or "reverse", e.g., negative height is underground, negative savings is debt.
- Numbers can be 2-dimensional (or more). This isn't yet commonplace, so it's called "Math" (scary M).
- Numbers can be undetectably small, yet still not zero. This is also called "Math".
Our concept of numbers shapes our world. Why do ancient years go from BC to AD? We needed separate labels for "before" and "after", which weren't on a single scale.
Why did the stock market set prices in increments of 1/8 until 2000 AD? We were based on centuries-old systems. Ask a modern trader if they'd rather go back.
Why is the decimal system useful for categorization? You can always find room for a decimal between two other ones, and progressively classify an item (1, 1.3, 1.38, 1.386).
Why do we accept the idea of a vacuum, empty space? Because you understand the notion of zero. (Maybe true vacuums don't exist, but you get the theory.)
Why is anti-matter or anti-gravity palatable? Because you accept that positives could have negatives that act in opposite ways.
How could the universe come from nothing? Well, how can 0 be split into 1 and -1?
Our math vocabulary shapes what we're capable of thinking about. Multiplication and division, which eluded geniuses a few thousand years ago, are now homework for grade schoolers. All because we have better ways to think about numbers.
We have decent knowledge of one noun: quantity. Imagine improving our vocabulary for structure, shape, change, and chance. (Oh, I mean, the important-sounding Algebra, Geometry, Calculus and Statistics.)
Caveman Chef Og doesn't think he needs more than yummy/yucky. But you know it'd blow his mind, and his cooking, to understand sweet/sour/salty/spicy/tangy.
We're still cavemen when thinking about new ideas, and that's why we study math.
Other Posts In This Series
- How to Develop a Mindset for Math
- Developing Your Intuition For Math
- Learn Difficult Concepts with the ADEPT Method
- Brevity Is Beautiful
- Learning To Learn: Embrace Analogies
- Learning To Learn: Pencil, Then Ink
- Intuition, Details and the Bow/Arrow Metaphor
- Finding Unity in the Math Wars
- Why Do We Learn Math?
- Math As Language: Understanding the Equals Sign
- Learning math? Think like a cartoonist.
- Learning To Learn: Intuition Isn't Optional
- Avoiding The Adjective Fallacy