Home > math education > Adding-up rules and Hockey Sticks

Adding-up rules and Hockey Sticks

July 7, 2011

So I’m at the math program HCSSiM, teaching for three weeks in a “workshop,” which means I am responsible for teaching 12 teenagers the basic language and techniques of math- things like induction, proof by contradiction, the pigeon-hole principle, and how to correctly use phrases like “without loss of generality we can assume…” and “the following is a well-defined function…”, as well as familiarity with basic group theory, graph theory, number theory, cardinality, and fun things like Pascal’s triangle.

It’s really beautiful, classical math, and the students are eager and fantastically bright. They are my temporary brood, and I adore them and feed them chocolate at evening problem sets.

It’s also a fine opportunity to do some silly math doodling just for fun, the only rules being you can’t use a computer to look anything up until you’re done, and you can only use the stuff your kids at the program already learned. I’m going to describe what my mom and I, and then a junior (Amber Verser) and senior (Benji Fisher) staff member at the math program, figured out in the last couple of days. It’s super cool and turns out is at least 400 years old.

One of the most common examples of proof by induction is the formula for the sum of the counting numbers up to n:

1 + 2 + 3 + … + n = n(n+1)/2

And then, once you figure that out, you move on to the next case:

1^2 + 2^2 + 3^2 + … + n^2 = n(n+1)(2n+1)/6.

If you’re really into it, you can put the next case on the problem set:

1^3 + 2^3 + 3^3 + … + n^3 = (n(n+1)/2)^2.

Two obvious patterns are emerging when you add up successive dth powers up to n.

  1. It’s a polynomial of degree d+1, and
  2. The roots of the polynomial are symmetric about -1/2 (mom noticed this!).

How do you prove those two facts?

If you think it’s totally easy, stop reading now and give it a shot. There are about a million things you could try and none of them seem to work. I’ll wait.

…okay, let’s say you gave up, or already know, or don’t care. (Why are you reading still if you don’t care?!)

First let’s generalize the question to, if we add up values of some degree d polynomial for values i=0, 1, 2, …, n, then we want to prove the result is a degree d+1 polynomial in n. That this is equivalent to the first statement above is pretty easy to see by just re-arranging the terms of the double sum over i and over the terms of the polynomial in question. But it still seems like you need to know at least the answer to the question of what is a formula for 0^d + 1^d + 2^d + … + n^d, which is of course where we started.

But that’s where Pascal’s triangle comes in! We can generate Pascal’s triangle by the familiar “add up two consecutive numbers and put the answer below,” but we also can think of the element on the nth row and kth (tilted) column of Pascal’s triangle as the number of ways to choose k things from n things, which is referred to as “n choose k”, and where we start both the row and column counts at 0, not at 1. That definition satisfies the addition law because, if we have n things, we can label one as “special,” and then the choice of size k subsets of the n things divide into two categories: the size k subsets that contain the special guy and the ones that don’t. If they do, then we need only find k-1 other things in the remaining n-1 size set, and the number of ways to do that is given by the element on row n-1 and column k-1. If they don’t contain the special guy, we need to find k things in the remaining n-1 size set, and the number of ways to do that is given by the element on row n-1 and column k.

On the other hand, we also know a formula for the numbers in Pascal’s triangle: the guy on the nth row and kth column is given by a degree k polynomial in n, namely n!/k!(n-k)!. (This is because we can label all of the guys 1 through n, and just take the first k guys, and there are n! ways to label n things, but we don’t actually care about the order among the first k or among the last n-k.)

For example, in the second column, where we are looking at “n choose 2” for various n, we have the equation n(n-1)/2. This is a LOT like n^2 but has extra terms sticking on the end of lower order. When you’re looking at the third column, you’re working with the formula n(n-1)(n-2)/6, which is like the basic polynomial n^3 with extra stuff. In other words, the formula for “n choose k” is a degree k polynomial in n which we can think of as being a stand-in for n^k. Awesome.

The last ingredient is something called the “Hockey Stick Theorem,” which you gotta love just because of the name. It states that if we add up the values along a column, from the top of the rows down to the nth row, then the sum will be the number just below and to the right, and the entire picture will resemble a hockey stick.

The proof of the Hockey Stick Theorem is trivial- the answer is of course the sum of the two above it, and we have one in the sum already, but the other isn’t… but that other is the sum of the two above it, one of which is again already in the sum but the other isn’t… and you keep going until you get to the top edge of Pascal’s triangle, where the missing number is just 0.

Why does the Hockey Stick Theorem give us what we want? Going back to our generalized statement, we want to show the sum of values on a (any) degree d polynomial for i = 0, 1, 2, …, n is a degree d+1 polynomial. Well, use the dth column and make a hockey stick from the top to row n. Then the sum is on the (n+1)st row, in the (d+1)st column, which we know is a degree d+1 polynomial in n. Woohoo!

One way of looking at this is that we were actually asking the wrong question: instead of asking what the sum of the dth powers is we should have perhaps been asking what the sum of the dth column of Pascal’s triangle is; in other words, there is a better basis for the vector space of polynomials than x^d, namely “x choose d”. In fact, if there were an agreement in the world that actually the “x choose d” polynomials should be the standard basis, (by the way, these basis polynomials would be called “Pascalinomials”!) then the hockey stick theorem would be the last word on how do those things add up. As it stands, to figure out the actual formula for the sum of the dth powers for i=0, 1, 2, …, n, we need to write the first row of the change-of-basis matrix from one basis to the other.

As for the second question, we simply need to extend the definition of the sum F(n) of dth powers from 0 to n to the case where n is negative, by iteratively using the relation:

F(n) = F(n-1) + n^d, or

F(n-1) = F(n) – n^d.

Then we have F(0) = 0, F(-1) = 0, F(-2) = (-1)^(d+1), F(-3) = (-1)^(d+1) – (-2)^d = (-1)^(d+1)(1^d + 2^d) …, and it’s easy to prove that, for any n,

F(n) = (-1)^(d+1)F(-n-1).

This means that if we have a root at -1/2 + a, we also have a root at -1/2 – a = -(-1/2 +a) -1.

Categories: math education
  1. Lior's avatar
    Lior
    July 7, 2011 at 7:41 pm

    The proof using the basis “x choose d” is the “right” one, but there are others. Here’s one which may also be suitable for your group of students, based on integration being inverse to differentiation: define a map on polynomials by $(D(f))(x)=f(x)-f(x-1)$. [The matrix of this is triangular in the standard basis of monomials so] it’s not hard to check by induction on degree that $D$ maps polynomials of degree $d$ surjectively on polynomials of degree $d-1$ [as long as $d!$ is invertible in the base ring].

    Like

  2. July 8, 2011 at 12:29 pm

    Lovely stuff.

    There’s a blog that has a lot of the same mathematical taste as this one at

    http://patternsinpractice.wordpress.com/

    Right now, there’s a daily posting of the problem sets that are being used in the summer course for teachers at PCMI.

    Al Cuoco

    Like

  1. No trackbacks yet.
Comments are closed.