A Probability Problem That Uses No Probability: Part II

 Yesterday I deprived you of a derivation and I now plug that hole. On reflection, I much prefer the method of  Michael Penn who spends sometimes half a video arming himself with a set of 'tools' before facing the task in hand, deploying his arsenal where necessary. This feels more pedagogically elegant and in future posts "like this" will precede posts "like yesterday's".

We are trying to prove that a certain alternating series is equal to the natural logarithm of 2. The following derivation (I won't use the word proof as to address issues of convergence would detract from the plot) contains some, for lack of a better phrase, magic tricks.


This is a clever rewriting: the feeling of adding and subtracting fractions might trigger a memory of computing the definite integral of a polynomial.


Then the Fundamental Theorem of Calculus allows us to write this back in terms of an integral.


If we've encountered one infinite series then it's the geometric series and we notice this fits the bill with first term 1 and common ratio -x!


And luckily we have a nice formula for an infinite geometric series.


And this is an integral that hopefully looks familiar! We can integrate directly, substitute the bounds and voila! While the first steps might have felt a little out the blue, let's review the structure of the argument. We had an infinite series, but we only know about the geometric series. So we transform this series into a geometric series inside an integral, apply our trusty formula then pull it out of the integral by computing an antiderivative. But if we're going to go to the effort of learning a trick, what's the fun in only using it once. I hear you scream "But Fin, can we calculate the values of any other series with this method?!" I'm so glad you ask. I include below a similar use, going the other direction this time and the explanations of the steps are identical to the above:


Again: clever rewriting, fundamental theorem of calculus, geometric series formula, interchange integral and infinite sum with no justification.These results feel like 2 sides of the same coin: is there any way we could have derived both in one fell swoop? We can, using complex numbers. It feels a little circular since we need the Taylor series of the natural logarithm, something we haven't used here, but we can derive this by calculating the complex logarithm of the number 1 + i in 2 different ways. Firstly we use Euler's formula to write 1 + i as \sqrt(2)exp(i\\pi/4) and then we use the Taylor expansion and compare real and imaginary parts: like any wise and economic professor, I leave this as an exercise.
This post has been a bit more technical than previous ones but I promise something more philosophical about measure theory tomorrow!




Comments

Popular posts from this blog

Paradoxes Part I: All Horses Are The Same Colour

Some Infinities Are Bigger Than Others

Paradox III: Ant on a Rubber Rope