Wednesday, December 2, 2015

Infinitesimals

So now we get to the bit that looks kind of weird to many mathematicians, as opposed to just being weird to laypeople.
One of the things we take for granted about the real numbers, or indeed elements of any field, is that if you have a number \(b\) and \(b\) is not 0, then if \(ab = 0\) then \(a\) has to be 0. The technical jargon is that a field has no "zero-divisors".
For our purposes, we're going to break this rule. We're going to define what are called "infinitesimals". We say that a number \(s\) is infinitesimal if \(s\) is not equal to 0, but for some positive integer \(n\), \(s^n = 0\). This isn't true for any real or complex number, so infinitesimals are necessarily a new thing.
We'll assume for all of our infinitesimals here that \(s^2 = 0\). We could have infinitesimals where \(s^3 = 0\) but \(s^2 \neq 0\), but we don't need them for what I want to do. So now we explicitly define an infinitesimal to be an object \(s\) where \(s \neq 0\) but \(s^2 = 0\).
if \(s\) is an infinitesimal and \(a\) is a real number, then \(as\) is also an infinitesimal. We can add infinitesimals to regular numbers, and get things like \(a + bs\), and we can multiply such sums:
$$(a + bs)(x + ys) = ax + ays + bsx + bsys = ax + (ay + bx)s$$ Here we've decided that for \(s\) an infinitesimal, \(as = sa\) just like with regular numbers. We also say that for two infinitesimals \(s\) and \(t\), \(st = ts\). Just to make life simpler.
Also note that if \(s\) is an infinitesimal and \(t\) is an infinitesimal, then \(st\) is either infinitesimal or 0. We'll say that \(s\) and \(t\) are independent infinitesimals if they're both infinitesimal and \(st\) is not 0. I don't think this is standard convention, but I need a term for this.
So let's talk about calculus for a moment.
Consider the function \(f(x) = x^n\). What happens if we evaluate \(f(x + s)\)?
Recall the binomial theorem:
$$(x + y)^n = x^n + \binom{n}{1}x^{n-1}y + \binom{n}{2}x^{n-2}y^2 + \ldots + y^n$$ Plugging in \(s\) for \(y\) and recalling that \(s^2 = 0\), we get that
$$(x + s)^n = x^n + nx^{n-1}s.$$ This looks like \(f(x)\) plus \(f'(x)s\). Extending to polynomials and then to Taylor series tells us that for functions that are equal to their Taylor series,
$$f(x+s) = f(x) + f'(x)s.$$ Hence our infinitesimals allow us to take derivatives!
Perhaps more reassuringly, we can rewrite the previous expression as
$$f'(x) = \frac{f(x+s) - f(x)}{s}$$ where we have to be a bit careful about dividing by an infinitesimal; we can only do it because \(f(x+s)-f(x)\) is a multiple of \(s\); dividing a regular number by \(s\) gets us into trouble.
We'll use this trick of using infinitesimals to mimic derivatives several times, and every time we do we could have used derivatives but that would lead to a more complicated setup. We can do this only because I'm going to be doing everything algebraically; no inequalities, no limits, no integrals. Only these derivatives that we get from infinitesimals. Occasionally this will mean that I leave out some conditions necessary for various things to exist, instead just assuming that we're in whatever case needed for those things to exist. I'll try to note when that happens.
Anyway, for the nonce we're going to say that for a field \(\mathbb{k}\) and independent infinitesimals \(s\) and \(t\), the set \(\mathbb{k}[s,t]\) is the set of elements of the form \(a + bs + ct + dst\) where \(a, b, c\) and \(d\) are in \(\mathbb{k}\). Note that we can add, subtract, and multiply in this set without any problems; dividing is tricky, so we'll try to only divide by numbers in \(\mathbb{k}\).

No comments:

Post a Comment