A number of excellent writeups in
Big-oh notation describe the use of the O(f(x)) and o(f(x)) as a measurement of the time and space requirements of an
algorithm. No one however thought to mention how they used in
maths, where they are sometimes known as
Landau symbols.
The first important thing to remember is that little o notation is a local measurements of the behaviour of a function. In computer science, your functions generally take an integer as their argument: the size of the input. For example, sorting a list of n items is an O(n log n) operation, such a problem is o(n2) and so on.
This means that the only point which you can study locally is infinity. This may sound a bit of a contradiction, so I'll explain. If you want to study the behaviour of a function at some point, it is not enough to take a few values here and there. I want to be able to get closer and closer to the point I'm looking at, without ever getting there. For the integers you can do that with infinity, you just make n bigger and bigger (i.e. you take the limit when n goes to infinity). You can't do this with, for example, 3, as you can't get closer than 2 or 4 without hitting 3.
This however is maths, and I'll be considering real valued functions of an open interval I of the real line (sorry for the mumbo jumbo, but it's needed to make things happen right), so given any point, for example 3, I can get as close as I want without ever getting there. 2 2.5 2.9 2.94 2.99999 and so on.
So whenever you use these
notations in maths it should be clear where you are looking at.
Before I give a formal definition I'll give a brief reason why these notations can be useful. The most obvious reason is when you are using some kind of series (
Taylor series,
Bertrand Series,
Laurent series etc...) to
expand a function around a point (again a point can be
infinity) and you want to see how big your error is. An approximation is no good if you don't know how precise it is. In other words you want to bound your error i.e. make sure it is a little o of something.
Definitions
So, if x0 is an element of our interval I or infinity, and f is a function from I → R and g a positive function of I then :
-
f ∈ o(g) if lim x → x0 f(x)/g(x) =0;
-
f ∈ O(g) if ∃ M ∈ R such that ∀ x ∈ I, |f(x)| < M g(x)
You may note that f ∈ o(g) is stronger than f ∈ O(g), and indeed implies it, so from now on we will only talk about o(g).
Just in case you hadn't grasped that the point you are studying is important, consider the functions f:x → x and g:x → x2 on the interval ]0,∞[.
f(x)/g(x) = 1/x, g(x)/f(x)=x, so
- at 0, g ∈ o(f)
- but at ∞, f ∈ o(g)
Properties
These are easily deduced from the properties of the limit. The proofs are left as an exercise to the reader
Throughout this we consider f ∈ o(F) and g ∈ o(G), f,g, are functions from I to R, F and G are positive functions of I, and as before we are working at a point x0.
- Fg ∈ o(FG)
- the product fg is also in o(FG) (this is not quite the same as the previous line)
-
If g is nowhere 0 on I then the quotient f/g ∈ o(F/G)
-
if F ∈ o(G) then f+g ∈ o(G)
-
F+f ∈ o(F) : if you take something small and something smaller what you have is small
You may be wondering, what the point is. If I taylor expand 1/(1-x) at 0 as 1+x it's obvious that there is an o(x) (or O(x2)).
The answer is because you will make mistakes. You might want to taylor expand 1/(1-x)2 to the 2nd order and be tempted to say that it is 1+2x+x2.
It's not. It's 1+2x+3x2 + o(x2).
Where did the mistake happen? If you multiply out (1+x+o(x))2 you get
1+2x+x2+2x(o(x)) + o(x2)+ 2 o(x) =1+2x+o(x) +x2+o(x2)
If you look at the properties you will see that the o(x) "kills" the x
2. If you want the correct 2nd order
expansion you need to square a 2nd order expansion of 1/(1-x).
Lastly you may have noticed a slight ambiguity in the notation. o(f) and O(f) don't just represent the 2 sets, they can also mean an element of those sets. It should however be obvious which one is meant by the context.
sources :
http://mathworld.wolfram.com/LandauSymbols.html
http://mathworld.wolfram.com/AsymptoticNotation.html