If f(n) = o(g(n)) , then is 2^(f(n)) = o(2^(g(n)))?
Solution 1
It is, at least if g(n) is converging to positive infinity for n to positive infinity (if g(n) isn't there are easy to find counterexamples).
Sketch of a proof:
Prerequsites: g(n) is converging to positive infinity for n to positive infinity.
f(n) in o(g(n)) means:
for every eps > 0 exists a n0 so that for all n > n0 abs(f(n)) < abs(g(n)*eps).
Form this follows:
2^abs(f(n)) < 2^abs(g(n)*eps) = (2^eps)^g(n) (for n > n0).
For eps < 1:
(2^eps)^n is in o(2^n) (as 2^eps < 2)
it follows that for every eps2 > 0 exists a n1 so that for all n > n1
(2^eps)^n < eps2*(2^n).
Choosing eps2 = eps vields:
(2^eps)^n < eps*(2^n) for all n > n1 (n1 is dependent on eps)
Because g(n) -> pos. inf. for n -> pos. inf. there exists a n2 so that
g(n) > n1 for all n > n2
Following from there is
(2^eps)^g(n) < eps*2^g(n) for all n > n2.
So for every 0 < eps < 1 there exists a n3 >= max(n0,n2) so that
2^abs(f(n)) < 2^abs(g(n)*eps) = (2^eps)^g(n) < eps*2^g(n) for all n > n3.
For every eps3 > 1 also:
2^abs(f(n)) < eps*2^g(n) < eps3*2^g(n)
So for every eps > 0 there exists a n3 so that
2^abs(f(n)) < eps*2^g(n) for all n > n3
Becase 2^f(n) < 2^abs(f(n)) for all n, and 2^x > 0 for all real x, it follows
abs(2^f(n)) < abs(eps*2^g(n)) for all n > n3
q.e.d.
If something is unclear or wrong, please comment.
EDIT: Some thoughts on other g(n):
A subsequence of g(n) is restricted i.e. it exists some x so that there isn't an n0 with abs(g(n))>=x for all n > n0:
o(g(n)) consists only of functions that are constant 0 after some n or converge to 0. Then 2^g(n) also has a restricted subsequence, but 2^f(n) is constant 1 after some point.
There is no n0 so g(n) > 0 for all n > n0:
2^g(n) < 1 if g(n) < 0, so g(n) has a restricted subsequence meaning o(2^g(n)) consists only of functions that are constant 0 after some n or converge to 0.
Solution 2
Here's an answer. The result depends on the convergence properties of g(n).
First consider the related limit:
lim(x->inf) log_2 ((2^(f(n))) / (2^(g(n))))
=
lim(x->inf) ( log_2(2^(f(n))) - log_2(2^(g(n))) )
=
lim(x->inf) ( f(n) - g(n) ) = lim(x->inf) ( g(n) * f(n) / g(n) - g(n) )
=
lim(x->inf) ( -g(n) ) = - lim(x->inf) g(n)
... Now, to get this into the form of the original question in your post, IF it is mathematically correct to switch the limit and the log (which I think it is), then we would have:
lim(x->inf) log_2 ((2^(f(n))) / (2^(g(n))))
=
log_2 lim(x->inf) ((2^(f(n))) / (2^(g(n)))).
Now, moving on to answer the question. If it is true that
2^(f(n)) = o(2^(g(n))),
then in the limit, the right-hand side becomes
log_2 (0) = - inf
... so in this scenario it must be true that
lim(x->inf) g(n) = inf
This result does seem to make sense. Consider f(x) = exp(-x) and g(x) = 1 - exp(-x)
. Clearly, in this example f(x) = o(g(x))
. However, 2^(f(x)) is not o(2^(g(x)))
because
lim(x->inf) (2^exp(-x)) / (2^(1-exp(-x))) = 1/2.
But given f(x) = exp(x), g(x) = exp(2x)
, where also f(x) = o(g(x))
and where lim(x->inf) g(x) = inf
, meeting the above condition, we have
lim(x->inf) (2^exp(x)) / (2^exp(2x))
=
lim(x->inf) 1 / 2^(exp(x)*(exp(x) - 1)) = 0
so 2^(f(x)) = o(2^(g(x)))
.
Mr_and_Mrs_D
Be warned - the Monster isAlife Git, Java, Android and finally Python I was flirting with JEE since a couple years but since 1/2014 we are having an affair I spent the best part of the last year refactoring a widely used mod manager application. Here is the commit message of the release I have been working on, where I detail what I have been doing: https://github.com/wrye-bash/wrye-bash/commit/1cd839fadbf4b7338b1c12457f601066b39d1929 I am interested in code quality and performance (aka in the code as opposed to what the code does) If you find my posts useful you can buy me a coffee TCP walks into a bar & says: “I’d like a beer.” “You’d like a beer?” “Yes, a beer.”
Updated on June 04, 2022Comments
-
Mr_and_Mrs_D almost 2 years
Notice that I am asking for little-o here (see similar question here) - for big Oh it's clearly wrong - for little-o it feels right but can't seem to prove it...
EDIT: glad I raised a debate :) Assume f,g > 0 for simplicity
-
huon about 12 yearsI don't think the step from
lim (g*f/g - g )
tolim -g
is valid for allf
andg
(not without some justification/explanation at least, becauseg
can go to infinity faster thanf/g
goes to zero). And it is correct to swap the limit with the log (log is continuous for positive numbers, which is all you need). I agree with your conclusion (that it depends on specific properties ofg
). -
Dan Nissenbaum about 12 yearsPerhaps this proves it: For any
eps
, however small, we havelim f(n)/g(n) < eps
forn>N
and someN
(assume the magnitude of the limit is intended). Therefore,lim (gf/g-g) < lim (g*eps-g) = lim (g(eps-1) = lim (-g)
. Do you think? (For the latter steps, it assumesg>0
, but it should be the same form if g<0. Perhaps there is some special case if g=0 in the limit.) -
Dan Nissenbaum about 12 years... Addendum: it seems, actually, that for the latter steps, not only does it assume that
g>0
, but it also washes over the sign oflim f/g
, so perhaps this influences things, as well. -
Mr_and_Mrs_D about 12 yearsMistakes - please go through it again - not sure about the proof but for instance
f(x) = x, g(x) = 2x
,f(x)
!=o(g(x))
- same with the other one. Consider f,g positive -
Mr_and_Mrs_D about 12 yearsfirst line - you mean g(n) > 0 (for all n after some point)? - this is unneeded if
g->inf
forn->inf
(follows from it). If g(n) does not (go to infinity) can you prove it or give a counter example ? Assume g > 0 for all n. Also your definition is wrong -o
means that for every c > 0 there is an n0 : f < cg for every n=>n0 - notice the strict inequality. Assume g,f >0. -
Mr_and_Mrs_D about 12 yearsand what about eps > 1 ? also in line
(2^eps)^g(n) <= eps*2^g(n) for all n > n3.
do you mean eps2 ? -
Dan Nissenbaum about 12 yearsThanks. It was the easy case so I wasn't careful. I intended
exp([2]x)
, not[2]x
. For the other case, it was intended thatf(x)
!=o(g(x))
, as noted. -
Dan Nissenbaum about 12 yearsPardon me. For the other case it was intended that
f(x) = o(g(x))
(equals sign), so I believe you're mistaken that it's not. (This case isf(x) = exp(-x)
,g(x) = 1 - exp(-x)
.) -
Ral Zarek about 12 years@Mr_and_Mrs_D I added some sketch of a proof for the other cases I could think of (if you can think of more just drop a comment). Yes the g(n) > 0 isn't really needed. On the > vs >=: corrected it. eps > 1 usually isn't a problem, if abs(f(n)) < abs(eps*g(n)) for an eps < 1 it also is for any eps >= 1.
-
Mr_and_Mrs_D about 12 years
1-exp(-x)/exp(-x) -> inf
so yes but still not sure about how the definitions go for non positive functions. Consider only positive ones plz -
Mr_and_Mrs_D about 12 yearsif abs(f(n)) < abs(eps*g(n)) for an eps < 1 it also is for any eps >= 1. : indeed but I need to find
for every eps > 0
a suitable n0 s.t. f < eps*g for n > n0 ! Your proof seems to suggest that if there is an eps <1 s.t f < eps * g (a mere f = O(g) for an eps < 1) then 2^f = o(2^g) - correct ? -
Mr_and_Mrs_D about 12 yearsAlso - on
g -> inf
: not needed. You are just using g > 0. g may very well be bounded - as long as f goes to zero I guess. Edit : (2^eps2)^g(n) < eps2*2^g(n) for all n > n3 should read (2^eps)^g(n) < eps2*2^g(n) for all n > n3. At this point - review your question - are you really proving that a single eps1 < 1 is enough ? -
Mr_and_Mrs_D about 12 yearsAslo : There is no n0 so g(n) > 0 for all n > n0. This does not mean that g(n) is bounded. Consider g = exp( n )* abs( sin( n*pi/2 ) )
-
Ral Zarek about 12 years@Mr_and_Mrs_D It's rather common to restrict to 0 < eps < 1 in a mathematical proof, as eps is usually considered to be very small (as close to 0 as posible). In this case: if there is a n0 for an eps < 1 that n0 also works for any eps >= 1.
-
Ral Zarek about 12 years@Mr_and_Mrs_D There was a typo in the previous expression. All the eps mean any number > 0 (in one case any number between = and 1). I probably should have used eps throughout the proof, but usually that confuses non-mathematicans.
-
Ral Zarek about 12 yearsI didn't say g was bounded in case no n0 so g(n) > 0 for all n > n0, I said 2^g(n) has a bounded (by 1) subsequence, which has pretty much the same effect for o. I clarified this now.
-
Dan Nissenbaum about 12 yearsIf all functions are positive, I believe the proof is straightforward as it stands. Are there incorrect steps?
-
Mr_and_Mrs_D about 12 years