Calculus Help

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

amsomerv

Junior Member
7+ Year Member
15+ Year Member
Joined
Oct 9, 2003
Messages
10
Reaction score
0
I figure most of you have taken calc and can give me a heads up on this problem.

Show that if p is greater than one and q is less than infinity and (1/p) + (1/q) = 1 and a and b are greater than zero then ab must be less than or equal to ((a^p)/p) + ((b^q)/q)

Some one suggested using the arithmetic-geometric mean inequality to prove it, but I am not really sure how to apply this. I am very inexperienced with calculus and coculd use some advice.
Thanks
 
I'll try to work in this after for a few minutes after dinner. My gut instinct would be to look at the intermediate value theorem.

btw - what class is this for (Calc I, CalcII, etc.)?
 
It is for calculus for life sciences, which is supposed to be an introductory level calc class for pre med majors. I am having a lot of trouble because I only took pre-calc in high school while everyone else in the class took AP calc and is using this for an easy A. The professor doesn't really explain things - he's more the type that just likes to do problems that seem to have no connection to anything that later appears in tests or homework, so I have done most of my learning from the book or internet tutorials, but I really can't find anything that will help me know where to get started.
 
I'd agree with Cerberus about using the intermediate value theorem. Unfortunately I don't have my calc book any more and can't remember how to apply it! Damn useless engineers.
 
I beleive you can use properties of convergence to assume the first eq is equal to 1. Then reduce the second to remove that and simplify the powers of p and q. Then I'm sure you can apply a basic theorem from whatever chapter you're in and that should be that. Hope this helps some, but probably not.
 
Hey,

Its seems more like discrete math than calculus, or its really beginners Calculus. Anyway, its really quite simple, you don't need to use the intermediate value theorem. (The intermediate value theorem is basically a theorem for continuous functions that states that if a continuous function takes on two values y1 and y2 at points a and b, it also takes on every value between y1 and y2 at some point between a and b.)

Anyway, you just need to use basic reasoning...

Prove: [a^(q/q-1)]/(q/q-1) + b^q/q >= ab

Given:

1/p + 1/q = 1

1/p = q-1/q

p = q/q-1

First of all, if q=1 you're screwed, so ignore that possibility. Also, q can't be equal to 0, since q/q-1 would then be 0, and dividing [a^(q/q-1)] (which would equal 0 in this case) by it would get you an undefined answer.
Therefore, you need to prove this using three cases:
a) one for if q is less than zero
b) one for if q is greater than zero and less than one,
c) one for if q is greater than one.

Its really simple from here and I am sure you can do this part on your own. You may have to refer back to some theorems that you previously learned because you'll have to create a formal proof in order to clearly show that [a^(q/q-1)]/(q/q-1) + b^q/q >= ab.

PM me if you need more help! 🙂
 
Top