Maths

2008-09-21 8:06 pm
For a certain function f(x), f(a) + f(d) = f(c) + f(b) and a> b > c > d > 0.
Also f"(x) < 0 and f'(x) > 0 for all x > 0.
Prove that f[(a+d)/2] > f[(c +b)/2]. Please help.

回答 (1)

2008-09-22 12:38 am
✔ 最佳答案
Since f'(x) > 0 for all x > 0, the graph of f(x) is always sloping upwards.
As f"(x) < 0 for all x > 0, the the graph of f'(x) is sloping downwards.
So, the slope of f(x) is postive but decreasing as x increases.
So, the graph of f(x) is similar to the graph of y=√x + C ,
where C is a arbitrary constant.
The graph is shown on the link http://hk.geocities.com/cipker/graphfx.bmp

f(a) + f(d) = f(c) + f(b)
f(a) - f(b) = f(c) - f(d)
Let k= f(a) - f(b) = f(c) - f(d)
From the graph, we can easily see that a - b > c - d.
So, a+d>c+b
(a+d)/2 > (c +b)/2

As the graph of f(x) is sloping upwards all x > 0,
f(n) > f(m) when n>m for any numbers n and m.
So,
f[(a+d)/2] > f[(c +b)/2]


收錄日期: 2021-04-12 15:02:47
原文連結 [永久失效]:
https://hk.answers.yahoo.com/question/index?qid=20080921000051KK00844

檢視 Wayback Machine 備份