Prove that 1 - 3x + 6x² - 10x³ + ... does not converge for |x| >= 1.?
回答 (3)
Start with the geometric series
1/(1 + x) = 1 - x + x^2 - x^3 + x^4 - x^5 + ..., convergent for |x| < 1
Differentiate both sides twice:
2/(1 + x)^3 = 2 - 6x + 12x^2 - 20x^3 + ..., convergent for |x| < 1
Divide both sides by 2:
1/(1 + x)^3 = 1 - 3x + 6x^2 - 10x^3 + ..., convergent for |x| < 1.
We are guaranteed that the series in question does not converge for |x| > 1; we simply check the endpoints x = -1, 1. In either case, the series diverges by the nth term test, because the sequence of (absolute valued) coefficients {1, 3, 6, 10, ..., n(n+1)/2, ...} does not converge to 0.
I hope this helps!
The n-th term a(n) = (-1)^(n+1) * n(n+1)/2 * x^(n-1) so using the ratio test:
|a(n+1) / a(n)| = |1+n/2| |x| --> |x|
Thus the series converges if |x|<1 and diverges if |x|>1. For |x|=1 the series (assuming x is real) is either 1-3+6-10+... or 1+3+6+10+... both of which diverge.
收錄日期: 2021-04-21 01:16:19
原文連結 [永久失效]:
https://hk.answers.yahoo.com/question/index?qid=20150126082107AAkaJ4Q
檢視 Wayback Machine 備份