0

The only time I've come across something that converges is when oscillation occurs, for example,

$\pi \equiv \frac{4}{1}-\frac{4}{3}+\frac{4}{5}-\frac{4}{7}+\cdots$

Is this the only way in which graphs, series etc. converge?

If you continually add/subtract, exclusively, then it couldn't converge, right? Because you'll just be making your way to $ \infty $, albeit at a decreasing rate.

Tobi
  • 832
  • 1
  • 11
  • 21
  • 3
    What about $\frac 13 = \frac 3{10}+\frac 3{100}+\frac 3{1000}+\cdots$? – lulu Mar 29 '17 at 00:30
  • We have [this](http://math.stackexchange.com/questions/8337/different-methods-to-compute-sum-limits-k-1-infty-frac1k2) and [this](http://math.stackexchange.com/questions/389146/proof-of-frac1e-pi1-frac3e3-pi1-frac5e5-pi1-ldots) and [this](http://math.stackexchange.com/questions/29450/self-contained-proof-that-sum-limits-n-1-infty-frac1np-converges-for) and pretty much [this entire page](http://math.stackexchange.com/questions/tagged/sequences-and-series?sort=votes&pageSize=50) if you care to read through all that. – Simply Beautiful Art Mar 29 '17 at 00:31
  • Oh. My knowledge of math is basic; Thank you. – Tobi Mar 29 '17 at 00:31
  • 1
    The example I gave is pretty basic, it's just the usual decimal expansion $\frac 13 = .3333\dots$ – lulu Mar 29 '17 at 00:32

2 Answers2

2

To address your last question directly, if you continue to add (strictly positive) numbers together, they can converge, but the terms need to go to zero sufficiently fast. Two very standard examples are $$\sum_{n=1}^\infty \frac{1}{n} = \frac{1}{1}+\frac{1}{2}+\frac{1}{3}+\cdots = \infty,$$ but $$\sum_{n=1}^\infty \frac{1}{n^2} = \frac{1}{1^2}+\frac{1}{2^2}+\frac{1}{3^2}+\cdots = \frac{\pi^2}{6}< \infty.$$ It is always necessary that the terms get smaller and smaller (that is, go to zero), but this is not enough to guarantee that the series converges.

Problems of this nature are one of the central focuses of calculus.

David
  • 3,090
  • 2
  • 13
  • 21
1

A series is said to be "absolutely convergent" if $\sum_\limits{n=1}^{\infty} |a_n|$ converges. In order for this to happen $a_n$ must approach $0$ fast enough that the tail is effectively $0.$ The classic example would be the infinite geometric series. e.g. $\sum_\limits{n=1}^{\infty} \frac 1{2^n} = 1$

If the terms of the series alternate between positive and negative then the series is an "alternating series."

If an alternating series fails to be absolutely convergent, it still may be "condionally convergent" if $\lim_\limits{n\to\infty} a_n = 0$ and each successive term is closer to $0$ than the one before it. That is $n>m>N \implies |a_n|<|a_m|$

Doug M
  • 57,458
  • 4
  • 32
  • 64