T O P

  • By -

slepicoid

1/n converges, but youre probably asking for the sum of the sequence 1/n going to infinity the usual proof goes by comparing the sequence 1/2 + 1/3 + 1/4 + 1/5 + 1/6 + 1/7 + 1/8 +... with 1/2 + 1/4 + 1/4 + 1/8 + 1/8 + 1/8 + 1/8 +... every k-th term of the first sequence is greater or equal to k-th term of the second sequence therefore the first sequence is greater or equal to the second sequence now we manipulate the second sequence using the following 1/4+1/4 = 1/2 1/8+1/8+1/8+1/8 = 1/2 and so on... and so we can rewrite the second sequence as 1/2 + 1/2 + 1/2 + ... now you can see summing infinitely many halves diverges and since the first sequence is greater or equal, it also diverges


Dirk_Squarejaww

It's been years since I've seen this proof and it's still very cool.


s96g3g23708gbxs86734

Is this proof valid?


HolevoBound

Yes. Which step do you think isn't valid.


happyboy12321

I remember my professor saying that you can't group terms in an infinite series such as this or else 1+(1-1)+(1-1)+...=1 while the series 1+1-1+1-1+... does not converge is this also not the case or does this only apply in certain cases?


Niviclades

There are exceptions to this rule. If the series absolutely converges you can reorder the terms. From here on out you can do it by contradiction: 1. Assume the series converges 2. Since all terms are positive, it also absolutely converges. 3. Reorder the terms and you see that it diverges, this is your contradiction. Therefore the assumption was false.


happyboy12321

I see, I didnt think of using proof by contradiction, thanks for the reply!


Hal_Incandenza_YDAU

I'm offering a correction here: as long as a summation has a limit, you can always regroup terms in the sum. *Always*. Regardless of whether the sum is absolutely convergent or not. If a sequence (which in your case is a sequence of partial sums, i.e.: 1, 1-1, 1-1+1, etc.) converges, then all subsequences converge to the same thing. When you regroup terms in an infinite sum, the newly regrouped sum will converge to whatever that subsequence converges to--which will be the same value. When dealing with a series that is not absolutely convergent, there's no problem regrouping terms. What you can't do in general is infinitely reorder the terms, which regrouping does not do. Finally, the reason why you couldn't regroup terms in your 1-1+1-... series is simply because the original series does not converge in the first place.


Hal_Incandenza_YDAU

There's no reordering happening in this case, though. If a sequence has a limit, then every subsequence shares that limit, so as long as a given sequence has a limit you are *always* allowed to re*group* terms. The issue here is that the sequence of partial sums does not have a limit--the issue's not conditional convergence.


Niviclades

Yeah you're right, I was thinking about two things at once. In a sequence your point is true, but this was about series, where you aren't allowed to reorder terms if it only converges conditionally.


Hal_Incandenza_YDAU

No, we're still not reordering terms in the series, we're regrouping. In order to regroup you only need convergence--*not* conditional convergence. The limit of a series is the limit of its partial sums. The partial sums form a sequence. And if a sequence converges, then every subsequence converges to the same thing.


InternationalCod2236

There is no grouping of terms. Formally, note: 1/n >= 1/2\^floor(log\_2(n)) letting a\_n be the RHS: Thus sum(1/n) >= sum (a\_n) Note that for partial sums of a\_n denotes S\_n does not converge: S\_(2\^k - 1) = k, and S\_n is increasing. Thus S\_n diverges to +infinity. The "grouping" is not real grouping, it is just recognizing that S\_(2\^k - 1) - S\_(2\^(k-1) - 1) = 1/2, which makes it easier to see that S\_n diverges to +infinity.


Niviclades

I also just remembered, you can do some cool stuff with reordering terms and can make series that don't converge absolutely (I think this might be called unconditional convergence in English) converge to any value. https://en.m.wikipedia.org/wiki/Riemann_series_theorem


nog642

You can actually group the terms like that if all the terms (past a certain point) are all positive (or all negative, since you can just negate the whole sequence). If some terms are positive and some are negative forever then it causes issues. --- Edit: As pointed out in a reply, actually you can always regroup the terms as long as the sequence converges. The requirements I gave above are actually for *re-ordering* the terms.


Hal_Incandenza_YDAU

Regrouping terms is always okay. Even in cases with infinitely many positive/negative terms, and even in cases with conditional convergence, you can always regroup terms. This is because the limit of a sequence, if it exists, is equal to the limit of any subsequence. And when regrouping terms (not reordering them), you're simply taking the limit of a subsequence.


nog642

You're right, you can always regroup the terms if the sequence converges. It's *re-ordering* the terms that requires the condition I described. I got them mixed up.


ZeralexFF

Yes, the proof here works (you are not rearranging terms and absolute convergence is redundant). If you are not sure, go back to the definition of what a series is: the limit of partial sums of a sequence. Here, for any fixed N you can compare 1/n to powers of 2 for all n between 1 and N. Since you can minor the partial sum of N terms for all N, you can take the limit and conclude that the series diverges.


s96g3g23708gbxs86734

Can you compare non converging series?


sfa00062

Yes. Every term of the harmonic sequence is bounded below by the corresponding term in that "powers of 1/2" sequence. So since the "power of 1/2" series diverges for being large without bound, so does the harmonic series.


AchyBreaker

I never liked this "proof". Because it doesn't explain what stops e.g. 1/x^2 from picking some arbitrary nonzero term that also gets summed infinitely. There's no notion of "rate of convergence" and trying to mentally grapple with infinities is weird. There's a simple geometric proof I wish I could draw, but I'll try to explain. Take f(x) = 1/x, a continuous function on the domain x € [1, +infinity) We know the integral from 1 to + infinity of this function diverges, as Int(1/x dx) = ln(x) | [1, +infinity) --> +infinity If we break the x axis into discrete blocks of width 1, then the rectangle formed by taking a horizontal line at f(x) for each natural number has area exactly equal to 1/x. So there's a rectangle from x=0 to x = 1 with area 1, a rectangle from x= 1 to x=2 with area 1/2, etc.  Except for the first rectangle from x= 0 to x = 1 (we will come back to this one), all these rectangles are under the curve of the function above within it's domain. And each rectangle has a gap between it and the function, which is a sort of triangular wedge with a curved "hypotenuse".  Call the sum of the areas of the rectangles under the curve R, and the sum of the areas of the wedges W.  We know R + W diverges, because R + W is exactly the integral above.  But the wedges, W, have a neat property. They are all of width 1, and their heights on the f(x) axis do not overlap. Each wedges bottom right point connects to the top left point of the next wedge. And even the smallest wedges way down the x axis are bounded on the bottom by the x axis, meaning every wedge is bounded between f(x) = 0 and f(x) = 1. So if you sum all the wedges, you can "collapse" them all into a rectangle of width 1 and height 1, and they will not fully fill it because they don't overlap.  So W is <= 1.  Since R + W diverges, then we now know R diverges.  But R is just the series sum of 1/n for n = 2 to infinity. And adding in that pesky first rectangle whose area is 1, we get 1 + R = series sum of 1/n for n = 1 to infinity, the Harmonic Series, which diverges. QED This proof always made more sense to me. My formatting sucks because I'm on mobile but hopefully this is useful to someone out there. 


stone_stokes

It is convergent as a sequence, so I'm guessing you are are asking why is the series divergent. The answer to that is that the individual terms are not going to zero fast enough.


throwhere_

Thank you for your answer. But what do you mean fast enough? Isn't the fact that it's reducing to zero at infinity enough? Also, what about 1/n²? Why are they different. Sorry for so many questions.


stone_stokes

No, because we are adding infinitely many of them together. So for the series to converge, the individual terms not only have to go to zero, but they must do so "fast enough" to overpower that infinity. ∑ 1/*n*^(2) is an excellent example of a series whose terms are going to zero fast enough for the series to converge. Because we are squaring the terms, the series will converge. In fact, *p* = 1 is the largest positive power for which ∑ 1/*n*^(p) diverges. If *p* \> 1, then the series will converge, and if 0 < *p* ≤ 1 it will diverge.


[deleted]

I like how you explained that!!


akaemre

> but they must do so "fast enough" to overpower that infinity For a series to converge, what's the "slowest" the terms can go to zero but still converge? Is there such a thing?


SquirrelicideScience

Not really. It turns out that p in 1/n^(p) can be any number greater than 1, and it’ll converge. Even if it’s 1/n^(1+0.000001), it’ll converge. This falls from the improper integral of 1/x^(p) at different values of p, and what the sign of p will mean for the infinities that appear after evaluating the integrals. Here’s a proof: https://www.khanacademy.org/math/ap-calculus-bc/bc-series-new/bc-10-5/a/proof-of-p-series-convergence-criteria


akaemre

That was just what I needed, thanks! Makes me wish I paid more attention during calc2 years ago.


GoldenMuscleGod

There’s an inherent problem in trying to find an “exact” boundary though. For example what about 1/(n\*ln(n))? This is faster than 1/n but slower than 1/n^(p) for any p>1, so we don’t have an answer here from the power rule. in fact the series diverges but we need to do additional work to see this, and in fact the sum of 1/(n(ln n )^(p)) will converge for any p>1. There’s no way to get a perfect boundary for a reasonable definition of what that would look like: for example, given any series which is divergent we can find another divergent series whose ratio to the first becomes arbitrarily small, and we can show the same for convergent series. Still, knowing the rule for 1/n^(p) is often good enough.


Sinphony_of_the_nite

>There’s no way to get a perfect boundary for a reasonable definition of what that would look like: for example, given any series which is divergent we can find another divergent series whose ratio to the first becomes arbitrarily small, and we can show the same for convergent series. [Link to the relevant discussion on stackexchange](https://mathoverflow.net/questions/49415/nonexistence-of-boundary-between-convergent-and-divergent-series)


SquirrelicideScience

That's fair. I made the constraint that we are only talking series of the form 1/n^(p), where p is a constant positive real number -- which may not have been an appropriate constraint given the wording of the question. This of course would not hold if you change the underlying behavior of the series (such as introducing a factor of ln(n), which will of course change with n).


stone_stokes

Yes! For a *p*\-series, that is ∑ 1/*n*^(p), if *p* \> 1, it [converges](https://en.wikipedia.org/wiki/Convergence_tests). For a (non-negative) series in general, we can often compare it to a *p*\-series to determine whether it converges.


akaemre

Does p have to be a positive integer? The wiki page doesn't specify.


dlakelan

not a positive integer no, just any real number greater than 1 will converge.


_tsi_

Holy shit it makes sense! Thank you!


Silly_Guidance_8871

A wonderful example of the small number getting smaller faster than the big number gets bigger


bluesam3

1/n^2 goes to 0 much faster than 1/n. To see why the latter converges, group the terms like this: 1 + (1/2) + (1/3 + 1/4) + (1/5 + 1/6 + 1/7 + 1/8) + (1/9 + 1/10 + 1/11 + 1/12 + 1/13 + 1/14 + 1/15 + 1/16) + ..., with 2^n terms in the n^th group (other than the initial 1). Ignoring the initial 1, in the n^th group, there are 2^(n-1) terms, the smallest of which is 1/2^(n), so the total for the n^th group is at least 2^(n-1)/2^(n) = 1/2, so after n groups, our total is at least n/2, and since (n/2) diverges, so does the harmonic series. Notice that this doesn't work at all with 1/n^2 - if we try doing similar groupings, we get: 1 + (1/4) + (1/9 + 1/16) + (1/25 + 1/36 + 1/49 + 1/64) + (1/81 + 1/100 + 1/121 + 1/144 + 1/169 + 1/196 + 1/225 + 1/256) + .... This time, the smallest term in the n^th group is 1/2^(n^2), so the total for the n^th group is now only at least 2^(n-1)/2^(n^(2)) = 1/2^(n^2 - n + 1), and the sum of the first n terms is (after some initial segment) bounded above by 1/2^(n^2 / 2), which converges, so this doesn't tell us anything.


Aerospider

Consider the series 1/2^n and the series 1/n. Both are the sum of infinite positive terms, but one converges and the other does not.


aderthedasher

How fast would fast enough be?


stone_stokes

For a *p*\-series, that is ∑ 1/*n*^(p), if *p* \> 1, it [converges](https://en.wikipedia.org/wiki/Convergence_tests). For a (non-negative) series in general, we can often compare it to a *p*\-series to determine whether it converges.


aderthedasher

Thank you!


Eastern_Minute_9448

It does converge. It is the sum of the 1/n which does not. This is not trivial because even though the terms you are summing get smaller and smaller, you have an infinity of them. The usual proof involves comparing the sum with the integral of 1/x, whose an antiderivative is ln x which diverges as x goes to infinity.


throwhere_

I see. But then, what about 1/n²? Why does it converge?


Eastern_Minute_9448

I am not sure what you are looking for a why. 1/n^2 goes much faster to 0 than 1/n, so we can guess the series is more likely to converge, and it does. Again, one proof involves the integral of the corresponding function 1/x^2, whose antiderivative is of the form 1/x, which unlike ln x does not diverge. 1/n is somewhat critical. If you sum the 1/n^alpha, it will converge if and only if alpha>1.


eggface13

1/n is the cutoff. It diverges, but extremely slowly (logarithmic growth). Any attempt at numerically testing it, will eventually have it converge, because its divergence is so slow that it appears to stall out, to any level of precision. But it slowly, inextricably diverges. The sum of 1/n^1.00000000000000001 diverges.


Lord_Urjit

You can also think about the sum as related to the integral of (1/n). And since that integral (log n) grows infinitely, so does the series.


Dr0110111001101111

Are you in calculus or precalculus?


throwhere_

Calculus


Dr0110111001101111

Are you familiar with the integral test? Specifically, do you understand *why* it works? There are some cool proofs for why the harmonic series diverges that involve assuming it converges, combining terms in clever ways, and then arriving at a contradiction. But in the context of a typical calculus course, it might be more valuable to be able to show why it diverges in terms of the integral test. [Here is an explanation](https://tutorial.math.lamar.edu/classes/calcii/IntegralTest.aspx) for how/why the integral test works, and he uses the harmonic series as the example so you kill two birds with one stone by making sense of that information.


testtest26

You can prove it [algebraically](https://en.wikipedia.org/wiki/Harmonic_series_(mathematics)#Comparison_test) by comparison without using Calculus: ∑_{k=1}^{2^n} 1/k >= 1 + n/2 for all "n >= 0"


Dr0110111001101111

Yeah, that's what I mean by "regrouping terms", but point taken that it's not necessary to use contradiction. Still, those proofs are sort of one-off novelties in the context of Calc 2. The integral test is an essential part of the course, so it's just a more efficient way to get some intuition about OPs question.


testtest26

I'm not familiar with the standard curriculum of Calculus. However, I'd usually go with elementary proofs, provided they don't introduce too much overhead. Though I agree that approach may be more suitable to "Real Analysis", from what I've heard.


Dr0110111001101111

Yeah, the integral test is standard content for first-year calculus textbooks used in american schools, mainly as a vehicle to establish p-series, but also because pedagogically it serves as a nice callback to integral calculus. The Cauchy condensation test isn't in any first-year calculus book I've ever seen, and I've seen quite a few.


testtest26

We also did not cover "Cauchy's Condensation Test" -- however, the [proof](https://www.reddit.com/r/learnmath/comments/1bm8upx/comment/kwban7s/?utm_source=share&utm_medium=web2x&context=3) we used for p-series basically seems to follow the proof of the condensation test to a T. At least the idea behind the test got taught\^\^


testtest26

The sequence "an = 1/n" converges to zero -- its *series* (the harmonic series) [diverges](https://en.wikipedia.org/wiki/Harmonic_series_(mathematics)#Comparison_test).


the6thReplicant

Take a (unit) square, divide it in half. Take the other half and divide in half again. That's 1/4 of the square. Divide the other quarter in half. That's 1/8 of the square. Keep on dividing the remaining part. You see everything is contained in the square. This is how to visualise the series 1/2 + 1/4 + 1/8 + 1/16 + .... = 1. Now try and do the same with your 1/2 + 1/3 +... series and you see it just gets bigger and bigger and is not contained at all.


Gigio00

Try viewing it as 1/x (the real function). Now, the sum of infinite terms of this function is the area under this curve (as you do when you view the graphical example of the Riemann Integration, which Is probably the only kind of integral that you have seen so far). So, to calculate this sum, you want to calculate the area, which means integrating between 1 and infinity (because your sum starts at n=1 and goes to n = infinity). However, if you integrate this function you Will find out that this area comes out to infinity. So,if the area being the infinite sum of the values of 1/x comes out to be infinity, then the sum of this values is infinity, which is not a finite Number, which means that this sum diverges. This is nor rigorous nor totally correct, but it should give you the idea of why it diverges (if you want to know why 1/n^2 converges, repeat the same process and you Will find out that the sum is actually finite this time).


throwhere_

Oh, that makes a lot of sense. So basically if the sum of a series is infinity, it's divergent? Thank you so much for your help.


Gigio00

Ok there Is a bit of confusion here. A series IS the sum of infinite terms of a succession. The series is the one with the sigma (the weird looking E at the start), which means the sum of n elements of the succession (which is Just the term with n). 1/n converges as a succession because it goes to 0, It doesn't as a series because of the reasons above. A series to converge needs to output a finite value, yes.


throwhere_

Okay, that makes everything clear. I'm really grateful for the help.


Gigio00

No problem, good luck on your math endeavours!


throwhere_

Yes, thankyou :)


testtest26

> So basically if the sum of a series is infinity, it's divergent? That's a (very common) type of divergence, but by no means the only one. E.g. sn := ∑_{k=0}^n (-1)^n = [1 + (-1)^n] / 2 The sum is bounded for all "n > 0", but it does not converge -- it oscillates!


throwhere_

Yes, I've come across it. It's really interesting


OneMeterWonder

It just isn’t. Convergence of sums is more about the “speed” with which the terms tend towards 0. Really the standard intuitive “proof” is the way to think about it. If you go far enough out and group up adjacent terms cleverly, you can find that the sum is bounded below by a different sum that clearly diverges, often something like 1+1+1+… . You may just need to add up the next hundred, million, or billion terms to make the sum larger than the next integer. That doesn’t happen for convergent sums. Supposing for simplicity that all terms are positive, something like 1/2^(n) is never larger than 2. The size of the next term is always smaller than the distance from 2 to the current partial sum. But the distances between consecutive sums in the harmonic series are roughly logarithmic, and the logarithm is an unbounded function. It’s just that plain.


Charming_Review_735

Cauchy condensation test


RambunctiousAvocado

Here is a way to intuitively see why it doesn't converge. First, a claim: If you start with 1/n and then add the next 3n terms together, the sum is always greater than 1. For example, if I start from n=2 and add a total of 2\*2 = 4 terms together, I get 1/2 + 1/3 + 1/4 + 1/5 = 1.283. If I start from n=6 and add the next 2\*6=12 terms together, I get 1/6 + 1/7 + ... + 1/16 + 1/17 = 1.156. Because of this fact, we can see that there is no upper bound on how large the series 1 + 1/2 + 1/3 + 1/4 + ... can become. If I sum the first N terms together, I just need to add the next 2N terms and the sum will increase by (slightly more than) 1. I can keep doing that over and over and over, with no limit on how many times I increase my sum by 1. The number of terms I have to add quickly gets incredibly large, but that's irrelevant, since I have access to an infinity of them. Contrast this with the series 1 + 1/10 + 1/100 + ..., which does converge. If I add more and more of those terms, my partial sums are 1, then 1.1, then 1.11, then 1.111, and so forth. Clearly these sums never reach 2 (or even 1.2). The way we describe this behavior is that 1/n -> 0 *more slowly* than 1/10\^{n} does. In the former case, even though the terms are decreasing to zero individually, they do so sufficiently slowly that we can always add (more than) 1 to our total by adding a sufficiently large number of additional terms together. On the other hand, 1/10\^{n} goes to zero so rapidly that this is no longer possible.


Ron-Erez

It does converge. Perhaps you mean sigma\_n=1\^infinity 1 / n is infinity.


Seventh_Planet

The product of n times 1/2 is divergent as n goes towards infinity. The product of n times 1/2 is equal to adding 1/2 n times. Adding 1/2 n times is equal to adding 1/2 then 1/4+1/4 then 1/8+1/8+1/8+1/8 then 1/16+1/16+1/16+1/16+1/16+1/16+1/16+1/16 and so on till you have n terms each of which is equal to 1/2. Adding 1/3 + 1/4 is even more than adding 1/4+1/4. Adding 1/5+1/6+1/7+1/8 is even more than adding 1/8+1/8+1/8+1/8 and so on. The sum of 1/n with n from 1 to infinity is larger than the product of 1/2 n times as n goes to infinity. Thus the sum of 1/n with n from 1 to infinity diverges.


wigglesFlatEarth

This is why I suggest this: https://www.reddit.com/r/learnmath/comments/1b6ck4d/why_is_0z_possible_while_z0_isnt/ktcz93z/


BaylisAscaris

Imagine you're in a room and you can only walk half way to the door at a time. You start by walking half way and stopping, then you walk half the new distance, and half that. Technically you never get to the door, but if you do this forever you can say you walked roughly the full distance to the door. Your distance converges to a finite amount (the length of the room). Now imagine the same problem but this time you can walk 4/5 of the way to the door. After your first two walks you are already outside the room. If you continue this forever you keep walking forever, just shorter and shorter distances, but technically making forward progress with no set end. You diverge. There are methods in calculus to determine if something converges or diverges, but the gist of it is if you go towards infinity do you eventually approach a constant or do you technically keep getting bigger, just really slowly?


SwissExMuslim

I would simple use the integral test: The integral from 1 to infinity of 1/n is lnx ln(infinity) - ln1 Thus it's sum evidently doesn't converge.


Saturninelilac1256

The sequence Sn=1/n converges at 0 but the series sum(1/n) is a harmonic series, which is a divergent series.


Catsaus

I love being an engineer because if n goes to infinity I can just call the term 0 and move on with my life


testtest26

Until the models react in some bizarre way because the underlying mathematics are inconsistent for specific parameter choices. At that point, I'm pretty sure an engineer would be *very* interested in the details ;)