This article is within the scope of WikiProject Mathematics, a collaborative effort to improve the coverage of
mathematics on Wikipedia. If you would like to participate, please visit the project page, where you can join
the discussion and see a list of open tasks.MathematicsWikipedia:WikiProject MathematicsTemplate:WikiProject Mathematicsmathematics articles
This article is within the scope of WikiProject Spoken Wikipedia, a collaborative effort to improve the coverage of articles that are spoken on Wikipedia. If you would like to participate, please visit the project page, where you can join
the discussion and see a list of open tasks.Spoken WikipediaWikipedia:WikiProject Spoken WikipediaTemplate:WikiProject Spoken WikipediaSpoken Wikipedia articles
This page has archives. Sections older than 365 days may be automatically archived by Lowercase sigmabot III when more than 10 sections are present.
Questionable example
The article states that f(x) = x²-1/x-1 is undefined at x=0. I would disagree, since it can easily be simplified to f(x) = x+1. It seems the same as arguing that x²/x would be undefined at 0 (or actually any g(x)*x/x). I can see how the example is convenient in other ways, because the formula is simple, but I would propose to either replace it by sin(x)/x, or at least note that the statement "f(x) is not defined for x=0" is debatable and that the example was chosen for its simplicity. - Jay
84.171.79.63 (
talk)
19:11, 28 June 2014 (UTC)reply
To clarify... the function in the article is f(x) = (x²-1)/(x-1) (rather than f(x) = x²-1/x-1 ) and the article states that it is not defined at x=1 (rather than at x=0).
Meters (
talk)
22:45, 7 July 2014 (UTC)reply
I still don't see the IP's point. Just because can be simplified to x+1, doesn't mean that it is simplified. And our article "
sinc" does specify that sinc(x) = sin(x)/x when x ≠ 0. One could make a better case that isn't defined at x = 0, but that isn't quite correct either, when we work on the
extended real line. —
Arthur Rubin(talk)16:08, 9 July 2014 (UTC)reply
I think the IP's point might be that in practice, other than to come up (in a textbook section about limits) with a function with a specific value excluded, no-one would ever define a function like . A less trivial and perhaps somehow "better" example would be, for instance, with a hole at x=1, because it cannot be trivially simplified. -
DVdm (
talk)
16:53, 9 July 2014 (UTC)reply
I agree that the function in the example is bad. The function simply doesn't have a pole at x=1. It is perfectly well defined in the vicinity of x=1.
95.192.5.53 (
talk)
11:36, 20 December 2015 (UTC)reply
Limit of a sequence
I think this sentence is not correct
On the other hand, a limit L of a function f(x) as x goes to infinity, if it exists, is the same as the limit of any arbitrary sequence an that approaches L, and where an is never equal to L.
I mean, it is technically correct, but this is an unneeded tautology. Of course the limit L of a function f(x) is the same as the limit of any that approaches L, by definition :D. --
ԱշոտՏՆՂ (
talk)
07:00, 24 August 2019 (UTC)reply
Most of the elements of a true statement were in that claim, but as written, it was not entirely correct, and what was correct was opaque. I have reworded the paragraph and cited a reliable source. Hopefully this is easier to understand now.—
Anita5192 (
talk)
19:46, 25 August 2019 (UTC)reply
This section is quite messy. Obviously copy-pasta extracted from reference "[8]", but with insufficient understanding.
The first paragraph claims to "formally define convergence", but uses the word "convergence" in the definition, which is actually that of [a special case of] "convergence of order α", not general convergence -- and convergence of order α does not necessarily imply the existence of that limit, but rather of such a lim sup.).
Then it introduces a function f, but one has to guess that now the author considers the sequence p(n+1) = f(p(n)), which is written nowhere.
Then (s)he speaks of "linear convergence" without ever having defined this (actually, α = 1).
Then it says "the series converges", but meaning the sequence and not a [series_(mathematics)|].
then, "if it is found that there is something better than linear..." (meaning convergence of order > 1), but it is totally obscure how one may find whether there is something better.
Then it speaks of quadratic convergence without ever defining this (actually, α = 2).
Also, it forgets that sequences may have convergence of order in between 1 and 2, the most frequent example being that of the
secant method or
regula falsi, which has convergence of order α = (sqrt(5)+1)/2 ≈ 1.6
and so on. Such copy-pasta by totally clueless people should better be avoided. Also, this rather belongs to a more specialized article, like:
rate of convergence (maybe best match? move there and replace with a link to there?), or:
recurrent sequences, dynamical systems, iterative methods, or ... —
MFH:
Talk00:06, 28 March 2022 (UTC)reply
Essential? not.
Limit is *not* essential for differential or integral calculus! I don't know if anyone isn't taught (in their calculus education) that "taking a limit" is one of two ways to approach the subject, with the other being infinitesimals. The lead is wrong. I also note that while the article claims the importance of Limit in calculus, it proceeds to ignore those applications. Why? Needs to be fixed.
174.131.48.89 (
talk)
04:41, 23 August 2022 (UTC)reply