1.2

Defining Limits and Using Limit Notation

The limit of a function describes the value that the function approaches as the input approaches a specific point.

Limits1012% of exam
Understand It
Ace It
Context

What this topic is and why it exists

Defining limits is about understanding what value a function approaches as the input approaches a certain point.
The notation 'lim x→c f(x) = L' means that as x gets closer to c, f(x) gets arbitrarily close to L.
This concept allows calculus to measure change at an instant, rather than over an interval, which is the core of what makes calculus powerful.
The mechanism here is not just about plugging numbers into an equation; it's about understanding behavior.
A common mistake is thinking that if you can't directly substitute x = c into f(x) to get L, the limit doesn't exist.
However, limits are about the approach, not the exact value at the point.
Misunderstanding this can lead to errors in continuity analysis and later in derivatives and integrals.
Limits are foundational; every derivative and integral you encounter will rely on this concept.
Get this wrong, and everything else crumbles.
1 / 9