One of the key concepts of calculus is the limit concept. Calculus has two major subfields: differential calculus and integral calculus. Differential calculus is concerned about the rate of change and finding tangent lines to curves. Integral calculus is about computing the total area under a curve and measuring the total effect of a process of continuous change. These concepts are defined in terms of limits [1]. In fact, every single notion of calculus is a limit in one way or another, and what distinguishes calculus from algebra is the concept of limit.
The concept of limit was implicitly used by the ancient Greek scholars. However, it was Isaac Newton in the 17th century who explicitly talked about limits. In the 18th century, the French mathematician, Jean le Rond d’Alembert (1717–1783), used the notion of limits and tried to provide a definition for it. Finally, another French mathematician Augustin-Louis Cauchy (1789–1857) formulated a satisfactory mathematical definition of a limit, which is still used today.
[1] In the 1960s an alternative formulation of calculus, called non-standard analysis, was introduced. Non-standard analysis legitimates the concept of infinitesimals, which was vaguely used by Gottfried Leibnitz (1646–1716), Leonhard Euler (1707–1783), and many others in the beginnings of calculus. For more information read the Wikipedia article on non-standard calculus.