Generality of algebra

From Wikipedia, the free encyclopedia

In the history of mathematics, the generality of algebra was a phrase used by Augustin-Louis Cauchy to describe a method of argument that was used in the 18th century by mathematicians such as Leonhard Euler and Joseph-Louis Lagrange,[1] particularly in manipulating infinite series. According to Koetsier,[2] the generality of algebra principle assumed, roughly, that the algebraic rules that hold for a certain class of expressions can be extended to hold more generally on a larger class of objects, even if the rules are no longer obviously valid. As a consequence, 18th century mathematicians believed that they could derive meaningful results by applying the usual rules of algebra and calculus that hold for finite expansions even when manipulating infinite expansions.

In works such as Cours d'Analyse, Cauchy rejected the use of "generality of algebra" methods and sought a more rigorous foundation for mathematical analysis.

Example[edit]

An example[2] is Euler's derivation of the series

 

 

 

 

(1)

for . He first evaluated the identity

 

 

 

 

(2)

at to obtain

 

 

 

 

(3)

The infinite series on the right hand side of (3) diverges for all real . But nevertheless integrating this term-by-term gives (1), an identity which is known to be true by Fourier analysis.[example needed]

See also[edit]

References[edit]

  1. ^ Jahnke, Hans Niels (2003), A history of analysis, American Mathematical Society, p. 131, ISBN 978-0-8218-2623-2.
  2. ^ a b Koetsier, Teun (1991), Lakatos' philosophy of mathematics: A historical approach, North-Holland, pp. 206–210.