All living things are composed of biological cells. Cells contain hereditary information and derive from preexisting cells. All cells are similar in chemical composition in organisms of a similar species. Animals, plants, and fungi consist of eukaryotic cells, which have an outer membrane and a nucleus that contains DNA. Prokaryotic cells, simple cells found in bacteria, lack a nucleus. Animal cells obtain energy via food; plant cells obtain energy from the sun by the process of photosynthesis.
“It is the cells which create and maintain in us … our will to live and survive.”
Albert Claude, Nobel lecture (1974)
The idea of the biological cell emerged in 1665, when English naturalist Robert Hooke (1635–1703) was looking at thin slices of cork bark through a compound microscope. Hooke noticed what he called “small rooms” butting up against one another in a pattern that reminded him of an aerial view of monks’ chambers at an abbey. In Micrographia (1665), his work related to his microscopic observations, Hooke named them “cells,” after the Latin word cellula, meaning “small room.”
Hooke’s investigation led to the development of cell theory, which evolved from the combined efforts of microscopists such as Dutchman Antonie Philips van Leeuwenhoek, who discovered blood cells in 1684. Scottish botanist Robert Brown observed the first nucleus in plant cells in 1833. By 1839, German physiologist Theodor Schwann and German biologist Matthias Schleiden recognized that cells are the elementary particles of organisms in plants and animals, and that some organisms are unicellular and others multicellular. German pathologist Rudolf Virchow confirmed the principle of cell division, essential in growth and reproduction, in 1855.
Cell theory signified an important conceptual advance in biology, laying the groundwork for major scientific breakthroughs in the twentieth century, such as the discovery of DNA by James Watson and Francis Crick in 1953. CK
1668
Decimal System
John Wilkins
The first proposal for universal measurements, based on tens
A French artwork from 1799, demonstrating metric measurements, including the liter, kilogram, meter, stere (unit of volume), Franc (decimalized currency), and are (unit of area).
The decimal system is a numeral system that employs the number ten as the base and uses ten different numerals: the digits from one to nine inclusive, and zero. A decimal point, signified by a dot—as in 0.45, for example—is used to represent decimal fractions.
“ … to which purpose, it were most desirable to find out some natural Standard, or universal Measure, which hath been esteemed by Learned men as one of the desiderata in Philosophy.”
John Wilkins, An Essay towards a Real Character, and a Philosophical Language (1668)
Many ancient cultures calculated using numerals based on ten (the number of fingers of both hands), but the outline for modern European decimal notation was introduced in 1668 by an English scientist and natural philosopher, Bishop John Wilkins (1614–72), when the Royal Society of London published his book An Essay towards a Real Character, and a Philosophical Language. Wilkins proposed an innovative plan for a system that was based on a single “universal measure” and was designed to measure all things. Adopting an international standard would help “all those different Nations who traffick together” by facilitating commercial trade. Wilkins outlined what was required for such a system and advocated that it should be based on accurately measuring the Earth or time. He devised a decimal system with a standard unit that was almost one meter long, and that was to be used with decimal multiples and submultiples. He also suggested decimal units to measure area, volume, and weight.
Wilkin’s proposal was debated for decades. The first practical application of his idea was implemented in the eighteenth century when it became politically desirable to adopt a common system of weights and measures. France introduced the metric system in 1799, and by the early nineteenth century this had been adopted by the international scientific community.
The modern metric system, known as the International System of Units (SI, from the French Le Système Internationale d’unités), was established in 1960. Practical and easy to use, the SI contains some of the elements that Wilkins proposed and is the most widely used system of measurement in the world. CK
1669
Phlogiston Theory
Johann Joachim Becher
The theory that burning required the release of the hypothetical element phlogiston
An illustration from the 1738 edition of Physica subterranea by Johann Joachim Becher.
In the late seventeenth and mid-eighteenth centuries, chemical theorists used the phlogiston theory to explain burning and rusting. They believed that every combustible substance contained the element phlogiston, and the liberation of this hypothetical elastic fluid caused burning. The ash or residue that remains after burning was believed to be a dephlogisticated substance.
German alchemist and physician Johann Joachim Becher (1635–82) initiated the theory of combustion that came to be associated with phlogiston. In his book Physica subterranea, published in 1669, he suggested that instead of the classical elements of earth, air, water, and fire, bodies were made from three forms of earth: terra lapidea (vitreous), terra mercurialis (mercurial), and terra pinguis (fatty). He thought combustible substances were rich in terra pinguis, a combustible earth liberated when a substance burned. For him, wood was a combination of terra pinguis and wood ashes.
“We are not able to ascertain the weight of phlogiston, or … the oxygenous principle.”
Joseph Priestley, theologian
Becher’s theories of combustion influenced German chemist Georg Ernst Stahl (1660–1734), who gave the hypothetical substance terra pinguis the name “phlogiston,” derived from the Greek word phlogizein, meaning “set alight.” He believed there was a link between the processes of burning and rusting. Stahl was correct in that each process depends on a chemical reaction with oxygen known as oxidization. However, problems arose because rusty iron weighs more than unrusted iron and ash weighs less than the original burned object. Phlogiston was disproved by French chemist Antoine-Laurent Lavoisier (1743–94). CK
1670
Pascal’s Wager
Blaise Pascal
An argument for belief in God based on the practical benefit for the believer
Pascal’s Wager is a logical argument to show that the potential benefits of believing in the Christian God outweigh the potential risks of not doing so. The argument appears in the posthumously published Pensées (1670) of French mathematician, scientist, and philosopher Blaise Pascal (1623–62). Pascal assumes that either the Christian God exists or does not, and that each person can decide whether to believe in him or not. If God does not exist, either believing in him or not believing in him will result in the same outcome, though the believer may have a happier life than the nonbeliever because the believer can gain comfort from the religion. If God does exist, then believing in him grants the possibility for eternal life, while not believing in him will result in eternal damnation.
“If you gain, you gain all … Wager then, without hesitation, that He exists.”
Blaise Pascal
Previous thinkers, such as St. Thomas Aquinas and René Descartes, had attempted to prove God’s existence through rational, logical proofs, but Pascal took another tack. His focus on the best possible outcome, though it echoed some classical writings, eschewed the previous metaphysical justifications for Christianity, and instead adopted a practical measurement based on the potential outcomes.
Pascal’s Wager is both a logical argument and an exercise in what would come to be known as decision theory. It makes no attempt at convincing potential believers of the existence of God. Instead, the wager offers a pragmatic approach. Pascal’s argument paves the way for other belief systems, such as existentialism, which focus less on metaphysical or spiritual questions in lieu of potential practical benefits. MT
1673
Mathematical Function
Gottfried Wilhelm Leibniz
The mathematic concept that a function is a relation between two variables
In mathematics, a function is a relationship between values. Each of its input values gives back one output value. Often it is denoted as f(x), where x is the value given to it. For example, where the number x relates to its square x2, the output of function f corresponds to input x and is written f(x). So if the input variable, or argument, is -4 then the output is 16, written as f(-4) = 16. The function concept is fundamental in modern mathematics and essential in science, where it is used for formulating physical relationships.
Italian physicist, mathematician, and astronomer Galileo Galilei (1564–1642) was the first to articulate the dependency of one quantity on another in mathematics. However, it was German mathematician and philosopher Gottfried Wilhelm Leibniz (1646–1716) who introduced the word “function” into mathematics. In his manuscript Methodus tangentium inversa, seu de functionibus (The Inverse Method of Tangents, or On Functions, 1673), Leibniz used “function” to describe any quantity varying from point to point on a curve, such as the length of the tangent.
“ … other kinds of lines which, in a given figure, perform some function.”
Gottfried Wilhelm Leibniz
In 1692 and 1694, Leibniz published articles in which he named any parts of straight lines as “functions.” In 1694, in a piece for the scientific journal Acta Eruditorum (Acts of the Scholars), Swiss mathematician Jacob Bernoulli (1654–1705) used the word “function” to have the same sense. The notation f(x) was introduced in 1734 by Swiss mathematician and physicist Leonhard Euler (1707–83). German mathematician Peter Dirichlet (1805–59) came up with the modern definition of “function” in 1837. CK
1674
Microbiology
Antonie van Leeuwenhoek
A branch of science that focuses on the study of microorganisms
A twentieth-century portrait of Antonie van Leeuwenhoek, the “father of microbiology,” by Ned M. Seidler.
Microbiology is a science that involves the study of tiny life-forms known as “microorganisms” or “microbes,” including archaea, algae, bacteria, molds, protozoa, viruses, and yeasts. Microbiology has grown to include bacteriology, immunology, mycology, parasitology, and virology. It has multiple applications and is employed in the study of genetics and disease, as well as in industry.
In 1674 Dutch scientist Antonie van Leeuwenhoek (1632–1723) was the first to see bacteria and protozoa when he noticed “animalcules,” or minute organisms, while closely observing water through a single-lens microscope of his own design. Van Leeuwenhoek was apparently inspired to take up microscopy after seeing a copy of English scientist Robert Hooke’s Micrographia (1665), in which the term “cell” was first coined to describe what came to be recognized as the basic unit of all living things.
“I see some of [the organisms] open their mouths and move the organs or parts.”
Antonie van Leeuwenhoek
In 1676 van Leeuwenhoek wrote to the Royal Society in London to announce his discovery of “animalcules.” Initially, the society members doubted him, but, in 1680, the Royal Society verified van Leeuwenhoek’s discoveries and appointed him as a fellow. He went on to discover red blood cells and then spermatozoa, coming to the radical conclusion that fertilization occurs when a spermatozoon penetrates an egg.
The discovery of microorganisms was revolutionary, later proving particularly important with regard to their relationship with decay and disease. The work of French chemist and microbiologist Louis Pasteur (1822–95) in this field led to the germ theory of disease and the development of vaccines and pasteurization. CK
1675
Evangelicalism
Philipp Jakob Spener
A Protestant movement upholding the sole authority of the Bible in matters of doctrine
A copper engraving from 1683 of Philipp Jakob Spener, by German engraver Philipp Kilian. Spener became interested in reforming Lutheran practice while studying in Strassburg.
Evangelicalism is a Christian movement that emphasizes the piety of the individual, and their relationship with God and the Savior. Evangelicals believe that the individual is saved by faith in the death of Christ, which atoned for humanity’s sin, and that humanity is sinful because of the fall of Adam in the Garden of Eden. Practitioners need to be “born again” by a process of personal conversion that saves them from eternal damnation through redemption, after which they are promised heavenly salvation. Evangelicals de-emphasize ritual: they view good works and the sacraments as being merely symbolic, and do not believe that ordination imparts any supernatural gifts. They look to the Bible as the sole authority in matters of theological doctrine.
“What they take to be faith is by no means that true faith which is awakened through the Word of God, by the illumination, witness, and sealing of the Holy Spirit, but is a human fancy.”
Philipp Jakob Spener, Pia Desideria (Pious Desires, 1675)
Evangelicalism grew out of the ideas of German Lutheran reformer Philipp Jakob Spener (1635–1705). His influential book, Pia Desideria (Pious Desires, 1675), attacked the clergy, advocated biblical study for both individuals and groups, and urged the cultivation of inner piety. His friend, German Lutheran August Hermann Francke (1663–1727), laid practical foundations of the movement by organizing a collegium philobiblicum (assembly of Bible lovers), devoted to scholarly studies of the scriptures. Francke’s ideas were spread via his students at the University of Halle.
The term “Evangelicalism” came into general use in England during the time that a series of revivals under the founder of Methodism, John Wesley (1703–91), and the itinerant English evangelical George Whitefield (1715–70) took place. In North America, the revival was spearheaded by U.S. theologian Jonathan Edwards (1703–58). By the early nineteenth century, Evangelical Protestantism was the most popular expression of Christianity in the United States; in the United Kingdom, evangelicals were strongly associated with missionary work and social reform. CK
c. 1675
Feminist Biblical Exegesis
Various
Critical analysis of the Judeo-Christian scriptures from a feminist perspective
A restored thirteenth-century stained glass of the Virgin Mary, from the Church of St. Mary Magdalene in Chewton Mendip, United Kingdom.
Feminist biblical exegesis is concerned with the representation of women in the Bible. It often challenges long-accepted interpretations of the texts put forward by male scholars working in patriarchal societies. Feminist criticism of the Bible examines the social construction of gender and what that means for the depiction of female figures in the Bible. It attempts to reinterpret problematic verses and narratives, and addresses issues of misogynistic narratives.
“If all Men are born free, how is it that all Women are born Slaves?”
Mary Astell, Some Reflections on Marriage (1700)
The roots of feminist biblical exegesis lie in the seventeenth century, in the writings of Protestant female theological writers, such as Quakerism founder Margaret Fell (1614–1702), Philadelphian prophetesses Jane Lead (1624–1704) and Ann Bathurst (c. 1638–c. 1704), Quaker Elizabeth Bathurst (d. 1690), writer Mary Astell (1668–1731), and visionary M. Marsin (a. 1694–1701). They advocated the equality of the sexes in marriage, society, and religion, arguing that their radical belief in universalism was as instituted by God at creation. For centuries the Virgin Mary in the Bible had been considered a counter figure to Eve as an instrument of salvation, and the idea was reintroduced as the writers attempted to advocate the rights of women and find scriptural precedent for female preachers.
Mary Astell’s ideas regarding women, and the religious education of women in particular, were ridiculed by contemporaries, but she has become known as the “first English feminist.” And as the feminist movement grew from the late nineteenth century, so feminist biblical exegesis became more popular. In 1895 and 1898, Elizabeth Cady Stanton (1815–1902) and a committee of twenty-six women published the bestselling but controversial two-part The Woman’s Bible, partly as a response to women’s exclusion from biblical scholarship and partly as a challenge to the traditional religious orthodoxy that woman should be subservient to man. CK
1677
Pantheism
Baruch Spinoza
The view that God and the universe are one, infinite, eternal, and knowable to mankind
Paradisical Landscape with the Creation of Man (1594), painted by Jan Brueghel the Elder.
The classic statement of pantheism, by philosopher Baruch Spinoza (1632–77), is Ethica, ordine geometrico demonstrata (Ethics, Demonstrated in a Geometrical Manner, 1677), although the term “pantheism” itself was not coined until after Spinoza’s death. In the Ethics, Spinoza argues, “Whatsoever is, is in God, and without God nothing can be, or be conceived … God is the indwelling and not the transient cause of all things.” To paraphrase, God is the only substance, the only thing that exists in itself. Everything else exists only in God.
Spinoza rejects the idea that God is a person; it is a mistake, he argues, to think that God acts freely, plans the fate of the world, or is moved by prayer. He also rejects the idea that God is external to and distinct from the universe, famously using the phrase Deus, sive Natura (God, or Nature). Both of these positions are characteristic of later versions of pantheism and have also been detected in precursors of Spinoza, such as the Stoic philosophers of ancient Greece and Rome, and in non-Western traditions, especially philosophical Taoism. Pantheism is often also associated with a religious reverence for the natural world, although this is not true in Spinoza’s case.
Historically, pantheism has been controversial—often regarded as tantamount to atheism—but also influential. The highwater mark for pantheism was in the late eighteenth and nineteenth centuries, when it attracted thinkers such as English poet and philosopher Samuel Taylor Coleridge (1772–1834), U.S. poet and essayist Ralph Waldo Emerson (1803–82), and German philosopher G. W. F. Hegel (1770–1831). Part of pantheism’s appeal then was that it seemed to provide a middle route between theism and atheism—in effect it promised religion unadulterated by superstition, and science while still recognizing spirituality. Less in the public eye today, it continues to influence philosophical and religious thought. GB
1678
Wave Theory of Light
Christiaan Huygens
The concept that light is emitted in all directions as a series of waves
A diagram showing the principle of the interference of light, published by Thomas Young in 1806.
The wave theory of light refraction and reflection, formulated in 1676 and 1677 by Dutch mathematician, astronomer, physicist, and horologist Christiaan Huygens (1629–95), was based on a new concept of light as wavelike. The theory postulates that the velocity of light in any substance is inversely in proportion to its refractive index.
In 1678 Huygens completed his Traité de la lumière (Treatise on Light). He read portions of the treatise to the French Royal Académie des sciences (Academy of Sciences) in 1678 and published it in 1690. Huygens posited that the more light was bent, or refracted, by a substance, or ether, the slower it would move while traversing across that substance. Suggesting that a substance consisted of minute, uniform elastic particles compressed close together, Huygens was able to explain refraction and reflection on this basis.
However, Huygens’s wave theory of light was different from that proposed by the English physicist, mathematician, astronomer, natural philosopher, alchemist, and theologian Sir Isaac Newton (1642–1727) in Opticks (1704), his book about optics and light refraction. Newton’s theory was premised on a proposal by Pierre Gassendi (1592–1655) that light traveled as a shower of particles, referred to as “corpuscles.” Because Newton was a great scientist and had many zealous supporters, Huygens’s theory was dismissed and neglected until the nineteenth century.
Huygens’s theory was vindicated by experiments conducted by English scientist Thomas Young (1773–1829) in 1801, and French engineer Augustin Fresnel (1788–1827) in 1816. The Huygens-Fresnel principle is named after these two men, and it is a recognized basis for understanding and predicting the wave propagation of light. When technology advanced sufficiently to measure the speed of light accurately, Huygens’s theory was proved correct. CK
c. 1684
Infinitesimal Calculus
Sir Isaac Newton and Gottfried Wilhelm Leibniz
A mathematical theorem designed to calculate motion and change
A seventeenth-century portrait of Isaac Newton. Newton’s Principia (1687) was one of the most important works in the history of modern science.
Infinitesimal calculus is a mathematical theorem used to find the slope of curves, areas under curves, minima and maxima, and other geometric and analytic values. It consists of differential calculus and integral calculus, which are used for the techniques of differentiation and integration respectively. Infinitesimal calculus has numerous applications, including astronomy, physics, electricity, acoustics, and even economics.
“And what are these Fluxions? The Velocities of evanescent Increments? And what are these same evanescent Increments? They are neither finite Quantities nor Quantities infinitely small, nor yet nothing.”
George Berkeley, The Analyst (1734)
The invention of the techniques used to apply the infinitesimal calculus theorem is attributed to two men who came up with the idea independently: English physicist and mathematician Isaac Newton (1643–1727), and German mathematician and philosopher Gottfried Wilhelm Leibniz (1646–1716). Liebniz published his research in the journal Acta Eruditorum in 1684 and Newton in his treatise Principia in 1687, though Newton actually began work on the theory first. Their new mathematical system attempted to calculate motion and change. However, the two variants described change differently. For Newton, change was a variable quantity over time: for Leibniz it was the difference ranging over a sequence of infinitely close values.
Although the utility of infinitesimal calculus to explain physical phenomena was recognized, the fact that the mathematical theorem used infinity in calculations caused disquiet. In 1734 the Anglo-Irish Anglican bishop, philosopher, and scientist George Berkeley (1685–1753) published a pamphlet, The Analyst; or, A Discourse Addressed to an Infidel Mathematician, which pointed out flaws in the theorem. German-Jewish mathematician Abraham Robinson (1918–74) addressed the flaws in 1960 by developing a rigorous mathematical system of nonstandard analysis in which infinitesimal and infinite numbers are incorporated into mathematics. His theory of tiny numbers, known as “hyperreal numbers,” simplifies approximation estimates. CK
1686
Identity of Indiscernibles
Gottfried Wilhelm Leibniz
The concept that separate objects or entities cannot have all their properties in common
An anonymous portrait of Gottfried Wilhelm Leibniz from 1710. Leibniz worked in several fields, including mathematics, philosophy, politics, and logic.
Known as Leibniz’s Law, “the identity of indiscernibles” is a principle of analytic ontology. It states that it is impossible for two numerically distinct objects to have all of the same properties. This is typically understood to mean that no two objects have exactly the same properties: no two distinct things resemble each other exactly. Conversely, the principle of “the indiscernibility of identicals” (which is also known as Leibniz’s Law) asserts that if A is identical to B, then every property that A has is a property of B, and vice versa.
“To suppose two things indiscernible, is to suppose the same thing under two Names.”
Gottfried Wilhelm Leibniz, A Collection of Papers, Which passed between the late Learned Mr. Leibnitz, and Dr. Clarke, In the Years 1715 and 1716 (1717)
The first explicit formulation of the principle was outlined in 1686 by German mathematician and philosopher Gottfried Wilhelm Leibniz (1646–1716) in his Discours de métaphysique (Discourse on Metaphysics), published in the nineteenth century. He used the principle in his arguments regarding various metaphysical doctrines, including the impossibility of Newtonian absolute space. In a well-known correspondence with English philosopher and cleric Samuel Clarke (1675–1729), Leibniz advocated his theory of relational space, in which space is composed of relations between objects, so implying that space cannot exist in the absence of matter. Clarke, himself a supporter of Isaac Newton, published the letters in 1717, the year after Leibniz died, under the title A Collection of Papers, Which passed between the late Learned Mr. Leibnitz, and Dr. Clarke, In the Years 1715 and 1716. The publication subsequently became one of the most widely read philosophical books in the eighteenth century.
The identity of indiscernibles is recognized as proposing solutions to philosophical inquiries regarding the nature of space and time. It has become generally accepted as metaphysical principle, although philosophers have continued to debate its validity. CK
1687
Motion as the Natural State of Things
Sir Isaac Newton
The theory that everything in the universe is always moving
The title page of a first edition of Sir Isaac Newton’s Philosophiae Naturalis Principia Mathematica (Mathematical Principles of Natural Philosophy), published in 1687.
Motion, at its simplest, is defined as a change in the arrangement of a physical system. The claim that everything in the universe is moving seems to contradict empirical evidence, yet it is a claim that has warranted investigation by some of humankind’s greatest minds, from Aristotle to Albert Einstein. Motion, it turns out, despite its ubiquity, is a complex, often inscrutable, phenomenon.
“Time is defined so that motion looks simple.”
Misner, Thorne, and Wheeler, Gravitation (1973)
Aristotle (384–322 BCE) began the systematic exploration of physical motion, but it was not until the publication by Sir Isaac Newton (1643–1727) of Philosophiae Naturalis Principia Mathematica (Mathematical Principles of Natural Philosophy) in 1687 that observations about motion were codified into “laws.” In the preface to this work, Newton connects motion to all natural phenomena: “We offer this work as mathematical principles of philosophy [of rational mechanics]. For all the difficulty of philosophy seems to consist in this—from the phenomena of motions to investigate the forces of Nature, and then from these forces to demonstrate the other phenomena.”
Newton goes on to establish that humans are ill-equipped to recognize any object truly at rest, as they are subject to inertia, preventing them from feeling motions of a mass to which they are connected, and neither do humans possess an unchanging frame of reference allowing them to see that they are moving. In making such claims, Newton argues that motion is the “natural state of things.”
From Newton’s foundation of motion as the natural state of things, classical mechanics, describing the motion of large bodies (planets and humans), was developed, as was quantum mechanics, describing the motion of atomic and subatomic bodies (neutrons, protons, electrons, and quarks). Because motion applies to all matter and energy, the implications of its further exploration cannot be underestimated. MK
1687
Gravity
Sir Isaac Newton
The force that attracts a body toward any other physical body having mass
An engraving from c. 1880, offering an artist’s impression of Sir Isaac Newton thinking about gravity after seeing an apple fall in the orchard at his home.
According to the Law of Universal Gravity, formulated by English physicist and mathematician Sir Isaac Newton (1643–1727), every two material objects attract each other with a force proportional to the product of their masses, and inversely proportional to the square of the distance between them. Gravity is a natural phenomenon: it keeps the Earth in orbit around the sun, the moon in orbit around the Earth, and accounts for other, equally significant, phenomena, such as the rise and fall of sea levels to create tides.
“Rational Mechanics will be the science of motions resulting from any forces whatsoever, and of the forces required to produce any motions, accurately proposed and demonstrated.”
Sir Isaac Newton, Principia Mathematica (1687)
There have been theories about gravitation since ancient times. Italian physicist, mathematician, and astronomer Galileo Galilei (1564–1642) was among the first to examine the process of free and restricted fall. However, it was Newton who showed how a universal force, gravity, applied to all objects in all parts of the universe. Newton’s Law of Universal Gravity, expressed in his Philosophiae Naturalis Principia Mathematica (Mathematical Principles of Natural Philosophy, 1687), overturned previous classical theories and heralded the modern quantitative science of gravitation.
The applications of Newton’s Law have been many and diverse. Newton himself used it in Principia Mathematica to demonstrate that comets move in parabolic orbits under the gravitational attraction of the sun. In the nineteenth century, astronomers used the Law to predict the existence of the planet Neptune. Years later, however, Newton’s theory was shown to have discrepancies when it failed to account for the orbit of Mercury.
In 1915, German theoretical physicist Albert Einstein (1879–1955) resolved the discrepancies with his General Theory of Relativity, which provides a unified description of gravity as a geometric property of space and time, or spacetime. Nevertheless, Newton’s theory unaltered remains perfectly adequate as a means of describing gravity for most applications. CK
1689
Tabula Rasa
John Locke
People are born with minds that hold no preconceived notions
The Latin term tabula rasa refers to a blank slate, or literally a tablet from which all markings have been erased. In philosophy, “tabula rasa” expresses the idea that humans have an uninformed mind at birth and acquire beliefs only through experience, sensation, and reflection. While the mind is capable of learning and developing its own thoughts, or creating more complex thoughts from simpler ones, no thoughts are inherent. All people, in this respect, are born equal.
In 1689, John Locke (1632–1704) published An Essay Concerning Human Understanding, in which he stated that the mind is like slate scraped free of all markings, a tabula rasa. Prior to Locke, thinkers such as Aristotle wrote about similar concepts as early as 350 BCE. Locke himself was influenced by the work of Arabic writer and philosopher Ibn Tufail (c. 1105–85), who wrote a story of a child isolated on an island who developed from the state of a blank slate into an adult. Tabula rasa informed the empiricist school of philosophy, one that posits all knowledge derives from experience.
“Let us suppose the mind to be, as we say, white paper, void of all characters.”
John Locke, An Essay … (1689)
The idea of a person existing as a blank sheet of paper waiting to be filled with ideas is a powerful one. The assumption that knowledge is acquired, and that people are free to create their own beliefs and make their own choices, allows for the justification of social phenomena, such as legal systems that hold people accountable for their actions, and political structures that recognize legal equality regardless of status. Yet we also know that genetic and environmental factors influence our ideas, and what we acquire versus what we are born with is still debated today. MT
1689
Empiricism
John Locke
The notion that what is perceived by the senses forms the foundation of knowledge
Allegories of Taste, Hearing, and Touch (1404) by Jan Brueghel the Elder.
In contrast with the notion that knowledge begins as tradition and innate ideas, empiricism stresses the role of experience and evidence, especially sensory experience, in forming ideas. Empiricists argue that tradition and innate ideas arise from previous sensory experience, and that perceptions of the physical world are supported by experience. For example, if the sky is presented visually to an individual as blue, that individual registers the experience and appearance, while forming the belief that the sky is blue. However, this view of the world encounters problems with knowledge that cannot be attained in such a way, for example mathematical knowledge.
English philosopher John Locke (1632–1704) believed that the mind begins as a tabula rasa (blank slate) and that humans do not have innate ideas. This idea was influenced by the Greek philosopher Aristotle (384–322 BCE), and expounded in Locke’s An Essay Concerning Human Understanding (1689). Locke’s empiricism was opposed to the rationalist (Cartesian) philosophy of French philosopher and mathematician René Descartes (1596–1650).
“It matters not what men’s fancies are, it is the knowledge of things that is … prized.”
John Locke, An Essay … (1689)
Locke’s essay was criticized by contemporary philosophers. However, the work is regarded as one of the foundations of empiricism in modern philosophy and laid the groundwork for subsequent British empiricists, notably David Hume (1711–76). Eventually, empiricism eclipsed Cartesianism, although in the 1920s logical empiricists attempted to synthesize the theories of British empiricists with the scientific methodology of logic and mathematics. CK
1689
Religious Tolerance
John Locke
No one should be denied equal rights on account of their religion
Sir Godfrey Kneller’s portrait of John Locke, 1697. Kneller was the leading portraitist in England during the late seventeenth and early eighteenth centuries.
The idea of religious tolerance is to allow religious freedom—in civil terms, to leave the adherents of a particular religion unmolested in private and in public. In a political sense, it means granting equal rights to individuals regardless of their religious beliefs.
“No man can be a Christian … without that faith which works … by love.”
John Locke, A Letter Concerning Toleration (1689)
In 1689, English philosopher and physician John Locke (1632–1704) advocated religious tolerance in his Epistola de Tolerantia (A Letter Concerning Toleration). He wrote the letter, addressed to an anonymous “Honoured Sir,” while in exile in Holland, which was a secular state that permitted religious differences. The recipient of Locke’s letter was his friend, the Dutch theologian Philipp van Limborch (1633–1712), who published it. At that time, there were fears that Roman Catholicism might take over England. Locke was involved in helping draft the English Bill of Rights of 1689, but it did not go as far as he wanted regarding religious tolerance. The same year, Parliament passed the Toleration Act, which granted freedom of worship to Nonconformists, such as Baptists and Congregationalists, but not to Catholics and Unitarians. Locke suggested that religious tolerance might resolve the problems experienced by both government and religious leaders, and that there should be a separation between church and state.
Locke’s letter caused a controversy among members of the Anglican High Church. Clergyman and writer Thomas Long thought that the letter was part of a Jesuit plot aimed at enabling the Catholic Church to achieve dominance by causing chaos. There followed a protracted published correspondence between Locke and clergyman and academic Jonas Proast (c. 1640–1710), who asserted that the state had the right to use force to make dissenters reflect on the merits of Anglicanism. Locke’s ideas came to form the basis of modern views on the toleration of religious differences. CK
1689
Primary and Secondary Qualities
John Locke
Our perception of objects depends partly on them and partly on our own senses
The title page to John Locke’s An Essay Concerning Human Understanding (1689). The first explicit formulation of an empiricist philosophy, the Essay has been hugely influential.
By making changes to our senses (eyesight, hearing, smell, taste, and touch), different ideas are produced in our minds. For example, a mug previously seen as white will look yellowish to a person who has developed jaundice; and if a person were to exchange eardrums for a bat’s echo-location system, the mental images of objects that it would produce would be quite unfamiliar. And yet, we do not typically believe that objects change just because the instruments used to perceive them change. These examples suggest that, contrary to what we naturally think, much of what we perceive depends on our sensory organs and not on the way the world actually is.
“ …[T]he Ideas of primary Qualities of Bodies, are Resemblances of them, and their Patterns do really exist in the Bodies themselves; but the Ideas, produced in us by these Secondary Qualities, have no resemblance to them at all.”
John Locke, An Essay Concerning Human Understanding (1689)
In An Essay Concerning Human Understanding (1689), the English philosopher and physician John Locke (1632–1704) names the properties of objects (such as shape, sound, volume, and scent) as “qualities,” with each having the “power to produce ideas in our minds.” Locke distinguishes two types of qualities: those that do not depend on our senses (number, shape, volume, velocity) and those that do (color, sound, scent, taste, texture). The former he calls “primary qualities,” by which he means powers in objects to produce ideas in our minds, and the latter he calls “secondary qualities,” by which he means powers in ourselves to produce ideas in our minds. This distinction allows Locke to explain variations in sense perceptions among perceivers, and in the same perceiver under different circumstances. It also enables him to explain discrepancies in memories.
The distinction between primary and secondary qualities is part of Locke’s empiricist epistemology (study of knowledge) and is central to what became known as the Representation Theory of Perception. Vestiges of Locke’s epistemology, now thought to be discredited in parts, still play a role in contemporary psychological explanations of perception. JW
1689
Nature of Personal Identity
John Locke
The only constant in personal identity is continually renewed consciousness
The English philosopher and physician John Locke (1632–1704) was the first thinker to theorize that personal identity, or the self, depends on consciousness rather than material substance or the soul. Locke argued that a person’s consciousness of their present thoughts and actions is what conceives the self, which extends to past consciousness using memory. Although consciousness can be lost by forgetfulness, the soul or substance remains the same.
Locke explored the theory in An Essay Concerning Human Understanding (1689). He asserted that personal identity cannot be founded in substance because a person’s body changes throughout life while the person remains the same. To illustrate this, he used the example of a man who has had his finger cut off. The man is not conscious of anything that occurs to the cut-off finger, so his consciousness does not lie in substance. Locke argued that no two things of the same kind can exist at the same time, so personal identity is founded on the repeated act of consciousness. He also argued that only an individual can know their own consciousness; others cannot judge that person because they can never know the person’s consciousness.
“[Identity lies] not in the identity of substance, but … in the identity of consciousness …”
John Locke, An Essay … (1689)
Locke’s theory was revolutionary at the time. Ever since, philosophers have attempted to theorize what constitutes the self. Scottish philosopher David Hume (1711–76) went on to examine the issue in A Treatise of Human Nature (1739), arguing that the notion of self is a fiction. In modern times, Canadian academic James Giles, has proposed a no-self theory that eliminates the concept of personal identity altogether. CK
c. 1690
Liberalism
Various
The notion that society is best left to look after itself, with minimal state intervention
A View of the House of Commons, engraved by B. Cole in the eighteenth century.
Humans have long believed that a single person, or small group of people, has the insight necessary to make the lives of people in a society better than they can accomplish for themselves. In many instances this belief led to large numbers living in poverty, or even being exterminated for the sake of the “general welfare.” After the Enlightenment, however, a liberal tradition (from the Latin liber, or “free”) took shape from the philosophical views of John Locke (1632–1704), Jean-Jacques Rousseau (1712–78), and Adam Smith (1723–90).
Liberalism is a description of the role of government in a society, and is contrasted with socialism and monarchy. The idea is that if individuals are free to pursue their interests—with government supplying only protection from fraud and coercion—they will efficiently provide many of the things everyone wants. This “invisible hand” theory of production and its implied freedom of expression are credited with the sharp increase in quality of life in the West since the Industrial Revolution (1760–1840).
“No one ought to harm another in his life, health, liberty, or possessions.”
John Locke, Second Treatise of Civil Government (1690)
Anti-liberalists argue that people will pursue things that are not objectively valuable, such as fast food, pornography, drugs, and money, and some conclude that significant government intervention is necessary. Liberals such as Ludwig von Mises (1881–1973) disagree, saying that there are no objective “judgments of value, but [only] the valuations actually manifested by people in buying or not buying.” Liberalism runs against anyone who attempts to enlist the power of government to promote values, to raise taxes, or to prohibit “harmful” products, such as cigarettes and alcohol. JW
1695
Parallelism
Gottfried Wilhelm Leibniz
The notion that mental events and physical events are perfectly coordinated by God
A detail from the Triptych of St. Aignan in the Treasury at Chartres Cathedral, showing the Hands of God.
In the philosophy of mind, parallelism is the theory that events in the world occur because of the action of God. The mind and the body are understood as two systems running in parallel, and not affecting each other causally. If a person has an intention (in the mind) that he or she acts upon (with the body), according to the theory of parallelism it is God who causes that event, an action that is appropriate to the intention, to occur.
Parallelism is most associated with the German mathematician and philosopher Gottfried Wilhelm Leibniz (1646–1716), who outlined the theory in a series of works: Specimen dynamicum (A specimen of dynamics, 1695), Essais de Théodicée sur la bonté de Dieu, la liberté de l’homme et l’origine du mal (Essays of Theodicy on the Goodness of God, the Freedom of Man and the Origin of Evil, 1710), and Monadologia (Monadology, 1714). Leibniz asserted that there exists a perfect correlation between mind and body that was ensured by God at the beginning of time in a “pre-established harmony.” Consequently, nothing in the universe influences anything else. The theory is in keeping with his doctrine of monadology, in which the world is made up of incorporeal mindlike entities that he called “monads.” Leibniz believed that God created the mind (a single monad) and the body (a collection of monads) in perfect harmony. Thus, their mental and physical states would always correspond in an appropriate fashion.
The theory of parallelism never became popular among philosophers. It has encountered criticism because it conflicts with the empirical procedures of modern science (experience seems to provide ample evidence that the mind and body affect each other). However, some religious believers uphold the theory because it affords their God—who is concerned with the happiness of his subjects—the honor they believe is due to Him as the ruler of the universe. CK
c. 1700
Voodoo
African slaves in Saint-Domingue
An occult combination of the ideas of shamanistic religion and Catholicism
A twentieth-century painting of a Voodoo ritual.
Voodoo is a Caribbean religion mixing the shamanistic practices of West African Vodun with some elements of Roman Catholicism. Vodun, as practiced in West Africa, has a large pantheon of gods or spirits, including the remote Creator-God (Bondye), the actor-gods (loa), and the ancestral spirits (vudu). As with all shamanistic traditions, Vodun is concerned with both honoring beneficial spirits (who can be communicated with via the shaman, or bokor) and suppressing or appeasing the harmful spirits (who are conjured up by the evil shaman, or witch doctor).
When European slavers, in this case the French, came to West Africa and hauled away large groups of Vodun practitioners at the end of the seventeenth century, these parishioners were obliged to practice Catholicism, as stated in King Louis IV’s Code Noir (1685), which defined the conditions of French slaves. As a result, Vodun practitioners in Saint-Domingue began to call themselves Catholics and adopted many Catholic rituals while privately continuing to practice a modified form of Vodun, called Voodoo.
Voodoo identifies Bondye with the biblical Yahweh, but devotes most of its worship to the loa actor-gods who are a combination of lesser African deities and Catholic saints. In keeping with its shamanistic roots, Voodoo emphasizes communication with beneficial spirits. This, among other ways, can be achieved when the shaman, through music, dance, and drugs, becomes possessed by the desired spirit.
Voodoo has been influential for two reasons. First, although not a pure African religion, it is probably the most widely known African religion. And second, it introduced to the world of pop culture things such as Voodoo dolls (dolls in the likeness of a human which, when activated, are thought to affect the actual human) and zombies (“the living corpses” that are raised up by the spirit-possessed shaman). AB
c. 1700
Biblical Textual Criticism
Various theologians
The study of Judeo-Christian manuscripts in order to determine the authentic text of the Bible
Cover of the oldest complete codex of the Hebrew Bible, dating from c. 1010. The Hebrew Bible is organized into three main sections: the Torah, the Nevi’im, and the Ketuvim.
Early Jews often destroyed or buried their religious texts, and early Christians usually wrote theirs on papyrus, a material that is easily eroded, meaning that little of the original material for the Bible remains. The earliest copies of the Old and New Testaments date to the fourth century, when parchment was introduced. As copies were made by generations of scribes, errors were introduced. Some were involuntary mistakes, and some were perhaps the result of attempts to elucidate a passage that the person transcribing did not understand; others were intentional, maybe to favor an opinion or doctrine.
“[Textual criticism] is also quite rightly employed in the case of the Sacred Books.”
Pope Pius XII, Inspired by the Holy Spirit (1943)
Biblical textual criticism attempts to restore the Bible as near as possible to the original text by studying ancient Judeo-Christian manuscripts, such as the Dead Sea Scrolls (c. 150 BCE–70 CE) and the Codex Vaticanus (c. 325–350), and reconstructing the text in its original form upon leaving the hands of its author. Where there are several variants, textual critics opt for that which best agrees with the context and most closely conforms to the style of the author. Biblical textual criticism was developed around the start of the eighteenth century with the work of theologians John Mill (c. 1645–1707), Johann Albrecht Bengel (1687–1752), Johann Jakob Wettstein (1693–1754), Johann Salomo Semler (1725–91), and Johann Jakob Griesbach (1745– 1812).
The Catholic Church endorsed biblical textual criticism in 1943 when Pope Pius XII called for new translations of the Bible from the original languages as the basis for all Catholic vernacular translations, rather than merely the Latin Vulgate of St. Jerome. This initiated a period of Catholic study regarding the authorship, dating, and genealogy of Judeo-Christian manuscripts. As the texts to be examined increases with discoveries of manuscripts, so the debate on what constitutes a definitive text continues. CK
1704
Newton’s Color Wheel
Sir Isaac Newton
A device developed by Newton to demonstrate that light is composed of seven colors
In these diagrams from Opticks (1704), Isaac Newton illustrated the color wheel (center right) and showed how a prism refracts white light into seven spectral hues.
The Newton color wheel is a disk of seven colored segments: red, orange, yellow, green, blue, indigo, and violet. When the disk is spun quickly, the colors blur together and the eye cannot distinguish between the individual bands of color and perceives white. The wheel is named after its inventor, English physicist Isaac Newton (1642–1727), and first appeared in his well-known work Opticks (1704). He discovered that when a ray of white light is bent, or refracted, through a prism, it separates into a continuous gradation of seven colors, or spectral hues. Newton created a two-dimensional circular model merging the red and violet colors at each end of the band of colors to form a hue circle, or color wheel. The size of each segment differed according to his calculations of its wavelength and of its corresponding width in the spectrum. When mixing pigments of opposite colors on the wheel—the complementary colors—Newton found that “some faint anonymous color” resulted. He was unable to mix pigments of opposite hues on the wheel to make white because pigments are based on subtractive color, unlike light, which is additive color. He surmised that complementary colors cancel out each other’s hue.
“All colors are the friends of their neighbors and the lovers of their opposites.”
Marc Chagall, artist
His concept was demonstrated more thoroughly in the nineteenth century by physicists and color theorists, such as Ogden Rood (1831–1902). Rood’s book Modern Chromatics (1879) includes the Rood color wheel. He emphasized that artists needed a knowledge of complementary hues in order to be able to reveal applied colors in their natural brilliance. A version of Newton’s color wheel without the indigo hue was adopted by painters to describe complementary colors that cancel each other’s hue to produce an achromatic mixture of white, gray, or black. Newton’s color wheel was used in the study of mixing color. Knowledge of how complementary colors work together led to art movements such as Pointillism in c. 1886. CK
1706
Pi
William Jones
A mathematical constant that is the ratio of a circle’s circumference to its diameter
The mathematical constant pi has been represented by the Greek letter “π” since the mid-eighteenth century, but it is sometimes still written as “pi.” Pi is irrational, meaning it is not equal to the ratio of any two whole numbers. It is equal to approximately 3.14159, and an approximation such as 22/7 is often used for everyday calculations. Pi is used for mathematical problems involving the lengths of arcs, curves, and solid volumes. It is also employed in physics and engineering.
Pi has been used by mathematicians since ancient times: the Babylonians used 3.125 to approximate pi in c. 2000 BCE. By the seventeenth century, new methods of mathematical analysis provided better ways of calculating pi, and British mathematicians William Oughtred (1575–1660), Isaac Barrow (1630–77), and David Gregory (1659–1708) all used π as a symbol to describe the circumference of a circle. In 1706, Welsh mathematician William Jones (1675–1749)—in his Synopsis Palmariorum Matheseos (A New Introduction to the Mathematics)—defined π as the ratio of a circle’s circumference to its diameter. Swiss mathematician and physicist Leonhard Euler (1707–83) adopted the symbol in around 1736, and π finally came into common usage.
“The digits of pi march to infinity in a predestined yet unfathomable code.”
Richard Preston, The New Yorker (1992)
In 1748, Euler’s book Introductio in analysis infinitorum (Introduction to the Analysis of Infinities) included an important equation known as “Euler’s Identity,” which incorporated π and related it to the chief symbols in mathematics at that time. In 1910, Indian mathematician Srinivasa Ramanujan (1887–1920) developed ways of calculating pi, and, in 1934, the pi symbol π was adopted internationally. CK
1710
To Be Is To Be Perceived
George Berkeley
The concept that something exists only if it can be perceived by a perceiver
In philosophy, “to be is to be perceived” is the idea that there are only two elements involved in perception: the perceiver and what is perceived. Such a view discounts material objects, advocating that only the ideas that people perceive directly are real.
The phrase esse est percipi (aut percipere)—to be is to be perceived (or to perceive)—was coined by Anglo-Irish philosopher Bishop George Berkeley (1685–1753). Known for his immaterialism—the apparent denial of the reality of any external world—he was unconvinced by the philosophical ideas of René Descartes (1596–1650) and John Locke (1632–1704), regarding what he saw as their representationalist theories on perception. The two men distinguished between the material and the ideas by which people perceive it, which Berkeley thought led to skepticism and atheism.
Berkeley wanted to show that the world exists even if no one is looking at it, because for him the world was a collection of ideas perceived by God’s mind. He attacked representalism in his Treatise Concerning the Principles of Human Knowledge (1710), attempting to refute Locke’s belief that general terms signify abstract ideas. For Berkeley, the mind knows ideas, not objects. There are three types of ideas: sensation, thought, and imagination. When various ideas are associated, they are considered to be one thing, which is given a name to signify it. Berkeley did not deny the existence of an ordinary object, such as a chair, which he asserted is perceived by visual ideas regarding form and shape, tangible ideas regarding texture, and so on. He advocated that there is a physical world containing such ordinary objects, rather than a material world. The physical world is dependent on the mind because it consists of ideas that exist because they are perceived. Today, Berkeley is regarded as the father of idealism because he saw reality as a mental construction. CK
1710
Optimism
Gottfried Wilhelm Leibniz
The theory that we live in the best of all possible worlds, which includes a belief that the universe is improving and that good will ultimately triumph over evil
The title page of Gottfried Wilhelm Leibniz’s Essays of Theodicy on the Goodness of God, the Freedom of Man and the Origin of Evil (1710).
In philosophy, the concept of optimism arose in the eighteenth century. Philosophical optimists assert that we live in the best of all possible worlds created by God, who consistently chooses the best for a good reason, and does so freely. The theory is that there must be sufficient reason for everything in the world. In this worldview, Adam sinned of his own free will in the Garden of Eden by eating the apple, but God knew that Adam would sin, and so the rest of the world was built around the consequences of that sinful action. Although humans are imperfect and predisposed to evil, they are still able to identify true good and so correct their errors. Evil is thus necessary to bring out the goodness in humanity.
“If there were no best among all possible worlds, God would not have created one.”
Gottfried Wilhelm Leibniz, Essays of Theodicy on the Goodness of God, the Freedom of Man and the Origin of Evil (1710)
Optimism is most associated with the German mathematician and philosopher Gottfried Wilhelm Leibniz (1646–1716). He coined the phrase “the best of all possible worlds” in his work Essais de Théodicée sur la bonté de Dieu, la liberté de l’homme et l’origine du mal (Essays of Theodicy on the Goodness of God, the Freedom of Man and the Origin of Evil, 1710), in which he attempted to explain the problem of evil and how it does not conflict with God’s goodness. Some atheists had argued that, because evil is incompatible with God, the existence of evil is proof that God does not exist. Leibniz was trying to demonstrate that the presence of evil in the world was still compatible with an omniscient, omnipotent, and good God.
Leibniz’s idea of optimism inspired French writer Voltaire (1694–1778) to write Candide, ou l’Optimisme (Candide: or, Optimism, 1759). The novel satirizes Leibniz and tells the story of a young man, Candide, who is indoctrinated with Leibnizian optimism by his mentor, Doctor Pangloss. However, Candide experiences great hardship. The novel gave rise to the term “Panglossian” to describe someone who is overly optimistic. CK
1712
Steam Power
Thomas Newcomen
Harnessing the properties of steam to power mechanical devices
The Newcomen engine at Farme Colliery, Rutherglen, Scotland, built in the 1730s and photographed in c. 1908.
The notion that steam, produced by heating water, might be channeled to provide driving force was first demonstrated by Hero of Alexandria in the first century CE. His device, called the aeolipile, was a hollow globe that spun by force of escaping steam jets. The use of steam, fed under pressure into the confined space of a cylinder to expand the air within it and force a piston to move, was no more than a sophisticated development of the idea of the aeolipile. The difference was that the aeolipile was little more than an interesting toy, whereas the steam-powered piston would become the mighty driving force of the Industrial Revolution.
In 1698, English engineer Thomas Savery (c. 1650–1715) built a steam-powered pump to extract water from mine shafts. The device, which became known as the “Miner’s Friend,” not only used the expansion of steam to force forward a piston, which in turn pumped water from the mine, but also condensed the steam in the cylinder with internal cold-water sprinklers to create a vacuum. As well as drawing back the piston, the vacuum was harnessed to extract additional water from the mine via a valve in the cylinder. However, the rudimentary pressurized boiler was liable to explode.
“Those who admire modern civilization usually identify it with the steam engine.”
George Bernard Shaw, Man and Superman (1903)
The first truly successful engine for driving a pump and removing water from mines was the atmospheric steam engine, invented by English ironmonger Thomas Newcomen (1664–1729). This created a vacuum inside a cylinder that pulled down a piston; a lever transferred that force to a pump shaft that descended into the mine. Newcomen installed his first atmospheric steam engine at a mine in Staffordshire in 1712. CK
1714
Monadology
Gottfried Wilhelm Leibniz
The view that reality is made up of monads, or hypothetical, intelligent, whole entities
Monadology is the name for any system sharing the concept of a monad. In philosophy, monads are indestructible, soul-like, self-sufficient substances that reflect the world’s order. They make up the universe but lack spatial exten§sion and are immaterial. A monad cannot be split because it is a unified whole without parts; in the material world atoms can be split into smaller particles because they are formed of parts.
Although ancient Greek philosophers used the term “monad,” it was popularized in the eighteenth century by German philosopher and mathematician Gottfried Wilhelm Leibniz (1646–1716). He wanted to reconcile the theories of the philosophers Baruch Spinoza (1632–77) and René Descartes (1596–1650) regarding the nature of matter. Leibniz began using the term “monad” in 1675 and by 1690 conceived monadism. He published his doctrine in his work Monadologia (Monadology, 1714). Leibniz asserted that “all matter is interlinked and each body feels the effects of everything that happens in the universe.” He argued that a living thing is a body that has a monad as its soul, and that objects in the material world are collections of monads. Leibniz believed that the universe works metaphysically, referencing God, rather than materially, referencing nature.
“The monad … is nothing but a simple substance, which enters into compounds.”
Gottfried Wilhelm Leibniz, Monadology (1714)
Various philosophers have used Leibniz’s concept of the monad to describe an unseen force. In modern times, the intellectual structure of monadology has been applied to physics because it provides a model for subjectivity and the role of the perceiver. It has led to the rise of quantum monadology as scientists investigate the nature of consciousness and time. CK
1715
Rococo
France
A playful architectural, literary, and artistic style
The Rococo interior (1743–72) of the Basilica of the Fourteen Holy Helpers in Bavaria, Germany.
Rococo is a lighthearted style of art, architecture, literature, landscape design, and music that emerged in France after the death of King Louis XIV (1638–1715). It achieved its greatest expression in the decorative arts and architecture, reaching its peak in the 1770s. Rococo was a reaction to the pomp and grandeur of Baroque, and the word “Rococo” is a fusion of the French word rocaille (a style of ornamentation with pebbles and shells) and barocco (the Italian for Baroque.) Originally, Rococo was a pejorative term, coined when the style was going out of fashion in favor of Neoclassicism.
“The serpentine line … waving and winding … leads the eye in a pleasing manner …”
William Hogarth, The Analysis of Beauty (1753)
As the French court distanced itself from Louis XIV’s excesses, court artists developed the playful style that became known as Rococo. In France, Rococo’s leading exponents were the artists Jean-Antoine Watteau (1684–1721), François Boucher (1703–70), and Jean-Honoré Fragonard (1732–1806). Published engravings of their work caused interest in Rococo to spread through Europe, influencing painters such as the Italian Giambattista Tiepolo (1696–1770) and Englishman Thomas Gainsborough (1727–88). Architects introduced Rococo’s sinuous S-shaped curves into interiors of delicate white stucco adornments. In literature, also, the narrative structure of novels such as The Life and Opinions of Tristram Shandy, Gentleman (1759–67), by Laurence Sterne (1713–68), adopted Rococo’s free, expressive style. In the early 1760s, Rococo was criticized by leading figures in France for its superficial content and romantic subject matter. Rococo became associated with the degeneracy of the ancién regime in France, and it fell out of fashion across Europe. CK
1717
Freemasonry
England
A secret fraternal organization with roots in deism
Freemasonry is a secret fraternal order with a structure based on ancient religious orders and chivalric brotherhoods. The order is organized into Grand Lodges, or Orients, in charge of constituent Lodges. Officers in a Lodge are titled worshipful master, senior and junior warden, and tiler. Ordinary freemasons can be an apprentice, fellow of the craft, or a master mason. Members are adult males, who are required to believe in the existence of a supreme being and the immortality of the soul. Freemasonry bears many of the trappings and teachings of a religion, and masons have “to obey the moral law.” Masons recognize each other using secret signs, words, and handshakes.
The term “freemason” dates to the fourteenth century and means a mason of superior skill. Later it came to describe someone who enjoyed the privileges of belonging to a trade guild. The modern term “freemasonry” came into use in the mid-eighteenth century to refer to the secret society that dates to the constitution of the Grand Lodge of England, which was founded in London on June 24, 1717.
“A Mason is to be a peaceable subject to the civil powers, wherever he resides or works …”
The Constitutions of the Free Masons (1723)
By the 1730s, freemasonry had spread to North America (the earliest known Grand Lodge was in Pennsylvania) and to Europe (the Grand Orient de France was established in 1733). Freemasonry now exists in various forms throughout the world. Where freemasonry is influential, there has been criticism that members systematically prefer other members for job appointments, to the detriment of civic equality and public interest. Freemasonry has also been criticized by religious institutions for its deistic aspects. CK
1725
Stages of History
Giambattista Vico
Human society develops in cycles with gains inevitably giving way to corruption
The theory of the stages of history was formed by Italian political philosopher, rhetorician, and historian Giambattista Vico (1668–1744), and was outlined in his book Scienza Nuova (The New Science, 1725). The idea is that human history passes through connected stages, together identified by a pattern of growth and decay. Vico suggests that society goes through a “bestial” stage in which it is ruled by a belief in the supernatural; then an “age of heroes,” in which society is divided into ruling and underling classes among which defensive alliances are formed; and finally, an “age of men,” in which there is class conflict. The lower class achieves equal rights, but only as the result of corruption, and so society may then fall back into an earlier stage of the cycle. At this point, the process starts again.
“The nature of peoples is first crude, then severe, then benign [and] finally dissolute.”
Giambattista Vico
A Catholic, Vico believed that providence had a guiding hand in events, righting matters when necessary to help humanity to survive the successive cycles. Vico’s stance was against the Cartesian school of thought that prevailed at the time, which he believed failed to take into account the importance of language, myth, and history in interpreting the past.
Vico’s book had little impact at the time of publication but was later read by German Romantic theorists, including Johann Wolfgang von Goethe (1749–1832), who were drawn to his vision of human history. The New Science went on to inspire various thinkers and artists, notably German economist and historian Karl Marx (1818–83), whose economic interpretation of history was influenced by it, although Marx differed in his view of the benefits of religion. CK
1725
Mechanical Programmability
Basile Bouchon
The automation of complicated instructions for machines
A model of the mechanically programmed loom designed by Jean-Baptiste Falcon in 1728.
In many complicated industrial operations, the setting-up of the machines involved can take as much time as is needed for the machines to perform. With mechanical programmability, the means of controlling the actions of numerous interacting components, and quickly changing their operations, may be set up in advance. For the worker, activating mechanical programs is far easier than setting up machine operations from scratch.
Mechanical programmability is said to have begun in the ninth century, with the invention of a musical organ, powered by water, that automatically played interchangeable cylinders with pins (similar to those of musical boxes). However, the automation of mechanical processes arose in the eighteenth century. In 1725, French textile worker Basile Bouchon invented what is considered to be the first industrial application of a semiautomated machine. He worked at a silk center in Lyon and devised a way of controlling a loom with a perforated paper loop that established the pattern to be reproduced in the cloth. His invention automated the setting-up process of a drawloom, whereby operators lifted the warp threads using cords.
“Punched card accounting was the first serious attempt to convert data …”
George A. Fierheller, industry leader
Unfortunately, the number of needles used was insufficient to allow larger designs to be weaved. Also, one tear in the perforated paper loop made the loop unusable. However, in 1728, Bouchon’s assistant, Jean-Baptiste Falcon, improved the design; he expanded the number of cords that could be handled by arranging the holes in rows, and replaced the paper roll with rectangular perforated cardboard cards joined in a loop, which made it easier to change the program. CK
1731
Euler’s Number (e)
Leonhard Euler
A mathematical constant that lies at the foundation of the natural logarithm
Euler’s Number, also known as the number e, is a mathematical constant, approximately equal to 2.71828, which is the base of the natural logarithm. A logarithm turns complicated sums involving multiplication into simpler ones involving addition, and complicated sums involving division into simpler ones involving subtraction. The number e is important in mathematics and has numerous applications in probability theory, derangements, and asymptotics. The exponential function is used to calculate compound interest for example, and the number e is used to perform differential and integral calculus.
Swiss mathematician and physicist Leonhard Euler (1707–83) introduced the number e as the basis for natural logarithms in a letter he wrote in 1731. The first time the number e was used as a constant in a published work was in 1736 in Euler’s Mechanica. He later linked the number’s logarithmic function with its exponential function in his book Introductio in analysis infinitorum (Introduction to the Analysis of Infinities, 1748), in which he advanced the use of infinitesimals and infinite quantities in mathematical analysis. Euler’s appreciation of the importance of introducing uniform analytic methods into mechanics helped mathematicians to solve problems in a clear way.
“Both mechanics and analysis are … augmented more than just a little.”
Leonhard Euler, Mechanica (1736)
In 2010, a computation performed by Japanese systems engineer and academic Shigeru Kondo (b. 1955) listed the number of known decimal digits of the number e as 1,000,000,000,000. The computation took Kondo 224 hours—more than nine days—and he then took another 219 hours to verify it. CK
1733
Bell Curve
Abraham de Moivre
A form of statistical representation that identifies the norm, or golden mean
The “bell curve” is a means of portraying probability distribution; its main purpose is to show how most features follow natural patterns, hovering around an average occurence and declining exponentially as they move away from the norm. This, when plotted on a graph, results in a bell shape, referred to as the “de Moivre distribution,” named after Abraham de Moivre (1667–1754), who proposed its formula in 1733.
A century later, in 1833, German mathematician Carl Friedrich Gauss (1777–1855) theorized that, statistically, probabilities would congregate about a norm. His theory was seized upon by ideologists, who wanted to show that harmonies gathered in neat and proper order around a measured, golden mean; they longed for a verifiable ideal to which social, political, and economic aims could aspire. The post-Enlightenment influence of Gauss’s idea was huge: large, unpredictable deviations (outliers) are rare and came to be more or less dismissed as enthusiasm for the smooth curve of the bell grew. In a world that humanity had begun to manage by means of description, it offered a significant improvement in the ordering of knowledge.
“The bell curve … is more than a technical description. It shapes our thought.”
Ulrich Beck, World at Risk (2009)
Sadly, outliers do, sometimes, throw predictability askew, as seen in the economic crash of 2008. Divergence from the mean does not necessarily imply error. What ought to occur is no predictor of what will occur, and mediocrity is not the goal of natural or social evolution: divergence is normal. Unpredictable change is what we have to live with, and, in this respect, outside mathematical theory, the bell curve has been more of a liability than a blessing to understanding. LW
1736
Graph Theory
Leonhard Euler
An abstract representation of mathematical objects with linked related coordinates
Graph theory is a branch of mathematics concerned with networks of points connected by lines—it does not relate to graphs used to represent statistical data, such as bar graphs or pie charts. The field of graph theory originated in 1736, when the Swiss mathematician Leonhard Euler (1707–83) published his solution to the Königsberg bridge problem in the Russian journal Commentarii Academiae Scientiarum Imperialis Petropolitanae. The problem, an old puzzle, asked whether it was possible to find a route through the city of Königsberg that entailed crossing each of its seven bridges only once. By considering the problem abstractly and using points and connecting lines to represent the land and bridges, Euler proved by means of a graph that it was impossible.
“The origins of graph theory are humble, even frivolous …”
Graph Theory 1736–1936 (1976)
Graphs provided a simple means of representing the relationships between objects, and often the act of presenting a problem pictorially in this way could be all that was needed to find a solution. The term “graph” itself was coined in the nineteenth century by British mathematician James Joseph Sylvester (1814–97), who was interested in diagrams representing molecules.
Graph theory has since evolved into an extensive and popular branch of mathematics, with applications in computer science and the physical, biological, and social sciences. In 1936, exactly 200 years after Euler solved the problem of the Königsberg bridges, Hungarian mathematician Dénes König published the first ever book on graph theory, Theorie der endlichen und unendlichen Graphen (Theories of Finite and Infinite Graphs). JF
1739
Is / Ought Fallacy
David Hume
A moral imperative (“ought”) cannot derive from a statement of fact (“is”)
The is/ought fallacy occurs when a conclusion reached expresses what ought to be, but is inferred from premises that state only what is so. Thus, can “is statements,” the kind typically expressing facts, imply “ought statements,” the type associated with morality? Most argue that the two are not interchangeable: in the statement A ought to be B, therefore C, C cannot be logically verified from the “ought” that connects A and B. Alice asks the Cheshire Cat, “Which way ought I to go from here?”, but that is a bit odd since directions are factual things—they are “is” statements (the correct way), not moral prescriptions (the right way or good way). Things we ought to do are often culturally prescribed morals or duties, and therefore have no business in the realm of logic or science.
“If it was so, it might be; and if it were so, it would be; but as it isn’t, it ain’t. That’s logic.”
Lewis Carroll, Alice’s Adventures in Wonderland (1865)
The fallacy was articulated by David Hume (1711– 76) and is often also called Hume’s Law, or Hume’s Guillotine. In A Treatise of Human Nature (1739), Hume noted the significant difference between factual or descriptive statements (the rose is red) and prescriptive or normative states (the rose ought to be pretty), and warned that we must exercise caution with such inferences when not told how the “ought” is derived from the “is” of the statement.
Hume’s point was that no ethical conclusion could be validly inferred from any set of factual premises. He was alerting people to the slippery tongues of religious fanatics and con artists. So, if anything, Hume taught us to be on guard with language—to listen carefully to what others say, beware of shenanigans, and be ourselves careful to mean what we say. KBJ
1744
Spiritualism
Emanuel Swedenborg
The belief that spirits of the dead can communicate with the living
A musical instrument is seen supposedly rising into the air at a seance in c. 1920. Many images showing “proof” of ghostly happenings were created using clever tricks of photography.
Belief in the possibility of communicating with the departed has been observed in various cultures for centuries. The belief system or religion known as spiritualism postulates that the spirits of the dead reside in a spirit world and can communicate with the living. Spiritualists seeking to make contact with the dead usually enlist the help of a medium, a person believed to possess a special ability to contact spirits directly. Spiritualists and mediums hold formal communication sessions, called seances, to speak with spirits.
“By spirits man has communication with the world of spirits, and by angels, with heaven. Without communication … man can by no means live.”
Emanuel Swedenborg, Arcana Coelestia (1749–56)
Modern spiritualism stems from the work of Swedish scientist, Christian mystic, philosopher, and theologian Emanuel Swedenborg (1688–1772). In 1744, he had his first vision of Christ and so began his spiritual awakening. Swedenborg went on to have dreams and visions, during which he spoke to spirits and visited the spirit world. Swedenborg believed he was chosen to reveal the spiritual meaning of the Bible, which he attempted to do in a series of theological works, the first of which is Arcana Coelestia (Heavenly Arcana, written between 1749 and 1756).
After Swedenborg died, the first Swedenborgian societies appeared, founded by followers dedicated to the study of his teachings. Although Swedenborg did not advise people to seek contact with spirits, many people were inspired to do so by his writings. The practice of seeking contact with the dead through the aid of a medium spread. By the mid-nineteenth century, spiritualism was popular in the United States, France, and Britain, and the National Spiritualist Association was founded in 1893. Such was spiritualism’s popularity that the Vatican condemned spiritualistic practices in 1898, claiming that attempts to contact the dead were blasphemous and related to the occult.
Spiritualist churches now exist worldwide. Services are generally conducted by a medium, and spiritualists meet to seek their spirit guide. CK
1747
The Man-Machine
Julien Offray de La Mettrie
An extreme materialist view of human beings as having only a body and no soul
A portrait of French philosopher Julien Offray de La Mettrie (1757), engraved in copper by Georg Friedrich Schmidt after Maurice Quentin de Latour.
The idea of humans as machines arose in the eighteenth century. French philosopher and mathematician René Descartes (1596–1650) had suggested that the human body has material properties and works like a machine but that the mind, or soul, was not material and does not follow the laws of nature. The idea of the man-machine rejected this Cartesian view, regarding the dualism of body and mind, and denied the existence of the soul as a substance separate from matter.
“We think we are, and in fact we are, good men, only as we are gay or brave; everything depends on the way our machine is running.”
Julien Offray de La Mettrie, Man as Machine (1748)
The concept of the man-machine was put forward by French physician and philosopher Julien Offray de La Mettrie (1709–51) in his materialist manifesto L’homme machine (Man as Machine), first published anonymously in Holland in 1747 but dated 1748. La Mettrie’s argument was informed by his clinical expertise and he approached philosophy from a medical standpoint. His provocative work challenged religious orthodoxy by dismissing the idea of humans having an animal or human soul. La Mettrie argued that they do not have free will but rather are automatons whose actions are determined by bodily states. He attacked the hypothesis of monadism proposed by German philosopher and mathematician Gottfried Wilhelm Leibniz (1646–1716) and his supporters as “incomprehensible,” writing: “They have spiritualized matter rather than materializing the soul. How can we define a being whose nature is utterly unknown to us?”
La Mettrie’s book caused a scandal and copies were burned in public. His philosophy was considered too radical by his contemporaries, and even other French philosophers turned against him, denouncing him as a lunatic. La Mettrie was forced to seek asylum in Berlin at the court of the Prussian King Frederick II, where he spent the rest of his life. La Mettrie’s philosophy has undergone a resurgence of interest since the twentieth century with the advent of neuroscience and the study of cognitive systems. CK
1747
Thalassotherapy
Richard Russell
The notion that sea bathing is beneficial in treating physical and mental conditions
A male patient undergoes thalassotherapy for arthritis in 1949.
Thalassotherapy is a therapy based on the belief that bathing in the sea is beneficial to health. Practitioners believe that the trace elements of calcium, iodide, magnesium, potassium, and sodium found in seawater are absorbed through the skin. The minerals are thought to boost blood and lymph circulation, thus accelerating the metabolism and eliminating toxins. Thalassotherapy is carried out in both the sea and in spas that use hot seawater; sometimes marine mud or seaweed are applied. The therapy is aimed at arthritis, depression, eczema, muscular pain, psoriasis, and rheumatism.
Sea bathing dates back to ancient times, but English physician Richard Russell (1687–1759) was the first to document the medical benefits of seawater. A resident of Lewes in Sussex, England, he went to Brighton to test his theories in 1747. Three years later, he published a dissertation in Latin, De tabe glandurali sive de usu aquae marinae in morbis glandularum, which advocated the use of seawater to cure enlarged lymphatic glands. This appeared in English in 1752 as “Glandular Diseases, or a Dissertation on the Use of Sea Water in the Affections of the Glands: Particularly the Scurvy, Jaundice, King’s-evil, Leprosy, and the Glandular Consumption.”
“I have seen many patients relieved, and some cured, by the … use of sea water.”
Richard Russell, Glandular Diseases (1752)
Russell’s seawater cure was so popular that it initiated the development of seaside resorts, with wealthy Londoners flocking to towns on England’s south coast, such as Brighton. The practice spread to the continent, and a French physician, Joseph La Bonnardière (1829–87), invented the term “thalassotherapy” in 1865, drawing on the Greek words thalassa (sea) and therapeia (treatment). CK
1748
Separation of Powers
Montesquieu
The division of a government’s powers into branches, each with its own responsibilities
How does a society, through its government, protect political liberty? For Charles-Louis de Secondat, Baron de Montesquieu (1689–1755), the answer lay with dividing its powers to prevent a concentration of authority that would lead to encroachments on freedom. In his treatise, Défense de l’Esprit des lois (The Spirit of Laws, 1748), Montesquieu coined the phrase “separation of powers” to describe a political system wherein three different government branches—the legislative, the judicial, and the executive—each had their own function. The legislative branch creates laws and provides for funding, the executive implements the government’s policies, and the judicial branch presides over conflicts. Montesquieu believed that each should have the ability to restrain the other.
“Liberty [is] intolerable to … nations who have not been accustomed to enjoy it.”
Charles-Louis de Secondat, Baron de Montesquieu
A French nobleman and lawyer, Montesquieu carefully examined both historical and contemporary governments, looking at how they operated, what they lacked, and what they were able to produce. Today, there is no system of democracy that exists without some type of separation of powers. While governments around the world separate powers in different manners, and have various systems of checks and balances to restrain each branch, the notion that governmental authority must be split to ensure personal liberty and safety is a foundational one in modern political theory. The separation of powers became a cornerstone principle in the creation of the United States of America, and, although not all states use the three-branch system, the idea is almost universally present in modern political structures. MT
1748
Verificationism
David Hume
The concept that we can discuss only what we can measure
The concept of verificationism holds that it must be possible to determine whether a statement or idea is true or false, and that a question must be answerable for it to deserve consideration. When the concept originated is uncertain, but it was evident in the writings of the Scottish philosopher David Hume (1711–76), and later in those of Immanuel Kant, Gottfried Wilhelm Leibniz, Ludwig Wittgenstein, and even Albert Einstein. Hume suggested that for concepts to be accepted, they must first be verified by sensory experiences. In his Enquiry Concerning Human Understanding (1748), he considers the verifiability of divinity and metaphysics. He writes: “Does it contain any experimental reasoning concerning matter of fact, and existence? No. Commit it, then, to the flames.”
“The philosopher … is not directly concerned with … physical properties.”
A. J. Ayer, philosopher
Verificationism claims that we are not born with any “innate” knowledge, and that mankind must acquire knowledge through direct observation. In his heavily criticized but widely read work Language, Truth and Logic (1952), the philosopher A. J. Ayer (1910–89) suggested the use of verificationism to sift serious statements from gibberish. According to Ayer, “A proposition is verifiable in the strong sense if and only if its truth could be conclusively established by experience.”
Verificationism was usurped by logical positivism in an attempt to render meaningless aesthetics, ethics, metaphysical inquiry, and even religious belief. If God cannot be verified, why bother to think of Him? But in the end, verificationism was doomed by a paradox of its own making: its own inherent self-contradictions. Verificationism turned out to be unverifiable because it was neither analytically nor empirically testable. BS
c. 1750
Deobandi
Shah Waliullah
A movement advocating that Muslims’ first loyalty is to their religion
The campus of Darul Uloom Islamic University, located in the north Indian town of Deoband.
“Deobandi” is a term used to describe a revivalist movement in Sunni Islam. The movement is named for the city of Deoband in India, which is home to the leading madrassa (Muslim theological school) in the country, Darul Uloom (House of Learning).
Deobandi was inspired by the mid-eighteenth-century teachings of Indian sociologist, historian, and Islamic reformer Shah Waliullah (1703–62). At a time when Muslim power was waning, Waliullah worked to revive Islamic rule. His approach in explaining Islamic social theory originated in the Koran. He wrote about numerous Islamic topics, including hadith (the study of traditions), fiqh (jurisprudence), tafsir (Koranic exegesis), kalam (scholastic theology), and falsafah (philosophy). His ideas of Islamic economics and society were seen as revolutionary at the time. Some academics regard Waliullah as the first person to conceive the idea of Pan-Islam because he advocated a common system of jurisprudence, the establishment of national governments with just rulers, and an internationalism in keeping with what he regarded as the purpose of the sharia of the Islamic prophet Muhammad.
“I am a student of the Qur’an without any intermediary.”
Shah Waliullah
Inspired by Waliullah’s teachings, the Darul Uloom madrassa was founded in 1867, nine years after the First War of Indian Independence and at the height of British colonial rule in the Delhi region of northern India. The Talib rulers in Afghanistan who overthrew the Soviets in 1991 trained at the madrassa, and some commentators regard Deobandi as having given rise to the Taliban movement in Afghanistan. In 2008, Deobandi leaders held a conference denouncing terrorism. CK
c. 1750
Romanticism
Johann Georg Hamann
An artistic movement that saw human nature as wild, emotional, and individual
The Wayfarer Above a Sea of Fog (c. 1818), by Caspar David Friedrich, captures the Romantic love of wild nature.
Romanticism in the arts in Europe and North America began in the mid-eighteenth century and peaked in the early nineteenth century. The movement emphasized individualism, sublime nature, heightened emotions, creative power, the supernatural, and the imagination, and celebrated the inner world and genius of the artist. The movement’s longing for wild, untamed nature came partly as a response to the arrival of the Industrial Revolution (1760–1840), and the widespread anxiety caused by social dislocation and increasing mechanization.
In artistic terms the movement was a reaction against the order, balance, and idealized harmony that characterized the Enlightenment and Neoclassicism in the eighteenth century. The German literary movement Sturm und Drang (Storm and Stress) emphasized human passion, the mystical, and wild nature, and its thinking was profoundly influenced by German philosopher Johann Georg Hamann (1730–88) and his critique of Enlightenment rationalism.
“Our frame is hidden from us, because we … are wrought in the depths of the earth.”
Johann Georg Hamann, Socratic Memorabilia (1759)
Romanticism spread throughout Europe and North America in literature, art, architecture, and music. Among prominent Romantic writers are the German writer Johann Wolfgang von Goethe and the English poets William Blake and William Wordsworth. Romantic artists include Swiss painter Fuseli, German painter Caspar David Friedrich, French painter Eugène Delacroix, U.S. painter Thomas Cole, and English painter J.M.W. Turner. Romantic composers include Frédéric Chopin from Poland, Hungarian Franz Liszt, and Italian Giuseppe Verdi. CK
c. 1750
Classical Music
Joseph Haydn
A European musical movement inspired by ancient Greek and Roman ideals
The classical period of music in Europe took place between about 1750 and 1820. The music is characterized by its use of homophony, a single melodic structure supported by accompaniment, and multiple changes of key and tempo in a single piece. The best-known composers of the period include Joseph Haydn (1732–1809), Wolfgang Amadeus Mozart (1756–91), and Ludwig van Beethoven (1770–1827).
When the great Baroque composer Johan Sebastian Bach died in 1750, Haydn, widely recognized as the first classical composer, was eighteen years old. Many European artists had begun to embrace the ideals of Neoclassicism, a movement espousing a newfound respect for the works of the ancient Greek and Roman artists. Classical composers created music that reflected the ideals of symmetry, harmony, and balance that they found in classical art. The movement eschewed Baroque and Rococo displays of musical virtuosity and grandiose compositions for simpler, more elegant works. The classical period saw the invention of the sonata, and also a rise in the prominence of other forms, such as the concerto and symphony.
“I was cut off from the world … and I was forced to become original.”
Joseph Haydn
Beethoven is often credited as the last classical composer (or the first of the Romantic era). Today, people often use the term “classical” to refer to a range of chamber music periods, but the music from the true classical period of 1750 to 1820 is perhaps the most popular form. Classical era works are a staple of modern orchestras, symphony performances are rigorously studied, and the music features regularly in movies, television programs, and commercials. MT
1753
Biblical Source Criticism
Jean Astruc
The study of ancient Judeo-Christian manuscripts to evaluate biblical texts
Sometimes called biblical documentary criticism or biblical literary criticism, biblical source criticism involves studying Judeo-Christian manuscripts to determine whether books and passages in the Bible stem from one author or multiple sources. When investigating a source, scholars try to establish its date, where it was written, why it was written, and whether it was redacted (changed for a specific purpose), in order to arrive at the document’s original form.
Source criticism began with a book by French scholar Jean Astruc (1684–1766), Conjectures on the Original Documents that Moses Appears to Have Used in Composing the Book of Genesis (1753), in which Astruc investigated the mosaic of manuscript sources that led to the Book of Genesis. His methods were developed by German scholar Julius Wellhausen (1844–1918), who is known for his Prolegomena to the History of Israel (1883), which argues that the Torah, or Pentateuch, originated from redactions of four texts dating centuries after Moses, who was traditionally credited as the author.
“In the beginning was the Word, and the Word was with God, and the Word was God.”
The Bible, John 1:1
Thanks to these authors, it is now accepted that the Pentateuch comes from four sources: Yahwist, written in c. 950 BCE in southern Israel; Elohist, written in c. 850 BCE in northern Israel; Deuteronomist, written in c. 600 BCE in Jerusalem during a period of religious reform; and Priestly, written in c. 500 BCE by Jewish priests in exile in Babylon. The same methodology, applied to the New Testament, led to the discovery that Mark was the first Synoptic Gospel written. Matthew and Luke both depend on Mark’s Gospel, along with the lost collection of sayings Q (quelle, German for “source”). CK
1755
Compatibilism
David Hume
The argument that determinism and free will need not rule each other out
Also known as soft determinism, compatibilism holds that for humankind determinism and a form of freedom can exist at the same time, without being logically contradictory or inconsistent. Determinism is the view that every event—including human thought, behavior, decision making, and action—has a cause that not only explains it, but also has been determined to occur by previous events. So, you really had no choice but to read this book—a series of prior events and your own desires and motivations deterministically brought you to the point where you are now, reading this book. Compatibilists accept determinism, but also claim that people are free in the sense that, when they do act on their determined motivations, there are no external impediments blocking them from performing or carrying out the act. So, that you are actually able to read this book without something or someone preventing you from reading it means that you are free.
One of the most influential proponents of this view was David Hume (1711–76), particularly in Of Suicide (1755), though earlier forms of the concept had been expressed by the Stoics and Thomas Hobbes (1588–1679). Arthur Schopenhauer (1788–1860), whose philosophical work was largely focused on motives and will, later wrote that a person is often free to act on a motive if nothing is impeding the act, but the nature or source of that motive is determined and not in the control of the subject. So, while you may be free to walk into a restaurant because the door is open, the hungry feeling itself or even the timing of it is not in your control—it is your biology.
Some have argued that human genetics demonstrates compatibilism since a great deal of our anatomy, physiology, and psychology is determined by our genes. Determination made me tell you that, but at least I was free to act on that determination. KBJ
1755
Critique of Inequality
Jean-Jacques Rousseau
The argument that inequality stems from living in societies and from unjust laws that are created by these societies
A cartoon from the French Revolution (1787–99) titled “The Peasant weighed down by the nobility and the clergy,” depicting the inequality in society that was faced by many.
When the Genevan philosopher Jean-Jacques Rousseau (1712–78) was asked, “What is the origin of the inequality among men, and is it justified by natural law?” he responded with what many consider a philosophical masterpiece, his Discourse on the Origin of Inequality (1755). In it, he argued that “natural man” is good, but failures in the social contract governing the relationship between people lead to inequality.
“You are undone if you once forget that the fruits of the earth belong to us all.”
Jean-Jacques Rousseau, Discourse on the Origin of Inequality (1755)
Rousseau founded his critique on the assumption that people in their natural state are free from jealousy, but the exposure to other people’s abilities and achievements that comes from living in society causes jealousy and vice to emerge. Society, then, makes people want to set themselves above others. The concept of property, too, poses a significant problem, because a government and laws are then needed to protect people’s possessions. Government thus ends up further promoting inequality, because it is mostly of advantage to the rich.
Rousseau argued that a new social order was needed to banish inequality. He went on to write a sequel to Discourse on the Origin of Inequality, called The Social Contract (1762), in which he made a case for thinking about society as an artificial person governed by a collective will. This, Rousseau asserted, would enable men to receive independence from “actual law,” which protects the societal status quo, and enjoy “true law,” which confers political freedom upon an individual. Current “actual law” gives people no choice but to submit all of their rights to a society that only divides them further from one another. Rousseau’s ideal of a government founded upon the consent of the governed underlies political philosophies such as Marxism and Socialism, as well as, in modified form, representative democracies such as that of the United States. MK
1756
Artificial Preservation of Food
William Cullen
The development of technology dedicated to arresting the natural spoilage of food
A supplier awaits a telephone call requesting a delivery of ice for a refrigerator or ice box, 1915.
While people had been fermenting and drying food since time immemorial, it was not until Scottish physician and agriculturalist William Cullen (1710–90) proposed a device to refrigerate foods, thereby slowing spoilage, that a man-made object alone worked against the laws of nature to preserve food. Refrigeration did not become a common method of food storage until almost 200 years after Cullen first demonstrated it in 1756, but its development was a clear victory in the battle against microorganisms that rob food of its looks, nutritional content, and edibility.
Cullen’s first refrigeration system was made by using a pump to create a partial vacuum over a container of diethyl ether, which, when boiled, absorbed heat from the air around it. This, in turn, created cool air and even produced ice. Perhaps because of the complicated nature of the system, people did not see a practical use for refrigeration at first, but efforts continued to create artificial cooling systems. Benjamin Franklin (1706–90), among many other scientists, persisted in conducting experiments to that end. At first, only large food-producing operations, such as slaughterhouses and breweries, used refrigeration, but then trucking companies began to install cooling systems in the vehicles that they used for transporting perishable items, and by the 1920s refrigerators were available as consumer products.
Refrigeration makes possible the shipping of food to geographic locations where that food cannot be grown or processed, thereby expanding the potential variety and nutritional content of the populace’s diet. Coupled with improved health care, a balanced diet allows for a better quality, longer lifespan—and this would be difficult to accomplish without the refrigerator. In addition, refrigerators facilitate longer working hours, since food can be purchased or prepared ahead of time and reheated as needed. MK
1759
Cultivate Your Garden
Voltaire
Philosophical disillusionment may be countered by work and friendships
A first edition of Candide (1759) by French writer Voltaire, held at the Taylor Institute in Oxford, United Kingdom.
To “cultivate your garden” is, according to the satirical novel Candide (1759) by Voltaire (1694–1778), the secret to happiness. The book’s main character, Candide, rejects the relentless optimism of his tutor, Doctor Pangloss, who argues that “all is for the best in this best of all possible worlds.” Candide and his companions discover otherwise when they venture into the world, which is filled with disappointment and misfortune, much of it modeled on real events, such as the Seven Years’ War (1756–63) and the Lisbon earthquake of 1755. Yet, peace and happiness are found when Candide and his friends settle on a farm and cultivate a garden.
Voltaire’s inspiration for the novel was his own dissatisfaction with the metaphysical optimism that pervaded the philosophical landscape of his time. He especially takes issue with German philosopher Gottfied Wilhelm Leibniz (1646–1716) and his rosy outlook on a world that Voltaire recognized was full of tragedy, misery, and folly. Through Candide, Voltaire suggests that, instead of finding solace in abstract hopes, people ought to “cultivate their gardens,” or embrace a pragmatic worldview built on intentional work and bolstered by strong human relationships.
Candide and its irreverent philosophy, as well as its cuttingly humorous tone, threatened institutions and philosophers long after its publication, and the novel has been banned hundreds of times over hundreds of years for being “filthy” and “obscene.” Even so, it has been canonized in Western literature, and its black humor and portrayal of people trapped in meaningless existences have influenced the work of anti-establishment authors such as Joseph Heller, Thomas Pynchon, Kurt Vonnegut, and Samuel Beckett. Candide’s rejection of metaphysical meaning, when combined with its exaggerated portrayal of folly, makes it the forerunner of theatrical works such as Beckett’s Waiting for Godot (1952). MK
1762
Man Is Born Free
Jean-Jacques Rousseau
The concept that freedom is the natural state of humankind, but that freedom is lost through the corrupting and alienating effects of an overly materialistic society
This frontispiece of one of the 1762 editions of Jean-Jacques Rousseau’s Social Contract also features the work’s subtitle, “Principles of Political Right.”
The first words of the hugely influential Social Contract (1762) by French philosopher Jean-Jacques Rousseau (1712–78) are haunting: “Man is born free, and everywhere he is in chains. Those who think themselves the masters of others are indeed greater slaves than they.” In a “natural” state, Rousseau argues, people are happy, healthy, altruistic, and good. When society is imposed upon them, though, the effects are corrupting and alienating; a man is pitted against his neighbor, a woman against her friend, as all compete for power through prestige and material wealth.
“Nature never deceives us; it is we who deceive ourselves.”
Jean-Jacques Rousseau
Rousseau’s work is founded on a philosophy of anthropology, arguing that, at early stages of societal development, human societies are at their moral best. He asserts that “uncorrupted morals” can be observed in human cultures that are less technologically and socially sophisticated than the eighteenth-century Western world. He uses as an example Caribbean culture, suggesting that, even though they live in a hot climate, which “always seems to inflame the passions,” Caribbean people are able to conduct themselves with admirable morality, largely because they are devoid of the societal and material trappings of more technologically and philosophically “advanced” cultures.
Even though he was reacting against philosophers such as Thomas Hobbes (1588–1679), who said that a man “in a state of nature … has no idea of goodness he must be naturally wicked; that he is vicious because he does not know virtue,” Rousseau created an equally reductive image, that of a “noble savage,” which would persist alongside his conception of people being born free of the restrictions society eventually imposes upon them. The latter notion was foundational in the creation of the United States Declaration of Independence of 1776, and also in the eventual abolition of slavery throughout the world. MK
1763
Bayes’ Theorem
Thomas Bayes
A mathematical formula for calculating the probabality of an event reoccurring
Bayes’ Theorem, a probability theory and mathematical formula, is credited to English mathematician Thomas Bayes (1701–61); it was published posthumously in An Essay towards Solving a Problem in the Doctrine of Chances (1763). The theorem is used to calculate the probability that something is true or will be true given a prior set of circumstances. For example, let us say that a man wakes up in the morning and the moment after the sun rises he hears a rooster crow. If this happens once or twice he might simply take note of it; but if it happens repeatedly for twenty-five or thirty-five days, or even months or years, it is highly likely that he will form a link between the events: the probability of the rooster crowing the moment after the sun rises is true (it is very likely to carry on happening in the future).
The formula looks like this: Pr(A|B) = Pr(B|A) × Pr(A) / Pr(B). On the left side of the equation is the conditional probability, or what we want to know—the probability of event A (the rooster crowing) if event B (the sun rising) happens or is true. The part of the formula on the right of the equation gives us the tools for finding the conditional probability. It involves assigning numerical values to the three components; basically, it gives numerical weight to how commonly or often A occurs, B occurs, and A and B occur together. The higher the numbers are for A, for B, and for A and B together, the more probable or likely A occurs, given B.
Bayes’ Theorem is important because constantly trying to calculate conditional probabilities is a natural response on the part of humankind to existing in a largely ungovernable material world. People wake up with a very strong expectation that daylight will appear, that gravity is still in place, that their cars will start, and other similar inductive inferences, and they form these beliefs because of consistent, repeated occurrences and relationships. KBJ
1764
Neoclassicism
Johann Joachim Winckelmann
An artistic movement that returned to the classical style of ancient Greece and Rome
The discovery of the Roman cities of Herculaneum in 1738 and Pompeii in 1748 helped revive an interest in all things classical at a time when the elaborate Rococo style was starting its downward spiral out of fashion. One work, The History of Art in Antiquity (1764), by German scholar Johann Joachim Winckelmann (1717–68), was especially influential in convincing artists to work in the manner of the Greeks, which he believed would give their art an archetypal purity of form based in geometric proportion and symmetry.
Classical mythology and the Homeric epics were popular subjects, and rendering the dress and decorative features correctly was an important part of this style in painting, sculpture, and printmaking. Artists copied designs from pattern books that were themselves renderings of newly discovered antiquities. The austerity of the Roman style suited the revolutionary themes prevalent in France during this period and many artists, most notably the painter Jacques-Louis David (1748–1825), married the imagined strength of ancient Republican virtues with a reformer’s uncompromising zeal. A strong line quality, the imitation of drapery and poses found in Greek and Roman sculpture, and a flattening of color and formality of composition are all hallmarks of this style.
“The only way for us to become great … lies in the imitation of the Greeks.”
Johann Joachim Winckelmann
Simultaneously there was also a strong trend toward Romanticism in this period, and while the two movements had differing goals and stylistic features, they did coexist. Some artists even produced works in both genres depending on the subject or the patron for whom they worked. PBr
1767
Solar Power
Horace-Bénédict de Saussure
Harnessing the power of the sun to provide energy
The sun has shone on the Earth since the creation of the universe, freely expending its immense power. It was frustrating for humankind that there was no way to harness it, but that situation changed in 1767, when the energy of the sun was captured for the first time.
Horace-Bénédict de Saussure (1740–99) was a Swiss physicist, geologist, and early Alpine explorer. In 1767, after several false starts, he managed to create a solar oven—the first device in the Western world to use the energy of the sun to create heat. It consisted of an insulated box covered by three layers of glass that absorbed and therefore trapped thermal radiation from the sun. The highest temperature he obtained in the device was 230° F (110° C), a temperature that did not vary greatly when the box was carried down from Mount Crammont’s summit to the Plains of Cournier, 4,852 feet (1,479 m) below. Carrying the oven down the mountain provided proof that the external air temperature, low at the peak but high on the plain, played no significant part in the result.
“I’d put my money on the sun and solar energy. What a source of power!”
Thomas Edison to Henry Ford, 1931
Today, extensive arrays of solar panels create free electrons attracted to positively charged electrodes to convert solar radiation into electricity, a process known as the photovoltaic effect. Other systems use polished lenses or mirrors to focus sunlight into concentrated beams that emit photoelectrons, which are then converted into electric current. As the world’s finite energy resources are depleted, the relatively infinite power of the sun is now coming into its own. SA
1771
Sign Language for the Deaf
Charles-Michel de l’Épée
Using a vocabulary of gestures rather than sounds to communicate
A postcard depicting the alphabet in sign language, from c. 1930.
It is unknown when the first sign languages for the hard of hearing developed, but Charles-Michel de l’Épée (1712–89), who founded the world’s first free school for the deaf in pre-Revolutionary France in 1771, is often regarded as a seminal figure. While he did not invent a completely original sign language, he learned the signs that his students used; these belonged to what is termed “Old French Sign Language,” a collection of signs that, in his time, was used by about 200 Parisians. De l’Épée emphasized their use in instruction, and attempted to systematize a sign language based on them. Moreover, he made his successful instructional methods widely available, encouraging his peers to focus on spreading the use of sign languages; indeed, one of his students cofounded the first school for the deaf in the United States and laid the foundation of American Sign Language.
“As long as we have deaf people on earth, we will have signs.”
George Veditz, advocate for deaf rights
Before sign languages for the deaf were developed, signs, although commonly used, were usually improvisatory and relatively unorganized. In certain communities where deafness was prevalent, “village sign languages” developed: a well-known example was identified in Martha’s Vineyard, Massachusetts, from the early eighteenth to the mid-twentieth century. However, without a universal form of communication, deaf people were typically unable to receive much education, to travel beyond their communities, and to communicate outside their families. Even now, sign language is not universally adopted in the education of the deaf, and various controversies about its use persist. GB
1772
Abolitionism
England
The belief that no part of society has the right to enslave human beings
A detail of the celebratory oil painting Proclamation of the Abolition of Slavery in the French Colonies, April 23, 1848, made in the following year by François Auguste Biard.
Any aim to end the enslavement of one group of human beings by another is referred to as abolitionism, although the term is generally associated with the effort made to end slavery of Africans in Europe and in North and South America during the seventeenth-and eighteenth-century Enlightenment era.
“The state of slavery is of such a nature that it is incapable of being introduced on any reasons, moral or political, but only by positive law, which preserves its force long after the reasons, occasions, and time itself from whence it was created, is erased from memory.”
Lord Mansfield, Somersett’s Case (1772)
In England, the most telling legal event was Somersett’s Case (1772), in which it was ruled that slavery was unsupported by law in England and Wales. Commitments to abolition waxed and waned during the nineteenth century, but the principles of equality and egalitarianism held sway long enough to convince, or perhaps force, governments in the Americas to follow suit over time. Indeed, during its various conflicts with the United States, the British government encouraged slaves to abandon their owners by offering to emancipate them from the colonies, and many slaves took the British up on their offer of free transport and political freedom. Slavery established the ideological framework for the American Civil War (1861–65), with pro-slavery southern states pitted against the abolitionists of the north; slavery in the United States was only ended in 1865 by the passage of the Thirteenth Amendment to the Constitution. The last nation to abolish slavery was Mauritania in 1981.
Abolitionism offered unqualified commitment to equity in the treatment of all human beings. The idea behind the abolitionist program, that all humans are to be treated as equal, has led to the subordination of the aristocratic, meritocratic ethos by the egalitarian ethos of contemporary liberalism, and to the widespread recognition of the inherent dignity of all human beings. Abolitionism proved to be one of the primary catalysts for self-reflective questioning of the imperialist attitude of the great empires of the nineteenth and twentieth centuries, which led to acceptance of the principle of national self-determination for former colonies. JS
1774
Animal Magnetism
Franz Anton Mesmer
The theory that well-balanced magnetic fluids or fields are the key to human health
Franz Anton Mesmer calls upon the power of the moon to produce an animal magnetic effect in a patient, in a satirical engraving titled “Mesmer and His Tub,” published in 1784.
German physician Franz Anton Mesmer (1734–1815) had a theory that all things had magnetic fluid in them, with a natural energetic transference occurring between living and nonliving things in the universe, the energy in question being largely derived from the stars. Mesmer held that the magnetic fluids or fields in humans could be manipulated for healing or other purposes. Imbalances in the invisible magnetic fluid were held to cause disease, which could be psychological (such as anxiety or phobias) or physical (including spasms, seizures, and pain).
“The action and virtue of animal magnetism, thus characterized, may be communicated to other animate or inanimate bodies.”
Franz Anton Mesmer
The term “animal magnetism” refers to the universal principle that everything contains this magnetic fluid, and also to the therapeutic system by which people alter the state of their magnetic fluids, rebalancing or even transferring the magnetic fluid, either through touch or by holding hands passively over the body.
In 1774, Mesmer produced an “artificial tide” in a patient. She was given a preparation containing iron, and magnets were attached all over her body. A mysterious fluid seemed to stream through her body, and her symptoms were relieved for several hours. But Mesmer’s best-known patient was eighteen-year-old Maria Theresia Paradis, who had been blind since the age of four. During 1776 and 1777, Mesmer treated her with magnets for nearly a year, and her sight seemed to be restored. However, after some time had passed, her blindness returned. A committee at the Vienna University investigated Mesmer and declared him a charlatan, saying that rather than curing her he simply made her believe she was cured.
Mesmer’s animal magnetism and practices drew attention to the fact that psychological treatment undoubtedly had a direct influence on the body and its ailments. Treatment with magnets is still being practiced in continental Europe today, and in the United States you can buy the “ionized” Q-Ray bracelet. KBJ
c. 1775
Trade Union
Europe
An organization created to improve conditions in the workplace
While trade unions are thought to have roots in the guilds of medieval Europe, they reached popularity around 1775, not long after the beginning of the Industrial Revolution in Europe. At this time, women, children, farm helpers, and immigrants were joining the workforce in large numbers, and workers were beginning to organize spontaneous protests and strikes concerning their treatment and conditions of employment in mills, mines, and factories.
Trade unions had mixed success in representing their members in the eighteenth century. Employers had to choose between giving in to union demands that might involve costs to them, or losing production. Typically their response to demands from unskilled workers was dismissive, but skilled workers fared better because they were less easily replaced. Even so, membership of a trade union, and the unions themselves, was generally illegal or unsupported by the law in the eighteenth and nineteenth centuries, and members even faced the threat of execution in some countries. Reform came slowly. The labor of children under the age of ten was not outlawed until the Factory Act of 1878, and it was not until the late nineteenth century that some trade unions were legitimized and began to acquire political power.
“Labor unions are the leading force for democratization and progress.”
Noam Chomsky, linguist and historian
Trade unions changed our views of both human rights in the workplace and the meaning of labor. Previously, workers were often abused, treated like dispensable property, and paid wages too low to support a family. The consumer has benefited, too, in being able to learn the real cost or value of goods. KBJ
1776
Nationalism
United States
The feeling of loyalty to a nation, often resulting in the belief that it is superior
Eugène Delacroix’s Liberty Leading the People (1830) depicts the French nation triumphing over the monarchy.
Nationalism, often associated with patriotism, is based on an identification with, and loyalty and devotion to, the body of individuals who comprise a nation. People are also described as nationalist if they consider their nation to be the best and seek to promote the culture and interests of their national community to the detriment of other nations or groups. Enlisting in the army to fight for the freedoms of your fellow citizens, or just simply thinking that you live in the best place on earth, are forms of nationalism. It is essentially an ideology, but one that varies from right to left of the political spectrum.
While nationalism has been practiced for centuries by many different cultures, its modern form can be traced to the United States Declaration of Independence in 1776, the American and French revolutions in the eighteenth century, and the unification movements of Germany and Italy in the nineteenth century—all of which signify people unifying for self-determination and independence.
“To wish for the greatness of one’s fatherland is to wish evil to one’s neighbors.”
Voltaire, Philosophical Dictionary (1764)
In negative terms, nationalism restricts the movement of people and money, and the territorial rights to certain resources. It has also played a role in many wars. In the case of Adolf Hitler and the Nazis, nationalism was associated with genocide justified by claims of racial, cultural, and genetic superiority.
World War II (1939–45) radically changed the political landscape in many ways, not least in the recognition that pride in one’s nation needs to be kept in check. As the Bible says in Proverbs 16:18, “Pride goes before destruction, and a haughty spirit before a fall.” KBJ
1776
The Pursuit of Happiness
Thomas Jefferson
All people have an undeniable right to pursue a life of happiness
The U.S. Declaration of Independence, signed in 1776 by Thomas Jefferson and others.
In the second sentence of the United States Declaration of Independence, written by Thomas Jefferson (1743–1826) in 1776 and signed on July 4 of the same year, the pursuit of happiness—along with life and liberty—is identified as an unalienable right to which all people are entitled. The declaration listed grievances of the U.S. colonists against King George III of England, detailing in part how his actions had deprived the colonists of their human rights—including their right to pursue lives of happiness.
In 1689, English philosopher John Locke (1632–1704) had published Two Treatises of Government, in which he wrote that governments exist under an umbrella of natural law, a law that guarantees everyone a right to life, liberty, and estate. When Jefferson wrote the Declaration of Independence, he used Locke’s idea, along with others, to convey the notion that a society, and the government that rules it, must protect those inherent rights. Yet when he wrote the Declaration of Independence, Jefferson chose to use the word “happiness” instead of “estate.”
“ … these United Colonies are, and of Right ought to be Free and Independent States …”
The Declaration of Independence (1776)
Jefferson believed in government as a means of ensuring that people are free, happy, and able to pursue their own personal desires. The concept of individualized rights, and that of a government whose primary purpose is to protect those rights instead of itself, is at the heart of democracies around the world. The Declaration of Independence, although not a law in itself, was an expression of shared values established in a moral framework, one that the citizens of the United States strived to live up to. MT
1776
The Invisible Hand
Adam Smith
Human affairs are guided by a natural force toward optimal outcomes
A phrase coined by Scottish moral philosopher and economist Adam Smith (1723–90) in The Wealth of Nations (1776), the “invisible hand” is thought by some to be a metaphor for the self-regulating behavior of the marketplace, and by others to be the hand of God or natural human moral sense. In all cases, it is a “natural force” of social freedom and economic justice that guides participants in free market capitalism to trade resources in a mutually beneficial manner.
When people come together in a society, there is often a struggle before they find ways to adapt, cooperate, and thrive. However, individual freedom and natural self-interest need not produce chaos if the natural force of the “invisible hand” guides human economic and social commerce; the result may be order and harmony. In a free market, people trade resources, some scarce, and this enables individuals to be better off than if each produced for himself alone. As people bargain and trade, the resources of their society naturally become part of the ends and purposes they value highly. Thus, with the “invisible hand” guiding events, there is no need for regulations to ensure that each participant sees benefit.
“It is not from the benevolence of the butcher … that we expect our dinner.”
Adam Smith, The Wealth of Nations (1776)
Smith was radical in regarding social and economic order as organic products of human nature and freedom. The prospering marketplace had no need for control by kings or governments, but would grow best with open competition and free exchange. In this way, Smith’s idea of the “invisible hand” and the writings that pertained to it actually referred to human social psychology as much as to economics. KBJ
1778
Biblical Redaction Criticism
Hermann Reimarus
The view that Jesus’s sayings must be distinguished from those of the apostles
Biblical redaction criticism is an area of Bible study that examines how its authors and editors processed and assembled their final literary output, all with a view to discovering what those individuals hoped to achieve.
Modern Biblical redaction criticism began with the German philosopher Hermann Reimarus (1694–1768) and his analysis of the historical figure of Jesus in Von dem Zwecke Jesu und seiner Jünger (On the Intention of Jesus and His Teaching, 1778), which was published a decade after his death. Reimarus distinguished what the gospel writers said about Jesus from what Jesus said himself. For example, Reimarus suggested that the disciples fabricated the story of the Resurrection.
Reimarus’s work caused an uproar on its publication, but it encouraged others to make a critical study of the New Testament. German theologian David Strauss (1808–74) caused controversy with his Das Leben Jesu kritisch bearbeitet (The Life of Jesus, Critically Examined, 1835), which asserted that the gospels had been altered and that the miracles mentioned were mythical in character. Strauss also suggested that the writers of the gospels of Matthew and Luke had used the gospel of Mark as a source.
“The apostles were … teachers and consequently present their own views.”
Hermann Reimarus, On the Intention of Jesus … (1778)
German theologian Wilhelm Wrede (1859–1906) attempted to demonstrate the historical unreliability of the gospel of Mark, and also the influence that the apostle Paul had on Christianity and the idea of salvation. Redaction criticism has since been applied to other areas of scripture, but detractors assert that its methods cast doubt on the Bible both as an inspired work and as a trustworthy historical document. CK
1778
Totem Poles
Pacific Northwest Native Americans
The symbolic display of cultural information through large carvings
Totem poles are read from bottom to top, and can depict people, animals, and spirits.
The word “totem” is derived from the Ojibwe word odoodem, meaning “his kinship group.” Native American totem poles are ambiguous structures, vertical carvings with intricate designs whose meaning is often unclear. While the age of many totem poles is uncertain, the Western red cedar most often used in their construction barely lasts for a hundred years. European explorers first encountered them in 1778, but their history predates that. They may be the product of a long history of monumental carving, starting perhaps with house posts and funerary memorials and then enlarging into symbols of family or clan prestige. Produced by illiterate societies, no documentation exists of their age, but it is clear that these structures were proud symbols of their creators.
“Some [are] reminders of quarrels, murders, debts, and other unpleasant occurrences.”
Ishmael Reed, From Totems to Hip-Hop (2002)
The importation of iron and steel tools from Europe and Asia during the 1800s enabled totem makers to carve far larger and more complex poles. The Haida people of the islands of Haida Gwaii, off the coast of British Columbia, probably originated these carvings, which then spread down the coast into what is now Washington State. (Totem poles do not exist in the American southwest or northern Alaska, as no tall trees can grow in these inhospitably hot and cold climates.)
The designs of totem poles vary according to the intent of their creators. Some recount legends and family histories; others refer to cultural beliefs or shamanic powers. Certain poles were mortuary structures; others were “shame poles,” on which an individual who had failed in some way had their likeness carved upside down. They were never objects of worship. SA
Late Modern
1780–1899
The Gare St. Lazare (detail, 1877), by Claude Monet, offers an Impressionist view of the steam engine, the driving force behind the Industrial Revolution.
Ideas that emerged during this period regularly assumed a material form in the parts and processes of the Industrial Revolution. Karl Marx and Friedrich Engels responded to this industrialization—and its attendant capitalism—with their own revolutionary ideas concerning socialism and communism. Charles Darwin’s ideas about evolution provided a natural explanation for the existence and diversity of living things, and Auguste Comte’s ideas about naturalism and positivism offered a natural explanation for the workings of the whole universe, leading Friedrich Nietzsche to argue in 1882 that “God is dead.” By the dawn of the twentieth century, the technological, political, scientific, and religious landscape looked very different indeed.
1781
Kant’s Copernican Turn
Immanuel Kant
Reality should conform to the categories of the mind, not the reverse
“It always remains a scandal of philosophy and universal human reason that the existence of things outside us … should have to be assumed merely on faith, and if it occurs to anyone to doubt it, we should be unable to answer him with a satisfactory proof.” So stated German philosopher Immanuel Kant (1724–1804) in his Critique of Pure Reason (1781), overturning decades of Enlightenment philosophy. Traditional philosophy had failed, said Kant, because it could not produce an argument that there is a world that is external to us.
Kant lived and died in what was then Königsberg in East Prussia and is now Kalingrad in Russia. In his entire life, he famously never traveled more than 10 miles (16 km) from the city. He became a scholar, working as a private tutor in order to support his philosophical investigations. Kant argued that we shape our experience of things through our mind but we never have direct experience of these things. He was trying to end philosophical speculations whereby objects outside our experience were used to support what he saw as useless theories. He called this massive change a “Copernican Revolution in reverse.” Copernicus (1473–1543) had overturned astronomy by confirming that the Earth moved round the sun and was not the static center of our universe, but Kant proposed a revolution that placed the mind at the center of all things.
“[Let us proceed] by assuming that the objects must conform to our cognition.”
Immanuel Kant, Critique of Pure Reason (1781)
Kant’s Copernican Turn, as the idea has been termed, has had a long-lasting effect on philosophy, influencing nineteenth-century philosophers such as G. W. F. Hegel, and twentieth-century phenomenologists such as Edmund Husserl and Martin Heidegger. SA
1781
Transcendental Idealism
Immanuel Kant
Knowledge of the world begins with perception, not the senses
Transcendental idealism was a philosophical doctrine of the late eighteenth century that naturally grew out of the German Enlightenment and German Idealism. Its best-known and initial systematic treatment was carried out by Immanuel Kant (1724–1804) in his Critique of Pure Reason (1781), but transcendental idealism went on to influence all major areas of philosophy: logic, ethics, social and political philosophy, and aesthetics.
In this context, the term “transcendental” means independent of experience, and “idealism” means dependent on the existence of ideas and the mind; the doctrine investigates the knowledge of objects that are “in me but not me.” The transcendental idealist believes that what we discover about objects depends on how they appear to us as perceivers. In this way, Kant secured the priority of the mind over the external world and at the same time preserved the validity of scientific investigation.
“Experience [does not] teach us that something … cannot be otherwise.”
Immanuel Kant, Critique of Pure Reason (1781)
Kant was trying to reconcile elements of empiricism and rationalism to correct the errors that prevented metaphysics from possibly becoming a real science. He felt that a synthesis of the two could account for both the sense data received for the acquisition of knowledge and the need for the mind to be pre-equipped to process such data, produce judgments, and detect relationships of necessity and causality, as well as reasoning of a higher order. Kant’s transcendental idealism is really an early form of cognitive psychology, and its investigation into perception and reason changed the way we view the relationship between the mind and the world. KBJ
1781
Analytic-Synthetic Distinction
Immanuel Kant
Judgments require close analysis of the content of propositions
In philosophy, the analytic-synthetic distinction is a conceptual distinction between two kinds of propositions or judgments. A proposition is analytic when the predicate concept is contained in the subject concept (as in the law of identity, A=A) or when it is true by definition (“all triangles have three sides”). A proposition is synthetic when the predicate concept is not contained in the subject concept, but lies beyond it (“all bachelors are unhappy”), or the truth of the proposition is known by the meaning of the words and something about the world around us (“car exhaust contributes to smog”).
“ … [predicate] B, though connected with concept A, lies quite outside it.”
Immanuel Kant, Critique of Pure Reason (1781)
German philosopher Immanuel Kant (1724–1804) first introduced the terms “analytic” and “synthetic” in his well-known work Critique of Pure Reason (1781). He saw this as a logical and semantic distinction applying to judgments: what makes a proposition true or false? Analytic propositions do not expand our concept of the subject; they do not add to our knowledge. They are considered true by virtue of the law of noncontradiction: the judgment that “a triangle has four sides” is false by virtue of the fact that “four sides” cannot be predicated of a “triangle” (TRIangle). A synthetic proposition is known to be true by its connection with some intuition or prior knowledge, and so its truth or falsity is not so easy. A maple tree can be both leafy and not leafy without contradiction. Kant’s analytic-synthetic distinction changed the way that we view the act of judgment, as complex conscious cognitions, the capacity for which is a central cognitive function of the human mind. KBJ
1782
Latin Square
Leonhard Euler
A matrix of symbols in which each symbol appears once in each row and each column
Familiar to aficionados of Sudoku puzzles, the Latin square was first mentioned in a paper, Recherches sur une nouvelle espèce de quarrés magiques (Investigations on a New Type of Magic Aquare, 1782), by the Swiss mathematician and physicist Leonhard Euler (1707–83).
A Latin square is an n x n matrix composed using n different symbols that occur once in each row and once in each column. This is a three-symbol set:
α
β
γ
β
γ
α
γ
α
β
For every added symbol used to generate the matrix there is a substantial increase in the total of different matrices that may be constructed. For example, given a set containing one symbol, only one matrix can be constructed. Given a set containing two symbols, two matrices might be constructed. A set of three symbols, however, can yield twelve matrices, such as the one above. Given nine symbols, five and a half octillion unique matrices can be constructed. including those used for the popular Sudoku puzzles.
“Euler introduced Latin and Greek letters to help him analyze magic squares …”
Leonard Euler: Life, Work and Legacy (2007)
Latin squares are used in the design of scientific experiments. A scientist testing five versions of an experimental drug can divide his test subjects into five groups and randomly pair one version to each group for testing during the first phase of the experiment. The Latin square can then be used to schedule four more phases so that each version of the drug is tested upon each group of test subjects without duplication. Latin squares also have several applications in the field of telecommunications. DM
1784
Enlightenment
Immanuel Kant
The free use of intellect to solve problems without recourse to authority
This painting (1768) by Joesph Wright of Derby conveys the Enlightenment fascination with science.
In 1784, German philosopher Immanuel Kant (1724–1804) published the essay, “Answering the Question: What is Enlightenment?” in Berlin Monthly. He was writing toward the end of the Age of Enlightenment, and his essay is the definitive commentary on the meaning of that period of intellectual history.
Kant argues that enlightenment is an intellectual coming of age in which humankind liberates itself from the chains of ignorance. The motto of the Age of Enlightenment, sapere aude, or “dare to be wise,” indicates a willingness to challenge the paternalistic authority of both monarchic government and the church. Kant compares enlightenment to the state of a minor who has finally reached adulthood and become emancipated from parental authority by embracing autonomy (the ability to make free and rational decisions). He argues that the ignorance of humankind is self-imposed: like an immature adult who is reluctant to leave the comfort of his parents’ home, many people cling to authority because of intellectual laziness. If we could educate all the members of society to judge on the basis of reason instead of dogma, people could be transformed into free-thinking individuals, which would bring an end to despotism and oppression.
Although a political revolution might offer a temporary reprieve from tyranny, Kant claims that it is only the cultivation of our minds that can bring about lasting change in society. Through the public use of reason, we can put forward and critique new ideas and thereby make intellectual progress.
Kant’s understanding of enlightenment has been highly influential in political theory concerning the importance of intellectual freedom in both academics and civil rights. This includes the separation of church and state, since citizens should not be subject to the “spiritual despotism” of a paternalistic government. JM
1784
Black Hole
John Michell
Entities in space that are so dense that no light can escape their gravitational pull
A computerized image of a black hole, surrounded by its white accretion disk of superheated material.
In his work on the effect of gravity on light (1784), English geologist John Michell (1724–93) was the first to suggest the existence of black holes, referring to them as “dark stars.” Karl Schwarzschild (1873–1916) is also credited with developing the concept. However, U.S. physicist John Wheeler (1911–2008) coined the term around 1968, and his vast amount of research pioneered the modern study of black holes.
A black hole is a place in outer space where the force of gravity is so extreme that nothing inside it can escape, not even a ray of light. The reference to a hole suggests that the entity is empty, but the complete opposite is true. In fact, a black hole consists of a great amount of matter packed into a very small space. What may be termed a hole is created by the extreme force of gravity sucking all surrounding matter toward that extremely dense center.
Because light cannot escape from black holes, they are invisible. To find one, astronomers need a telescope with special tools that analyze the behavior and patterns of stars; those stars closest to a black hole act differently and thus betray the black hole’s presence.
Black holes exist in various sizes, but they all have a large mass. Some are as small as an atom and have the mass of a large mountain; there are “stellar” black holes that have a mass twenty times that of the sun; and the largest are called “supermassive,” which have a mass of more than a million suns combined. There is scientific proof that all large galaxies have a supermassive black hole at their center; at the center of the Milky Way lies Sagittarius A, with a mass of about 4 million suns.
Researching black holes has resulted in a better understanding of the real universe. And if they were present at the Big Bang, they would have impacted the expansion rate of the universe and the abundance of elements found in it, and so they might hold the key to understanding how the universe began. KBJ
1785
First Categorical Imperative
Immanuel Kant
The moral principle that one’s behavior should accord with universalizable maxims
A print of a portrait of Immanuel Kant, published in London in 1812.
German philosopher Immanuel Kant (1724–1804) founded deontology, or “duty ethics,” a moral theory grounded in pure reason rather than virtues, self-interest, or compassion. In his Groundwork of the Metaphysics of Morals (1785), Kant argues that we can determine our moral duties through an appeal to the categorical imperative. Here, “categorical” means unambiguously explicit, while “imperative” means command, thus a “categorical imperative” refers to a universal moral law.
Kant’s categorical imperative has three formulations. The first formulation encourages a person to act as if the maxim (rule) for their action were willed as a universal law for all rational beings. For example, if a person is considering whether or not to tell a lie, they must evaluate two maxims: “lie whenever you like” and “always tell the truth.” They must then universalize these laws to see if everyone can consistently follow them. If they universalize the liar’s maxim, they get a world of liars in which nobody would believe their lie, so there is a contradiction between the world they need to exist for the lie to work (a world of trusting, truth tellers in which they are the only liar) and the world that would exist if everyone lied (a world of dishonest skeptics). Truth telling can be universalized without contradiction, so there must be a universal moral duty to tell the truth. Moral laws must be reversible: if truth telling is a maxim that all people must follow, then I must follow it too.
“Act only in accordance with that maxim [you could accept as] a universal law.”
Immanuel Kant, Groundwork … (1785)
Since its inception, Kantian deontology has been one of the most dominant theories in ethics. The categorical imperative is routinely used in applied ethics to test the fairness of laws and policies. JM
1785
Second Categorical Imperative
Immanuel Kant
Treat people as ends in themselves, not merely as a means to your own ends
Immanuel Kant (1724–1804), the founder of the moral theory of deontology, was dissatisfied with prevailing theories of ethics that emphasized the agent’s character, moral sentiment, and ego. He sought to give ethics universality and precision through an appeal to the categorical imperative, which refers to the moral duties that all rational beings ought to respect.
The second formulation of Kant’s categorical imperative states that a person should always treat rational beings as ends in themselves, never as a means to the end of their own satisfaction. Rational beings view themselves as ends, as beings with intrinsic value, which is to say that a person has moral worth independent of whether or not other people find them instrumentally useful. It is wrong for a person to use a rational being simply as a means, an object whose sole purpose is to make another person happy. This formulation is often referred to as “respect for persons” because, when we view persons as ends rather than means, we respect their status as fellow rational beings. If someone wants to use a person instrumentally, they must first obtain their consent. For example, forcing someone to paint someone else’s house would be wrong, but paying someone to do so is just.
“So act that you use humanity … as an end and never merely as a means.”
Immanuel Kant, Groundwork … (1785)
Respect for persons has become a cornerstone of applied ethics, directly influencing key notions such as respect for autonomy (the ability to make free and rational decisions about a person’s own life) and voluntary informed consent (the idea that a person must be fully informed of risks and benefits before entering into a contract or authorizing a medical procedure). JM
1785
Uniformitarianism
James Hutton
An assumption about the natural laws and processes that create Earth’s landscape
James Hutton (1726–97) was a Scottish doctor, naturalist, and geologist. In a paper, Theory of the Earth, presented to the Royal Society of Edinburgh in 1785, he asserted that “the solid parts of the present land appear, in general, to have been composed of the productions of the sea, and of other materials now found upon these shores.” He proposed that land formation took place under the sea as sediments accumulated on the seabed. The land was then lifted up above the sea, tilted, and eroded, and then returned below the sea again where further layers of deposits were added. This cycle of formation, elevation, erosion, and submersion was repeated countless times over countless years. Hutton found evidence for this theory in rock unconformities—breaks in the geological record—in the Cairngorm Mountains and along the Berwick coast in Scotland.
“[Geological forces] never acted with different degrees of energy [from now].”
James Hutton, Theory of the Earth (1785)
For many centuries, how and when Earth was formed, rocks were created, and the landscape shaped were all unknowns. Was Earth as created in Genesis, or was it much older? Such questions remained unanswered until Hutton proposed his theory. The ideas took time to be established but were confirmed in the multivolume Principles of Geology, published from 1830 to 1833 by Charles Lyell (1797–1875). Lyell stated that Earth was shaped entirely by slow-moving forces that act over a long period of time and continue to act to this day. Reviewing this book, the English polymath William Whewell (1794–1866) coined the term “uniformitarianism” to describe this process, a process in which “the present is the key to the past.” SA
1785
Geological Deep Time
James Hutton
The concept that Earth’s history extends into an “abyss of time”
Grand Canyon, Arizona, where the Colorado River has exposed 2 billion years of geological history.
The concept of deep time originated with two men of the Scottish Enlightenment, doctor and geologist James Hutton (1726–97) and mathematician John Playfair (1748–1819). Hutton’s Theory of the Earth (written and presented to the Royal Society of Edinburgh in 1785, then published in the society’s Transactions in 1788) introduced deep time, and there are observations of it in the works of Charles Lyell and Charles Darwin.
The span of human history is a tiny blip when compared to the Earth’s 4.54 billion-year geological timescale, and to recognize that fact is to internalize the notion of deep time. What must be recognized is the difference between relative age (that measured by relationship) and numerical age (that measured by date). Traditionally, the concept of deep time has been applied in geology and paleontology, but it is also useful in attempts to discover the age of the universe.
“The result … is that we find no vestige of a beginning—no prospect of an end.”
James Hutton, Theory of the Earth (1785)
Deep time is measured using the geologic timescale, a system of chronological measurement relating to rock layers and stratification. The time units used are the supereon, eon, era, period, epoch, and age. Special fossils found in rocks, called index fossils, are essential for creating relative timescales. They are called index fossils because they are found only in rocks of a limited timespan, of a certain sedimentary level, and so they help in the dating of other things found in that same sedimentary layer. With the discovery of radioactivity, geologists could discover the age of minerals when they crystalized in rocks. The geologic timescale is itself not confirmed; it fluxes constantly as more data is uncovered and dates are catalogued. KBJ
1787
Panopticon
Jeremy Bentham
A design for a prison that enabled around-the-clock surveillance of inmates
The Presidio Modelo was a “model prison” of Panopticon design, built in Cuba between 1926 and 1928.
The panopticon was a proposed model prison designed in 1787 by English utilitarian philosopher and legal reform theorist Jeremy Bentham (1748–1832). The design element that made it highly novel was that it allowed around-the-clock surveillance of the prisoners. The word “panopticon” means “all-seeing,” and the ones who could “see all” were the inspectors who conducted the surveillance, never the inmates of the prison. Prisoners could not tell at any given time whether they were being watched, and it was this insecurity and anguish that would be a crucial instrument of discipline. Bentham also saw his design as applicable for hospitals, schools, sanitariums, workhouses, and lunatic asylums.
At the center of Bentham’s design was a tower structure surmounted by an inspection house. Here, staff, guards, and managers were stationed to scrutinize the cells set into the surrounding perimeter. Bentham believed that the model prison would be more cost-effective than others because it could be run by fewer staff—just the thought of being watched would be enough to keep prisoners behaving well. Additionally, the prison could provide its own income: inmates sentenced to hard labor could walk on a treadmill or turn a capstan to power looms or other machines.
The panopticon prison never came to fruition in Bentham’s lifetime, although similar structures were built in the nineteenth century. However, the principles behind his idea had a lasting impact. Much later, French philosopher Michel Foucault (1926–84) wrote that it was a metaphor for modern disciplinary society, one that observes and normalizes people in a form of social quarantine. Bentham’s design created a conscious, visible, and ever-present mark of power, in which heavy locks, bars, and chains were no longer necessary for domination—the prisoners’ paranoia and psychological angst kept them under control. KBJ
1789
Utilitarianism
Jeremy Bentham
What is morally good is that which promotes the greatest good for the greatest number
A portrait of Jeremy Bentham, painted in c. 1829 by Henry William Pickersgill.
According to utilitarianism, it is the outcome of any action that determines whether it is a moral or immoral act. Any act that promotes the greatest good for the greatest number of people is considered moral, regardless of whether it conforms to any other notion of morality, ethics, or religious doctrine.
Prior to the development of utilitarianism during the seventeenth-and eighteenth-century Enlightenment, thinkers had recognized the value of human happiness and used it as a measuring stick of ethics and morality. However, it was not until the publication in 1789 of An Introduction to the Principles of Morals and Legislation by English utilitarian philosopher Jeremy Bentham (1748–1832) that utilitarianism fully bloomed as an ethical theory. For Bentham, maximizing human pleasure, or happiness, and minimizing pain could be achieved through mathematical calculations that were applicable in any ethical judgment.
However, it was in Bentham’s godson, John Stuart Mill (1806–73), that utilitarianism found its most ardent voice. For Mill, some pleasures were qualitatively superior to others. Eschewing Bentham’s qualitative assessment, Mill believed in a spectrum of pleasures, and he made it clear that simply maximizing base pleasures was not the ultimate goal of utilitarianism.
Utilitarianism has had a wide-reaching impact on any number of human endeavors beyond moral philosophizing. In political circles, utilitarian arguments are commonly employed to support or refute governmental policies and actions, and economists have applied utilitarian principles widely in developing methods for maximizing prosperity. Although the school of thought is not without its critics, its introduction as an ethical theory demonstrated that morality need not be based upon principles handed down from authority, but rather upon the measurable outcomes that actions produce. MT
1789
Declaration of Rights
Jérôme Champion de Cicé
An argument that there are universal human rights that confer equality before the law
The Declaration of the Rights of Man and the Citizen (1789), painted by Jean-Jacques-François Le Barbier.
On August 26, 1789, the French Constituent Assembly adopted the last article of the Declaration of the Rights of Man and the Citizen, a key moment of the French Revolution (1787–99) and a crucial first step toward writing the constitution for post-Revolutionary France. It was authored by clergyman Jérôme Champion de Cicé (1735–1810).
The document featured seventeen articles containing provisions about individuals and the nation. It abolished feudal rights, clearly defining individual and collective rights as universal: life, liberty, property, security, and resistance to oppression. These rights are valid at all times, everywhere, and are natural rather than bestowed. The Declaration also recognized equality before the law, and this effectively eliminated any special rights or exceptions awarded to the nobility and clergy. It restricted the powers of the monarchy, overturning the divine right of kings, and stated that all citizens had the right to participate in the legislative process, with the government containing only elected officials. However, political rights pertained only to “active” citizens: those at least twenty-five years of age, who paid taxes equal to three days of work, and who owned property. This meant that only white men had these rights; women, children, and foreigners did not.
“Men are born free and remain free and equal in rights.”
Jérôme Champion de Cicé, Declaration, Article 1 (1789)
The Declaration inspired similar rights-based documents of liberal democracy in many European and Latin American countries. It changed the way we view the relationship between the state and its citizens, and brought about global legal reform. KBJ
1789
Chemical Elements
Antoine Lavoisier
The distinct, indivisible substances that make up the material world
A chemical element is a pure substance that cannot be broken down to simpler substances or changed into another substance by chemical means. Elements are the chemical building blocks for matter; each chemical element has one type of atom, and each is distinguished by its atomic number (the number of protons around its nucleus).
Talk of elements started with the ancient Greeks. Aristotle wrote that they were entities that composed other things but were themselves indivisible. For the Greeks, there were four elements: earth, air, fire, and water. The fundamental nature of a chemical element was recognized in 1661 by the English chemist Robert Boyle (1627–91), but it was not until 1789, with the publication of the Elementary Treatise of Chemistry by French chemist Antoine Lavoisier (1743–94), that the first true modern list of elements existed. Lavoisier’s list was created following a careful study of decomposition and recombination reactions, and contained just thirty-three elements. By 1869, when Russian chemist Dmitri Mendeleev (1834–1907) created the periodic table to summarize the properties of elements and to illustrate chemical behavior and recurring trends, the total number of known elements had increased to sixty-six.
“I now mean by elements … certain primitive … or perfectly unmingled bodies.”
Robert Boyle, chemist
To date, 118 elements have been identified. Their discovery has led to a more profound understanding of our physical world in its composition, mechanics, and evolution. All objects contain chemical elements, and more than thirty have key functions in keeping plants, animals, and people alive and healthy—they are indeed the foundation of life itself. KBJ
c. 1790
Feminism
Olympe de Gouges/Mary Wollstonecraft
A movement arguing that women should possess the same legal and social rights as men
A watercolor portrait of French revolutionist and feminist Olympe de Gouges (1784). In 1791 de Gouges published her Declaration of the Rights of Woman and the Female Citizen.
The term “feminism” refers to a series of social and political movements that have sought to address various longstanding prejudices concerning the rights of women, their perception by society, and their access to social resources. The movement has thus petitioned, in various ways, for the right to equal intellectual, economic, and legal status. Feminism has undergone many paradigm shifts, and even now the meaning of the term continues to metamorphose.
“I love my man as my fellow; but his scepter, real, or usurped, extends not to me, unless the reason of an individual demands my homage; and even then the submission is to reason, and not to man.”
Mary Wollstonecraft
At least since humanity’s move from mobile, hunter-gatherer societies to settled, agrarian ones, the interests of men have generally been given priority in society, especially by religions. However, sparked by the French Revolution in the 1790s, women began to articulate the view that there was a strong case for their equal treatment.
During the “pamphlet wars” of the 1790s (a series of exchanges between reformers and those seeking to resist the application of “French principles” to Britain), two women, Olympe de Gouges (1748–93) in France, and Mary Wollstonecraft (1759–97) in England, published challenges to the view that women were second-class citizens. While they accepted some inequalities that later waves of feminism would reject out of hand, they opened up a public debate that remains of critical importance today.
The ideas of feminism have challenged and changed many aspects of society, from female political involvement to reproductive rights. Feminism has been adopted in different ways by groups as radically diverse as Muslims and Native Americans, but their common theme is the desire to question the structure of societies and assumptions about what women are, and what they ought to be. Nevertheless, equality for women is still a distant vision for the vast majority. LW
c. 1790
Dandyism
Europe
The notion of looking superb as being more important than being superb
Celebrity status used to be connected to power, and those in power would dress in ways that made their status unmistakable. In Europe, powerful aristocrats would seek to maintain their position within restricted social circles by wearing the right haute couture and displaying the right manners, but generally they did not seek wider attention. From the 1790s onward, however, the arrival of new bourgeois wealth brought a wave of men able to boast nothing more in the way of status than a disposable income. Eager for the attention of the wealthy upper class, and with the leisure to focus on themselves, they sought the spotlight with dress and manners designed to catch the eye. Such well-dressed and well-groomed men of the late eighteenth and early nineteenth centuries discovered that dandyism could facilitate their entry into aristocratic society.
The dandy gained a foothold in England and Scotland, then France and the United States, after the wars of the late eighteenth century created conditions that favored a wider culture newly accessible to aristocratic-looking men. Perhaps the best-known dandy was George “Beau” Brummell (1778–1840), a middle-class young man who learned the nicer points of sartorial splendor at Eton College, Oxford University, and later the prestigious Tenth Royal Hussars. He became an intimate of the Prince Regent, later King George IV (1762–1830), until they quarreled in 1813.
The dandy would avoid the traditional aristocratic burden of responsibility for society while displaying an aloof concern for looking good. If there was serious purpose behind the show, it was to couple disdain for emerging democratic culture with an air of nihilism, or aesthetic hedonism. Dandyism later became an entryway into society for men of little wealth but great artistic talent, such as writer and poet Oscar Wilde (1854–1900). JSh
1790
The Third Critique
Immanuel Kant
The theory that experiencing the sublime creates tension between mental faculties
The Critique of the Power of Judgment, also known as The Third Critique, was published in 1790 by German philosopher Immanuel Kant (1724–1804). An influential work on aesthetics, it contains a greatly expanded treatment of the sublime. For Kant, the sublime is experience of absolute grandeur, pertaining to the supersensible and moral nature of humans. It is one of two types of aesthetic experience, the other being experience of the beautiful. Speaking of the sublime, he distinguishes two kinds: the mathematical (such as the immeasurable powers of God), and the dynamical, (for example, the powerful force of a tsunami).
“Attempting to imagine what it cannot, [the mind] has pain in the failure …”
Immanuel Kant, The Third Critique (1790)
Experience of the sublime is that of formlessness, limitlessness, and incomprehensiveness; the sublime resists representation. When confronted with it, our mind cannot find a way to organize it, to make it make sense. But instead of being simply frustrated or upset by this inability, the sublime brings about alternating feelings of displeasure and pleasure. These feelings are the result of psychological tension between the mind’s faculties of imagination, understanding, and reason. The displeasure stems from the failure to grasp something through the senses and understand it, and the pleasure is an enthusiasm for the supersensible flight of fancy. The faculty of reason contains ideas of absolute freedom and totality, so it is the only one capable of making sense of the experience. The Third Critique altered cognitive aesthetics. Kant’s notion of the sublime changed the way we think about aesthetic experiences, the feelings we have when faced by art or nature, and the judgments at which we arrive. KBJ
1790
Against Revolution
Edmund Burke
The notion that gradual reform is better than revolution for bringing about change
Published in 1790 by Irish political theorist and philosopher Edmund Burke (1729–97), Reflections on the Revolution in France remains to this day the best-known intellectual attack on the French Revolution. The work provoked a huge response, most notably from English-American political activist Thomas Paine (1737–1809), who initiated an exchange of pamphlets.
In the work, Burke called for military defeat of revolutionary France and the reinstatement of the old aristocratic society. While he disliked the divine right of kings and the idea that people had no right to challenge an oppressive government, he argued for tradition, social status, prejudice in some forms, and private property—each citizen should have a clear idea of where they belong in their nation’s social hierarchy. Burke favored gradual constitutional reforms rather than revolution, and emphasized how mob rule could threaten individual rights, freedoms, and society as a whole. He argued that the French based their revolution on overly abstract principles; structuring government on abstract ideas such as liberty and equality rather than effective command and order could lead to abuse and tyranny. Many of Burke’s predictions came true; he foresaw a popular army general becoming master of the assembly, and, two years after Burke died, Napoleon seized the assembly and created a military dictatorship that was corrupt and violent.
“To give freedom is still more easy … it only requires to let go the rein.”
Edmund Burke, Reflections on the Revolution … (1790)
Reflections expresses classical liberal and conservative political views. Right or wrong, the book offers an excellent analysis of how a revolution can murder itself by its own principles. KBJ
1791
Presumption of Innocence
William Garrow
The legal principle that an accused person is innocent until proven guilty
Of all the legal apparatus that surrounds a criminal trial, one aspect stands out above all others. When the accused stands in the dock, he or she is presumed innocent until proven guilty. It is up to the prosecution to make the case that the defendant is guilty, it is not up to the defendant to have to prove his or her innocence. This age-old principle dates back to Roman times, although it was first named by an English juror, William Garrow (1760–1840), in the eighteenth century.
William Garrow became a barrister in 1783. He specialized as a criminal defense counsel, later becoming attorney general for England and Wales in 1813 and a judge in 1817. Quite early in his career, in 1791, he coined the phrase “innocent until proven guilty” and insisted that the prosecution case must be properly tested in court. The principle itself had origins in the sixth-century legal Digest or compendium of Roman law compiled for Emperor Justinian, which stated, “The burden of proof lies with who declares, not who denies,” drawing on the earlier work of the third-century Roman jurist Julius Paulus. For the principle to work, three related rules presume that in respect of the facts, the state as prosecution has the entire burden of proof; that the defendant has no burden of proof and does not even have to testify, call witnesses, or present any evidence; and that the judge and jury are not allowed to draw any negative inferences from the fact that the defendant has been charged and is in court.
The presumption of innocence has international appeal. It appears in Islamic Sharia law, and also in European common law and the civil law systems of countries such as Italy and France. Many countries have explicitly included it in their constitutions and legal codes. The course of justice would be radically different without it. SA
1792
Equal Education for Women
Mary Wollstonecraft
The argument that women will achieve equality with men, both in the workplace and in their marriages, only by receiving an education appropriate to their station in society
A portrait of Mary Wollstonecraft painted by John Keenan in c. 1793. Mary Wollstonecraft’s daughter, Mary Shelley (née Mary Wollstonecraft Godwin) went on to write Frankenstein (1818).
In A Vindication of the Rights of Women, published in 1792, writer and philosopher Mary Wollstonecraft (1759–97) argues that women should have the same rights to an education suited to their places in society as men do. Wollstonecraft believed that educating women would not only further the interests of women, but would also strengthen marriages, home life, and society as a whole. She argued that stable marriages occur when spouses are equals and share the marriage as equal partners. She also wrote that men and women have a sexual nature, and the strength of the marriage is dependent on both partners remaining faithful.
“Virtue can only flourish amongst equals.”
Mary Wollstonecraft
Wollstonecraft wrote A Vindication of the Rights of Women in England during the time of the French Revolution (1787–99), and many of her opinions were radical for the time. Although she wrote about a wide variety of subjects, ranging from child education to politics and history, A Vindication of the Rights of Women is widely seen as one of the first works of what would later become feminism. She wrote the book to argue against the notions that women were destined only to be wives to their husbands and were suited only for limited domestic purposes. Hers was a decidedly radical opinion for the time as many people, especially men, viewed women as little more than chattels.
A Vindication of the Rights of Women was well received in its time, but revelations about Wollstonecraft’s illegitimate daughter, unorthodox personal life, and suicide attempts caused the work to be viewed with suspicion for more than a hundred years. It was not until the twentieth century that Wollstonecraft’s life and work became better understood. It served as inspiration for many feminists, including Virginia Woolf (1882–1941), even though feminists today do not regard education on its own to be sufficient to provide equality between the sexes. MT
1794
The Age of Reason
Thomas Paine
A critique of institutionalized religion that advocated deism instead
In 1794, 1795, and 1807, the English-American political activist Thomas Paine (1737–1809) published a three-part pamphlet series, The Age of Reason; Being an Investigation of True and Fabulous Theology. It was well received in the United States but not in Britain, where there was growing fear of radical political views following the French Revolution (1787–99). Paine wrote the first two parts while he was imprisoned in France during the Reign of Terror (1793–94). The views contained were not new to the educated, but, using an engaging, satirical, “no holds barred” style, he was able to access a larger audience. The low cost of his pamphlets also helped.
A deist manifesto, The Age of Reason gives the central tenets of deism as, first, God created the world but does not constantly intervene or interact with us directly, and, second, the existence of God can be proved on the basis of reason and observing the natural world. Paine, rejecting the possibility of miracles, argued that nature was the only form of divine revelation, since it was obvious that God had established a uniform and eternal order throughout his creation.
“Give to every other human being every right that you claim for yourself.”
Thomas Paine, The Age of Reason (1794)
Paine also rejected Christianity, calling it a human invention, and criticized attempts by what he viewed as a corrupt Christian Church to gain political power. He also denied that the Bible was the word of God, referring to it as just an ordinary piece of literature. Paine’s attempts to spread deism and attacks on the Christian Church made him many enemies. But his pamphlets were designed to bring politics to the people in a clear and simple style, and this changed political discourse. KBJ
1794
The Morality of Terror
Maximilien de Robespierre
Persecution by the state is justifiable if directed toward virtuous ends
An illustration of Robespierre—the original proponent of the Terror—being led to the guillotine in 1794.
On February 5, 1794, during the infamous Reign of Terror in France (1793–94), Maximilien de Robespierre (1758–94) wrote Report on the Principles of Political Morality, an impassioned defense of the Revolution. He argued that the Republic could be saved from its enemies by the virtue of its citizens, and that terror was virtuous and fully justified because it helped maintain the principles of the Revolution and the Republic. The violence occurring during the Terror was necessary in order to achieve higher political goals. In the event, 16,594 people were executed in France during the Terror, with the ironic inclusion of Robespierre himself.
Robespierre believed that terror had a deep moral purpose, beyond winning any war. He envisioned a society in which people sought the happiness of their fellow citizens—the peaceful enjoyment of liberty, equality, and natural justice—rather than material benefits. Declaring that “the blade of the law shall hover over the guilty,” he vowed to use the political power he had gained during the Terror to hound out and eliminate all opponents of the Revolution and create what he called a “Republic of Virtue.” In reality, terror seems to have been resorted to out of the fear and weakness of the controlling Jacobin party.
“Virtue … without terror is destructive; terror … without virtue is impotent.”
Robespierre, Report on the Principles … (1794)
For the first time in history, the use of terror was an official, fully endorsed government policy. Robespierre’s Report is a horrifying example of how a noble idea, such as virtue, can be twisted out of all recognition to become a logical justification for tyranny and violence. It is a perfect example of how genuinely good intentions can result in evil. KBJ
1795
Gradualism
James Hutton
The theory that changes occur, or ought to occur, slowly and incrementally
The Grand Canyon du Verdon, a 13-mile (21 km) long gorge in Alpes-de-Haute-Provence in southern France.
Scottish geologist and farmer James Hutton (1726–97) is credited with originating the idea of gradualism in 1795, and his notion of gradual change revolutionized the sciences of geology and biology. It has also influenced the fields of politics and linguistics.
In natural sciences, gradualism is a theory that change is the cumulative product of slow but steady processes. In geology, Hutton argued that landforms such as canyons, or even layers of soil, were the result of slow, steady changes over very long periods of time. The theory contrasts with that of catastrophism, which held that the changes originated in short-lived events.
In biology, naturalist Charles Darwin (1809–82) embraced gradualism in his theory of evolution, which states that mutations occur gradually over time to become naturally selected helpful traits. Individual organisms possessing a small variation that suits them slightly better to their environment are more likely to survive and reproduce than those without the slightly helpful trait. Generations of the more fortunate individuals survive and reproduce, while those without the trait die out, with the result that the population of that organism changes gradually over time.
“This is no time … to take the tranquilizing drug of gradualism.”
Martin Luther King Jr., “I Have a Dream” speech (1963)
In politics and society, it is believed that gradual changes to policy, law, and practices are preferred to the violent or abrupt changes resulting from revolution and uprisings. Political gradualism is contrasted with reformism, which urges swift and radical changes. In the United States, gradualism was proposed in the 1950s to eliminate racial segregation, but many reformists felt this was merely a way of avoiding the issue. KBJ
1796
Doppelganger
Jean Paul
The notion of a coexisting identical being, double, or negative version of a person
The word “doppelganger” was adopted by German writer Johann Paul Richter (1763–1825, pseudonym Jean Paul), who used it in his work Siebenkäs (1796–97). The term, which began appearing in English around 1851, literally means “double walker,” or, more simply, a double of a living person.
In German folklore, a doppelganger is a sinister or evil paranormal entity, not a ghost or spirit of the dead but a kind of apparition that is the negative or opposite of its human counterpart. There are doppelganger-like descriptions in Egyptian, Norse, and Finnish mythologies, also. In folklore, seeing your double was an omen of imminent death or horrible things.
A doppelganger is also a literary device referring to a character in a story that is a copy or impersonator of another character. Soap operas classically use this device in their plots. These “fake” doubles often have intentions and emotions different to those of their “real” counterparts and cause great psychological anxiety for the person they haunt. Today, a literary doppelganger is less likely to have evil intentions; it may simply be a look-alike of someone, the cause of a moment’s “double take” on the part of an observer.
“ … the ‘double’ was originally an insurance against destruction to the ego …”
Sigmund Freud, “The ‘Uncanny’” (1919)
The idea of a doppelganger has greatly impacted literature, movies, and television, providing some of the greatest plots, twists, and characters. It has influenced Dr. Jekyll and Mr. Hyde (1886), Twin Peaks (1990–91), Fight Club (1999), Mulholland Drive (2001), and Black Swan (2010). The notion is easily abused, and can lead at times to cliché, but when used imaginatively it can underpin intense, cliff-hanging drama. KBJ
1796
Gambler’s Fallacy
Pierre-Simon Laplace
The false assumption that probability is affected by past events
If, when using a fair coin, heads are flipped five times in a row, the Gambler’s Fallacy suggests that the next toss will be tails because it is “due”; the chance of tails coming up on the next toss is therefore seen as greater than half. This is bad reasoning because the results of previous tosses have no bearing on future tosses.
The fallacy is also known as the Monte Carlo Fallacy because of an incident that happened there at a roulette table in 1913, when black fell twenty-six times in a row. While this is a rare occurrence, it is among the possibilities, as is any other sequence of red or black. Needless to say, gamblers at that table lost millions that day because they reasoned, incorrectly, that red was due to be next, or next after that. The fallacy also occurs in the erroneous thought that gambling is an inherently fair process, in which any losses incurred inevitably will be corrected by a winning streak.
“[Gambler’s fallacy relies on the] idea that essentially chance events have memories.”
J. D. Mullen and B. M. Roth, Decision Making (1991)
French mathematician Pierre-Simon Laplace (1749–1827) first noted the fallacy behavior in his Philosophical Essay on Probabilities (1796), in which he wrote about expectant fathers trying to predict the probability of having sons. The men imagined the ratio of boys to girls born each month to be fifty/fifty, and that if neighboring villages had high male birth rates it implied that births in their own village had a high probability of being female. Also, fathers who had several children of the same sex believed that the next one would be of the opposite sex—it being “due.”
The Gambler’s Fallacy warns that there is no Lady Luck or “invisible hand” in charge of your game. Walk away with your money and dignity now. KBJ
1796
Homeopathy
Samuel Hahnemann
The theory that tiny amounts of harmful substances prompt the body to cure itself
Willmar Schwabe’s “Improved chemist’s shop with 134 remedies in bottles, A, B and D” (1889).
German physician Samuel Hahnemann (1755–1843) was so disturbed by the use of bloodletting, leeching, and purging in common medical procedure that he gave up his own practice and became a researcher. In 1796 he identified the founding principle of homeopathy: that a substance which produces symptoms in a healthy person will cure similar symptoms in a person who is sick. Hahnemann refined his theory with two other principles: first, that the less you use of a substance, the more potent it becomes, and, second, that a sickness is always unique to the person who is suffering it.
The idea of homeopathy closely echoes the Hippocratic idea that “like cures like.” Hahnemann came across a version of this “law of similars” in 1790 while translating William Cullen’s A Treatise of Materia Medica (1789) into German. Homeopaths claim that the principle is the same as that underlying vaccination: just as a vaccine provokes a reaction from the individual’s immune system that in the future protects against that person from actual disease, so the homeopathic remedy, it is said, provokes the body into healing.
“The physician’s high and only mission is to restore the sick to health …”
Samuel Hahnemann, Organon of Medicine (1833)
It is hardly surprising that homeopathy was embraced in the early nineteenth century: medical procedures, just as Hahnemann claimed, often did more harm than good. Today, homeopathy’s impact is evidenced by the existence of a multimillion-dollar industry in alternative medicine. And the persistence of the idea that there is more to human healing than conventional medicine allows for remains a challenge: are we more than the sum of our parts, or does it just make us feel good to think we are? LW
1797
Assembly Line
Eli Whitney
Manufacturing items in a sequential manner, using interchangeable components
Workers on a moving assembly line at the Ford Motor Company Highland Park Plant in c. 1913.
In 1797, the U.S. government solicited contracts from private firms to produce 40,000 Model 1795 .69 caliber flintlock muskets. U.S. inventor Eli Whitney (1765–1825) proposed to supply 10,000 muskets over two years by assembling machined parts that conformed precisely to a model or jig. Prior to the mechanization of the assembly line, craftsmen made unique products by hand. The interchangeable parts produced by Whitney’s unconventional method would allow unskilled workmen to assemble the muskets at a faster, and thus cheaper, rate than was traditionally possible.
Although Whitney was probably the first to exploit the assembly line concept for business purposes, other theorists and inventors had done seminal work. In his book The Wealth of Nations (1776), Scottish philospher and economist Adam Smith had discussed the idea of division of labor for the manufacture of pins; and French gunsmith Honoré Blanc, influenced by the French artillerists’ Gribeauval system of cannon and shell standardization, had used gauges and filing jigs to achieve engineering tolerances in the manufacture of interchangeable musket parts. Whitney’s assembly line not only speeded up manufacture but also made it easier to replace parts of damaged or defective muskets.
“The tools … shall fashion the work and give to every part its just proportion.”
Eli Whitney
Modern mass-production assembly lines rely upon the judicious arrangement of machines, equipment, and workers to achieve a continuous flow of work pieces. All movements of materials are simplified to eliminate cross flow or backtracking. In the early twentieth century, the Ford Motor Company adopted the assembly line to mass-produce the Model T. BC
1798
Malthusian Law
Thomas Malthus
The theory that population growth will naturally ensure the persistence of poverty
British political economist Thomas Malthus (1766–1834) doubted that human progress was as inevitable as the utopians of his era proclaimed. In An Essay on the Principle of Population (1798), he explained that, while progress required more material wealth for everyone, he could not see how agricultural production could keep pace with rapid population growth.
Malthus observed that agriculture could grow arithmetrically so that 100 units of food could be (say) 150 units in forty years, and 200 units in eighty years. Assuming no depletion of soil and water, doubling total food in three generations would be an impressive technological feat. However, if the typical family has four children, then a generation of 100 people is replaced by a generation of 200, followed (geometrically) by 400 people in around eighty years. Thus, 100 units of food for 100 people today eventually translates to 200 units of food for 400 people, so the later individuals get half as much food as before.
“Population, when unchecked … increases in a geometrical ratio.”
Thomas Malthus
Since food is never distributed equally, Malthusian law predicts much malnourishment and death from famine. The law explains why, despite increases in food production of more than 50 percent over the past two generations, 2 billion people today go hungry.
Malthus’s work inspired the evolutionary theory of Charles Darwin (1809–82), that competition for food among an oversupply of animals would naturally select the unfit for elimination, causing a species to evolve over time. Malthusian law has not been refuted, but humanity could evade its implications through family planning and birth control. JSh
c. 1800
Phrenology
Franz Joseph Gall
The belief that a person’s mental attributes may be determined by assessing their skull
A phrenology illustration from the People’s Cyclopedia of Universal Knowledge (1883).
Developed around 1800 by German physiologist and neuroanatomist Franz Joseph Gall (1758–1828), phrenology was originally called craniscopy; today it is considered a pseudoscience. Drawing on the science of character, theory of the brain and mind, and faculty psychology (in which different faculties of the brain perform different tasks), phrenology sought to link a person’s character with the shape of their skull.
Phrenology was extremely popular in England in the early nineteenth century, with some employers even asking a phrenologist to determine whether a prospective employee was an honest and hardworking person. However, by the 1850s the “science” had lost credibility as it had become apparent that few practicing it could deal with contradictory evidence.
“I never knew I had an inventive talent until Phrenology told me so.”
Thomas Edison, inventor
The logic of Gall’s phrenology ran as follows: first, the brain is an organ of the mind and it controls propensities, feelings, and faculties; second, the mind is composed of many distinct, innate faculties (including moral and intellectual ones); third, because these faculties are all distinct, they must each have a different place or organ in the brain; fourth, the size of an organ is a measure of its power, and its actions depend on its organization; fifth, the shape of the brain is determined by the development of its various organs. Thus, by “reading” the dents, dips, and protrusions of the skull, particular character traits, intellectual abilities, and natural propensities may be identified and quantified.
Gall’s ideas influenced today’s neuropsychology, which studies the structure and function of the brain as it relates to processes and behaviors, and psychiatry. KBJ
1802
Separation of Church and State
Thomas Jefferson
The view that the church should keep out of the state’s business, and vice versa
The closeness of relationship between the political apparatus of a nation state and its organized religious body or bodies is one that varies from nation to nation, depending on the law system in place and the populace’s prevalent views on religion in society. The degree of separation of church and state is typically detailed in a country’s constitution.
From ancient Greece through medieval times, church and state were generally mixed. Crimes were described as being against the state and the gods; kings had additional priestly titles, or their throne was granted by divine right; and Catholic popes could depose kings and compel regal government. However, during the Protestant Reformation, Martin Luther (1483–1546) advocated the “doctrine of two kingdoms,” which divided God’s rule into a worldly kingdom governed by laws, and a spiritual kingdom governed by grace. The doctrine is described as one of the earliest conceptions of a separation of church and state.
“The government of the United States is not … founded on the Christian religion.”
George Washington, U.S. president 1789–97
In North America, the most influential separationist was U.S. founding father Thomas Jefferson (1743–1826). The First Amendment to the United States Constitution (1791) asserted that the government must respect an individual’s freedom of religion, and not create laws to establish a national religion. This separation of church and state changed the way that people perceived the role of religion in society, especially concerning government, law, morality, and individual freedom. Historically, the separation of church and state has benefited the advancement of science, since many scientific discoveries run counter to religious belief. KBJ
1802
The Watchmaker Analogy
William Paley
As a watch implies a watchmaker, the natural world implies the existence of God
According to the watchmaker analogy, feats of nature such as Mount Everest are evidence of God.
Western conceptions of God are shaped by a variety of human experiences, such as the powerful sense of awe inspired by a glowing sunset, a towering mountain range, or even the complexity of a mammalian cell when viewed under a microscope. Such experiences have led many thinkers to believe that the objects and events that inspire these feelings are intentional, that is, they have been designed. Since design implies a personal agent, these experiences seem, to many, grounds for believing that an extra-natural intelligent being crafted the cosmos. This inference is known as the “teleological (goal-directed) argument” for God’s existence, which can be traced to Xenophon’s Memorabilia (c. 371 BCE) and Plato’s Timaeus (c. 360 BCE).
British philosopher and Anglican bishop William Paley (1743–1805) formulated a well-known version of this argument, known as “The Watchmaker Analogy,” in his Natural Theology (1802). Paley contrasts the experience of finding an ordinary stone with finding a watch, and then asks whether it is reasonable to regard their possible origins as being the same. He answers with a resounding “no”; everything about the watch points to the existence of an intelligent craftsman, whereas the features of the stone suggest nothing at all about its origins. This distinction between design and chance leads Paley to conclude that atheism is as absurd as the idea that the watch is the product of chance: he asserts that “every indication of contrivance, every manifestation of design, which existed in the watch, exists in the works of nature.”
Arguments such as this one continue to play an important role in discussions of the limits of science and its relationship with religion. The contemporary Intelligent Design movement that began with Phillip E. Johnson’s Darwin on Trial (1991) and Michael J. Behe’s Darwin’s Black Box (1997) is a development in the tradition that Paley helped to advance. JW
1809
Heavier-Than-Air Flight
George Cayley
The theory that machines might be able to fly by utilizing lift and thrust
A photograph from 1894 (colored later), showing aviation pioneer Otto Lilienthal flying a hang glider.
Every bird demonstrates its capability of heavier-than-air flight by, in part, employing a lifting body—its wings—to create air pressure differentials. When a wing travels through the air, the air pressure above it is less than the air pressure below it; as a consequence, the air below pushes the wing higher, enabling whatever is attached to the wing, or wings, to fly. The concept of heavier-than-air flight is no more than that: to soar like a bird, attached to wings that lift.
From an early nineteenth-century perspective, rigid-frame kites had been invented in China as early as 2800 BCE, while human flight had been possible only since 1783, as a passenger floating under a hot-air balloon. Early thinkers, such as Leonardo da Vinci (1452–1519), had studied the flight of birds, and had even designed craft that might replicate their abilities. However, no real progress was made toward practically achieving heavier-than-air flight until English engineer George Cayley (1773–1857) began studying the aerodynamics necessary in 1791, when he started experimenting with hang gliders. Cayley built a working model of a glider and designed a glider to carry a person, as well as publishing a treatise, On Aerial Navigation, in 1809. Humankind finally flew in 1853, when Cayley’s coachman completed the first manned glider flight in a full-scale model of Cayley’s design. Fifty years later, in December 1903, Orville (1871–1948) and Wilbur (1867–1912) Wright became legendary overnight when they flew the first powered, heavier-than-air flying machine.