The most thoroughgoing analysis of Plato’s principle applied to art can be found in the work of Roman Ingarden (1893–1970). With respect to music, for example, he identifies the various notes and the temporal sequence of each note as defined by the composer—in other words, the form of the music—with the “the musical work” and not, as some might have it, the performance or written score. The performance, says Ingarden, is guided by the form as it is communicated by the score; however, the desired psychological and physiological responses of the audience are contingent upon the temporal arrangement of the notes. After all, one can play all the notes contained in Beethoven’s Symphony No. 5, but unless they are played in a particular order and are separated by very specific time intervals, the performance will not evoke the same emotion or convey the same meaning—indeed, it will not be Beethoven’s Symphony No. 5.

Contemporary formalists such as Clement Greenberg (1909–94) embrace Plato’s principle, practicing and espousing art that emphasizes the creation and depiction of form over the representation of readily identifiable subjects. DM

c. 360 BCE

Space

Plato

The entity in which things come to be and that, with time, is a precondition of all thinking and understanding

A diagram of three-dimensional Euclidian space. Every point is determined by three coordinates, relating the point’s position on each axis relative to the other two.

According to the treatise Timaeus (c. 360 BCE) by Plato (c. 424–c. 348 BCE), space is a receptacle into which all things come into being. It exists eternally, and is not sensed per se but is intuited by “a kind of bastard reasoning.” However, our conventional understanding of space as an extension in three dimensions is from the Greek mathematician Euclid (fl. c. 300 BCE), who wrote a generation after Plato. With the addition of “Cartesian coordinates” (the system that locates a point in space using a set of three numbers), Euclid’s conception of space provides the framework for the application of algebra and calculus to classical physics.

“The third type [of existing things] … space … exists always and cannot be destroyed.”

Plato, Timaeus (c. 360 BCE)

The German philosopher Immanuel Kant (1724–1804) suggested that space is one of two “a priori conditions of the understanding.” That is to say that understanding, and thinking in general, presupposes the existence of space, which, along with time, makes all other concepts and propositions possible. Logical propositions are sequential, and those sequences “unfold” in space. For this reason, space cannot be a projection of the mind; instead, it transcends all cognitive activity and makes thinking possible.

In the twentieth century Albert Einstein (1879–1955) introduced further conceptually challenging and powerful innovations to our understanding of space. His formula for special relativity, E=mc2, couples space with time to show that energy exerted in travel through space inversely affects a person’s passage through time (and vice versa). Although a human might feel this effect only during extreme exertions of energy (during travel at or near the speed of light), the effect can be quantified and measured with sophisticated mechanical devices. General relativity, on the other hand, suggests that space possesses shape and that the shape of space is dictated by the presence of mass. DM

c. 360 BCE

Moral Absolutism

Plato

Moral and ethical principles are universal, and all actions are either right or wrong

What makes someone good? What makes our actions right or wrong? For a moral absolutist, there exists a standard of morality that is universally applicable to all human activity. This standard dictates what a person should do in any situation, and any deviation from that is an immoral act. For the moral absolutist, a clear moral code is the measuring stick that determines the moral value of any thought or action. A person’s intentions and beliefs, as well as intended or unintended consequences, are irrelevant.

Moral absolutes have probably existed since prehistory. Those found in the Jewish Torah are thought to date from the sixth to fourth centuries BCE. Greek philosopher Plato (c. 424–c. 348 BCE), in his dialogue Theaetetus (c. 360 BCE), considered the problem of moral relativism: namely, that a relativist cannot claim that there is no universal truth because doing so would require believing at least one universal truth. This criticism of relativism laid the logical foundation for the case for moral absolutism. German philosopher Immanuel Kant (1724–1804) believed that morality was based on absolute rights and ethical principles or duties; lying, for example, was always immoral for Kant.

“Act only … [if] you can at the same time will that it should become a universal law.”

Immanuel Kant, philosopher

Moral absolutism was, and is, a way that many people approach questions of morality and ethics. With a moral absolute, judging a person’s actions as right or wrong is very often a simple proposition. Yet these concrete dictums can, and have, led to terrible consequences, and the practical effect these absolutes have leads many to question the nature of morality itself, and what it means to be a moral person. MT

c. 350 BCE

Geocentrism

Ancient Greece

The theory that the universe orbits around the Earth

The world according to Aristotle, from De Philosophia Mundi (twelfth century) by William de Conches.

Geocentrism, also referred to as the geocentric model or the Ptolemaic model of the universe, holds that the planet Earth is the center of the universe and all celestial bodies orbit or surround it. The observable movements of the moon, the planets, and the stars can all be explained by the existence of celestial spheres that govern their movements. The geocentric model of the universe remained the dominant theory about how the universe operated until the sixteenth century.

Early Greek astronomers believed that the Earth lay at the center of the universe. In the fourth century BCE, philosophers Plato and Aristotle contributed to this model by proposing that the heavenly bodies were affixed in celestial or stellar spheres that rotated around the similarly spherical Earth. In the second century CE, the Hellenistic philosopher Claudius Ptolemy built upon this model and developed a system that explained planetary movement based on the idea that multiple spheres govern the movement of heavenly bodies.

“[The Earth] alone remained immovable, whilst all things revolve round it.”

Pliny the Elder, naturalist and natural philosopher

Today the geocentric model of the universe seems almost ludicrous, but it held influence for a long time because it fit so well with observable data. Early astronomers did not have the luxury of telescopes or instruments that could make accurate measurements; the geocentric model not only explained the motions of the heavens with accuracy, but it also catered to the notion that humanity was the center of the universe. The subsequent heliocentric (sun-centered) model was far more accurate, but geocentrism nevertheless provided early thinkers with a rational explanation of how and why the universe operated. MT

c. 350 BCE

The Fifth Element

Aristotle

An unobserved element, ether, is one of five elements that make up the universe

The opening page of Aristotle’s cosmological treatise On the Heavens (c. 350 BCE).

The fifth element, known as aether or ether, was one of the five basic elements of nature in the ancient Greek world. These fundamental elements of fire, water, earth, air, and ether were the foundation of the universe.

To the ancient Greeks, the universe existed as a system of celestial spheres in which the Earth was stationary, located in the center, with all the planets, stars, and other celestial bodies surrounding it in spheres that governed their movements. In his dialogue Timaeus (360 BCE), Plato (c. 424–c. 348 BCE) wrote about the elements and their different types, identifying ether as being the brightest part of the air. Aristotle (384–322 BCE), in his work On the Heavens (c. 350 BCE), later hypothesized that ether was a fifth element in the spaces between the celestial spheres.

“There is something besides the bodies nearby and around us …”

Aristotle, On the Heavens (c. 350 BCE)

After Aristotle, the classical explanation of the five elements as the fundamental particles of nature dominated Western scientific thought until the Renaissance. Yet ether, the fifth element, was different. Scientists could directly observe each of the other four elements in nature, while ether remained hidden. Aristotle had to make observations about the world and then hypothesize an element that had yet to be observed. Modern science has long dispelled the idea of the elements, but that ether did not exist, and was never observed, did not diminish the importance of the concept. Aristotle’s postulation that an as-of-yet unobserved particle explained other observed phenomena anticipated the scientific notion that a hypothesis should be able to predict behavior or observation, a concept central to modern science. MT

c. 350 BCE

Hypothesis

Aristotle

An explanation for a phenomenon that can be tested by further investigation

A hypothesis is a type of claim used to explain a phenomenon. As Aristotle (384–322 BCE) uses the word, a hypothesis is the claim in an explanation that states that something is or is not the case, and that we are obliged to demonstrate. Aristotle contrasts hypotheses with “illegitimate postulates,” which are claims that are assumed to be true and are used without demonstration, and “definitions,” which express the meaning or referent of a term or phrase but do not assert whether there are any instances of the meaning or referent.

Consider an example: Why might an adolescent have acne? In order to investigate, we may form a number of hypotheses: for example: acne is caused by eating too much sugar, by not cleaning well, and by undergoing puberty. In order to see which hypothesis best explains our phenomenon, we must subject each to a series of tests—that is, we must identify certain events that we would observe if the hypothesis were true and then check for them. For example, if our adolescent’s acne is due to eating too much sugar, then removing sugar from his diet should cause a marked decrease in acne.

“The great tragedy of science—the slaying of a beautiful hypothesis by an ugly fact.”

Thomas Huxley, biologist

The concept of hypothesis came into modern use through the scientific work of Francis Bacon (1561–1626) and Isaac Newton (1642–1727), although neither expressed much respect for it. Bacon regarded hypotheses as unscientific starting points, and Newton explicitly rejected their use. Nevertheless, in practice, they both made important use of hypotheses, and contemporary scientists recognize them as an essential component in the scientific process. JW

c. 350 BCE

The Chicken and Egg Conundrum

Aristotle

The age-old puzzle that if chickens come from eggs and vice versa, how do you establish which of the two existed first?

First-century marble bust of Aristotle; it is a copy of a Greek bronze sculpted by Lysippus in the fourth century BCE. Lysippus is said to have created 1,500 bronzes, none of which have survived.

When a hen lays a fertilized egg, that hen will keep the egg warm until it hatches a chick. That chick will then grow up to become a hen and lay other eggs, repeating the process as part of an ongoing cycle. But when did this process start? What was first: the chicken or the egg? This infamous question identifies a problem of causality, a paradox in which both chicken and egg cannot exist without the other, yet there must have been a moment when one of them came first.

“How wonderful that we have met with a paradox. Now we have some hope of making progress.”

Niels Bohr, physicist

What existed at the beginning? How did objects, the world, animals, and humans come to be? These are the basic questions that lie at the heart of the chicken and egg conundrum. When the ancient Greek philosopher Aristotle (384–322 BCE) asked the question, he believed that both must have always been in existence. Over the centuries the question remained a challenge to philosophers, though it became less important after English naturalist Charles Darwin (1809–82) introduced the theory of evolution by natural selection and explained the development of any organism as a process of slow progress over time.

In 2010, British researchers released the results of a study that, they claimed, conclusively proved that the chicken came first. While the solution was not universally accepted, and others claim that the egg existed prior to the chicken, the question’s importance is not solely one of biological history. The chicken and the egg conundrum prompts us to consider beginnings, and how they relate to our experiences. Some theologians have answered the question by saying that the creation of the universe necessarily means that the chicken came first. Other traditions hold that time does not have a clear beginning and end, and the idea of what came first is nonsensical because all things have existed for eternity. MT

c. 350 BCE

Spontaneous Generation

Aristotle

Certain organisms can blossom into life spontaneously from nonliving matter

Some of the earliest Western thinkers believed that life forms first emerged from a “primordial slime” that was either activated by the sun and air or fertilized by seeds transmitted by them. Aristotle (384–322 BCE) was the first figure to argue that some life forms emerge spontaneously by means of an elemental principle similar to a seed.

Aristotle stated that reproduction involves an active principle, represented by the male’s semen, which acts upon a passive principle, represented by the material substrate of the potential organism provided by the female. The form of the organism is transmitted by the semen to the material, giving rise to the new organism. In spontaneous generation, however, the passive substrate is a mixture of seawater and earth, not a living organism. The active principle is provided by pneuma, or “vital heat,” found in all things in varying levels. This pneuma stimulates a sort of fermentation that eventually gives rise to the new organism.

According to Aristotle, only certain organisms are produced in this way—namely, certain fish, oysters, and eels. These organisms provided Aristotle with empirical evidence that spontaneous generation occurs: that oysters are generated spontaneously, for example, is suggested by the fact that they do not multiply during transportation. Furthermore, the “eggs” that they contain never hatch offspring. Eels, according to Aristotle’s observations, did not possess the organs required for procreation, nor did they produce eggs.

Speculation concerning the possibility of spontaneous generation gave rise to a number of biological theories, including the homunculus (little man). It persisted throughout the Middle Ages and was not compellingly, if not definitively, refuted until the nineteenth century, by Louis Pasteur. DM

c. 350 BCE

The Third Man Problem

Aristotle

Aristotle’s critique of Plato’s Theory of Forms

The ancient Greek philosopher Aristotle (384–322 BCE) was both the greatest student and greatest critic of Plato (c. 424–c. 348 BCE). His biggest criticism of Plato’s philosophy is encapsulated in the “Third Man Problem.”

Plato distinguishes between the essential nature (sometimes referred to as the Form or Idea) of any given thing and the thing itself. He claims that the Idea of a thing is unchanging and exists independently from the thing itself. For example, if a person knows something about canines, they know something about what makes a canine a canine and not, technically speaking, something about a specific dog. Among other reasons, Plato defends this metaphysical claim on the basis that individual things are constantly changing and so we can never truly know anything about them.

“Of the ways in which we prove that the Forms exist, none is convincing.”

Aristotle, Metaphysics (c. 350 BCE)

Plato himself acknowledged a certain logical inconsistency in the notion that Ideas exist separately from things themselves: Plato’s logic suggests that, by extension, we should postulate a third thing—the Idea of the Idea of our subject. For the same reason we might postulate a fourth Idea, and a fifth, and so on. Aristotle argues that this reductio ad absurdum is grounds to reject Plato’s claim. He argues that the essential nature of a thing exists in the individual and, at the same time, in all the other members of the species to which the individual belongs.

This disparity of views is at the heart of one of the longest running and most multifaceted debates in Western philosophy: the idealism/realism debate. At stake is nothing short of the foundations of science and the nature of knowledge. DM

c. 350 BCE

Perfection

Aristotle

The concept of something that is completely flawless or complete

Perfection, in the sense of being flawless, is derived from discussions by Aristotle (384–322 BCE) of privation, or deficiency. Aristotle stated that “a doctor and a musician are ‘perfect’ when they have no deficiency in respect of the form of their peculiar excellence.” In other words, a “perfect” specimen is flawless in every way with respect to its performance of its profession or its embodiment of its species. This, however, is just one sense of a concept that is key to Aristotle’s philosophy.

The word “perfect” is a translation of the Greek teleion, a derivative of the polysemous word telos. In this context, the relevant meaning of telos is “end,” or “goal.” With this in mind, the English translation “perfect” can be understood to encapsulate the idea of being complete, of having fulfilled a goal. This was important for Aristotle because, as a matter of principle, he believed that all things exist for a reason—that is, they have some telos—and that all things naturally strive toward the fulfillment of their telos. Therefore, perfection, for Aristotle, is something all things strive for, be they a blade of grass or a human being.

“Have no fear of perfection—you’ll never reach it.”

Salvador Dalí, artist

In biology, Aristotle employs this notion to explain (in part) the various stages of an organism’s development—each is a step toward the fulfillment of its telos. In cosmology, however, Aristotle employs the idea very generally, suggesting that the telos of all heavy bodies invariably drives them toward a state of rest around a cosmic center point. That all heavy bodies fall to Earth is evidence that this center point is, in fact, Earth. In this way, perfection is a concept wholly entangled with geocentrism. DM

c. 350 BCE

The Scientific Method

Aristotle

The development of a system for pursuing scientific enquiry

A diagram from Roger Bacon’s Opus Majus (c. 1268), illustrating his scientific studies of the eye.

In Posterior Analytics (c. 350 BCE), Aristotle (384–322 BCE) became the first thinker to attempt an analysis and systematization of the proper procedure for conducting science. For him, science was the search for universal truths concerning the causes of things. He believed that these causes are revealed not through experimentation and empirical observation, but through the rigorous application of sound deductive reasoning.

Experimentation of the sort commonly associated with modern science emerged in the East in the early eleventh century, in the works of Alhazen, Al-Biruni, and Avicenna. Their notion of a cyclical scientific process of observation, hypothesis, and experimentation was transmitted to the Western world via philosopher Roger Bacon in the thirteenth century, whose Opus Majus offered an explanation and critique of what he extolled as a new approach to science (and, ultimately, theology).

“The scientific method is a potentiation of common sense.”

Peter Medawar, biologist

Philosopher of science Karl Popper enhanced the scientific method in the twentieth century, when he introduced the notion of falsifiability. According to Popper, properly scientific hypotheses are falsifiable—that is, clear and logically feasible conditions can be articulated under which the hypothesis might be false. This allows the scientist to conduct experiments aimed at bringing about these conditions, thereby either disproving or strengthening the hypothesis. Popper also argued that a condition for the acceptance of a scientific theory must be its ability to produce testable predictions that are not also predicted by another theory. For example, the greatest criticism of string theory is that it has so far failed to produce testable predictions. DM

c. 350 BCE

The Law of Noncontradiction

Aristotle

Something cannot both exist and not exist at the same time

The title page of Aristotle’s Metaphysics (c. 350 BCE), the first major work of metaphysic philosophy.

The rules or “laws” of logic prescribe the acceptable means of manipulating semantic elements in a system. Perhaps the oldest and best-known rule of logic is the Law of Noncontradiction: nothing can both be and not be. It originated with Aristotle (384–322 BCE) as a metaphysical principle, expressed in his Metaphysics (c. 350 BCE): “The same attribute cannot at the same time belong and not belong to the same subject and in the same respect.” As such, the Law sets a boundary for every type of investigation. However, contemporary logicians restrict its application to propositions (for any proposition, p, it is not the case that both p and not-p are true at the same time in the same way).

The universal applicability of the Law has been challenged by a minority of scholars since the late 1800s. For example, Friedrich Engels argued that we find contradictions in nature: “Even simple mechanical change of position can only come about through a body being at one and the same moment of time both in one place and in another place.” Similarly, Graham Priest argued that intuitions believed to support the Law fail in well-known cases, such as the Liar Paradox (if “This claim is false” is true, then it is also false; if it is false, then it is also true).

“Contradictory propositions are not true simultaneously.”

Aristotle, Metaphysics (c. 350 BCE)

However, for the majority of scholars, these criticisms serve only to clarify and strengthen the classical view of the Law’s fundamental role in reasoning. For example, in response to the Liar Paradox, some argue that the claim’s entailing a contradiction proves that it conveys no coherent meaning. It is as meaningless as “that is a round square.” JW

c. 350 BCE

Indeterminism

Aristotle

Certain happenings are not caused deterministically by prior events

In the study of causation, there is a debate about how to characterize events that stand in a cause-and-effect relationship. On one hand, laws and prior events may exert complete control over the effects produced by a cause. For example, when one billiard ball strikes another, the angle and velocity of the striking ball, along with gravity, inertia, and the smoothness of the surface, fully determine where the struck ball goes. “Determinism” is the view that all events bear this relation. On the other hand, laws and prior events may exert less than complete control over the effects produced by a cause. For example, for any set of radioactive isotopes, there is no way to predict the exact order in which members will decay, but we are quite sure that all will eventually decay. Physicists attribute this unpredictability to randomness in the quantum law that governs the process. “Indeterminism” is the view that some events are indeterministic.

“Nor is there any definite cause for an accident, but only chance.”

Aristotle, Metaphysics (c. 350 BCE)

“Indeterminism” is a recent term, but the idea is old. Aristotle allowed for the possibility of accidents in nature, and Epicurus and Leucippus argued that atoms could act unpredictably. The Stoics later rejected this indeterminism in favor of determinism (or possibly fatalism). Nevertheless, the question of whether any causation is indeterministic remains significant. In physics, there is evidence that some quantum events are indeterministic, yet in every other area of physics events seem to be deterministic. In philosophy, there is disagreement as to whether morally responsible agency requires indeterministic causation. As yet, there is no widely accepted resolution to these problems. JW

c. 350 BCE

Rest as the Natural State of Things

Aristotle

All objects gravitate toward a state of rest that correlates to the element of which they are composed

A fifteenth-century French illustration depicts Christ holding a globe containing the four elements of Aristotle’s Physics (c. 350 BCE) at rest: beneath the heavens, fire, air, water, and earth.

Before the theory of gravity and Newton’s laws of motion, science was dominated by the notion that everything in existence is composed of certain fundamental elements. Ancient Greek philosopher Empedocles (c. 490–430 BCE), for example, believed that everything in existence was composed of earth, air, fire, and water, and offered an elaborate account of how these elements mixed and separated to form the planets, stars, and other denizens of the universe. Central to this account were the distinct qualities of each element. Each element, said Empedocles, tended to occupy a position in the universe relative to its density and weight: heavier, denser elements tended to settle in the lowest regions of the cosmos, leaving the lighter, “thinner” elements in the higher regions.

“The downward movement of a mass of gold or lead … is quicker in proportion to its size.”

Aristotle, On the Heavens (c. 350 BCE)

Aristotle (384–322 BCE) generalized the underlying principle of this account when he argued that rest is the natural state of all things. According to Aristotle, all objects tend toward a state of rest that corresponds to the natural state of their predominant element. If, for example, a thing is primarily composed of earth, it will naturally fall toward Earth, where it will come to rest. If it is predominantly composed of fire, it will tend toward a state of rest in the heavens. Following this reasoning, Aristotle argued that heavy bodies fall at a faster rate than lighter bodies, a conclusion that was not definitively refuted until the experiments of Galileo in the sixteenth century.

Historically, the most significant extrapolation of the principle of rest being the natural state of things was Aristotle’s conclusion that Earth is the unmoving center of the universe. This arose from the persistent observation that heavy bodies fall toward Earth. Thanks to Claudius Ptolemy in the second century CE, geocentrism became the dominant worldview until well into the seventeenth century. DM

c. 350 BCE

Categorical Logic

Aristotle

The first systematic science of the logic of categories

For Aristotle (384–322 BCE), the analysis of the mechanisms of logic at work in language was part (the first part, in fact) of any thoroughgoing science. In his text, Prior Analytics (c. 350 BCE), the philosopher set out a system of logic that would dominate science for approximately 2,000 years.

Aristotle’s logic is a “categorical” system of logic because its deductions concern either categories of things, the characteristics its members possess or do not possess, or the members themselves. The fundamental instrument of Aristotle’s system of logic is the syllogism. In its simplest form, a syllogism is an argument composed of three categorical statements, one of which posits a specific thing that must be true given the truth of the other two general statements. The classic example of a syllogism states: “All men are mortal. Socrates is a man. Therefore, Socrates is mortal.” Aristotle himself described a syllogism as a “discourse in which, certain things being stated, something other than what is stated follows of necessity from their being so.”

A rival system of logical analysis was developed by the Greek Stoic philosopher Chrysippus (c. 280– c. 206 BCE) in the third century BCE; however, it was Aristotle’s ideas that remained dominant. Aristotle’s system of logic was at the center of monastic debates concerning the nature of God, and it also enabled monks to determine how many angels could dance on the head of a pin. In biology, this system of categorical logic influenced Aristotle’s own initial efforts to define the various species (categories) of living organisms; this influence continued in subsequent efforts to expand and refine the definition of species. It was not until the late nineteenth century that German mathematician Gottlob Frege reinvigorated predicate logic of the sort promoted by the ancient Stoics. DM

c. 350 BCE

Equal Temperament

Aristoxenus

A musical tuning system that uses equal intervals in the scale

Attributed to the Greek philosopher Aristoxenus (f. 350 BCE), equal temperament (ET) is today the most common musical scale. It is used for the tuning of pianos, guitars, and other instruments that employ a fixed scale. The defining characteristic of ET is that it divides the octave into equal parts.

ET’s predecessors—such as just intonation or Pythagorean tuning—divide the tone based upon different ratios of the frequency. Just intonation, for example, fixes D on the frequency 9/8 times that of C. If the frequency of C is 262 hertz, then D, being 294.75 hertz, is 32.75 hertz higher than C. However, according to just intonation, F falls on the frequency 4/3 times that of C (349.33 hertz), and G falls on the frequency 3/2 times C (393 hertz). Thus the difference between F and G, an interval that is nominally the same as that between C and D, is approximately 44.6 hertz. This discrepancy between the size of intervals does not occur in ET.

“Harpists spend 90% of their lives tuning their harps and 10% playing out of tune.”

Igor Stravinsky, composer

In ET the frequency of each note is precisely 12√2 (approximately 1.05946) times higher than that of the preceding note. This ratio ensures that the interval between each note in the scale is exactly the same. The advantage of this is that compositions can be transposed between keys without having to substitute intervals in the original key with different-sized intervals in the new key. The only way to avoid this while using other temperaments is to retune the instruments. Despite its ancient origins, ET has only recently entered into common usage following advancements in technology that have allowed us to accurately measure audio frequency. DM

c. 350 BCE

Hylomorphism

Aristotle

A conceptual framework for the analysis of any given thing or process

According to the theory of hylomorphism of Aristotle (384–322 BCE), every substance (that is, any existing thing) is composed of matter and form. The matter of a thing can be understood as the stuff a thing is made of. However, more than that, it represents everything the substance could be—its “potentiality,” as Aristotle described it. The form of a thing, on the other hand, is its shape and more—it represents everything the substance actually is, its “actuality.” Like his teacher and mentor Plato (c. 424–c. 348 BCE), Aristotle believed that the form of thing gives it meaning and identity.

Hylomorphism is central to Aristotle’s explanations of the various processes unfolding in the universe. Reproduction, for example, occurs when the animal form, carried in semen, is imposed upon suitable material, provided by the female, thereby producing a new substance, the infant animal. For Aristotle, a comprehensive understanding of any given substance or phenomenon requires a thorough appreciation of its form and matter, and its actuality and potentiality, and the nuances and complexities of their unity.

“By the matter I mean … the bronze, by the form I mean the arrangement of the figure.”

Aristotle, Metaphysics (c. 350 BCE)

Despite its abstract nature, hylomorphism was extremely influential: it was the underlying conceptual foundation of all sciences from the Middle Ages until the late eighteenth century, when it was replaced by atomism. It has also been employed in contemporary attempts to untangle the persistent enigmas of consciousness. Aristotle’s application of hylomorphism to consciousness is referenced by U.S. philosopher Hilary Putnam in his early work on functionalism and continues to influence contemporary philosophy of mind. DM

c. 350 BCE

Types of Friendship

Aristotle

All friendships fall within one of three categories of increasing perfection

A detail from the School of Athens (1510–11) fresco by Raphael, featuring Aristotle and Plato.

In general there are, according to Aristotle (384–322 BCE), three different types of things that people like and three corresponding types of friendships. People like things that are useful to them, and so there are friendships based on utility. People like pleasure, and so there are friendships based on pleasure. Finally, some things are inherently likable because they are inherently good, and likewise, there are friendships between good people based solely upon their virtuous characters.

Friendships based on utility arise most often when people become associated for the sake of a mutual benefit. These are the sorts of friendships that exist between business associates or politicians. They are also, says Aristotle, the weakest sorts of friendships because “those who are friends for the sake of utility part when the advantage is at an end.” Friendships based on pleasure are most common among the young. They are tenuous, however, because people’s pleasures change as they get older.

“Good men will be friends for their own sake, that is, in virtue of their goodness.”

Aristotle, Nicomachean Ethics (c. 350 BCE)

Perfect friendships are those based on the virtuous character of the participants. Aristotle claims that “perfect friendship is the friendship of men who are good, and alike in virtue.” Such friendships arise because it is inherently pleasurable to share the company of good people. They are perfect because they are the most enduring and because the benefits are the same for both participants. Unfortunately, perfect friendships are likely only among the very old. This is because, as Aristotle explains, it takes many years of experience before one’s virtues are refined sufficiently to enter into a perfect friendship. DM

c. 350 BCE

Fallacy

Aristotle

An argument that may be persuasive but contains an error of logic or language

A fallacy is an error in reasoning, but reasoning can be erroneous in a number of ways, so there is no definitive type of fallacy. Aristotle (384–322 BCE) was the first to gather and explain the most common types of errors in reasoning, such as equivocation, begging the question, and false cause. In the subsequent centuries of philosophical debate, new categories of fallacies were identified, and the philosophers William of Ockham (c. 1287–1347) and John Buridan (c. 1300–after 1358) compiled an extensive number of fallacy types, giving them Latin names such as argumentum ad populum (appeal to the people) and argumentum ad baculum (appeal to the stick, or force).

There are now more than 200 named fallacies, commonly divided between formal and informal. Formal fallacies are mistakes in the logical form of an argument, independent of its semantic content. For example, in the non-fallacious form called Modus Ponens, a correct deduction can be derived from a conditional premise and a correct antecedent, regardless of the content. However, in the related formal fallacy called “affirming the consequent,” a false deduction is derived from the same correct conditional premise and a false antecedent. It follows that not every instance of the deduction would be true, even if the premise statements appeared correct individually.

“ … some reasonings are genuine, while others seem to be so but are not …”

Aristotle, On Sophistical Refutations (c. 350 BCE)

An informal fallacy occurs when the content or organization of the premises of an argument constitutes an error in reasoning, as when an arguer changes the subject (red herring) or appeals to an inappropriate authority (argumentum ad verecundiam). JW

c. 350 BCE

Occam’s Razor

Unknown

The simplest explanation is usually the correct explanation

“When you have two competing theories that make exactly the same predictions, the simpler one is the better.” This is the Principle of Parsimony, the axiom that it is pointless to achieve with more than what can be done with less, or the Principle of Plurality, that many hypotheses should be posited only when absolutely necessary. The idea that the simplest explanation is usually the correct explanation has always been the guiding principle whenever humanity has been faced with a choice, problem, or dilemma. We know a dog barks because we can hear it. We know a grapefruit is sour because we can taste it. And that principle has a name: we call it Occam’s razor.

Occam’s razor gives precedence to the notion of simplicity and holds that simplicity is equal to perfection. The principle is named after an English philosopher and Franciscan monk, William of Occam (c. 1287–1347), although there is no evidence that he ever used the phrase in any of his extensive writings, despite clearly being predisposed to the concept.

In fact, the principle of Occam’s razor was recognized long before Occam’s time, not least in ancient Greece. Aristotle (384–322 BCE) wrote, “The more perfect a nature is, the fewer means it requires for its operation.” The principle remained significant. Austrian physicist Ernst Mach (1838–1916), who studied the mechanics of projectiles moving at supersonic speeds, said that scientists should use the simplest methods possible in their research. Now, the principle that one should never make more assumptions than are absolutely required underlies all scientific and theoretical modeling. It helps to shake off variables that muddy the waters of enquiry and threaten to introduce ambiguities and inconsistencies. Occam’s razor has become part of science’s everyday intellectual furniture, its most basic of tools. BS

c. 350 BCE

Homunculus

Aristotle

The theory that living creatures, including humankind, begin life as miniature versions of their adult forms, only having to grow in size to reach maturity

An illustration by Nicolaas von Hartsoeker (c. 1700) depicts an example of the homunculi that his microscopic researches led him to believe existed in the heads of human sperm.

The term “homunculus” is derived from the Latin homo, meaning “human being,” or “person,” and the diminutive suffix, culus. Although literally defined as “little person,” the term usually means a fully formed organism of microscopic proportions.

“ … it becomes thencefold a true living infant, having all the members of a child that is born from a woman, but much smaller. This we call a homunculus …”

Paracelsus, De Natura Rerum (1537)

Homunculi figured prominently in preformationist theories of the development of individual organisms. Generally speaking, preformationism tells us that living organisms begin life as fully formed, but miniature, creatures whose maturity involves little more than growth. Preformationism is often defined in contrast to epigenetic theories, which explain the maturation of an organism as a process through which its infant form changes and develops into its adult form.

Aristotle (384–322 BCE) produced an explanation of animal development that was epigenetic in nature, but he is often associated with the homunculus because of the dominant role he attributes to the male contribution in the reproductive process. Aristotle’s epigeneticism persevered for almost 2,000 years but then Nicolaas von Hartsoeker (1656–1725), using an early microscope, concluded that there were miniature men inside human sperm, which he called homunculi. Von Hartsoeker’s well-known image of a homunculus occupying the head of a sperm became the banner for the “Spermist” form of preformationism that gained popularity in the late seventeenth century.

Perhaps the most extreme form of preformationism was espoused by Philippus Aureolus Theophrastus Bombastus von Hohenheim (1493–1541), better known as Paracelsus. In his De Natura Rerum (The Nature of Things, 1537), Paracelsus suggested that, by allowing a man’s semen to putrify in the uterus of a horse, and later feeding it with human blood, the result will be a human infant. This being the case, he argued, females are entirely unnecessary in human reproduction. DM

c. 350 BCE

The Liar’s Paradox

Aristotle

A paradoxical proposal that reveals the limitations of universal accounts of truth

The Liar’s Paradox is a well-known thought experiment in philosophy that challenges the adequacy of standard bivalent (two-valued) logical systems. The classic “liar’s sentence” is: “This sentence is false.” The problem is that if we assume that all meaningful claims are either true or false, and there is no middle value, then this sentence, since it seems meaningful, must be true or false. But if it is true, then, according to its content, it is false, and this is contradictory. On the other hand, if it is false, then, according to its content, it is true, which is also contradictory. Thus, it is neither true nor false.

Although there is no widely accepted solution to this paradox, philosophers have expended considerable ink in response. Some suggest that our standard account of truth must be mistaken. This account, traced back to Aristotle (384–322 BCE), has it that a claim (P) is true if and only if there is some state of affairs expressed by P; for example, “the cat is on the mat” is true if and only if the cat is on the mat.

“A man says that he is lying. Is what he says true or false?”

Eubulides of Miletus, philosopher

The logician Alfred Tarski (1901–83) argues that the Liar’s Paradox shows that this universal account of truth cannot be included in the language it governs. Other philosophers, such as Graham Priest, argue that it shows that bivalent systems are inadequate, and that we must appeal to multivalued logics to solve the problem. Still others, including Arthur Prior, Jon Barwise, and John Etchemendy, attempt to resolve it within the boundaries of both the classical account of truth and bivalent logical systems, arguing that the “liar’s sentence” is simply false. The Liar’s Paradox is still sparking important developments in logic. JW

c. 350 BCE

The Four Causes

Aristotle

The idea that all things may be defined in four different but complementary ways

A solar or lunar eclipse—as depicted in this manuscript (1410)—illustrates Aristotle’s second cause.

Aristotle (384–322 BCE) maintained that we can define a given thing in four different ways, corresponding to four different “explanatory factors,” or “causes.” The first cause is the material cause, which has to do with the contents or ingredients of the thing in question. For example, the material cause of a window might be a wooden frame and glass pane; or the material cause of peanut butter might be peanuts, salt, and sugar.

The second cause is the efficient or genetic cause, which has to do with the agency that brings about the thing in question. For example, a solar eclipse is an event caused by the moon coming between the sun and the earth and blocking the sun’s light from some part of the earth, or, more simply, pregnancy is caused by sexual activity or artificial insemination.

“We do not have knowledge of a thing until we have grasped its … cause.”

Aristotle, Physics (c. 350 BCE)

The third cause is the formal or essential cause, which has to do with naming the genus and species to which the thing in question belongs (this is arguably the most important, or at least most precise, definition). For example, Aristotle says human beings are “rational animals,” where “animal” is the immediate genus, and “rational” the species differentiated from “nonrational.”

The fourth or final cause has to do with the purpose, goal, or destiny of the thing in question. For example, the eye is for seeing, or, according to the Westminster Confession of Faith (1646), man was made “to glorify God and enjoy Him forever.” While Aristotle was the first to clarify these four causes, the concepts themselves are so fundamental to life in general (and science in particular) that their importance cannot be overemphasized. AB

c. 350 BCE

Hasty Generalization

Aristotle

The notion that generalizations based on unrepresentative samples may be false

A hasty, or false, generalization, secudum quid in Latin, is a fallacy in which an arguer draws an inference about a population of objects or events on the basis of an unrepresentative sample. For example, imagine meeting three people upon visiting a new college, all of whom are female. If, on the basis of these meetings, you draw the conclusion that everyone on campus that day is female, you would be making a hasty generalization because it is not clear whether your sample is representative of the college’s population.

A sample may be unrepresentative in two ways: it may be too small or it may be biased. In the college example, if there were only twelve people on campus that day, the sample size of 25 percent may be sufficient. But if the population on campus were much larger, say 3,000, it would not. In addition, if you chose your sample randomly, your sample would be unbiased. But if you happened upon a sorority meeting, the sample would be biased toward females.

“Given a thimbleful of facts we rush to make generalizations as large as a tub.”

Gordon W. Allport, The Nature of Prejudice (1954)

An early form of this fallacy can be traced to Aristotle’s Prior Analytics (c. 350 BCE), and a discussion is found in William of Ockham’s Summa Logicae (c. 1323). The fallacy plays an important cautionary role in the sciences, which rely heavily on generalizations. For example, medical researchers draw inferences about the effectiveness of potential treatments from samples of people who need the treatment. Researchers try to conduct a series of studies to avoid an overly small sample, and introduce controls (such as a placebo, random selection, and diet) into the experimental process to avoid bias and interference. JW

c. 350 BCE

Teleology

Aristotle

The theory that all aspects of the universe were created to fulfill an ultimate purpose

The Greek word telos means “purpose,” and its contemporary meaning is attributed to Aristotle (384–322 BCE). Teleology is the study of final causes, and the final cause of an event is its telos—that for which other events are caused. Until recently, philosophers associated purpose with the direction of events by a creative mind, and there is considerable debate among ancient and contemporary thinkers about whether a purposing mind exists objectively.

For example, ancient atomists deny that nature exhibits purpose, and Socrates is disappointed when he learns that atomists’ fully material explanation of movement does not include a “mind or any other principle of order.” Aristotle similarly associates telos with a mind, citing the artist’s mind as the cause that motivates and directs all other causes, and telos came to have particular influence through his biology and ethics. In the former, Aristotle categorizes animals according to their most distinctive features and notes that, since nothing generates itself and whatever is in a state of potentiality is nearer or farther from its realization, there must be an organizing principle outside of the organism directing it to its intended actuality. Aristotle extends this essentialism into his ethics, identifying the human telos with virtue.

“Reason forms the starting point, alike in the works of art and … nature.”

Aristotle, On the Parts of Animals (c. 350 BCE)

Aristotle’s teleology influenced medieval Islamic and Christian thought, especially that of al-Kindi and Thomas Aquinas, both of whom refer to the appearance of purpose in nature in justifying God’s existence. Aristotelian teleology remained popular in biology until the arrival of Charles Darwin. JW

c. 350 BCE

Nature Abhors a Vacuum

Aristotle

The motions of objects mean that a vacuum is impossible

A central tenet of the physics of Aristotle (384–322 BCE) is that the laws of nature do not permit a void—a space containing absolutely nothing, or a vacuum. This principle came to be referred to as Aristotle’s horror vacui (fear of emptiness) and is encapsulated in the phrase “Nature abhors a vacuum.”

According to Aristotle, the very notion of a void was nonsensical, because the term “void” seemed to be defined as an indefinable nothing. Semantics aside, Aristotle observed that all things in motion eventually come to a halt. Were there a void, said Aristotle, a thing in motion would remain so forever. Furthermore, in a void, objects would not be compelled to move or fall in any particular direction. This, of course, did not correspond to Aristotle’s observations of the world, and so he concluded that the universe must be filled and that the motions of things are determined, in part, by their relative densities.

“Just as every body is in place, so, too, every place has a body in it.”

Aristotle, Physics (c. 350 BCE)

For these reasons (and others) Aristotle taught that the universe was filled with a medium that he called ether, which, in addition to the traditional four elements identified by Empedocles before him, comprised the five elements of the universe. This notion persisted through the Middle Ages and into the seventeenth century. It was among the chief reasons why atomism, which embraces the notion of a void, gained little traction. Around the seventeenth century, a more sophisticated (and vague) conception of this medium came to be referred to as “phlogiston.” However, in 1643, the Italian physicist Evangelista Torricelli proved fairly conclusively that a vacuum can indeed exist. DM

c. 330 BCE

Cosmopolis

Alexander the Great

The concept of many different peoples living together in the same city

The Greek word cosmopolis means “universe city” and refers to a large, important city that is inhabited by people from many different ethnicities.

After the collapse of Classical Greece and the rise of Alexander the Great (356–323 BCE), there occurred a shift from using the word polis (city-state) to cosmopolis. The cities that formed during Alexander’s reign were large, multicultural commercial centers. In no way could they support the original Classical Greek model of city life, in which each individual had a direct role in the politics, economy, social welfare, and spiritual wellbeing of the city. However, the collapse of Alexander’s empire after his death marked the rise of Hellenistic Greece, and what is called the Hellenization of the Mediterranean world. Diverse cultures that were once foreign to the Greeks now took on Greek-like qualities. More than ever, the polis came to be replaced by the cosmopolis.

“Different people, different beliefs, different yearnings … different dreams.”

Jimmy Carter, U.S. president 1977–81

This shift is also clear in philosophy. The Classical Greek philosophers Socrates and Plato had focused on the intimate and important relationship between the citizen and the city, but Hellenistic philosophers, such as Epicurus and Antisthenes, spoke of virtues that should go beyond the city walls to all persons. People now felt a kinship and moral responsibility to everyone, not only their fellow citizens, but at the price of some loss of identity and pride in their particular city.

“Cosmopolis” may now refer to many things: a novel (2003) by Don DeLillo, David Cronenberg’s movie adaptation of that novel in 2012, a novel (1892) by Paul Bourget, a city in Washington in the United States, and a musical work by Elias Breeskin, to name a few. KBJ

c. 323 BCE

Hellenism

Alexander the Great

An art style more dynamic and ornate than that of the Classical Greek period

A marble statue of the Greek goddess of victory Nike, dating back to the second century BCE.

The Hellenistic period (323–31 BCE) existed between that of Classical Greece and the rise of the Roman Empire. Hellenistic art and architecture is distinct from both Classical Greek art and Roman art. Hellenistic works are more dynamic, broader, and more ornate than the works of Classical Greek artists.

The beginning of the Hellenistic period dates from the death of Alexander the Great (356–323 BCE), who had carved out one of the ancient world’s largest empires. The extent of Alexander’s conquest had meant that many parts of the Classical world were exposed to Greek ideas about philosophy, science, art, and architecture. The Classical school of art had been characterized by simple and realistic portrayals, but the new Hellenistic art was more expressive, often depicting extreme emotions, dramatic settings, and drastic movements. Meanwhile, Hellenistic architecture became more elaborate; dramatic friezes and Corinthian columns featured in buildings that took advantage of larger spaces in order to create a sense of wonder, spectacle, or grandeur.

“A taste for the small and exquisite was combined with a love of the … grandiose …”

H. Honour & J. Fleming, A World History of Art (1984)

Where Classical Greek art accurately depicted human anatomy and form, Hellenistic art left a legacy of broader possibilities for artists. After the Greek world was absorbed by Roman expansion, culminating in the battle of Actium in 31 BCE, Roman artists continued the tradition of Hellenistic works and their grander depictions, leaving behind a dramatic legacy. When artists and architects of the Renaissance revisited the Classical period, they often turned to Hellenistic examples and themes for inspiration. MT

c. 300 BCE

Eternal Return

Zeno of Citium

The notion of a cyclical universe in which every moment in time is revisited endlessly

If time and space are infinite, the chance is also infinite that the world will exist again exactly as it is. Although the idea of recreation within the universe dates to even earlier religious and philosophical teachings, the ancient Greek philosopher Zeno of Citium (c. 334–c. 263 BCE) was first to propose the idea that the universe goes through regular cycles of creation and destruction, and that fate determines whether the universe will infinitely repeat the same course within these cycles.

The idea of an eternal return is the mathematical and philosophical explanation of what must occur in an infinite universe with infinite variation. An infinite universe means that the particular variations that produced the reality in which we live will also produce an infinite series of realities, and occasionally these realities will contain everything exactly as it has already occurred. Most religious and philosophical examples of the argument for “eternal return” also include the idea of fate. That is, there is no other way that the universe can exist other than how it exists in the present because it will eternally return to this present.

“Everything goes, everything comes back; eternally rolls the wheel of being.”

Friedrich Nietzsche, philospher

The philosophy of Friedrich Nietzsche (1844–1900) was greatly influenced by the experience of freedom and fate rooted in the idea of the eternal return. The idea is also the basis for several theories in physics about the cycles of the universe. These include the theory that the Big Bang was one of an infinite series of big bangs that continuously create universes. The idea of an eternal return is also consistent with theoretical models of a “multiverse,” in which an infinite number of universes and dimensions exist at the same time. TD

c. 300 BCE

Waterpower

Ancient Greece

Using the energy of water as a power source for machines of all kinds

The Laxey Wheel, 72.5 feet (22.1 m) in diameter, was built in 1854 on the Isle of Man, Britain, to pump water out of nearby mine shafts. It is the largest working waterwheel in the world.

Gravitational pull forces water to seek the lowest available point it can find; gravity always causes water to flow. Waterpower, or hydropower, is the harnessing of water’s kinetic energy to perform tasks. Water’s natural movement has been used to power everything from electrical power stations and sawmills to clocks and Japanese kinetic garden ornaments.

“Water is the driving force of all nature.”

Leonardo da Vinci, artist and inventor

Like all forms of life, humanity has always needed to consume water to survive, although being able to control and direct it is a relatively recent advancement. As early as 7000 BCE, ancient Egyptians learned to fertilize their fields by creating dikes along the Nile river to trap the yearly floodwaters and cause nutritious sediments to settle on their land. However, it was not until the rise of Hellenistic Greece that people found a way to harness water as a source of mechanical power; sometime between the third and first centuries BCE, Hellenic people developed the first waterwheels. These wheels captured the power of flowing water, using it to turn a shaft that in turn caused a millstone to rotate against another stationary one to grind grain.

Hydropower became very important during the early days of the Industrial Revolution (1760–1840) in Britain. In the textile industry, for example, watermills were built to twist thread, drive looms, and finish textiles; in iron manufacturing, water drove trip hammers and powered forges. In the twentieth century, water was harnessed to create hydro-electricity.

Whether water has existed in the landscape in sudden, flooding excess or deadly scarcity, humanity has always depended on its presence. However, when humanity first had the idea of using water’s natural properties as a source of mechanical power, the status of one of the great natural forces of the world, one that bestows both hardships and blessings, was changed. Water was transformed into a powerful lever, a force that humanity could direct to do its will. MT

c. 300 BCE

A Priori and A Posteriori Knowledge

Ancient Greece

The debate about whether knowledge is obtained by experience or by reason

A portrait of the Greek mathematician Euclid. His well-known work, the Elements (c. 300 BCE), encompassed aspects of mathematics, arithmetic, geometry, and music theory.

Philosophy and science are rife with questions about the sources of knowledge. Empiricists argue that experience is the sole source of knowledge. Knowledge obtained by experience is called a posteriori (Latin for “to the after,” implying after or because of experience). Rationalists argue that some of our knowledge is obtained non-experientially, that is, by pure reason alone. Knowledge obtained non-experientially is called a priori (Latin for “to the prior,” implying before or independent of experience).

“Any necessary truth, whether a priori or a posteriori, could not have turned out otherwise.”

Saul Kripke, philosopher and logician

The origin of these terms is controversial and includes Medieval Latin translations of Euclid’s Elements (c. 300 BCE) and the fourteenth-century writings of Albert of Saxony. But there is little doubt that these ideas trace back to Ancient Greek thinkers Plato (c. 424–c. 348 BCE) and Euclid (third century BCE). The first comprehensive description and defense of a priori knowledge was offered by Immanuel Kant (1724–1804) in the eighteenth century. It is Kant’s discussion of the subject in his Critique of Pure Reason (1781) that provides the framework for the contemporary debate.

It seems clear that much of what we understand about reality comes through experience. And yet, while our experiences are contingent (they could have been otherwise) and local (we have experienced very little of reality), we seem to know things that hold necessarily and universally true. For example, claims such as “2 + 2 = 4,” “there are no married bachelors,” and “there are no round squares” seem true irrespective of what we have experienced, and thus the source of this knowledge, it would seem, cannot be experiential. Dissenters contend that this is the wrong conclusion and that there is sufficient empirical evidence for explaining the uniqueness of these claims. As yet, there is no widespread agreement in this debate, and it continues to motivate new research in philosophy, psychology, and mathematics. JW

c. 300 BCE

Anti-Semitism

Manetho

Hostility or discrimination toward Jews as a religious, ethnic, or racial group

Detail from the Arch of Titus, Rome, Italy, built to celebrate a victory in the First Roman–Jewish War (66–73 CE).

German agitator Wilhelm Marr (1819–1904) coined the term “anti-Semitism” in his political diatribe The Way to Victory of Germanicism over Judaism (1879). The term suggests derogatory attitudes toward Semitic peoples in general, but it particularly denotes hatred of, or discrimination against, Jewish peoples. Manifestations of anti-Semitism are littered throughout history.

The third-century BCE historian and priest Manetho is credited with disseminating Egyptian anti-Semitism throughout ancient Greece. Manetho held that the Jews are enemies of the human race and that it is necessary to extricate them from human society. The Roman Emperor Tiberius banned Judaism and expelled Jews from Rome. Constantine I imposed numerous prohibitions and regulations on Jewish religious practices and outlawed conversion to Judaism. Hostility toward Jews persisted into the Middle Ages, especially during the Christian Crusades (1095–1291).

“I am a Jew. Hath not a Jew eyes? … If you prick us, do we not bleed?”

William Shakespeare, The Merchant of Venice (1598)

The most extreme expression of anti-Semitism in history was arguably during the rise of the Fascist movement in the 1920s and 1930s. Jewish culture and religion were denigrated, and Jews were accused of rebellions and anti-government conspiracies, and of sabotaging their respective nations. It culminated in the extermination of an estimated six million Jewish people during what is now referred to as the Holocaust. Anti-Semitism persists today in a variety of forms: anti-Jewish sentiment often emerges out of the Israeli–Palestinian conflict, and is also spouted by conspiracy theorists, religiously motivated politicians, and hate-mongers in general. DM

c. 300 BCE

Skepticism

Pyrrho of Elis

A denial of the possibility of certainty in knowledge

A central question in the history of ideas is: What, if anything, can we know? One disconcerting answer is that it is unclear that we can know anything at all. This answer characterizes “skepticism,” from the Greek skepsis (to inquire). Skepticism originated with the Greek philosopher Pyrrhon of Elis (c. 360–c. 272 BCE), whose ideas were passed on by his assistant Timon and made famous by the Roman philosopher Sextus Empiricus (c. 160-210 CE), and later the French philosopher Michel de Montaigne (1533–92).

Empiricus claimed that skepticism is a “mental attitude, which opposes appearances to judgments.” By framing skepticism as an attitude of doubting, rather than as a position or view, Empiricus avoided the criticism that skepticism is incoherent (since it would be absurd to say, “We know that we know nothing”). By “opposes appearances to judgments,” he meant that skeptics set appearances (the way reality seems) in opposition to judgments (beliefs about the way reality is), and he argued: for any argument that a claim is true, there is an equally powerful reason to be skeptical.

“Skepticism is the chastity of the intellect, and it is shameful to surrender it too soon.”

George Santayana, philosopher

This type of skepticism, known as “Pyrrhonism,” was strongly challenged in the seventeenth and eighteenth centuries. René Descartes argued compellingly that there was at least one thing he knew—that he existed. In addition, Isaac Newton and John Locke offered powerful reasons for thinking that practical discoveries in physics are more likely to be true than any skeptical alternatives. Nevertheless, problems with Cartesian epistemology and the fall of Newtonian physics mean that skepticism still influences philosophy today. JW

c. 300 BCE

The Elements

Euclid

The greatest ancient mathematical treatise, Euclid’s Elements is a collection of definitions, axioms, theorems, and proofs that has informed all logical and scientific endeavor

The title page of an early translation of Euclid’s Elements (c. 300 BCE), printed by John Day of London in 1570–71. The mathematical and geometric treatise consists of thirteen books.

Little is known with certainty about the life of the ancient Greek scholar Euclid (f. c. 300 BCE). Hundreds of years after him, in the fifth century CE, the Greek philosopher Proclus wrote that Euclid taught at Alexandria when Ptolemy I Soter reigned over Egypt, meaning any time between 322 BCE and 285 BCE. Historians believe that he was older than Archimedes. However despite this paucity of knowledge about him, Euclid’s memory lives on thanks to the written works he left behind, most notably the thirteen books that comprise the Elements (c. 300 BCE). This book earned Euclid the moniker “the father of geometry,” and arguably exercised an influence upon the human mind greater than that of any other work except the Bible.

“The discovery of Euclidean geometry, this rule had a deep philosophical and religious significance for many people because it showed that human thinking could get at part of the ultimate truth of reality.”

John Barrow, professor of applied mathematics and theoretical physics

A variety of mathematical subjects are covered in the Elements: Book V investigates ratios and proportions, Books VII to IX deal with number theory (indeed IX is well known for its proof that there is an infinite number of primes) and Books XI to XIII focus on three-dimensional figures. However, the treatise is perhaps best remembered for its investigations into geometry (Books I to IV). Euclid’s five postulates effectively read as a constitution for the laws of geometry: a framework to describe the real world.

The first Latin translation of Euclid’s Elements was by the English philosopher Adelard of Bath in about 1120. It was among the first works printed with the newly invented printing press, and as universities began to multiply, it became the ultimate textbook in Europe. It remained the key text on geometry until 1899, when the German mathematician David Hilbert wrote his acclaimed Foundations of Geometry. Moreover, geometry was synonymous with Euclidean geometry until discoveries as late as the second half of the twentieth century. Effectively, through the Elements, Euclid’s ideas ruled for more than 2,000 years. JH

c. 300 BCE

Mathematical Proof

Euclid

The central way in which claims in mathematics are justified

Mathematical proof, traditionally, is the logical derivation of claims (theorems) from axioms (claims assumed to be true or taken for granted), and definitions of the terms occurring in the axioms. Although there were mathematicians who proved theorems before him, Euclid (f. c. 300 BCE) is credited as the first to present his proofs in a systematic form. In his seminal work, the Elements (c. 300 BCE), theorems are proven on the basis of definitions and axioms. Euclid’s use of deductive logic to solve mathematical problems underpinned mathematics for more than 2,000 years.

What is the point of mathematical proof? Axioms and definitions are assumed to have various desirable qualities—to be a priori (that is, knowable independently of experience), certain, and necessary—which logical derivation is regarded as preserving. So theorems, when correctly proven, are also a priori, certain, and necessary. Mathematical proof thus enables the mathematician to erect vast structures on the foundations of the axioms and definitions, confident that they will topple only if the axioms and definitions are poorly expressed or constructed.

However, controversy about mathematical proof abounds, as three well-known examples illustrate. First, although Euclid thought that one of his axioms of geometry, the Parallel Postulate, was necessary, alternative geometries later rejected it. Second, the philosophy of mathematics called intuitionism holds a different view of what counts as a valid logical derivation and thus of what counts as a mathematical proof. Third, in 1976, a computer aided in producing a proof of the four-color theorem in graph theory: because the proof is too lengthy to be checked in its entirety, it is controversial whether it constitutes a proper proof. Still, mathematical proof, as traditionally conceived, continues to be at the center of mathematics. GB

c. 300 BCE

Prime Numbers

Euclid

Positive integers that have exactly two positive integer factors

Prime numbers are often known as the building blocks of mathematics. A number is considered prime when it is greater than one and only divisible by one and itself, for example 2, 3, 5, 7, 11, and so on. All other numbers greater than one are called composite numbers.

It is not known when humans first recognized the existence of primes. The Rhind Papyrus from 1650 BCE offers hints that the ancient Egyptians might have had some knowledge of them. This scroll contains much of what we know about Egyptian mathematics, including examples of unit fractions, many of which seem concerned with prime numbers. However, the earliest surviving records that reveal an explicit understanding of prime numbers are from ancient Greece. In his series of thirteen books, the Elements (c. 300 BCE), the scholar Euclid proposed various key facts about prime numbers, including the fundamental theorem of arithmetic (in Book VII) and the first known proof that there are infinitely many primes (in Book IX). This latter discovery has been hailed as the moment that mathematics became an analytic subject.

“Primes are the atoms of arithmetic—the hydrogen and oxygen … of numbers.”

Marcus de Sautoy, professor of mathematics

Thousands of years after Euclid, and despite the focus of some of history’s greatest intellects, one aspect of prime numbers remains an infuriating puzzle: they have no obvious pattern and there is no efficient way to find them. However, since the late twentieth century and with the help of computers, prime numbers with millions of digits have been discovered. This might not have mattered much to the world beyond mathematics, but for the fact that cryptographers now use them to create virtually unbreakable codes. JH

c. 300 BCE

Squaring the Circle

Euclid

The idea of constructing a square that has exactly the same area as a given circle

The phrase “squaring the circle” refers to attempting an impossible task. The phrase has its roots in the three classical problems of ancient geometry, namely, the doubling of the cube, the trisection of an angle, and the squaring of the circle. All of these eventually proved to be impossible: René Descartes proved the first problem to be impossible in 1637, Pierre Wantzel proved the second to be impossible in 1836, and the third was proved impossible by Ferdinand Lindemann in 1882.

The problems originate in the geometric axioms of Euclid (fl. c. 300 BCE), according to which it is possible to construct certain geometric shapes of specified proportions using only a straightedge and a compass. With specific regard to the squaring of the circle, Euclidian geometry tells us that it is possible to construct a square exactly twice the area of any given square. Furthermore, it is possible to construct a square of exactly the same area as any given polynomial (algebraic expression with two or more terms). Following Euclid, Archimedes (c. 290–c. 212 BCE) proved it possible to square any segments under a parabola. It then seemed likely that constructing a square the precise area of a given circle would be possible. Looking for the correct procedure to do so occupied numerous geometers in the first and second centuries CE.

“To square the circle and who cannot find, For all his thought, the principle he needs.”

Dante, The Divine Comedy (c. 1308–21)

Were it possible to square the circle, it must be possible to derive pi using only a straightedge and a compass. However, in 1882 Lindemann proved pi to be a “transcendental” number—it is not the root of any rational number, nor can it be derived algebraically. It therefore cannot be derived by such means. DM

c. 300 BCE

Alchemy

Hellenistic Egypt

The belief that chemicals might be manipulated to obtain perfected forms

Illustration demonstrating coagulation, the seventh and final operation of alchemy.

Alchemy was a pursuit that mixed philosophy, religion, and chemistry in an attempt to manipulate physical objects and properties. Alchemists were proto-scientists who merged the study of the physical world with spiritual and metaphysical principles. They pursued a range of disparate goals, some of the best known being the creation of a universal solvent, development of an elixir of life, and the transformation of less valuable metals into precious gold.

Alchemy probably began in Hellenistic Egypt (c. 300 BCE), when the Greek philosophical tradition merged with Egyptian advances in metallurgy, glassmaking, and other crafts. Indian and Chinese thinkers also developed alchemy independently as an attempt to improve health and lengthen lifespan. In Europe, alchemists gained prominence during medieval and Renaissance times, attracting many well-known scientific thinkers. Even Isaac Newton (1642–1726), the pivotal figure of modern physics, pursued alchemy for decades as he attempted to discover some way to transmute chemicals into different forms.

“Medicine rests upon four pillars: philosophy, astronomy, alchemy, and ethics.”

Paracelsus, physician and alchemist

Alchemy faded when it failed to meet modern demands for quantification and skeptical analysis, but its legacy of probing the unknown lingers. Alchemists were part scientist, part physician, part philosopher, and part cleric, exploring mysteries in an attempt to both discover knowledge and gain wisdom. Modern science is largely free of the metaphysical concerns that the alchemists considered paramount, but the desire to explore the unknown and quantify the process rests at the heart of every new discovery. MT

c. 300 BCE

The Problem of Evil

Epicurus

If God is omnipotent and omnibenevolent, why does evil exist?

A pencil drawing, Justice and Divine Vengeance Chasing Murder (1804), by Pierre-Paul Proudhon.

For many believers, God is not only all powerful and responsible for the creation of the universe, but also all loving and all knowing. Yet the world is still full of cruelty, human suffering, and evil. How could a being with God’s powers allow such suffering to take place?

Questions about the nature of the divine and how a God or gods can be explained in light of reality have been around for millennia. The ancient Greek philosopher Epicurus (341–270 BCE) has been credited with the first formulation of the problem (called the Riddle of Epicurus), though its identification has found no easy solution since. In some religious traditions, such as in ancient Mesopotamia, the destructive elements in the world were attributed to conflicts between gods who each held certain powers. For many monotheistic religions, multiple proposed solutions have arisen over the years. Many of the Enlightenment philosophers of the seventeenth and eighteenth centuries, such as David Hume, Immanuel Kant, and Gottfried Liebniz, all proposed their own solutions to the dilemma.

“Is God willing to prevent evil, but not able? … Is he able, but not willing?”

Epicurus

The question of evil is a problem that still plagues many theologians and lay believers alike. At the problem’s heart is not only a question about the nature of the divine, but also one about the limitations of human understanding. Yet for all the proposed solutions, the problem remains, turning a believer’s thoughts not only inward toward reconciliation, but also outward, probing the very nature of the divine. For the believer, it is a question that may never be resolved satisfactorily, while for many nonbelievers it is further proof that such a being is nonexistent. MT

c. 300 BCE

Freedom to Choose Otherwise

Epicurus

The notion that moral responsibility implies at least two possible courses of action

Philosophers discuss a variety of types of freedom but the one that captures the most attention is the freedom necessary for moral responsibility, that is, it must have been the case that, upon choosing a particular course of action, it was possible for that person to have chosen an alternative course of action. This possibility is known as “alternate possibilities freedom,” or “the freedom to choose otherwise,” and stands in contrast to the view that, even if determinism were true, it would be possible to be morally responsible for particular actions. The latter view is known as “compatibilism” because it holds moral responsibility to be compatible with determinism.

“To deny … free will is to assert that what a man does do and what he can do coincide.”

Peter van Inwagen, analytic philosopher

Although the question of whether the freedom to choose otherwise is necessary for moral responsibility traces back to Epicurus (341–270 BCE), or at least Lucretius (c. 99–c. 55 BCE), the most significant advancements occurred in the eighteenth and twentieth centuries. Isaac Newton’s mechanistic physics forced modern philosophers to face the possibility that our actions are exhaustively determined by natural processes, and that the freedom to do otherwise is an illusion. Unwilling to relinquish the idea of moral responsibility, philosophers such as David Hume and Immanuel Kant defended versions of compatibilism. But difficulties with compatibilism, highlighted by Roderick Chisholm in 1964 and Peter van Inwagen in 1975, led to new interest in the view that moral responsibility that is incompatible with determinism requires the freedom to choose otherwise. Nevertheless, in 1969 philosopher Harry Frankfurt constructed a counterexample to this view and it continues to influence the debate. JW

c. 300 BCE

Yoga Vasistha

Valmiki

A Hindu scripture that sums up the nature of life: the world of materialism and the body is a dream from which we must wake up in order to pursue spiritual enlightenment

A miniature painting from the School of Raja sansar Chand (c. 1780) depicts the Hindu sage Bharadvaja and his pupils in an illustration of Valmiki’s epic poem, the Yoga Vasistha.

One of the most exhaustive texts in Sanskrit, the Yoga Vasistha is a detailed account of a conversation between the young Sri Rama and his teacher, Vasistha Maharshi, much of which is told in parables. There are 32,000 two-line verses—one verse for every question the pupil was said to have asked his master—which ranks it, at 64,000 lines, second in length only to the voluminous Mahabharata (c. 400). Even Krishna himself, in the Bhagavad Gita (c. 100 CE), uttered just a mere eighteen verses. Legend says that anyone who manages to finish reading the Yoga Vasistha will have their spiritual growth significantly hastened. Authorship of the work is generally attributed to the well-known Sanskrit poet and sage Valmiki, who lived around 300 BCE.

“The tree of meditation casts a cool shade in which all desires and cravings come to an end and all the burning distress ceases.”

Valmiki, Yoga Vasistha (c. 300 BCE)

Also known as the Knowledge Vasistha, Maha Ramayana, or Vasistha Ramayana, the text is divided into six sections: dispassion or indifference, longing for liberation, creation, existence, quiescence of mind, and liberation. At its core it is a warning of how illusory our concept of the world really is, that we are in a “dream,” and how achieving enlightenment depends upon effectively waking from that dream and then learning to put away worldly desires—to become indifferent to the material things around us and so free ourselves to pursue our own individual growth and spiritual awareness. The Yoga Vasistha is always at pains to remind us that everything we see and know, being as it exists within our “dream,” is false. The goal is always to awaken. And when we awake, if we eliminate all desire and our mind then enters its “no-mind” state, what is attained is the blissful Moksha, the pure extinction of all worldly thought and the complete liberation of the self. More than a treatise on enlightenment, the Yoga Vasistha also contains advice on politics, gambling, and even how to sow deception. It incorporates elements of Jainism, yoga, and Mahayana Buddhism. BS

c. 300 BCE

The Fifth (Parallel) Postulate

Euclid

The axiom of geometry written by Euclid that was unprovable

The Elements of Euclid (f. c. 300 BCE) lays down the laws of geometry in five postulates. The first four are relatively intuitive; the fifth (parallel) postulate is not. It states: “If a straight line falling on two straight lines makes the interior angles on the same side less than two right angles, the two straight lines, if produced indefinitely, will meet on that side on which the angles are less than the two right angles.” For more than 2,000 years, this formula eluded all attempts to prove it as a theorem. Although it was not as intuitively obvious as the first four postulates, many mathematicians believed that the parallel postulate could be proved using the first four postulates.

However, in 1829, mathematicians stopped trying to prove the fifth postulate, concluding that it was unprovable and should be discarded, and began to explore geometries that did not contain it. The Hungarian mathematician János Bolyai (1802–60) watched his father wrestle with Euclid’s fifth axiom. He too became hooked, so much so that his father begged him to give it up. However, Bolyai concluded that a proof was impossible and set about developing a new geometry that ignored it. When he published his geometry in 1831, the great mathematician of the time, Carl Friedrich Gauss (1777–1855) wrote, “To praise it would amount to praising myself. For the entire content … coincides almost exactly with my own meditations.” Moreover, in 1848 Bolyai discovered that a Russian named Nikolay Ivanovich Lobachevsky had also published virtually the same geometry as early as 1829.

These insights amounted to a radical new way of describing the shape of the physical universe. Gone was the flatness of a Euclidean world in which the parallel postulate holds true; this geometry created a strange, non-Euclidean, curved, spacetime world: as Bolyai realized, “a new, another world out of nothing.” JH

c. 300 BCE

Matrix

Ancient China

Rectangular diagrams that speed up the chore of making advanced calculations

A matrix is any rectangular array of numbers, symbols, or mathematical expressions arranged in rows and columns. The earliest recorded matrix—in a Chinese text, known in English as The Nine Chapters on the Mathematical Art, dating from some time between 300 BCE and 200 CE—was used to solve simultaneous equations. The word—which literally means “womb” and is related to mater (mother)—was introduced into common English usage in the nineteenth century by Cambridge University mathematician James Sylvester (1814–97), who demonstrated that any matrix could give rise to smaller determinants (or “minors”) through the removal of some of the original’s elements or entries.

Sylvester’s colleague Arthur Cayley (1821–95) then increased the practical applications of matrices by demonstrating that they conform to algebraic systems. The significance of this discovery is that their relationship with the normal laws of arithmetic is not always straightforward: some of the rules (such as those of association and distribution) still apply, but in certain cases others, such as the commutative law (numbers may swap position in calculations), may not.

“Matrix computations are built upon a hierarchy of linear algebraic operations.”

G. H. Golub & C. F. Van Loan, Matrix Computations (2013)

The simplest matrices function as nothing more than shorthand notations for mathematical calculations. Advanced matrices, however, have a wide range of applications, not only in mathematics but also in economics, engineering, physics, and statistics, in which they expedite a range of calculations. Their latest use is in computer graphics, where they have made it possible to represent rotations and other transformations of images. GL

c. 250 BCE

Elixir of Life

Ancient China

A potion that promises an eternal existence on Earth for those who drink it

A sculpture of Amitayus (the Buddhist deity of longevity) holding a vase containing the elixir of long life.

The elixir of life is a hypothetical substance that, if consumed, holds the promise of immortal life. The origins of this idea can be identified in many ancient cultures. In particular, the idea of a potion that could extend life indefinitely was prevalent in East Asia from at least the third century BCE, and Emperor Qin Shi Huang (259–210 BCE) is known, in later life, to have ordered 1,000 men and women to go in search of the elixir. Several emperors of this era are known to have died from drinking substances returned to them from such expeditions. Similarly, the idea of an elixir of life, or Amrita, is described in ancient Hindu texts, and in ancient India the search for it never ceased.

The task of discovering or creating the elusive elixir was assigned to alchemists, and the idea that melted metals, particularly gold and mercury, would provide the basic substance of an elixir was taken up in both Eastern and Western cultures. The enduring nature of these metals was believed to confer itself upon those who consumed them, but the consumption of such metals had highly toxic and fatal effects.

“The water I give him will become in him a spring of water welling up to eternal life.”

The Bible, John 4:14

In more recent history, the search for immortality has in general been replaced by a more modest striving for prolonged and healthier life. However, contemporary science has not given up the search entirely. In the twenty-first century, research into microbial culture and probiotics is at the forefront of investigations into the possibility of eternal existence. Thus, the elixir of life may yet reveal itself to humanity’s future generations, who would surely question whether the elixir is something we truly desire. LWa

c. 250 BCE

Propositional Logic

Chrysippus

A revolutionary system of logic that unlocked our understanding of language

Propositional logic is a system of logic used to characterize the form and function of propositions within language. The first comprehensive system of propositional logic emerged in ancient Athens with Greek philosopher and logician Chrysippus (c. 280–206 BCE). This revolutionary branch of logic concerns the operation of whole propositions or statements, and the relationships between them. Accordingly, within propositional logic a statement, such as “Athens is the capital of Greece,” is treated as an indivisible whole. This can then be combined with other statements in order to create a more complex statement such as “Athens is the capital of Greece and the sun is shining in Athens.”

The system of propositional logic is in contrast to the traditional syllogistic logic employed by Aristotle a century earlier, which focused on the operation of individual terms. However, it was not until the mid-nineteenth century that the development of symbolic logic paved the way for a modern axiomatization of propositional logic. As such, the first formulation of contemporary propositional logic can be attributed to philosopher and logician Gottlob Frege (1848–1925).

“Logic, I should maintain, must no more admit a unicorn than zoology can.”

Bertrand Russell, philosopher

Perhaps the most striking aspect of Frege’s groundbreaking work in propositional logic was his claim that it represented a method of systematic inquiry even more fundamental than that of mathematics. This bold idea has since been the source of significant controversy in the study of logic and has given rise to key developments, including the notions of truth values and truth tables that operate on the premise that all statements must be either true or false. LWa

c. 250 BCE

The Archimedes Principle

Archimedes of Syracuse

An explanation for the buoyancy of an object in water

Archimedes (c. 290–212 BCE) is arguably the best-known inventor of ancient Greece and perhaps one of the greatest mathematicians of all time. He is remembered for myriad inventions: from a device for raising water called the Archimedes Screw through to catapults that defended his home of Syracuse against invading Romans. The story of how the principle that bears his name was discovered is equally well known.

The Archimedes Principle dictates that a body submerged in a liquid is buoyed up by a force that equals the weight of the liquid it displaces. If the body weighs more than the weight of the water it displaces, it will sink; if it weighs less, it will float. These observations, immortalized in the two volumes of On Floating Bodies (c. 250 BCE), remain the first known investigations into the laws of buoyancy, making Archimedes the father of the science of hydrostatics.

“There, as he was sitting in the bath … He saw at once a way of solving the problem.”

Vitruvius, On Architecture (first century BCE)

According to legend, Archimedes stumbled on his principle while doing some detective work for Heiron, the king of Syracuse, who suspected that a goldsmith had stolen his gold while making his crown. The Roman writer and architect Vitruvius recounted in his work On Architecture (first century BCE) how Archimedes took the crown to the public baths and “as he was sitting down in the tub, he noticed that the amount of water which flowed over the tub was equal to the amount by which his body was immersed.” Running, dripping wet, down the streets of Syracuse, he cried out “Eureka!” (I’ve found it!). But Galileo and others have questioned the accuracy of Vitruvius’s account, which was written 200 years after the actual event. JM

c. 150 BCE

Trigonometry

Hipparchus

The branch of mathematics that deals with the functions of angles

The Pyramids of Giza, Egypt, were built in c. 2500 BCE using a primitive form of trigonometry.

Trigonometry—a word derived from the Greek trigonon (triangle) and metron (measure)—began as the study of geometrical angles and the information that we may infer from them. As it developed, it became essential in astronomy, land surveying, mapmaking, and building design (notably Egypt’s pyramids); later it became vital in technologies as varied as radar and atomic energy.

The earliest known trigonometer was the Greek astronomer and mathematician Hipparchus (c. 190–c. 120 BCE), who developed trigonometric tables primarily in the service of his work in astronomy. Prior to Hipparchus, trigonometry was not a recognized branch of mathematics: for example, Pythagorus (c. 570–c. 495 BCE) had talked of arithmetic, geometry, harmonics, and astronomy. The earliest surviving work on trigonometry, and also the only surviving comprehensive ancient treatise on astronomy, is the Almagest (c. 150 CE), by Roman astronomer and citizen of Egypt Claudius Ptolemy (c. 90–c. 168 CE).

“Mathematical formulas … are wiser than we are, wiser even than their discoverers.”

Heinrich Hertz, physicist

In the sixteenth century the emergence of symbolic algebra and the invention of analytic geometry gave trigonometry a vast new range of applications: it became essential in the construction of accurate clocks, navigational equipment, and high-grade musical instruments. In a crucial development, Galileo Galilei used trigonometry to demonstrate that any object falling under the force of gravity moves horizontally as well as vertically. This finding was instrumental in the creation of a new science—ballistics—which made it possible to calculate the range of projectiles (originally focusing on cannonballs). GL

c. 150 BCE

Yoga Sūtras of Patañjali

Patañjali

A collection of ancient texts providing the foundation for the practice of Raja Yoga, in which oneness with universal consciousness is achieved through disciplined meditation

A nineteenth-century miniature painting from Jodhpur, India, depicts a figure practicing pranayama, or “extension of the life force,” the yogic discipline of breath control.

Although the practice of yoga predates the author and compiler Patañjali (f. c. 150 BCE), it is he who is credited with collecting together what has become the canonical text of Raja (Royal) Yoga. Originally written in Sanskrit, the text contains 196 sūtras (rules) organized into four padas (chapters) that communicate the theoretical foundations of the discipline.

“The restraint of the modifications of the mind-stuff is Yoga.”

Yoga Sūtras of Patañjali (c. 150 BCE)

Each sūtra is a short aphorism stating one, or part, of the philosophical tenets of Raja Yoga. As we are told in the sūtras, yoga involves training a person’s mind through meditation and contemplation to overcome that which is disturbing and unsettling to it. Yogic training, according to the sūtras, is divided into eight limbs, each of which prescribes disciplines that must be adopted in various aspects of life in order to achieve moksha (liberation), the ultimate goal of yoga.

Moksha occurs when the practitioner of yoga is freed from their sense of self. Patañjali claims that union or integration of the self with the Supreme is the result of the subject restraining the fluctuations of their ego, controlling cognition, and finally annihilating the ego. That is, the practitioner ceases to identify themself as a singular individual and instead identifies with a universal consciousness.

Yoga is one of the six orthodox schools of Hindu philosophy, and, as such, its philosophies and practices are ancient. Indeed, various concepts and cognate uses of the term “yoga” were foundational to many Eastern religions, including Buddhism and Jainism. Although the yoga systems share common roots, there are two predominant schools: the Raja Yoga described in the Yoga Sūtras of Patañjali, and Hatha Yoga taught by Yogi Swatmarama (fifteenth and sixteenth century CE). It is this latter form of yoga, with its greater emphasis on the body’s role in meditation, that is more commonly taught in health clubs and yoga studios today. DM

c. 131 BCE

News Sheet

Ancient Rome

Distributing information rapidly to many people at the same time

Before the invention of writing, the only way to spread news was orally, from person to person or from a speaker to a crowd. It was not until the advent of the Roman Republic that what might be called the first proto-newspaper arose: the Acta Diurna. First appearing in about 131 BCE, the Acta Diurna—Latin for “daily acts”—were public notices about important events in the Republic. The Romans carved news items on stone tablets or metal sheets and posted copies in locations where the public could gather and view them; those able to read would share the information with those of their fellows who could not.

It was not until after the invention of the printing press by Johannes Gutenberg of Mainz, Germany, in 1450 that the first modern newspaper emerged. Published by Johan Carolus in Strasbourg, Germany (now France), in 1605, the Relation aller Fürnemmen und gedenckwürdigen Historien (Account of All Distinguished and Commemorable News) was a book-sized, weekly publication. Soon after its appearance, numerous other newspapers arose in Europe, later spreading to various parts of the world. The first magazine appeared more than a century later, in London in 1731. Named The Gentleman’s Magazine, it marked the first use of that term for a publication of varied content. Newsprint remained the prime way in which most people obtained news until it was partly superseded by radio, television, and, later, the Internet and social media.

Newspapers serve to answer our need for new information, for stories about what is going on in the world other than that which we can see for ourselves. They teach, entertain, satisfy our curiosity, and indulge our desire for gossip. The newspaper was also the earliest means of spreading information quickly, cheaply, and efficiently to large numbers of people: it was therefore the first mass medium. MT

c. 100 BCE

Bodhisattva

Mahayana Buddhism

Buddhists who dedicate their lives to helping all beings achieve enlightenment

The idea of the bodhisattva developed with the rise of Mahayana Buddhism during the first century BCE. Mahayana means “greater vehicle,” and the tradition considers itself superior to Theravada Buddhism because of doctrines such as the bodhisattva ideal.

Bodhisattva means “awakening-being” and refers to Buddhists who have dedicated themselves to helping all beings achieve enlightenment. The Theravada School asserts that the goal of Buddhism is to become an arhat, an enlightened being who has attained nirvana and can exit samsara (the cycle of rebirth). The Mahayana concept of the bodhisattva takes the arhat a step further, asserting that after attaining enlightenment, one can choose to remain in the cycle of rebirth to teach others until all sentient beings have become enlightened. This is motivated by bodhicitta, a universal sense of compassion for all beings that is awakened when one realizes enlightenment. Bodhisattvas practice an ethic that involves the “exchange of self and other” (paratmaparivartana) in which they place the welfare of other sentient beings above their own. This is the ultimate expression of anatman (no-self) since it eradicates the distinction between a person’s own good and the good of other beings. This sometimes requires them to make use of upaya (skillful means), which are actions that might violate conventional moral precepts but nonetheless promote enlightenment.

“I have made the vow to save all beings. All beings I must set free.”

The Buddha

The Bodhisattva ideal pervades all Mahayana traditions. Serious practitioners take the bodhisattva vow, swearing to become Buddhas so that they can bring all sentient beings to enlightenment. JM

c. 100 BCE

Memento Mori

Ancient Rome

A Latin phrase that reminds us all of the end that inevitably awaits us

A seventeenth-century memento mori painting by Philippe de Champaigne.

Memento mori is a Latin phrase that is usually rendered in English as “Remember: you will die.” It most probably originated as a proverb, but according to one popular folk story it was first used in the first century BCE by a Roman slave who, having seen his master ride in triumph through Rome after a military victory, used the expression to remind his master of his mortality.

Whatever the truth of that tale, the term was taken up enthusiastically by the Christian religion, which emphasized divine judgment and the transience of all earthly things. Memento mori provided the religion with a moralistic counterpoint to another well-known Latin expression: nunc est bibendum (“now is the time for drinking”—in other words, do not defer gratification). Memento mori encapsulates the outlook of many people who believe that our actions in this life will be rewarded or punished in the next.

By extension, memento mori has come to be used as a generic label for funerary architecture that, in addition to commemorating the deceased, also reminds the living of the transience of earthly existence. Examples of this include relief sculptures on tombs depicting human skulls or angels snuffing out candles. The term is also used for the admonitory captions that may accompany such details: the entrance to a chapel in Évora, Portugal, for example, has above it the inscription “Nós ossos que aqui estamos pelos vossos esperamos” (We, the bones that are here, await yours).

In painting, there is a genre of still life that is referred to as vanitas, Latin for “vanity.” Such compositions include symbols of mortality, most commonly the skull, and were intended as sobering daily reminders of mortality. There is also a long and strong tradition of memento mori works in literature, particularly poems. Among the best known of these in English is “Elegy Written in a Country Church Yard,” written in the eighteenth century by Thomas Gray. GL

c. 100 BCE

Societas Publicanorum

Ancient Rome

The first example of a private corporation with publically traded shares

A first-century marble relief from Pompeii, Italy, that was used as the sign to a coppersmith’s shop.

The British East India Company, founded as a joint-stock company in 1707 from earlier incarnations dating back to 1600 and the time of Elizabeth I, is cited as the earliest predecessor of the modern publically traded corporation. However, recent scholarship suggests that the corporation has roots in the first century BCE, in Roman institutions known as societas publicanorum.

The more or less uniform geographical expansion of Roman political power allowed for the creation of large markets aided by certain technological advances. For example, advances in mining technology increased the rate and volume of mineral extraction, and agricultural innovations increased food production. With that increased production came higher revenues from taxation, greater usage of public property, and stronger demand for public works in general. In order to cope with this demand, the Roman authorities effectively outsourced certain administrative responsibilities via contract to a new kind of political institution, the societas publicanorum.

Roman law permitted private individuals to form associations (societas) comprising various partners (socii) for the purpose of bidding on and fulfilling such contracts. These associations were permitted to seek external financing by selling shares (partes) in the association. In an arrangement similar to that offered by modern corporations, investors did not share the contractual obligations or liabilities of the association itself. And, just like shares in modern corporations, partes could be bought, sold, and traded, which made them a more attractive proposition.

The historical significance of societas publicanorum cannot be overstated; not only are they the prototypes for the dominant vehicle for business in modern times, they are also likely the earliest models of private provision of public services—in other words, privatization—itself a common practice almost 2,000 years later. DM

c. 27 BCE

Universal Language

Ancient Rome

The concept of a single language that can be understood and spoken by everyone in the world

A Latin engraving on the tombstone of the Roman cavalry soldier Publius Cornelius, who participated in the expedition of Aelius Gallus to Yemen in 26–25 BCE.

The idea of a “universal language” refers to either a hypothetical or historical language spoken by most or all of the world’s population. Some mythological or religious traditions posit a time when all people spoke the same language, but there is no evidence for this hypothesis. Latin was one of the first languages to achieve a semblance of universality, thanks to the expansion of the Roman Empire (established in 27 BCE). Latin became (in effect) the official language of government, business, religion, education, and law across much of Western Europe, and remained so until the sixteenth century (due in part to the dominance of Latin in the Roman Catholic Church).

“It would be very difficult to form … this language … but very easy to understand it …”

Gottfried Leibniz, mathematician and philosopher

With the decay of the Western Roman Empire in the fifth century, regions that had been unified by the Roman Empire began to fall away from central control, and Latin dialects used in general conversation became, over time, distinct languages (the Romance languages). When trade increased, despite the everyday decline of Latin, the desire for an international trade language increased also, as did attempts to construct a universal language in general. Gottfried Leibniz (1646–1716) was one of several thinkers to make efforts toward what he called a “universally characteristic language.” Leibniz was impressed by the way that the Egyptian and Chinese languages used graphic expressions for concepts, and proposed a universal system of writing composed of symbols. Inspired by Leibniz, Giuseppe Peano (1858–1932) developed the Interlingua language in 1903, based on a simplified form of Latin.

The pursuit of a universal language continues today. For example, existing constructed languages—most famously Esperanto but also the less well-known Ido—may still give grounds for a future universal language. Both were designed to be international auxiliary languages, or universal second languages. JE

c. 50 CE

The Second Coming

Early Christians

The notion that Jesus will return and announce himself once more on Earth

It is a central belief of the Christian faith that one day Jesus Christ, the Son of God, will return and once again make himself known to the peoples of the world. Furthermore, it is believed that Christ’s “second coming” will bring heaven on Earth. The belief stems from a number of canonical biblical prophecies that, taken together, support the interpretation that Christ will return.

There is no expectation that the world’s transition toward heaven will be smooth, however. The period immediately before Christ’s return is anticipated to be marked by the emergence of false prophets and the Antichrist, with Christ and the Antichrist having to battle for the future of humanity. It is evident from the Gospels and the works of Paul (believed to have been written sometime around 50 BCE) that Christ’s return was anticipated to be an imminent event. The epoch of the return of Christ is expected to be characterized by the destruction of existing worldly empires, followed by the establishment of a kingdom lasting 1,000 years. The majority of Christian theologians in history have believed that the reappearance of Jesus may occur at any moment, and have therefore advised that Christians should always be ready for it.

In modern times, the idea of the second coming of Christ has proven influential in works of fiction, as has the figure of the Antichrist who opposes him. It inspired the supernatural horror movie series The Omen—initially released as a novel by David Seltzer in 1976—and also Stephen King’s novel The Stand (1978). Individuals claiming to be the second coming have included Sun Myung Moon (of the Unification Church), Jim Jones (instigator of the Jonestown mass suicide in 1978), David Koresh (behind the Waco, Texas, mass suicide in 1993), and Marshall Applewhite (cause of the Heaven’s Gate mass suicide in 1997). JE

c. 50 CE

Grace

Early Christians

The belief that we need God’s help to live a good life and earn a place in Heaven

Grace, strongly developed in early Christian thought, is the idea that we need God’s help in living a good life—which is defined as following God’s will—in order that after death we may return to God in Heaven. Grace, in its specifically metaphysical meaning, is understood to be a divine act bestowed from God as a manifestation of His love and kindness. In Christian thought, God’s ultimate act of grace is considered to be the sacrifice of His Son, Jesus, for the salvation of humankind.

“This grace … conducts all … to the state of the highest perfection and glory.”

Augustine of Hippo, theologian

The life of grace is taken to be characterized by divine favor, and is marked by dedicated active service to God and obedience to His commandments. As Christian doctrine evolved, grace specifically came to mean salvation from sin, with grace being an active means of healing the effects of evil in the lives of believers. The notion of grace is associated with the Christian belief in original sin—that Adam’s sin is visited on all humanity, with the result that all of humankind has “fallen” from grace and would remain in that state without the intervention of, and human belief in, Jesus as the Son of God, a manifestation of the one true God on Earth. It is because the salvation of humanity could not occur without Jesus’s self-sacrifice that his act has taken on the meaning of the ultimate act of divine grace in Christian theology. The concept of grace as an undeserved divine action, closely aligned with acceptance of Jesus as the Son of God and a life dedicated to serving God and spreading Christianity, has been a central element of Chrisitianity for centuries, and can perhaps be taken as the central doctrinal element of the Christian faith. JE

c. 50 CE

Plainchant

Early Christians

The idea of unaccompanied singing in unison of Christian liturgies

A carving of a man playing the third tone of plainchant on a cithara, c. 1118–20, from Cluny Abbey, France.

Although the music itself dates from the first century CE, the so-called Apostolic Age, the term “plainchant,” describing unaccompanied singing of Christian liturgies, appeared in the thirteenth century and is derived from the Latin cantus planus (plain song). The term is often used in a more restricted sense, however, referring to the Roman Catholic Church’s sung liturgy, Gregorian Chant, which is misattributed to Pope Gregory I (540–604). The plainchant repertoire was refined, particularly following the Council of Trent (1545–63), when it was standardized in order to create a common liturgical practice. Centuries later, the Second Vatican Council of 1963 replaced Latin with vernacular languages, and introduced new liturgical music.

The standardization of the Roman Catholic Church’s liturgical materials as an aid to expansion was instrumental in the development of musical notation, although this was not recognized as an important consequence in early medieval sources. Dating from the ninth century, the early notation of chant, with neumes representing notes or groups of notes, would gradually develop into the highly precise system of symbols for pitch, rhythm, and dynamics used today. Notation was crucial to the expansion of composition from a single melodic line to polyphony, and in medieval times plainchant melodies were sometimes elaborated into complex polyphonic compositions.

While plainchant has been of enormous importance for the church through the centuries, and remains so, it reached a general audience during the mid-1990s with the release of the album Chant by The Benedictine Monks of Santo Domingo de Silos, which sold three million copies in the United States. The popularity of the plainchant-like works of Hildegard von Bingen (1098–1179) has also helped plainchant to become a vehicle for new-age aesthetics, and to generate an interest in other historical female composers. PB

c. 55 CE

Three Theological Virtues

Early Christians

Three positive character qualities associated with salvation

An artwork by William Blake (c. 1827) for Dante’s The Divine Comedy, which features the three theological virtues.

The Greek philosopher Aristotle (384–322 BCE) defined virtue as “a positive trained disposition.” The ancients believed in four cardinal virtues, namely wisdom, justice, temperance, and courage, but held that because all four are aimed at the good of humankind, they are never wholly separable from one another. Using Paul’s First Epistle to the Corinthians 13:13 (written c. 55 CE) as their proof text, the Church Fathers added to this list three more virtues, ones especially connected with the nature of God and the perfection of the cardinal virtues. These were faith, hope, and love.

Believers would say that faith has to do with holding on to what they know to be right, despite emotions to the contrary. Faith is the emotive thrust that keeps them trusting in God—who they have good reason to believe exists and is good—even when He feels absent. The virtue of faith helps them to act rationally, even when it is hard to do so.

Hope is the rational desire to be with God and attain true happiness therein. Insofar as reason helps believers to discern God’s existence and good character, hope is the proper desire to want to be with God. Hope—a rational desire that is grounded in rational possibilities—may be contrasted with “wish,” which is no more than an irrational desire for some impossible future happiness.

Love—that is, agape love—disposes believers to sacrifice themselves for others in a manner that perfects justice, but never contradicts it. Justice requires a person to treat others correctly—to provide for the basic needs of one’s family, for example—but love goes beyond this; it might, for example, oblige a father to lay down his very life for his family. The three theological virtues have played an enormous role in Western, especially Christian, thinking about rationality and character development, and their importance shows no sign of abating. AB

c. 70 CE

Terrorism

Jewish Zealots

The pursuit of religious, political, or ideological goals through creation of fear

Herod’s Palace at Masada, Israel, where Jewish Zealots committed suicide after defeat by the Romans in 73 CE.

Terrorism, the idea of deliberately creating terror by targeting attacks on people and/or buildings for a variety of purposes, originated in around 70 CE in the practices of the Jewish Sicarii—known for their use of a short sword (sica)—who killed wealthy Jewish collaborators during the period of Roman rule over the Jewish people. Later examples of terrorist groups include the Assassins, a Shiite group in eleventh-and twelfth-century Persia, who killed in order to oppose efforts by Sunni practitioners to suppress their religious beliefs; and, in eleventh-century India, a group called the Thugs who killed to nourish the goddess Kali with blood, threatening Indian society until the colonial era.

“How can you have a war on terrorism when war itself is terrorism?”

Howard Zinn, historian and social activist

The concept of state terrorism is more controversial, leading to disagreements over the appropriateness of attributing the label “terrorism” to actions or campaigns coordinated by nation-states pursuing particular goals under conditions in which only guerrilla warfare is possible. State terrorism (assuming scholarly consensus that such campaigns can be called terrorism) is also an important topic of analysis, characterizing many actions taken by right-wing dictatorships (such as in Germany under Hitler or Indonesia under Suharto), left-wing totalitarian states (such as in the U.S.S.R. under Stalin or during the Cultural Revolution in China under Mao), and under contemporary “liberal-democratic” capitalist states (sometimes indirectly, such as U.S. support of right-wing client-state dictatorships, such as Pinochet’s Chile or in Guatemala following the 1954 coup, or sometimes directly, such as the contemporary U.S. tactic of extrajudicial killings through drone strikes). JE

c. 77 CE

Encyclopedia

Pliny the Elder

A comprehensive collection of knowledge, often found in book form

Roman naturalist and writer Gaius Plinius Secundus, or Pliny the Elder (23–79 CE), compiled the world’s first encyclopedia some time around 77 CE when he wrote Naturalis Historia, or Natural History. This ambitious work was an attempt to bring together knowledge on every subject in the natural world, from zoology and astronomy to classical art. While Pliny’s work was as comprehensive as he could make it, it was organized into chapters that addressed individual fields of study and not in individual entries on specific topics. European, Muslim, and Chinese scholars created more encyclopedias in the centuries that followed, but it was not until the early eighteenth century that the first modern versions of the encyclopedia emerged. Numerous encyclopedias or compendiums appeared that, like Pliny’s work, contained information on myriad topics and subject areas, while at the same organizing that knowledge into an easier to use format. Individually tailored entries, accompanied by drawings, diagrams, and cross-references that linked them to related material, became the standard for all encyclopedias.

“To me the charm of an encyclopedia is that it knows—and I needn’t.”

Francis Yeats-Brown, British army officer and writer

What is the sum total of human knowledge? This is the core question that encyclopedias seek to answer. Whether they cover a particular discipline or try to capture the breadth of understanding, encyclopedias offer the reader the promise of being able to know the answer to any question simply by reaching for the proper volume. As technology advanced and book searches became replaced by keystrokes, the ability to have all knowledge at the tips of your fingers became almost instantaneously attainable. MT

c. 99 CE

Social Welfare

Emperor Trajan

Services provided by a state’s government to benefit its citizens

A silk painting from c. 1178, showing lohans (enlightened disciples of the Buddha) bestowing alms.

Social welfare is the idea that the government of a state is responsible for providing a minimal level of well being for its citizens. While charity is an ethical and religious principle intended to guide behavior, social welfare is a political principle promoting a broader social responsibility to help those in need. The first widespread social welfare programs were instituted in the Roman Empire, and the Emperor Trajan (53–117 CE) established the first and most extensive of these.

Justifications for social welfare programs have included religious beliefs, ethical requirements, and political calculations. Most major world religions teach the importance of care for the poor, and historically governments able to provide a level of subsistence for their populations have been better able to avoid some of the problems of poverty that lead to social instability. The Islamic Caliph Umar (579–644) identified a public aspect to the moral obligation or Pillar of Islam requiring charity, and established taxes to pay for public services. In another well-known historical example of social welfare, the Song Dynasty in China (960–1279) implemented government assistance in the form of healthcare and public housing.

“Power has only one duty—to secure the social welfare of the People.”

Benjamin Disraeli, British prime minister 1874–80

Today, every democracy in the world implements some form of social welfare program, and most citizens of those countries believe that it is important to provide resources for those unable to provide for themselves. Such programs are often controversial, however, provoking debates about who deserves assistance, how much assistance should be given, and what percentage of public resources should be used for them. TD

c. 100 CE

Perfect Number

Nichomachus

A natural number equal to the sum of all its divisors except itself

The concept of a “perfect number,” a positive integer equal to the sum of its positive proper divisors minus itself, stems back to the Neo-Pythagorean philosopher and mathematician Nicomachus of Gerasa (c. 60–c. 120 CE), who authored the Introduction to Arithmetic, the standard text on arithmetic for 1,000 years. The first such perfect number is six (sum of 1 + 2 + 3), with the three next such numbers being 28 (1 + 2 + 4 + 7 + 14), 496, and 8,128. These four perfect numbers were known to Nichomachus, who identified 8,128 as a perfect number as early as 100 CE. Nichomachus classified all numbers as “deficient,” “perfect,” or “superabundent,” depending on whether the sum of their divisors was less than, equal to, or greater than the number itself.

“[Nature seems] to have been determined and ordered in accordance with number.”

Nicomachus, Introduction to Arithmetic (c. 100 CE)

The study of numbers was of fundamental (and commonly spiritual) importance to Pythagorean and Neo-Pythagorean thinkers—Nichomachus himself wrote a two-volume work titled The Theology of Numbers on the mystic properties of numbers (only fragments of which survive). The work led to a number of early mathematical discoveries, including that of the first four perfect numbers.

Perfect numbers have proven elusive; as of 2007, only forty-four have been found. A formula covering identified perfect numbers—all positive thus far—combines what is known as the Euclid-Euler theorem (2^(n−1)x(2n−1)), with a Mersenne prime inserted for n (a Mersenne prime is a number generated by the formula (2^n)-1, named for the French monk Father Marin Mersenne, 1588–1648). JE

c. 125

Gnosticism

Valentinus

The belief that gnosis (esoteric or intuitive knowledge) can lead to the soul’s salvation

Gnosticism, from the Greek gnosis (knowledge), is a contemporary term that refers to a number of schools of spiritual thought, each emphasizing the attainment of revealed knowledge of the divine as the means for spiritual salvation and transcendence of the physical world. While gnostic tendencies may be found in a number of world religions, the term “gnosticism” is often used to refer to a set of “gnostic” Christian schools in the second to third century CE, the period of early Christianity before orthodoxy had become enforced. A key leader of the Gnostic movement was Valentinus (c. 100–c. 160 CE), who founded an influential sect in the second century in Rome.

“ … God invisibly cooperates with what has been modeled to lend it credence.”

Valentinus

Valentinian gnosticism presupposes a central being, Bythos, from whom emanate three pairs of aeons, or beings, that represent cosmological opposites (such as male and female); from these three pairs emanate others, making an overall total of thirty aeons. All the aeons together constitute the realm of spiritual being (the pleroma). But the last aeon, Sophia, sinned through having an offspring, Achamoth, who created a rival world (kenoma, Greek for “vacuum”); a rival imitator creator “deity,” the Demiurge (identified with the God of the Old Testament), generates the physical universe. Gnostic Christianity, though relating to both the Old Testament and the figure of Christ, incorporates elements of pagan gnosticism and Platonism. Knowledge of Valentinus and gnosticism increased with the discovery in 1945 of the Nag Hammadi library, a number of preserved early gnostic Christian texts, discovered buried in Nag Hammadi in upper Egypt. JE

c. 150

Easter

Melito of Sardis

A period of celebration commemorating the death and resurrection of Jesus Christ

Christ Carrying the Cross (c. 1500) by Hieronymus Bosch.

The Easter feast, celebrating the death and the resurrection of Jesus Christ, is among the most important Christian feasts. Easter was first mentioned in a mid-second century Paschal homily believed to be written by Melito of Sardis (d. c. 180) for reading aloud on the morning of Pascha, an earlier name for the feast. Originally, Easter was observed with Jewish Passover, but after the first Council of Nicaea in 325 it was declared that Easter should be observed on Sunday, held to be the day of the resurrection of Christ. The date was movable, being the first Sunday after the first full moon after the spring equinox.

Two of the most common symbols of Easter are the egg and the rabbit, and the egg symbolizes new life breaking through the seeming death of the eggshell, represented by its hardness. This symbolic interpretation of the egg likely predated Christianity, but was adapted to represent Christ’s return from death and coming forth from the tomb. The Easter rabbit is also likely to hark back to cultures predating Christianity, for which the appearance of the rabbit in the landscape symbolized the coming of spring (itself symbolized by the animal’s renowned fertility). The rabbit has been adapted by many Christian cultures, but has not taken on any specific Christian meaning comparable to that of the egg.

“Easter is the demonstration of God that life is essentially spiritual and timeless.”

Rev. Charles M. Crowe, pastor

In Germany the egg and rabbit Easter symbols were united in the notion of an egg-laying hare, one that, after importation into the United States in the eighteenth century, came to lay the chocolate eggs now loved by children the world over. JE

c. 180

Free Rein Defense

Irenaeus

The Christian argument that the existence of evil is consistent with natural laws

A portrait (c. 202) of St. Irenaeus, bishop of Lyons. His main work, Against Heresies (c. 180)—and, indeed, all his other writings—was devoted to refuting gnosticism.

For theologians, the concept of evil is a challenge to the propositions that God is omniscient, omnipotent, and omnibenevolent. The problem, or theodicy, is this: given God’s omniscience, God must know that evil exists in the world. This being the case, God must be either incapable of preventing evil, and therefore not omnipotent, or God is permissive of evil, and therefore not omnibenevolent. Stated plainly, the existence of evil is a problem because evil itself challenges the conventional monotheistic concept of God.

“For when strength was made perfect in weakness, it exhibited the benignity of God, and His most admirable power.”

Irenaeus, Against Heresies (c. 180)

Two distinct approaches to this problem have emerged: one inspired by Augustine of Hippo (354–430), the other inspired by Irenaeus, bishop of Lugdunum, now Lyons (c. 140–200). The Free Rein Defense, or FRD, is most closely aligned to the Irenaean approach. Irenaean theodicy upholds the utility of evil in humankind’s development as moral agents. The FRD proposes that the universe must operate in accordance with certain natural laws (such as the laws of classical physics) because, were this not the case, cause and effect would break down; it would be impossible to predict (even vaguely) the effect of any given cause; and effective action would be impossible. In other words, if the laws that govern the universe were not given free rein to unfold as they may, human beings along with every other denizen of the universe would be little more than God’s puppets, and concepts such as free will, intention, piety, sin, and even good and evil would be nonsensical. For this reason, the defense continues, God allows the complex chains of causes and effects to unfold unimpeded. As a consequence, however, the natural laws ultimately and inevitably give rise to evil and suffering. Today the FRD and its underlying principles have come under severe attack, especially from secular culture, informed by existential nihilism, insisting that life’s meaning must be created by humankind. DM

c. 180

Creatio Ex Nihilo

Irenaeus

The belief that an all-powerful god created the universe out of nothing

The Ancient of Days, an illustration by William Blake for his book Europe: A Prophecy (1794). The book was engraved on eighteen plates, and survives in just nine known copies.

Standing in contrast to proposals that the universe was created out of preexisting chaos, the Christian doctrine of creatio ex nihilo—that God directly created the universe from nothing—originated at the time in early Christian history when no established church doctrine had yet been consolidated. The doctrine found an able expositor and defender in Irenaeus, bishop of Lugdunum, now Lyons (c. 140–200).

“God the Creator … created all things, since He is the only God … alone containing all things, and Himself commanding all things into existence.”

Irenaeus, Against Heresies (c. 180)

Irenaeus developed the doctrine of creatio ex nihilo specifically to counter popular gnostic teachings that the material world was not the direct work of God, but instead was the work of a lesser god who fashioned it from preexisting matter. The gnostic idea of the world followed the Platonic view, which itself set up a dualism between ultimate reality (that of God) and the empirical world—the latter, according to the gnostics, being inferior, illusionary, and an obstacle and distraction that prevents us from attaining knowledge about or inhabitation of the “real world.”

The gnostics, following Platonic dualism, viewed the empirical world as evil, essentially because it stood between the believer and God. In contrast, Irenaeus upheld that, first, the created world is real, rather than a false veneer hiding the real, hidden world; second, it is created directly by the true God rather than being the work of a lesser, subordinate deity; and, third, it is created from nothing rather than from preexisting and chaotic, imperfect matter. Irenaeus, then, fundamentally attributes all that exists to a single, perfect entity, and in so doing attempts to eradicate the notion that creation can be evil. The notion of creatio ex nihilo underwrites Christian cosmology, setting the stage for numerous other Christian beliefs about creation, while simultaneously underwriting the Judeo-Christian metaphysical view that there is only one God, who is fundamentally superior to and beyond all else in creation. JE

c. 180

Original Sin

Irenaeus

The concept that Adam’s sin with Eve in the Garden of Eden has tainted all of humankind

The left-hand panel of a diptych (c. 1479) by Hugo van der Goes, showing the fall and redemption of Man.

The notion of “original sin,” hereditary guilt passed from Adam throughout the lineage of humankind, is not explicitly developed in the Old Testament, though it is consistent with a number of passages throughout Genesis and the Old Testament in general. Such a notion was initially developed in the works of Irenaeus (c. 140–200), who taught that evil came into the material world through Adam’s sin. Irenaeus developed this doctrine in order to combat the gnostic sects in their belief that the world of matter was both inherently evil and the product of a lesser deity, rather than the creation of a true, good God (which contains evil as a consequence of humankind’s disobedience).

“Original sin … appears to be a hereditary depravity and corruption of our nature …”

John Calvin, theologian

While the doctrine of original sin was initially developed by Irenaeus, its precise articulation was developed by Tertullian (c. 155–220), St. Cyprian (200–258), and St. Ambrose (339–397). They taught what might be the core of the doctrine of original sin, namely that sin that originated in Adam passed down through human generations. St. Augustine (354–430) and St. Anselm (c. 1033–1109) advanced the doctrine, which became canonically central to orthodox Catholic and Christian faith. “Original sin” provides a justification for the existence of the Church (the Church would be needed to promote the true way in a fallen world) in addition to a motivating factor for joining it (humankind needs the intervention of the Church to find salvation). The doctrine of original sin, long in development, is a fundamental element of the Christian faith and has proven key to securing the importance of the Church in the lives of believers. JE

c. 180

Vitalism

Galen

The belief that some special “living” quality exists beyond the parts of animate life

Vitalism is a broadly metaphysical doctrine that conceives of a living being as being distinguished from nonliving beings by virtue of some “essence” that is particular to life and which exists above and beyond the sum of that living being’s inanimate parts.

The Greek medical practitioner Galen (c. 130–c. 210) held that spirit (pneuma) was the essential principle of life and took three forms—animal spirit (pneuma physicon), which occurred in the brain; vital spirit (pneuma zoticon), which occurred in the heart; and natural spirit (pneuma physicon), which resided in the liver. Galen’s ideas remained influential for as long as 1,400 years after his death.

An earlier form of vitalism had been proposed by Aristotle (384–322 BCE), and his works On the Soul and On the Generation of Animals (both c. 350 BCE) became canonical vitalist treatises. Aristotle held that the soul (the psyche) is what attributes organizational unity and purposeful activity to a living entity. Later thinkers to offer a more advanced version of the theory included the philosopher and biologist Hans Driesch (1867–1941). Driesch also articulated a philosophy highlighting what he took to be an essential, autonomous, and nonspatial psychoid or mindlike essence behind living things, referring to his experiments with sea urchin embroyos, in which separated cells developed into whole organisms.

“After death … I will no longer move, no longer sense, nor speak, nor feel, nor care.”

Aristotle, philosopher

Galen was responsible for particular conceptual errors that persisted long after him, and vitalism as a theory of life has few adherents within contemporary biology. Nevertheless, there is no question that both had enormous influence on Western thinking. JE

c. 180

Bloodletting

Galen

The theory that the sick may be healed by relieving their bodies of excess blood

An illustration of bloodletting and skin disease treatment from a medical treatise of 1519.

Bloodletting is the practice in medicine of withdrawing blood from patients to prevent or cure illness or disease. While the ancient Egyptians and Greeks were among the first to practice medical bloodletting, it was the Greek physician Galen (c. 130–c. 210) who produced a systematic bloodletting treatment in the second century—a development that would result in much loss of life over the following centuries.

In accordance with the Hippocratic school of medicine, Galen regarded health as a matter of balancing the four humors—blood, phlegm, yellow bile, and black bile. Of these, he was particularly interested in blood, which he demonstrated to flow through the arteries of living animals. He believed that illness was often due to a superabundance of blood and therefore recommended bloodletting by venesection (opening a vein by incision or puncture) to remedy it, in part because venesection is a controllable procedure. He offered guidelines about when, where, and how to bleed, depending on the patient’s condition and the disease’s course. Later methods of bloodletting included blood cupping (cutting the skin and then using heated cups to withdraw the blood by means of suction as they cool) and the use of leeches (especially the species Hirudo medicinalis), for which sucking blood from host mammals is a natural behavior.

Convinced of his findings, Galen eventually completed three books expounding his views on bloodletting. His systematization of Greek medicine was so powerful and persuasive that his views dominated Western medicine for the next 1,500 years. It was only in the Renaissance that scholars, such as Andreas Vesalius, began to challenge his authority. Nevertheless, the practice of bloodletting continued, amid increasing controversy, until the end of the nineteenth century. Except for a few rare conditions, it is now generally regarded as ineffective or harmful. GB

c. 200

The Holy Trinity

Tertullian

The belief that God consists of three persons combined in one substance

The Trinity, a sixteenth-century oil painting by Jacopo Robusti Tintoretto.

Until the first half of the first century CE, Jewish monotheism typically understood God to have one “character” or “personality.” However, the appearance of Jesus in the first century CE challenged this, as by performing a number of acts (such as forgiving sins) that the ancient Jews believed were the prerogative of God alone, Jesus seemed to claim for himself some kind of identity with God. Yet despite making these claims, Jesus, at the same time, also clearly understood his identity to be different from that of God the Father. The nature of God’s identity was complicated even further when Jesus said that the Father would send “the Holy Spirit”—apparently another aspect of God—to aid Christians in their spiritual growth.

During the first four centuries after Jesus’s death, Christians argued about how, precisely, the claims of God’s unity and yet plurality could be reconciled. The early Christians were deeply concerned with rational intelligibility, and while they could accept that the exercise of reason might point to a mystery that is understood rationally only by God, they could not accept logical contradiction. Various “heretical” ideas were suggested, including modalism (which claims that the Father, Son, and Holy Spirit are mere modes or appearances of the one God) and Arianism (which claims that the Son was created by the Father). However, one of the early church fathers, Tertullian (c. 160–c. 225) was the first to speak of both God being one “substance,” and the Father, Son, and Holy Spirit being distinct “persons” or “personalities” of that substance.

A century later, at the Council of Nicene, the rough formula of “three persons in one substance” became the accepted orthodox key to understanding the unity and plurality of the Godhead. Given the pervasiveness of Christianity, the idea that God is tri-personal is now the dominant conception of monotheism across the globe. AB

c. 200

Metanoia

Tertullian

A change of mind powerful enough to alter a person’s outlook and behavior

The Repentant Peter (c. 1600–05) by El Greco (Domenico Theotocopuli).

The term “metanoia” is derived from the Greek prefix meta—meaning “over,” “after,” or “with”—and nous, meaning “intellect” or “mind.” Translating literally, metanoia means a change of one’s mind or purpose. The term is generally used in two different contexts, both of which retain this literal meaning. In the Bible, the term is most often translated as “repent,” and the term also figures prominently in the psychological works of Carl Jung (1875–1961).

The Christian scholar Tertullian (c. 160–c. 225) argued that, in the context of Christian theology, metanoia is best translated as “change of mind.” In this specific context, the change of mind may be taken to refer to the change from nonbeliever to believer. Furthermore, this particular kind of change of mind is expected to entail a wholesale change in the person’s behavior and disposition; the person who experiences metanoia is expected not only to embrace a pious attitude but also to act accordingly. Hence the word “repent” refers to renunciation of sin in both thought and act.

“Do you not realize that the kindness of God is meant to lead you to repentance?”

The Bible, Romans 2:4

In Jungian psychology, metanoia occurs during the so-called “middle-life” phase of human personal development. According to Jung, late in the third decade of life, people experience an urgent need to reassess their values, and this leads them to a phase of intense introversion. Jung calls this process “metanoia” because it involves the comprehensive breaking-down and rebuilding of each person’s worldview and value system. The process of metanoia, says Jung, results in greater self-realization. Today, it underlies therapy that seeks emotional healing by inducing breakdown. DM

c. 200

Fideism

Tertullian

The view that faith in God is sensible, even if reason cannot confirm His existence

The relation of faith and reason has long plagued Christian belief, given the dual intellectual heritage of Western civilization in both Athens, home to the Western origins of reason, and Jerusalem, home to the Western origins of faith. Fideism, the philosophy that faith is (relatively) independent of reason—that reason is not necessary to justify belief in God—is commonly held to extend as far back as Tertullian (c. 160–c. 225), who emphasized a theme of Paul’s First Letter to the Corinthians, that the truth of Christianity can only be discovered through revelation.

“Who then will blame Christians for not being able to give reasons for their beliefs?”

Blaise Pascal, Christian philospher

Fideism has influenced, to varying degrees, significant figures in the history of philosophy, including Blaise Pascal (1623–62), Søren Kierkegaard (1813–55), William James (1842–1910), and Ludwig Wittgenstein (1889–1951). Pascal’s defense of his faith includes a critique of the use of apologist arguments in attempts to get others to attain belief. Kierkegaard argued that belief requires a committed leap of faith, which itself is distinct from evidence or reasons. James, a U.S. pragmatist who also studied psychology and religion, put forward a set of criteria under which it is rational to believe without proof. Wittgenstein interpreted religion as a self-contained and expressive occurrence with its own internal logic, and within its own self-referential criteria; consequently it is independent from reason and external critique. While the extent to which the four philosophers in question, and Tertullian himself, are actually fideists is debatable, fideism itself has long provided a common defense of faith in the absence of evidence of God. JE

c. 200

Midrash

Rabbi Judah ha-Nasi

The considered and shared interpretation of biblical stories

A fourteenth-century illuminated Jewish manuscript from France, showing a child being given a lesson in the teachings of the Torah.

In the Jewish tradition, “Midrash” is a method of interpreting biblical stories for the purposes of clarification, completion, and resolving complex or contradictory passages. From the time of the Torah’s origins, rabbis were required to communicate stories from it in a way that was comprehensible to people and useful for them. These teachings were communicated primarily through an oral tradition until c. 200, when Rabbi Judah ha-Nasi (135–219) recorded a set of interpretations known as the Mishna. Commentaries on the Mishna by other rabbis over the following three centuries were compiled as the Gemara; these, coupled with the Mishna, comprise the Talmud.

“At times the truth shines so brilliantly that we perceive it as clear as day. Our nature and habit then draw a veil over our perception, and we return to a darkness almost as dense as before.”

Maimonides, rabbi and philosopher

Midrash is required when communicating biblical stories because the lessons of the stories are not always immediately apparent. Some biblical principles may seem to contradict other biblical principles, or conflict with contemporary legal or moral teachings from other sources. So, in order that believers can understand the meaning of the text, it is necessary that they find a teacher who is able to explain its content by making connections, clarifying lessons, and relating the text to a broader moral and literary context.

These biblical explanations tend to be transmitted orally, but the oral tradition of interpretations may also be collected and recorded as supplemental pretext for understanding the primary biblical source. Accordingly, the term “Midrash” may be used to refer to the entire compilation of teachings about the Bible.

The existence of Midrash admits to the necessity of interpretation when it comes to religious texts. In the Jewish tradition, educated teachers who are able to apply expertise in language and tradition give these interpretations and record them for others. These interpretations themselves become texts to be studied, and promote the idea that religious texts are a part of an ongoing conversation among believers. TD

c. 240

Manichaeism

Mani

A dualistic religious philosophy dividing the world between good and evil principles

A Manichaen miniature showing Manichaen priests. The approximately 1,000-year-old fragment was found during an archaeological expedition in Idikut Shahri, Turkestan.

The prophet Mani (c. 216–276), born in southern Babylonia, was the founder of Manichaeism, one of the most influential religions in the ancient world. Central to the religion is the belief that good and evil cannot have originated from the same source. Evil and good are perceived as independent of each other, and humanity can only escape evil by recognizing this dual reality and following the precepts of the good.

“Those who follow me … and put their hope in God Ohrmizd … are the ones that are saved and find salvation from this cycle of rebirths and attain eternal redemption.”

Mani, An Apocryphal Epistle to Mar Ammo

The realm of evil is interpreted to be the same as the realm of matter, which itself is in direct opposition to the realm of God. Existence commenced with the two primal principles of darkness and light, each in its own realm, and began when darkness waged a war on light. Manichaeism may be distinguished from many forms of gnosticism, in which matter is also seen as evil, by its belief that dualism is an inherent part of the nature of existence, with darkness and evil having an independent existence from the good.

For a while Manichaeism constituted a serious challenge to Christianity. Following the death of Mani, the doctrine spread through Syria into the West, and eastward into central Asia. Its message, that the good must be actively pursued and the tendencies of darkness must be opposed, inspired its followers to reject the world of matter, which meant the physical body, too. The message presented by Manichaeist missionaries was quite uncharacteristic of and distinct from the many forms of gnosticism with which it coexisted.

While Manichaeism was ultimately to be supplanted by Christianity, Mani’s worldview and teachings survived as a subject of interest to thinkers such as the French Protestant philosopher Pierre Bayle (1647–1706), Scottish empiricist philosopher David Hume (1711–76), and French essayist and philosopher Voltaire (1694–1778), marking it as a significant philosophical system in Western philosophy. JE

c. 240

Christian Universalism

Origen

The belief in the eventual salvation of all souls, and that Hell is only temporary

Is it God’s plan to achieve salvation for all humanity, or is the opportunity for salvation only accepted by some? The early Christian theologian Origen (185–254) believed that instructive suffering in the afterlife eventually induces all souls, even the Devil, to accept Jesus and God. This universalism was later declared heretical, and the Church (both Catholic and Protestant) taught the faithless to expect eternal suffering in Hell. However, some seventeenth-century German Anabaptists and English Ranters rediscovered universalism, and in 1820 Friedrich Schleiermacher (1768–1834) became the first modern theologian to defend the doctrine.

Universalism also flourished in the United States, with the founding of the Universalist Church of America in 1793. Major figures in the movement included John Murray (1741–1815), who spread the doctrine amid opposition from orthodox Christians who believed it would lead to immorality, and Hoseau Ballou (1771–1852), who stressed the importance of reason in religious thinking and argued that punishment for sin is restricted to earthly life only.

“Once [eternal damnation] exists, [eternal bliss] can exist no longer.”

Friedrich Schleiermacher, theologian

Typical arguments for universalism proceed from God’s love for humanity and righteousness against evil. If God’s love for humanity prevails, eternal torment is impossible; only suffering for some ultimate good makes sense. Heavenly bliss is lessened by awareness of hellish torment. God’s redemptive love also implies His persuasive inducement of every soul’s free choice of faith eventually. And God’s perfect goodness could not accept permanent evil, for how can God ultimately prevail as long as Hell persists? JSh

c. 268

Porphyrian Tree

Porphyry

A classification system that categorized all knowledge in a tiered diagram

The Porphyrian tree, also called the Arbor poryphyriana or the scala praedicamentalis, is an ancient classification system that diagrams any concept into a five-tiered hierarchy of existence. At each tier in the hierarchy, a concept can be either one of two things, such as material or immaterial, sensitive or insensitive, and rational or irrational. When applied to an idea, the Porphyrian tree creates a visual depiction of the concept in three vertical columns. The central column depicts each of the five tiers, with the columns on either side representing the two possible choices. Its vertical appearance with appendages or branches is somewhat treelike, hence the name.

In his work the Categories, ancient Greek philosopher Aristotle (384–322 BCE) created a categorization system that fit all human ideas, emotions, or knowledge into one of ten categories. In about 268, Neoplatonic philosopher Porphyry (c. 234–c. 305) wrote his Introduction, or Isagoge, in which he reorganized those categories into his eponymous binary categorization system. After sixth-century philosopher Boethius translated Porphyry’s work into Latin, it quickly gained a place as the standard logic text in Western culture. The Introduction became necessary reading for anyone studying philosophy, and did not fall out of favor until the late nineteenth century. It was translated into many languages and became one of the best-known works of philosophy ever written.

With the Porphyrian tree, humanity had a way to categorize every idea into a single applicable visual depiction. The tree created a taxonomy of thought, grouping similar concepts while quickly revealing any differences between them. Though no longer in use, the grouping and categorization of phenomena into a treelike visual format is used in everything from logic to grammar and zoology. MT

c. 300

Santa Claus

St. Nicholas

A legendary figure who is the traditional patron of Christmas, well known for being a bringer of gifts for children

A seventeenth-century oil painting of St. Nicholas, from the Greek School. St. Nicholas is the patron saint of Greece, and of children, sailors, unmarried girls, merchants, and pawnbrokers.

The figure of Santa Claus, the benevolent old fellow who travels around the world on Christmas Eve night giving presents to good children (and, if their parents are to be believed, coal to bad ones), stems from the legend surrounding St. Nicholas, Roman Catholic bishop of Myra (270–346) in Lycia (now Turkey). Precise details of the saint’s life are not known, but he had a reputation for generosity and kindness. He is believed to have provided dowries of gold for three poor girls, and to have restored to life three children who had been cut up by a butcher and put in a tub of brine. Over time, the legend of the dowries, combined with elements of the Norse god Odin and Nordic folktales of a magician who punished naughty children and rewarded good children with presents, led to the characterization of St. Nicholas as the figure that we know today. The saint’s name was corrupted into Santa Claus in English-speaking countries.

“Alas! How dreary would be the world if there was no Santa Claus!”

Francis P. Church, publisher and editor

Santa is invariably pictured as a large, old, and white-bearded man, wearing a red coat with a white collar and cuffs, with a matching red hat and black belt and boots; this depiction of him originated with images drawn by cartoonist Thomas Nast for Harper’s Weekly in 1863. A tradition that began in 1822 with Clement Clarke Moore’s poem, “A Visit from St. Nicholas,” has Santa Claus living in the North Pole (the region believed to have been inhabited by Odin). Santa is also said to live in the company of elves (in Norse mythology, elves are ancestral spirits), who spend the year preceding Christmas Eve making toys.

The popularity of Santa Claus in the celebration of Christmas shows no sign of abating. He is the subject of numerous Christmas films, songs, and Christmas-themed commercials. On the Internet, the North American Aerospace Defense Command makes a point of “tracking” Santa annually. JE

354

Christmas

Ancient Rome

The Christian commemoration of the birth of Jesus Christ

A miniature depiction of the Nativity, showing the baby Jesus, Mary, Joseph, and an angel, is enclosed by an initial letter “G” in an Italian choral manuscript of the fourteenth century.

The birth of Jesus Christ has been celebrated in December since the second century CE, but it was not until 354 that Bishop Liberius of Rome finally declared December 25 to be the event’s official date. “Christmas,” the English term for the celebration, means “Christ’s Mass,” the mass for the celebration of Christ’s birth.

“The supernatural birth of Christ, his miracles, his resurrection, and ascension remain eternal truths, whatever doubts may be cast on their reality as historical facts.”

David Friedrich Strauss, theologian

There are two main theories regarding its celebration specifically on December 25: first, that the date of Christmas was set in opposition to the Roman Feast of the Invincible Sun, which was celebrated on the 25th, the original date of the winter solstice; and second, that the birth date of Christ was calculated to be exactly nine months after March 25, the date on which early Christians believed he was conceived. Whatever its origin, the observance on December 25 had spread throughout the majority of Christendom by the end of the fourth century. Christmas became a public holiday in Rome in the early sixth century on the order of Emperor Justinian.

Most Protestant sects retained the celebration after the sixteenth-century Protestant Reformation, although Puritans tried to stop the celebration of Christmas in the seventeenth century. The restoration of the English monarchy in 1660 led to a reformed, more secular celebration.

The Christmas tree is a relatively recent innovation, having originated in Germany as late as the sixteenth century when fir trees were decorated with fruits, candles, and tinsel. Christmas greeting cards originated in the nineteenth century in England, while Christmas presents may have originated with a New Year exchange of gifts in pagan Rome. The figure of Santa Claus was an amalgamation of features of a Christian saint, Nicholas of Myra, with elements of the Germanic god Thor, whose home is in the polar regions. Christmas is one of the most important celebrations in the Christian world, and its effect on the Western cultural calendar has been profound. JE

397

Canonization of the Bible

Christian Church Fathers

The official sanctioning of the writings of the Old and New Testaments

An early twentieth-century German engraving shows Christ holding Genesis, the first book of the Old Testament, and Apocalypsis (Revelation), the last book of the New Testament.

The first 200 years of Christianity’s development were marked by a determined attempt to develop an orthodox canon of texts. The Roman Catholic church fathers were obliged to consider many competing and often contradictory schools of thought, all of them forcefully claiming their interpretations of various scriptures to be the truth. The collection of texts that now makes up the New Testament was determined over the course of a number of councils (notably, the Councils, or Synods, of Carthage in the third, fourth, and fifth centuries). Significantly, the philosopher and theologian Augustine of Hippo (354–430) had an active hand in the development of what was to become the officially sanctioned Christian canon. The official biblical canon was declared at the Council of Carthage held in 397, and inevitably its decisions to include certain scriptures and exclude others were considered highly controversial.

“The whole canon of the Scriptures, however, in which we say that consideration is to be applied, is contained in these books … because of the authority which is deservedly accredited to them.”

St. Augustine of Hippo, Christian Instruction (c. 397 CE)

Despite declaring the official canon, the Roman Catholic Church did not draw permanent official canonical boundaries until the Protestant Reformation in the sixteenth century. The German monk and theologian Martin Luther (1483–1546) published a distinctive new canon, today the official canon of the Lutheran church, which notably removed the books Hebrews, James, Jude, and Revelation from the biblical canon. These texts were excluded because they contradicted essential Protestant doctrines, including sola gratia (the belief that salvation occurs through divine grace alone) and sola fide (the belief that salvation comes through faith alone, rather than through good works). The creation of a biblical canon was simultaneously a key element of the unification and consolidation of the power of orthodox Christianity and a key mechanism by which heterodox scriptures, traditions, and movements were excluded. JE

397

The Confessions

Augustine of Hippo

A foundational work in the field of Christian apologetics

The frontispiece to an eleventh-century edition of Confessions by St. Augustine, first published in 397.

Written around 397, the Confessions of Augustine of Hippo (354–430) is a foundational work of Christian apologetics; it also comprises perhaps the world’s first autobiography, being an essentially confessional account of an individual’s return to God from a life of (largely sexual) sin. Augustine’s perspective, in which Neo-Platonic philosophies and ideals are merged into an early Christian framework, is that Christ, being God made man, is the bridge between sinful humanity and a perfect God.

Much of the Confessions is a personal story of conversion, with the final books, or parts, containing philosophical reflections on topics within philosophy and theology. For Augustine, “sin” is equated with the pursuit of temporal pleasures, while redemption and a return toward God can be accomplished through the free exercise of the will in pursuing “eternal” goods, such as wisdom and obedience to God. Later in his life, Augustine increasingly emphasized the role of God’s grace, accessible to humanity through the agency of Christ, in increasing the capacity of the will to choose God and refuse sinful temptations.

“But my sin was this, that I looked for pleasure, beauty, and truth not in Him …”

Augustine of Hippo, Confessions (397)

The Confessions was deeply influential in Western theology and philosophy, as well as in the development of what became canonical Christian beliefs and interpretations. The work has also had a profound effect on common orthodox Christian beliefs, particularly the association of temporal pleasures with sin, and of virtue with a rejection of temporal (carnal) pleasures—exemplified by the celibacy of its clergy required by some branches of Roman Catholicism. JE

c. 400

Just War

Augustine of Hippo

Waging war is morally justified if it meets specific moral, religious, or political criteria

Is war ever moral, just, or ethically defensible? Rules and limitations on warfare are as old as warfare itself, but it was the philosopher and theologian Augustine of Hippo (354–430) who paved the conceptual way for countless future conflicts when he identified the notion of a “just” or morally justifiable war.

While earlier thinkers, such as Aristotle and Cicero, also wrote about the morality and rules of warfare, Augustine was the first person in the Christian tradition to present the concept of a “just war.” Augustine believed that those acting on the authority of God could justly engage in a war, even though it would naturally lead to killing, violence, and destruction that otherwise would be immoral. In accepting the concept of just war, Augustine was taking a moral position that was in stark opposition to the pacifist teachings of many Church leaders.

“A just war is in the long run far better for a man’s soul than … prosperous peace.”

Theodore Roosevelt, U.S. president 1901–09

Augustine argued that legitimate authorities could morally engage in warfare in some situations, but the task of identifying the specific conditions of a just war was left to later thinkers, such as St. Thomas Aquinas (1225–74). The conditions originally included acting from a legitimate authority, acting with the right intention, and having a just cause, though they have since been expanded upon and refined by others.

After Augustine, leaders no longer had to choose between the normative restrictions of pacifism and the morally neutral approach of realism. The concept of just war implied that organized, intentional, widespread conflict—with all its horrors—could be a force for good, and waging it a moral act. MT

c. 400

Against Original Sin

Pelagius

The unorthodox view that Christian believers are untainted by Adam’s sin

The Christian doctrine of original sin holds that Adam’s sin in the Garden of Eden is passed down through human generations, and thus human beings are inherently sinful and need divine grace to regain God’s favor. However, the Christian monk Pelagius (354–420) taught that human will is entirely free and therefore as capable of choosing good as evil. Divine grace, for Pelagius, is thus merely an aid in choosing goodness, and is not necessary for salvation, which can be accomplished by will alone.

Pelagius’s arguments led him to a series of radical conclusions. He argued that Adam’s sin was personal, and it would be unjust of God to hold all humanity accountable. Since all are born without sin, baptism has no purpose, and all children who die without the Sacrament go to heaven. Moreover, Jesus’s sacrifice, cornerstone of Christian belief, had no substantive effect; his function is to set an example of how to live.

“That we are able to do good is of God, but that we actually do it is of ourselves.”

Pelagius

Pelagius’s arguments against original sin effectively challenged the core of what came to be orthodox Christianity. Orthodox apologists decried Pelagius’s work once it became known, and he was put on trial for heresy at the Synod of Diospolis in 415. However, he denied or offered orthodox interpretations of his works and was found innocent of heresy. After a series of further political controversies, the Emperor Honorius expelled Pelagian leaders from Rome in 418. Pelagius’s works provided a substantive challenge to orthodox Christian beliefs, but the challenge was resolved primarily by the coercive power of the church, and so is largely forgotten today. JE

c. 400

Kama Sutra

Vātsyāyana

A Hindu manual offering advice on life, love, and lovemaking

Erotic carvings at the Temple of Khajuraho, India, built between 950 and 1050.

The Kama Sutra is an ancient Sanskrit text that details teachings on the subject of seeking pleasure in life, and predominantly features recommendations for intimate and sexual pleasure. The text, compiled in the early fifth century by the Hindu philosopher Vātsyāyana, is intended to provide advice for living a good life, finding and practicing love, and making the self and a partner happy.

According to traditional Hindu teachings, life has three main goals. These goals include living virtuously, attaining prosperity, and realizing pleasure. The Kama Sutra is explicitly aimed at the third of these goals, even though the first two goals are considered to be more important for living well. Written as a manual, the Kama Sutra details the ways in which the goal of realizing pleasure should be pursued, with the highest pleasures in life coming from sexual relationships and practices. The Kama Sutra is also a manual for pursuing love more broadly, and contains advice on marriage, the duties of spouses, and how to interact with past lovers.

“As variety is necessary in love, so love is to be produced by means of variety.”

Vātsyāyana, Kama Sutra (c. 400)

Sexual liberation is often considered a modern idea, but the Kama Sutra proves that openly thinking and writing about sex is as old as recorded history. The Kama Sutra promotes the idea that pleasure is a fundamentally good human experience, and provides permission and advice for making pleasure better. At the intersection of religious tradition and sexual urges, the Kama Sutra encourages us to be mindful of the pleasure that we should be bringing to others and to ourselves. TD

c. 400

Intellect and Will

Augustine of Hippo

Two qualities that offer the means toward redemption from sin

Portrait of St. Augustine of Hippo, by Cecco del Caravaggio (active c. 1610–20).

The philosopher and theologian Augustine of Hippo (354–430) developed an influential theory of intellect and the will largely in response to the apparent contradiction in Christian belief in a God who simultaneously created and governs the world yet who is not responsible for evil. Augustine proposed that intellect and will are simultaneously the source of “sin” and the means toward redemption. For Augustine, the will is perpetually free and thus capable of the choice of either sin or redemption in Christ; while human nature encompasses sensory capabilities and desires, it also contains intellect and will.

For Augustine, evil involves the pursuit of temporal things, such as carnal pleasure, instead of eternal things, such as wisdom and reunion with God through Christ. The senses perceive elements of the external world and translate data to the intellect, which identifies its potential options; the free will then makes its choice. Augustine’s later works have greater emphasis on the need for “grace” to aid the will in choosing against sin.

“Mind is mind and hand is body. The mind orders the mind to will.”

Augustine of Hippo, Confessions (397)

Augustine’s views on the nature of humanity and the place of intellect and free will were of critical importance until the Enlightenment in the seventeenth and eighteenth centuries. During that period, instead of humanity being characterized by three elements—a divine yet fallen soul (which sets humanity above and apart from nature), reason capable of knowing truth, and free will—humanity was understood as being fully natural, with only limited reason (and thus intellect), and without free will (although the degree of lack of freedom was subject to argument). JE

c. 400

The Free Will Defense

Augustine of Hippo

God allows evil because to prevent it would deny humanity the means to act freely

One argument against God’s existence is the “problem of evil,” where “evil” is understood to refer to suffering and injustice. If a being is all-powerful, it can eliminate any evil that exists; if it is all-knowing, it will know about any evil that exists or is about to exist; and if it is all-good, it will, presumably, want to eliminate all the evil it can. Since there is evil in the world, it follows that it is impossible that God exists. This argument had special significance for early Christians attempting to defend their fledgling religion from a variety of intellectual competitors, including Manicheans and gnostics. Augustine of Hippo (354–430) offered perhaps the earliest rebuttal with his “Free Will Defense.”

According to Augustine, evil has two sources: free individuals, who voluntarily choose to harm others, and God, who justly punishes sinners. He rejects the claim that God would want to eliminate all evil because it assumes that God has no good reason for allowing creatures to act freely. If God wants creatures to respond freely to His overtures of grace, it would be impossible to prevent them from acting against His wishes, since doing so would override their freedom. Furthermore, in accordance with Augustine’s understanding of original sin, all humans voluntarily sin against God, and thereby deserve punishment (because God is just).

“[The will’s] failings are justly punished, being not necessary, but voluntary.”

Augustine of Hippo, City of God (early fifth century)

The Free Will Defense belongs to a family of arguments known as theodicy, which attempt to absolve God of wrongdoing for allowing evil. The efficacy of the defense continues to be contested by philosophers such as Alvin Plantinga, William Rowe, and Stephen Wykstra, and the discussion remains popular. JW

The Middle Ages

500–1449

A Catalan atlas from c. 1375, showing merchants travelling to Cathay. It was around this period that merchants first began organizing themselves into guilds.

Historians once referred to much of this period in Western civilization as the Dark Ages, a moniker that was applied not only because of the apparent economic and cultural stagnation during this time, but also because of the perceived dearth of ideas. Historians today agree that there were numerous innovative ideas that emerged from the Middle Ages and, in fact, many of them were crucial for developments in areas such as mathematics, cosmology, the philosophy of the person, and natural rights. Can God create a boulder too heavy for Himself to lift? The answer to this question posed by Dionysius the Areopagite in c. 500 is incredibly important to the Judeo-Islamic-Christian idea of God—and debate over it still occurs today.

c. 500

Paradox of Omnipotence

Pseudo-Dionysius the Areopagite

Could God create a boulder that is too heavy for Him to lift Himself?

Marble statue of Atlas kneeling with the celestial sphere on his shoulder, known as “Farnese Atlas.” It is a Roman copy of the second century CE, after a Greek original.

The “paradox of omnipotence” is a family of verbal paradoxes intended to establish the boundaries of the concept of an omnipotent being, or God. The general form of the paradox proposes that, if a being can perform any act (a trait that is itself part of the definition of omnipotence), then that being should be able to create a task it cannot achieve; it follows that it cannot perform all possible tasks. Simultaneously, if such a being cannot create a task it cannot perform, then a task must exist that it cannot perform.

“For, the denial of Himself, is a falling from truth … and the falling from the truth is a falling from the existent … Almighty God cannot fall from the existent.”

Pseudo-Dionysius

The first form of the argument is traced to Pseudo-Dionysius the Areopagite (fl. c. 500), a Christian theologian so named because his work was at first erroneously attributed to Dionysius the Areopagite, an Athenian mentioned in the Bible who was converted by a speech of St. Paul. Pseudo-Dionysius the Areopagite was primarily concerned with the question of whether it was possible or not for God to “deny himself.”

A popular version of the omnipotence paradox is often referred to as the “paradox of the stone,” which may be expressed as, “Could God create a boulder that is too heavy to lift himself?” Answers to such formulations of the paradox tend to center on how “omnipotence” itself should be defined. For example, “omnipotence” may not mean that God can do anything whatsoever, but rather that God can do anything possible within God’s own nature. God’s power is thus limited by an inability to perform logical absurdities, such as making a square a circle. When two concepts are logically incompatible, it is not possible by definition for God to perform them. As a logical impossibility, the creation of a boulder that is too heavy to lift thus falls outside the definition of omnipotence. The paradox of the stone is among the most popular expressions of the paradox of omnipotence, and is among the most ancient types of critique of the Judeo-Christian God. JE

c. 550

Human Flight

Emperor Kao Yang

The search for a method to enable humans to fly

Drawing of a flying machine (c. 1490–1516) by Leonardo da Vinci. Da Vinci designed a number of such devices, but there is no evidence that he ever attempted to build any of them.

Ancient myths and legends are full of tales of humans having the ability to fly, and the impulse to do so has likely been a part of human history since people first looked at the birds and imagined having the power of flight. The earliest known record of a human taking flight comes from China, where Emperor Kao Yang (r. 550–559) is said to have sentenced captive enemies to death by strapping them to large kites and sending them aloft; their relatives were allowed to control the ropes and at least one such unfortunate is said to have survived. Prior to that, Kao Yang would have bamboo wings strapped to the arms of captives and have them thrown from the tops of towers; all are said to have perished. By the time Marco Polo reached China in the thirteenth century, a fairly common practice before a ship set sail was to send a man aloft on a kite and assess his flight; if the man shot straight up into the sky, it was deemed that the voyage would be successful.

“Once you have tasted flight, you will forever walk the earth with your eyes turned skyward, for there you have been, and there you will always long to return.”

Leonardo da Vinci, artist and inventor

By 1250 English philosopher Francis Bacon had described a theoretical mechanical flying machine, and Italian inventor and polymath Leonardo da Vinci went on to create several flying-machine designs. Then, in 1783, two Frenchmen, François Laurent d’Arlandes and Jean-François Pilâtre, took the first flight in a lighter-than-air hot-air balloon. A mere 120 years later, in 1903, Orville and Wilbur Wright became the first to fly in a powered aircraft.

The idea of flight, even when it was still only that, has captivated humanity like no other. Ancient tales of flying carpets, winged dragons, and soaring creatures of all types testify to the strength of the fantasy, and the realization of the dream did little to lessen its appeal. Aircraft-borne flight is now a commonplace occurrence, a method of travel readily available and capable of bridging all corners of the globe, yet being able to fly unaided remains a fantasy capable of spurring inspiration and daydreams. MT

c. 600

Ladder of Divine Ascent

John Climacus

The method by which monks and hermits can achieve holiness

A twelfth-century illustration of “The Heavenly Ladder,” contained in an instruction book for monks.

Traditional Byzantine churches are often decorated with depictions of the “Soul-saving and Heavenward Ladder,” otherwise known as the “Ladder of Divine Ascent.” The artworks are associated with a literary work authored by John Climacus, a seventh-century monk at the monastery on Mount Sinai in Egypt. His book, The Ladder of Divine Ascent, describes thirty stages of spiritual development that lead the believer to theosis or salvation, the goal of spiritual struggle.

The thirty stages of spiritual development are symbolized by a ladder, and the thirty chapters of Climacus’s book each correspond to a step on the ladder. Groups of steps represent either virtues that must be attained or vices that must be avoided, to be replaced by virtues, if the striver is to be successful.

“Do not be surprised that you fall … do not give up, but stand your ground.”

John Climacus

Climacus’s work was intended for his fellow monks and holy men, and its thirty steps were intended to be followed one after the other in sequence. His readers are cautioned to be vigilant—at every step there are evil powers seeking to pull the striver from the ladder. Upon reaching step thirty at the top of the ladder, which is titled Faith, Hope, and Charity, the monk can expect to receive from Christ the crown of glory.

While there are traces of the ideas of earlier ascetic writers in Climacus’s work, the views he expresses are ultimately his own. His proposal that salvation may be reached by climbing a ladder while fending off agents of evil was greatly influential on later Greek ascetic writers, and was popular in Slavic countries for centuries. The Ladder of Divine Ascent remains one of the classics of ascetic Christian literature. JE

c. 610

Jihad

Muhammad

A struggle undertaken by Muslims to uphold the cause of God

The term “jihad” is an Arabic word meaning “to endeavor, to strive, to struggle,” and is generally used to mark an effort toward a commendable aim. It occurs forty-one times in the Koran, which was revealed to Muhammad from c. 610. In the context of religion, jihad can mean the struggle against personal sin or evil inclinations, efforts to improve society morally, or the act of spreading Islam in general. Jihad can be peaceful (spreading Islam through writing, for example) or through force (the “jihad of the sword”). In mystical Islam, the more peaceful modes of jihad are emphasized; inner spiritual struggle is the “greater jihad,” while the jihad of the sword is considered the “lesser jihad.”

“To fight against the infidels is Jihad; but to fight against your evil self is greater Jihad.”

Abu Bakr, companion of Muhammad

The term “jihad” is thus not synonymous with the concept of a “holy war,” but denotes a much wider range of goals and the means by which those ends are pursued. However, about two-thirds of the instances of the term jahada in the Koran do reference warfare. The ultimate aim of this warfare is subordination of unbelievers to Islam, understood in the political way of extending Islamic geopolitical rule over the globe, not in the forced conversion of conquered peoples. Moreover, Koranic passages on jihad suggest that the command to fight unbelievers was not unconditional, but contingent upon being provoked by them or threatened by aggression. An understanding of the complexity of jihad as a religious imperative is essential to understanding the responses of practitioners of Islam to the expansion of capitalism into the Middle East, and that of Western state powers all over the globe. JE

c. 610

The Five Pillars of Islam

Muhammad

Five basic duties that should be carried out by every Muslim

A detail of an Alhambra vase (c. 1300) decorated with the hand of Fatima; the fingers represent the Five Pillars.

The Five Pillars of Islam are attributed to Muhammad (c. 570–c. 632), who is said to have received them from the angel Gabriel. Recorded after Muhammad’s death, they are accorded the status of hadith (a saying or act that Muhammad is confirmed to have said or done).

“Verily, the prayer is enjoined on the believers at fixed hours.”

The Koran, An Nisa 4:103

The first and most important pillar is the shahada, or Islamic creed. All Muslims are required to profess, “There is no god but God, and Muhammad is his prophet.” The creed ideologically separates Muslims from non-Muslims, firstly by separating monotheists from non-monotheists and secondly by separating monotheists who believe Muhammad was God’s prophet from those who do not (such as Jews and Christians). The second pillar is the salat, or Islamic prayer. Probably appropriating the notion from Persian Zoroastrianism, Islam requires all believers to pray five times a day (dawn, noon, afternoon, evening, and night). The third pillar is zakat, or the practice of alms-giving. Well-off Muslims are expected to give 2.5 percent of their wealth to the poor, particularly poor Muslims (zakat is often built into taxation systems in Islamic-controlled countries). The fourth pillar is sawm, or fasting, especially during the month of Ramadan (abstaining from food and drink from dawn till dusk). The fifth and final pillar is hajj, which is pilgrimage to Mecca—to be undertaken at least once by every Muslim who is able-bodied and wealthy enough. Although some sects of Islam, such as the Twelvers and Ismailis, have additional duties required of true Muslims, all Muslims agree on the Five Pillars, which guarantees their influence. AB

c. 618

Kōan

Chan Buddhism, Tang Dynasty

The notion that study of the paradoxical can promote enlightenment

The term kōan literally refers to a “public record” of the teachings used by Zen masters to help awaken their disciples. Kōan began as the study of gongan as developed in the Chinese Chan school of Buddhism sometime during the Tang dynasty (618–907). When Chan was transplanted to Japan in the twelfth century and became Zen, gongan became known as kōan.

“We’re lost where the mind can’t find us, utterly lost.”

Ikkyu, Zen Buddhist monk and poet

Kōan typically take the form of puzzling questions, statements, or tales that serve as objects of meditation. Examples such as “What is the sound of one hand clapping?” or “If you meet the Buddha on the road, kill him!” may appear nonsensical or paradoxical. One kōan goes so far as to relay the story of the Zen master Nansen chopping a living cat in half to resolve a dispute between two monks who were arguing over the animal. Ultimately, a kōan is designed to expose the flaws in a student’s objective, discursive methods of reasoning, thereby pushing the student to a new mental schema characterized by a heightened awareness that is closer to enlightenment. Rinzai Zen master Hakuin argued that the purpose of a kōan is to awaken a “great doubt” (daigi) in a student about the paradigm he or she uses to interpret the world, which ultimately results in a “great death” (daishi) in which the old beliefs fall away and the kōan is resolved by a new level of understanding. Kōan study is thus a type of metacognition, since it pushes one to be mindful of the way one thinks about and experiences the world. Kōan study was once practiced universally in Zen schools, although the Rinzai sect now emphasizes it to a much greater degree than the Sōtō school. JM

622

Islamic State

Muhammad

The concept of a divinely guided Islamic government with Allah as head of state

The idea that a state can be organized and governed along purely Islamic principles originated with Muhammad (c. 570–c. 632) himself, who in 622 established the first khilāfa or caliphate (an Islamic religious community headed by a religious leader). An Islamic state is ideological in nature, and those citizens governed by it are divided into two distinct groups: those who believe in and follow Islam, and those who do not. Today, states following the khilāfa model include Mauritania since 1958, Iran since its Islamic Revolution in 1979, Pakistan, Afghanistan, and Saudi Arabia. Not all Muslims, however, approve of Islamic states, with some clerics arguing that they violate one of the guiding principles of Islam: that divinity and politics should never become entwined. Such clerics are in the minority, however; most Muslims hold the view that it is sinful not to believe in the Islamic state.

“Not to accept Sharia … such [an Islamic] society breaks its contract with God.”

Abul Ala Mawdudi, theologian and Muslim revivalist

The drafting and enacting of government legislation within an Islamic state rests solely with those who adhere to Islam. Muslims may not copy polytheists (those with a belief in more than one god) in either dress or behavior, or support them in any way that may give them prominence over Muslims. Islamic states derive their legitimacy from Allah Himself, who is sovereign over all and whose law of Sharia is supreme. Sharia provides the Islamic state with the framework it requires to develop a civil society and the organizations and institutions that comprise it, and to shape the common objectives of government: administration, law enforcement, defense, and the material wellbeing of its citizens. BS

c. 710

Shintōism

Japan

A religion founded on the idea that all things have an inherent spirituality

A sixteenth-century illustration of a Shinto shrine located in the city of Ise, Japan.

Shintōism, the indigenous religion of Japan, was first documented in the historical records Nihon Shoki and Kojiki in the eighth century CE. Its beliefs and practices were formalized during the Nara and Heian periods (710–1185) and are still popular throughout Japan.

Shintō literally means “way of the kami,” the latter term referring to the divine spirits said to inhabit Japan. Kami can manifest as natural forces, animistic presences in holy sites, or highly developed living human beings. Because all things in nature, including people, were created by the gods Izanagi and Izanami, they manifest an inherent divinity that can be cultivated and appreciated. The emperors of Japan trace their lineage back to the first in their line, Jimmu Tennō, who was descended directly from the sun-goddess Amaterasu. Shintō shrines are typically constructed to commemorate a departed person or a natural feature that exhibits kami. Ethically, Shintō practitioners strive to purify themselves by cultivating virtues such as michi (integrity), harai (purity), makoto (sincerity), wa (harmony), akai (cheerfulness), kansha (thankfulness), kenshin (spiritual surrender), and meiyo (honor). Rites are performed at shrines to purge practitioners of kegare (uncleanness) and tsumi (iniquity).

“A major part of the world’s goodness lies in its often unspeakable beauty.”

Yukitaka Yamamoto, shrine guardian priest

Most Japanese citizens consider themselves to be Shintō, even though only a minority regularly practice the religion. The Jinja Honchō oversees more than 80,000 shrines in Japan, the holiest of which is Ise Jingū. Shintō beliefs are so pervasive that they have influenced every aspect of Japanese culture, art, and philosophy, including Japanese Buddhism. JM

c. 726

Iconoclasm

Emperor Leo III

The belief that images hinder relationships with God, so they should be destroyed

A twelfth-century monastery in Cappadocia, Turkey, where Christians in favor of icons took refuge.

Iconoclasm can be described as the willful destruction of works of art, particularly those representating aspects of religious beliefs or personages associated with them. The destruction is motivated by the idea that such images block a direct relationship between the faithful and God because they are liable to become objects of adoration in themselves.

There exist a number of simplifications and misunderstandings regarding iconoclasm. It may be regarded as only an act of pure destruction or vandalism (such as the actual smashing of images), or destruction grounded in hatred for or fear of such images, or as the monolithic expression of the will of all believers of a religion. Iconoclasm is more complex and multifaceted than these depictions would allow. In Christianity, biblical prohibitions against worshipping false idols and false gods have given massive support to iconoclastic movements throughout history, especially in territories conquered by Christian armies.

The Byzantine Iconoclasm refers to a period of the Byzantine Empire during which the emperor, Leo III the Isaurian (c. 685–741), banned religious images; his edict resulted in a large-scale campaign in which religious images and artifacts were widely destroyed. The First Iconoclasm, initiated by Leo III, lasted from 726 to 787, after which there was a lull before his successor, Leo V the Armenian (775–820), alarmed that God appeared to be signaling his displeasure with the empire by allowing continuing military failure, instituted the Second Iconoclasm, which lasted from 814 to 842.

Iconoclasm has persisted through the ages and within different cultures. Perhaps the most notorious example of recent times occurred in 2001, when members of the Taliban in Afghanistan used explosives to destroy the two Buddhas of Bamiyan (sixth and seventh century), which were the largest standing statues of Buddha in the world at that time. JE

c. 750

Perpetual Motion

Bavaria

A movement that could continue indefinitely without an external source of energy

A design for a magnetic perpetual motion machine (c. 1670) by John Wilkins, the bishop of Chester.

Laws of physics dictate that a “perpetual motion machine” able to produce as many or more outputs (in terms of energy or productivity) than its inputs is an impossibility. Even so, throughout history, numerous attempts at creating a machine capable of perpetual motion—motion that would continue effectively forever without external power or energy—have been made. But as far as we know, all have failed.

The first such attempt was the Bavarian “magic wheel,” invented in the eighth century and consisting of a spinning wheel that was powered magnetically. The magic wheel was supposed to be capable of spinning indefinitely, but it could not overcome the opposing force of friction.

While other attempts to create a machine capable of perpetual motion were equally as unsuccessful, they were the cause of scientific inquiry that in a few cases led to inventions still significant today. The mathematician Blaise Pascal (1623–62), for example, tried to create a successful perpetual motion machine, and by accident hit upon what would become the roulette wheel.

In 1775, finally exasperated by so many failed attempts, the Parisian Royal Academy of Sciences declared that it would admit no further proposals for perpetual motion machines. Strangely enough, this official declaration of failure produced a significant rise in attempts to create a perpetual motion machine. The nineteenth-century invention of electric generators was to have a similar invigorating effect.

Of the attempts still seen today, some are based on the model of the “overbalanced wheel,” first drawn in 1235 by the artist Villard de Honnecourt. In hopes of overcoming the force of gravity, such designs attempt to attain perpetual motion through the carefully controlled movement of interacting, shifting weights intended to spin the wheel indefinitely. JE

c. 800

Feudalism

Europe

A system for structuring society, based on land ownership by lords

Feudalism was an economic system characteristic of Europe from the ninth century until the fifteenth century and primarily characterized by the ownership of lands by lords, who also exercised governmental powers. Landless peasants, known as serfs, were permitted to work the land for themselves, for which “privilege” their productive “surplus” was coercively extracted and paid to the lords, either as rent or in the form of forced labor on the lords’ private lands. The lords, in turn, were obligated to provide protection from marauders and justice whenever a serf should make a claim, and to guarantee serfs access to common lands. The lords were often themselves subject to a king or queen, to whom they owed loyalty; in the event of war they were obliged to provide serfs for military service.

“Landlords, like all other men, love to reap where they never sowed.”

Adam Smith, The Wealth of Nations (1776)

The term “feudalism” is derived from the term “fief,” which was land “given” under certain conditions (a series of obligations, privileges, and duties) by a lord to their vassal in return for the vassal’s loyalty and acceptance of obligations to the lord. A number of factors contributed toward the decline of feudalism, including the Black Death in the mid-fourteenth century (which severely decreased the population of Europe), peasant revolts, the development of an increasingly powerful merchant class, and the increasing ability of kings to enlist mercenary armies rather than rely on their lords for conscripts. Feudalism, with its relatively uncomplicated and easily understood relations between kings, lords, and serfs, has greatly influenced the fantasy genre in novels, movies and television, and video games. JE

c. 800

Bushidō

Japanese warrior class

A set of ideals, rules, and ethical principles followed by the samurai in Japan

A portrait of the seventeenth-century warrior Yoshio Oishi, who was held as an example of an ideal samurai.

In Japanese, the word bushidō translates as “the way of the warrior.” This code of conduct guided the Japanese warrior class, the samurai, who, from about 800, provided armed support for rich landowners excluded from the imperial court by the Fujiwara clan. The ideals of Bushidō informed not only the samurais’ actions in combat, but also their day-to-day lives. Bushidō taught the samurai how to cultivate their martial spirit and become proficient with various weapons, and directed that a samurai’s personal honor demanded that he be ready to lay down his life in loyalty to his feudal lord.

However, it was not until the ascendance of the Kamakura shogunate (1192–1333) that Bushidō became a driving force behind the actions of leading Japanese historical figures. By the time of the Tokugawa period (1600–1868), the tenets of Bushidō had been formally recognized and codified into law. The samurai class was abolished during the Meiji restoration of the mid to late nineteenth century, but the Bushidō code lived on, albeit in a different form.

Загрузка...