“The way of the samurai is found in death …. Be determined and advance.”
Yamamoto Tsunetomo, samurai
By the time of World War II, the samurai class no longer existed, but its principles of loyalty, duty, honor, and sacrifice had strong adherents in the Japanese military. In postwar years, many of the tenets of Bushidō remained firmly rooted in Japanese culture and society as the nation made its transition into a major industrial and commercial power. The Bushidō principles expounded by the samurais of old remain strong in contemporary Japanese society today and can be seen in many aspects of Japanese life, not least in Bushidō martial arts such as judo and karate. MT
c. 800
Romanesque Architecture
Europe
An architectural style that draws inspiration from the buildings of ancient Rome, characterized by semicircular arches
The Palatine Chapel in Aachen Cathedral, Germany, built in c. 800. The chapel holds the remains of Charlemagne and was the site of coronations for 600 years.
The idea that architecture should once again draw inspiration from the style of the Romans, with its round-arched windows and doorways, dates from around the year 800 and the palace of Emperor Charlemagne (742–814) at Aachen, Germany. Today, the only part of that palace still to be seen is the Palatine Chapel, now incorporated into Aachen Cathedral. The chapel also features the barrel and groin vaults characteristic of Roman buildings and exemplifies the deliberate adoption of Roman forms by Carolignian architects.
“After about the first millennium, Italy was the cradle of Romanesque architecture, which spread throughout Europe, much of it extending the structural daring with minimal visual elaboration.”
Harry Seidler, architect
When Romanesque architecture fully became the major medieval style of Western European architecture between 950 and 1150, its characteristics were drawn from proto-Romanesque structures, which had dominated over the preceding two centuries. Architectural historians consider the years from 950 to 1050 as a period in which the style was still developing, with the monuments most characteristic of Romanesque architecture emerging between 1050 and 1150. Romanesque architecture was deeply influenced by feudalism and monasticism, which together provided the grounds for stability within medieval culture.
Romanesque buildings varied widely while sharing common characteristics, with architects in different regions finding different solutions to problems related to structure. Historians, taking account of complicated regional variations and the characterization of the proto-Romanesque period, continue to debate the exact start of the period of Romanesque architecture, but the style certainly concluded in the twelfth century with the transition into the Gothic period.
Much later, during the Gothic Revival of the nineteenth century, Romanesque architecture was also to receive a partial rebirth. For example, London’s Natural History Museum, designed by Alfred Waterhouse, is a Romanesque revival structure. JE
c. 803
Restaurant
Austrian monks
The concept of an establishment designed solely for serving food
The idea of an establishment dedicated to serving food dates back as far as ancient Rome and Sung dynasty China, while many historians credit eighteenth-century France for originating the truly modern restaurant (the name derives from the French word restaurer, meaning “to restore”). The oldest true restaurant in Europe, however, is said to be the Stiftskeller St. Peter, which is housed within the walls of St. Peter’s Archabbey in Salzburg, Austria. Mentioned in a document as far back as 803, it is believed to be the oldest continuously operating restaurant in the world. Today, diners there may enjoy their food to the accompaniment of daily live performances of the music of Mozart.
The predecessors of what we would call restaurants, establishments in which food, alcohol, and other beverages were permitted to be sold, tended to be inns (mostly for the benefit of travelers rather than local people) and street kitchens. The earliest French “restaurants” were highly regulated places selling what were called restaurants, meat-based consommés specifically intended to “restore” an individual’s strength. The first French restaurant of this type was likely that of a Parisian seller of bouillon (broth) named Boulanger, who established his restaurant in 1765. Such places were first developed into what has become the modern restaurant industry during the French Revolution (1789–99), when the tight grip that the guilds had on the supply of certain foods was loosened, and at the same time a larger consumer base for restaurants was established.
More than 200 years later, the restaurant has become a staple of human culture. With a rise in affluence comes a desire to pass to others the chores of buying and preparing food, and in a world increasingly fascinated by fine dining it is unlikely that the idea of the restaurant will ever fall out of fashion. JE
831
Transubstantiation
Paschasius Radbertus
The belief that Holy Communion wafers and wine change into Christ’s body and blood
The Eucharist—a ceremony that functions as a re-enactment of the Last Supper, in which participants drink a sip of consecrated wine and eat a small piece of consecrated bread—was central to Christianity from the religion’s beginnings. The concept of transubstantiation, however—the belief that the bread and wine taken during the Eucharist change miraculously into the body and blood of Jesus Christ at the moment of consecration by the priest—did not appear until the ninth century.
In 831 the Benedictine abbot Paschasius Radbertus (c. 785–860) wrote a treatise titled “On the Body and Blood of the Lord.” In it, he asserted that “the substance of bread and wine is efficaciously changed within into the flesh and blood of Christ, in such a way that after the consecration the true flesh and blood of Christ is truly believed [to be present].” Radbertus’s view was initially met with some resistance—notably from Ratramnus, a monk from the same abbey who wrote his own work titled “On the Body and Blood of the Lord” (c. 850)—but by 1079 it had been declared an official doctrine by a council held in Rome under Pope Gregory VII. The term “transubstantiation” was coined around the same time by Hildebert de Lavardin (c. 1055–1133), archbishop of Tours.
“Jesus took bread, and blessed it … and said, Take, eat; this is my body.”
The Bible, Mark 26:26
As a doctrine of faith, transubstantiation is fascinating for both its creativity and its undeniable oddity (to modern sensibilities at least). The sacrament of the Eucharist is designed to foster connectedness with God; through transubstantiation, God effectively becomes present in the mouths of all believers. JS
c. 850
Beowulf
England
The first European literary work not to be composed in Greek or Latin
The first page of the sole surviving medieval manuscript of Beowulf (c. 850).
The story of Beowulf is believed to have originated in the fifth century, existing first as an oral tale before being transcribed into a written work sometime between the eighth and eleventh centuries. The story made its way to the modern world through a single surviving manuscript that was likely written hundreds of years after the story was first told. It attracted little attention until the nineteenth century, when initial interest in the work focused on its historical insights into the lives of the early Anglo-Saxon people. The story’s value as a work of merit as English literature was recognized only when the author J. R. R. Tolkien (1892–1973) published a paper about its importance in 1936.
Many of the most heralded works of English literature, such as those by Geoffrey Chaucer, William Shakespeare, and John Keats, came and went before anyone took note of Beowulf’s existence, much less gave it enough thought for it to have any impact at all. However, the epic poem not only gives insights into pre-Norman Conquest English culture, but is the first European literary work not composed in Greek or Latin. Beowulf sheds light on the the heroic ideals of the Germanic Anglo-Saxon people and its theme has come to influence poets, artists, and writers around the world.
“Let him … win glory before death … that is best at last for the departed warrior.”
Beowulf (c. 850)
Beowulf tells the tale of a Scandinavian warrior, the eponymous hero, who defends a Danish king’s home from a monstrous intruder. After defeating the monster and its mother, Beowulf returns home to become a king over his own people, later dying after fighting a dragon. It is written in Old English, the Germanic language from which modern English descends. MT
c. 850
Gunpowder
China
A man-made explosive consisting of potassium nitrate, carbon, and sulfur
Although the exact date of gunpowder’s origination remains a mystery, by the year 850 Chinese alchemists had identified the basic formula for the explosive. Discovered by accident during a search for the elixir of life, gunpowder was composed of a combination of potassium nitrate, carbon, and sulfur. Gunpowder explodes when exposed to a suitable heat source, releasing a tremendous amount of heat in addition to rapidly expanding gases, and those explosive properties were harnessed in the development of pyrotechnics, rock blasting, and firearms.
By 904, military forces of the Song dynasty (960–1279) were using gunpowder in battles against the Mongols. It appeared in the Middle East sometime in the middle of the thirteenth century, and its use spread to Europe shortly thereafter. By the time the Ottomans were using siege guns at Constantinople in 1453, gunpowder and the weapons based upon it had become an integral part of the practice of warfare.
“By chance they discovered a strange mixture that burned and exploded …”
Clive Ponting, Gunpowder: An Explosive History (2006)
The influence of gunpowder over the course of human history has been overwhelming. While modern weapons use different formulations of gunpowder, the explosive powder changed warfare forever. Its destructive characteristics influenced everything from the development and shape of military fortifications to naval architecture and civil engineering. The invention also transformed the idea of what a warrior or soldier should be. No longer did a person have to be bigger, stronger, or more courageous than an enemy. Instead, all that was needed was a firearm, a good eye, and the ability to pull a trigger. MT
c. 900
Polyphony
Europe
A type of music that consists of two or more independent melodic voices
In the Western tradition, polyphony is usually distinguished from counterpoint or contrapuntal music, as exemplified by Bach’s fugues. The formal style of polyphony that originated in about 900 was largely associated with religious themes, but it is likely that polyphony as a style of informal musical presentation long predates such highly stylized musical forms of European cultural expression. “Polyphony” is actually a term with a variety of uses in the context of music and music theory. It can refer to the functional capacity of certain instruments, such as some keyboards, to play more than one note at a time, but it more often refers to a musical presentation involving two or more distinct but simultaneously played tones or even chanted melodies.
“ … polyphony was for the most part confined to special occasions …”
Studies in the Performance of Late Medieval Music (1983)
The layering and texturing of melody can be seen as representative of the human condition insofar as the tapestry of human life and its meaning is difficult to communicate as single strands. Polyphony maps onto the human mind in a way similar to the phenomenon of motivation: something that is complex and fragmentary forms a coherent whole that insists on being heard. Polyphony can be seen as a contribution to pluralism, to the appreciation of valuable complementary differences.
The joy and fervor with which music is frequently approached by practitioners is undoubtedly increased by their adding a polyphonic instrument to their repertoire. Polyphony engages the senses, and its complex tonal presentation certainly brings incomparable aesthetic benefits for the listener. JS
c. 1000
Vaccination
China
The administration of a safe material to promote immunity to a harmful organism
It was in Asia that humankind first realized that the likelihood of a person developing a dangerous disease could be reduced through a form of exposure to the disease that was safer than direct infection. In India, crushed, dried pustules of victims of smallpox were fed to children to help them develop antibodies to the disease. In China, around 1000, medical practitioners blew powdered smallpox pustule crusts into the nostrils of patients, which was a more effective means of encouraging the development of immunity. However, as long as actual smallpox pustules were used, there was a high risk of patients falling prey to the disease rather than developing immunity to it.
The practice of inoculation spread to Africa, Turkey, Europe, and the Americas, but it was English physicist and scientist Edward Jenner (1749–1823) who pioneered a safe smallpox vaccination. In 1796 he carried out an experiment on an eight-year-old, James Phipps, inserting pus taken from a cowpox pustule into an incision on the boy’s arm. Jenner then proved that Phipps was immune to smallpox. A year later, Jenner submitted his findings to the Royal Society but was told he required more proof. He continued his experiments and self-published his paper in 1798. He was met with ridicule, but eventually the practice became widespread.
In 1885 French chemist and microbiologist Louis Pasteur (1822–95) developed a rabies vaccine. Vaccine research and development then progressed rapidly right through to the mid-twentieth century, leading to vaccines for chickenpox, diphtheria, tetanus, anthrax, cholera, plague, typhoid, tuberculosis, mumps, measles, polio, influenza, hepatitis A and B, and meningitis. At the same time, legislation began to be introduced to regulate vaccine production and enforce vaccination. The eventual eradication of smallpox was a direct result of this systematic program. CK
c. 1000
Guild
Europe
The banding together of merchants or craftspeople to present a united front against outside competitors who might threaten their livelihoods
An illustration from 1602, showing the insignia of various guilds in Umbria, Italy. Such emblems were often incorporated into the signs of the shops that plied the relevant trade.
The guild is commonly thought of as a medieval and early modern European institution, although similar organizations in different cultures have also been described as guilds. A guild is usually a self-organized group of merchants or crafters that works to promote the economic and professional interests of members of that group. During the Middle Ages (from the fifth until the fifteenth century), guilds developed gradually from religious organizations that conducted charitable and social activities, so it is difficult to date them precisely. In England, merchant guilds were active from the beginning of the eleventh century, and craft guilds from before the end of the twelfth century.
“The spirit of the medieval Western guild is most simply expressed in the proposition, guild policy is livelihood policy.”
Max Weber, General Economic History (1927)
Both merchant and craft guilds were characterized by attempts to establish monopolies. Merchant guilds held the exclusive rights to trade within a town and controlled mercantile infrastructure, such as weights and measures; they often became the governing body of their towns. Craft guilds attempted both to monopolize their craft—requiring craftspeople to join the guild, establishing tariffs against outside competition, and so forth—and to regulate its practitioners. The hierarchy of apprentice, journeyman, and master is characteristic of the medieval craft guild. But such guilds also established quality standards, labor practices, and price schedules, working to ensure a good standard of living for their members.
The guild system began to decline in the sixteenth century due to the emergence of new economic systems. Attempts to revive it, such as the guild socialism movement of the early twentieth century, have not been successful. The system persists to a degree today in specialized and traditional crafts, in professions such as law, medicine, screenwriting, and realty, and in ceremonial groups such as the livery companies of London. GB
c. 1000
Courtly Love
Europe
A rethinking of the ideas of Christian love, marriage, and virtue
An illustration for Giovanni Boccaccio’s epic poem The Teseida (1340–41), showing the scribe dedicating the work to an unknown young woman.
Whereas the religious coloring of the Middle Ages necessitated a strict commitment to nonadulterous relationships and marital fidelity in all aspects of life, the Age of Chivalry, commencing around 1000 in the late Middle Ages, saw a distinct change in attitudes among nobles toward marriage and affection. The chivalric code embraced extramarital yet discreet affairs that were highly ritualized and entirely secret. This transformed relationships between men and women of noble birth, creating conditions for intimate relationships that were impossible in the obligatory and often stifling roles assigned to married persons.
“[Courtly love is] love of a highly specialized sort, whose characteristics may be enumerated as Humility, Courtesy, Adultery, and the Religion of Love.”
C. S. Lewis, The Allegory of Love (1936)
Between the eleventh and thirteenth centuries, the primary theme of courtly love, known to the French as fin amour, was made famous by troubadours, or traveling lyric poets. These poets, who were sometimes musicians also, helped to record the principles of courtly love for posterity, and even shape them contemporaneously, raising the idea of courtly relationship to sublime heights.
Courtly love is probably most responsible for the detachment of love from marriage as an institution, which as a social fact has made its way into secular culture and transformed modern sensibilities about the nature of the relationship between married couples. Husband and wife are no longer expected simply to bow to the standard accepted social conventions about obedience and raising a family, all of which can take place in arranged marriages. Instead, romantic love, in which each lover commits in a worshipful way to the other in a manner historically associated with religious conviction, has become the focus and expectation of intimate relationships between a man and woman. The new intimacy between couples ushered in by courtly love served as a foundational element of the modern distinction between public and private spheres of interest. JS
1047
Consubstantiation
Berengarius of Tours
The belief that the body and blood of Christ coexists with blessed bread and wine
A detail from an altarfront in Torslunde, Denmark, dating to 1561, showing Martin Luther offering communion to worshippers at a baptism.
The doctrine of consubstantiation is a view of Holy Communion in which the body and blood of Jesus Christ coexists together with the bread and wine. All are present, nothing has been lost; Jesus’s body and blood, and the bread and wine, are one. It is a difficult theology to entertain, particularly as there is no biblical text to support it. But neither was there a clear biblical foundation for the Catholic concept of transubstantiation—that the bread and wine change their metaphysical makeup when blessed at the Mass and become, in their entirety, the actual body and blood of Christ. Transubstantiation, however, had been the Church’s teaching for centuries, and the theology of the Mass was at the very center of the Catholic universe. There could be no coexisting of doctrines there.
“The concept of consubstantiation attempts to express in a tangible way (for the believer) the notion of the infinite in the finite without reducing the two terms to a common denominator.”
Brian Schroeder, Altared Ground (1996)
In 1047, Berengarius of Tours (c. 999–1088), archbishop of Angers, proposed consubstantiation in an attempt to explain why the bread and wine seem to remain bread and wine even after they have been consecrated by the priest. According to Berengarius, “the consecrated Bread, retaining its substance, is the Body of Christ, that is, not losing anything which it was, but assuming something which it was not.” In saying “not losing anything which it was,” Berengarius was clearly rejecting the long-held teaching of transubstantiation, and his interpretation was condemned at church councils in 1050, 1059, 1078, and 1079. He clung to his proposal with passion, however, calling Pope St. Leo IX ignorant for refusing to accept it. Only his powerful connections prevented his being burned as a heretic.
The idea of consubstantiation persisted in the Church for centuries, however, and was given fresh impetus centuries later by the monk and theologian Martin Luther (1483–1546), despite always being considered fundamentally heretical in churches of not only Catholic but also Protestant persuasion. JS
c. 1100
Gothic Ideal
Abbot Suger
An architectural style that emphasized height and soaring lines
In architecture, sculpture, painting, illuminated manuscripts, and the decorative arts, the Gothic ideal reigned supreme in Europe from the twelfth to the sixteenth century. But the late medieval “gothic” style was originally so named pejoratively by Neoclassicists who looked back to the Greco-Roman era for their ideal. The architectural motifs first found in Saint-Denis Royal Abbey Church in Paris, rebuilt in c. 1140 under Abbot Suger (1081–1151), and in the great cathedrals, including pointed arches, flying buttresses, ribbed vaults, and stained-glass windows, were seen as barbaric and excessive when compared to the clean lines of an ancient Greek temple such as the Parthenon.
“The principle of the Gothic architecture is infinity made imaginable.”
Samuel Taylor Coleridge, poet and philosopher
In the eighteenth century the Gothic Revival style appeared in Europe and England and was seen as appropriate for both government buildings, such as the British Houses of Parliament, and homes, most famously Strawberry Hill, just outside London. Gothic motifs were popular in architecture, furniture, ceramics, and jewelry. At the same time, the “gothic” novel appeared as an outgrowth of the Romantic movement. Often set in medieval castles and populated with heroes and damsels in distress, these works of horror, suspense, and mystery often had supernatural plot points and characters, usually as villains. This genre is exemplified by English author Mary Shelley’s Frankenstein; or, The Modern Prometheus (1818). American Gothic, such as Washington Irving’s The Legend of Sleepy Hollow (1820), and Southern Gothic, such as the contemporary Southern Vampire Mysteries series by Charlaine Harris, are further variations on the theme. PBr
c. 1100
Flying Buttress
France
An architectural support that revolutionized architecture
Bourges Cathedral in France, built between c. 1200 and c. 1300, is a masterpiece of Gothic art.
The flying buttress is a type of architectural flanking, arched support system that came of age in the Gothic era, from the twelfth century, most notably in high-roofed churches. The buttress is referred to as “flying” because it is in part unsupported; it can be understood as “flying” from its masonry block support to the wall in the form of an arch. High roofs created extreme forces capable of pushing the walls outward unless they were redirected toward the ground. Without flying buttresses countering these forces, the high walls would collapse. While the Cathedral of Notre-Dame in Paris is a well-known example of a church featuring flying buttresses, prototypes of the general engineering principles of the flying buttress are found much earlier, in some Roman and even Byzantine architecture.
“While the daybreak was whitening the flying buttresses [he saw] a figure walking.”
Victor Hugo, The Hunchback of Notre-Dame (1831)
One advantage of employing flying buttresses was that fewer materials were needed to create walls, and heavier and far larger stained-glass windows could be introduced into churches, thereby providing greater beauty upon which the eyes of worshippers could gaze, indeed drawing the eyes of worshippers upward toward the heavens in more immediate contemplation of God. The buttresses themselves became not mere practical supports but objects upon which sculptors could introduce ever more intricate designs and figures. The flying buttress greatly expanded the potential for societies with limited, rudimentary engineering to create vast structures. Architects were empowered to urge believers to greater religious conviction and love of the infinite power of the Creator than they ever could have inspired with buildings of lesser scale. JS
c. 1100
The Ontological Argument for God
St. Anselm
An argument put forward by a Christian cleric to provide logical proof for the existence of God
A copper engraving of the Benedictine monk St. Anselm, made in 1584 by the French Franciscan priest and explorer André de Thévet. The colorization was added later.
The original and most rigorous formulation of the ontological argument for the existence of God is credited to St. Anselm (1033–1109). In Anselm’s argument, the existence of God is taken to be a necessary conclusion based on examining the concept of God.
“God cannot be conceived not to exist. God is that, than which nothing greater can be conceived. That which can be conceived not to exist is not God.”
St. Anselm, Proslogion (1077–78)
One formulation of the argument goes as follows: Premise 1 (P1): God is that being than which none greater can be conceived. Premise 2 (P2): It is possible God exists only in the mind. Premise 3 (P3): If so, then there could be a being greater than God, namely, one who exists in reality. Premise 4: P3 is contradictory of P1, and as P3 is premised on P2 and as P1 is more certain than P2, we should reject P2. It may be concluded that God exists in the mind and in reality.
This argument has been subjected to a number of sophisticated treatments, both for and against, by philosophers from a variety of traditions. In Meditations on First Philosophy (1641), René Descartes offered a version of the ontological proof in his effort to secure human knowledge from the skeptics. Immanuel Kant famously argued that the argument is not properly formulated because existence is not a predicate. Most recently, Alvin Plantinga offered an updated version of the argument, although it is less compelling than St. Anselm’s original.
One of the most interesting things about the ontological argument for the existence of God is just how flimsy it is. That the move from analysis of a concept to assertion of ontological commitment should ever have been thought convincing is itself testimony to the desperate zeal with which the human mind approaches metaphysical issues such as God’s existence. Yet the argument also indicates how creative the human mind can be when confronted with a problem as heady as trying to prove the existence of a transcendent, higher power. JS
c. 1100
Scholasticism
St. Anselm
A method of learning that places a strong emphasis on conceptual analysis
The theological rigor of the mostly monastic thinkers in the first half of the Middle Ages gave way in the second half to Christian academic intellectuals who deployed the methods of dialectical reasoning and conceptual analysis used by Greek philosophers to address problems now associated with philosophy proper. These academicians are known as scholastics, or “school men,” because they were engaged at European universities.
The scholastics first worked to reconcile the positions of various respected Christian thinkers with ancient philosophers such as Aristotle and Plato, before moving to the defense and explication of orthodox Christian positions against a world of plural viewpoints and opinions. Individuals such as St. Anselm (1033–1109), St. Thomas Aquinas (1225–74), Duns Scotus (c. 1266–1308), and William of Ockham (c. 1287–1347) adopted a set of metaphysical and religious premises against which to view and consider all philosophical claims. These scholastics concerned themselves largely with problems such as freedom of will, nominalism and realism, and the existence of God.
The philosophical approach and body of work of the scholastics remain a rich source of ideas for those who reflect on ontology and the nature of reality. The scholastics provided Western philosophy with a set of provocative theses, against which many modern thinkers have rebelled, producing notable philosophical works in their own right. For example, much of the work of René Descartes can be viewed as a reaction to the methods and conclusions of the scholastics, and was explicitly such. But perhaps the most profound contribution the scholastics made to intellectual history was a concerted effort at reviving the conceptual analysis found in ancient philosophy, and in some cases they even surpassed the ancients in their efforts. JS
c. 1150
Wabi-sabi
Fujiwara no Shunzei
There is beauty in the impermanence and imperfection of things
Wabi-sabi is a Japanese aesthetic paradigm that is prevalent throughout all forms of Japanese art. It is a compound term that consists of two related concepts: wabi and sabi. Though wabi-sabi gained prominence with the flourishing of the tea ceremony in the early Tokugawa period (1603 – 1867), its origins can be traced back to the earliest use of sabi by twelfth-century court poet, Fujiwara no Shunzei (1114 – 1204).
By itself, wabi refers to the idea that beauty is found in simplicity. Understatement is always more powerful than overstatement, which is evident in the subtlety of color and lines, and in the use of empty space in Japanese architecture and art. Minor imperfections actually make an object more beautiful than a flawless specimen. Sabi includes the aesthetic of desolation (particularly in landscapes) and an aged, worn appearance in handcrafted objects. This not only epitomizes the Zen notion of emptiness, but also the idea of impermanence. Objects with a rustic patina are considered to have matured with experience. The concept that combines the two, wabi-sabi, celebrates the incomplete and transient nature of things. The sixteenth-century Kizaemon tea bowl is one of the great examples of this aesthetic: its worn, asymmetrical appearance heightens the appeal of its desolate charm. Classic works of Japanese architecture, such as the Silver Pavilion or the dry landscape garden of Ryoanji, both in Kyoto, illustrate the minimalistic beauty of wabi-sabi.
“Loneliness—The essential color of a beauty not to be defined.”
Jakuren, Buddhist priest and poet
Wabi-sabi is the single most important concept in Japanese aesthetics. Heavily influenced by Zen, it governs all Japanese art forms. JM
c. 1181
Holy Grail
Chrétien de Troyes
A mystical cup that was thought by Christians to have healing properties
A handcolored etching of the Holy Grail, from a series of illustrations for Richard Wagner’s opera Parsifal (1882).
The Holy Grail is first mentioned in the Arthurian romance Perceval, Le Conte du Graal (c. 1181) by Chrétien de Troyes (c. 1135–c. 1183). The Grail itself is simply a beautifully decorated chalice, or cup, used to hold the Mass wafer, which Catholics receive as the literal, transubstantiated body of Christ. In the story, the wafer sustains the injured Fisher King, who lives by this bread alone. In its earliest conception, therefore, the Holy Grail is best thought of as a romantic medieval appropriation of the Eucharist, which brings health to those who partake of it.
The thirteenth-century poet Robert de Boron added to the Grail legend by describing it as the combination of the chalice Jesus used at the Last Supper and the blood of Jesus that Joseph of Arimathea saved during the crucifixion. In this way, Joseph of Arimathea became the first of the Grail guardians, and it was his task to keep the Grail safe until it could help in healing the faithful. In later Arthurian romances, the “Grail Quest” is undertaken by King Arthur’s knights as a means to help restore Camelot—the near paradisiacal kingdom on Earth—which is being torn apart by sin.
“Then looked they and saw a man come out of the holy vessel …”
Sir Thomas Malory, Le Morte d’Arthur (1485)
Although the Holy Grail has gradually become more than a simple metaphor for the Eucharist, it still retains the strong Christian notion that Jesus’s sacrifice makes possible redemption not only as the healing of moral brokenness (the forgiveness of sins) but also the healing of nonmoral brokenness (the restoration of broken bodies, dying lands, and so on). The legend of the Holy Grail depicts humanity’s quest for redemption, but also hints at what that redemption might look like. AB
1191
Zen
Myōan Eisai
The concept that enlightenment may be realized through quiet meditation
Zen is a religious and philosophical tradition established by Myōan Eisai (1141–1215), who studied Chan Buddhism in China and founded Japan’s first Zen temple in 1191. The Chan School traces its own origins to Bodhidharma, the legendary Indian monk who brought Mahayana Buddhism to China and founded the Xiaolin temple. Mahayana Buddhism began to incorporate elements of Daoism, which led to the simplified, experience-driven approach of first Chan, and then Zen.
“Zen … turns one’s humdrum life … into one of art, full of genuine inner creativity.”
D. T. Suzuki, author and lecturer
Like Indian Mahayana Buddhism, Zen asserts that suffering in the world comes as a result of our ignorant attachment to false ideals, particularly the concept of a permanent self. The true nature of reality is engi, or interdependent arising, in which everything is part of a dynamic, interrelated web of being. All things are impermanent and nothing exists apart from the natural and social context in which it is embedded. Through meditative practices, a person can experience the truth of engi and gain satori (enlightenment), which is characterized by mushin, a state of “no-mind” that perceives things as they truly are without abstraction. Zen training involves the cultivation of two main virtues: chie (wisdom about the true nature of reality) and jihi (compassion for all sentient beings). The two most dominant schools of Zen are Sōtō, which focuses upon seated meditation, and Rinzai, which emphasizes the contemplation of kōans, or paradoxical riddles. The cultivation of mushin results in a type of hyperpraxia in which a person’s performance of any task is greatly enhanced, and many artists since the samurai era have studied Zen to augment their abilities. JM
c. 1200
Theistic Satanism
Europe
The view of Satan as a godlike being that is deserving of worship as a hero
The Witches’ Sabbath (1797) by Francisco de Goya. The goat in the center is the Devil in its beastlike form.
Atheistic Satanists—such as Anton LaVey (1930–97)—claim that Satan is not a real being but rather a symbol of absolute individualism. In contrast, proponents of theistic Satanism agree with the Abrahamic religions that Satan is a real being; however, unlike those religions, they believe that he is worthy of worship.
The origins of Satanism are hard to trace, but the concept is believed to have first appeared in Europe around the twelfth and thirteenth centuries. Part of the difficulty in dating the practice is due to the fact that many people in history have been accused of forms of Satanism—such as those put on trial for witchcraft during the sixteenth and seventeenth centuries—but it is generally thought unlikely that many of them actually participated in Devil worship.
“Better to reign in Hell than serve in Heaven!”
John Milton, Paradise Lost (1667)
Satanism became a popular theme in literature from the eighteenth century onward, and in 1891 Joris-Karl Huysmans offered a detailed description of the Black Mass in his novel Là-bas (Down There). Traditionally, the Black Mass is a central rite of Satanism, and the ceremony is usually said to combine an inverted form of the Chrisitan Eucharist with elements of magic.
Today, theistic Satanism is generally understood to be an explicit Satanist religion in which believers enter into pacts with demons, celebrate the Black Mass, offer animal sacrifices, and so on. This explicit Satanist religion is multidenominational with widely varying characteristics, but all agree that Satan is right to rebel against God, who is understood—partly in response to poets such as John Milton and theologians such as John Calvin—to be an arbitrary tyrant. AB
c. 1200
Robin Hood
England
The legendary outlaw who robbed from the rich to give to the poor
According to popular legends, Robin Hood was a skilled archer who, along with his “merry men” (including Little John and Friar Tuck), lived in Sherwood Forest. Together, they carried out robberies of the rich and then gave their plunder to the poor. Although the government at the time generally condemned Robin and his men as treasonous criminals (centuries later Guy Fawkes was spoken of as a “Robin Hood”), storytellers and poets have portrayed him as a heroic outlaw.
We do not know for certain if a man named Robin Hood ever lived, although the evidence to suggest as much is compelling. Some of the earliest references to this man are from English legal documents from the first half of the thirteenth century, leading many to believe that Robin Hood was a historical figure, and indeed an outlaw. He must have been a unique outlaw, because shortly after this time legends quickly grew up around him and his exploits. Although in no way consistently told, they were immortalized in songs and poetry by well-known English writers, such as William Langland and William Shakespeare.
“He was a good outlawe, and dyde pore men moch god.”
A Gest of Robyn Hode (c. 1500)
The legend and legacy of Robin Hood are not so much about politics—a peasant revolt against a corrupt government—as they are about ethics. Robin Hood is a legendary figure because he appeared to embody an ethical dilemma: a person who committed an injustice (robbing) to commit an act of love (giving to the poor). However, today he is admired worldwide as a figure who did not unjustly take (rob), but rather justly took (reclaimed) for the weak what was unjustly taken from them in the first place by the strong. AB
1202
Fibonacci Numbers
Leonardo Bigollo (Fibonacci)
A number sequence in which each number equals the sum of the previous two
Fibonacci numbers are a sequence of numbers in which the third number, and every number after that, is the sum of the previous two numbers (0, 1, 1, 2, 3, 5, 8, 13, 21, 34 …). The sequence can begin with 0 and 1, with the third number 1, the fourth 2, and so on; or 1 and 1, where the third number is 2, the fourth is 3, and so forth.
The sequence was named after the Italian mathematician Leonardo Pisano Bigollo (c. 1170–1240), also known as Fibonacci, who was one of the finest mathematicians of the Middle Ages. He introduced the sequence to the Western world in his Book of Squares in 1202, although the sequence had appeared much earlier in Indian Sanskrit oral traditions dating back as far as 200 BCE. In the Book of Squares, Fibonacci likened his sequence of increasing numbers to a community of rabbits, and used it to calculate how many rabbits could be produced over a year if a person began with just two rabbits, with each female being able to reproduce at the age of only one month, doing so immediately, and then giving birth once a month later to a male and a female, each female of which then goes on her own reproductive journey.
“There are no numbers in mathematics as ubiquitous as the Fibonacci numbers.”
The Fabulous Fibonacci Numbers (2007)
The sequence can also be seen in biology, with the number of petals on daisies aligning with Fibonacci numbers: black-eyed daisies have thirteen petals, Shasta daisies twenty-one, and field daisies thirty-four. If you ever want to draw a mathematically correct pine cone, just draw five spirals going one way and eight going the other. And have you ever wondered why it is so difficult to find a four-leafed clover? Because four is not a Fibonacci number. BS
1215
Trial by Jury
England
A legal proceeding in which the verdict is determined by members of the community
A tile from the campanile of Florence Cathedral, c. 1300, showing Phoroneus, personification of the law.
Although there were trials by jury in ancient Athens and ancient Rome, current legal systems that use trial by jury usually descend from or are influenced by the English common law, which traces a codified right to trial by jury to the provisions of the Magna Carta, an agreement forced on King John of England in 1215 by his barons. Article 39 of the Magna Carta provided that “no free man” could be imprisoned, outlawed, or exiled without “the lawful judgment of his peers.”
Trial by jury is a legal procedure in which members of a jury, or jurors, are responsible for deciding the facts of the case: for example, whether the defendant is guilty in a criminal trial. Jurisdictions vary in their use of trial by jury; it is generally more common in criminal cases than in civil cases.
“A court is only as sound as its jury, and a jury is only as sound as the men who make it up.”
Harper Lee, To Kill a Mockingbird (1960)
Jurors are typically selected at random from a pool of citizens, so the jury is presumed to be impartial. Those who support trial by jury argue that it provides a check on the power of the government, allows the voice of the community to be heard in the legal system, and displays that the power of the state is under the people’s control. Critics, however, object that juries can be—and have been—swayed by personal feeling or community prejudice. They also object that juries are not capable of understanding scientific and statistical evidence, both of which are becoming increasingly deployed in court. The chief modern alternative to trial by jury is found in the system of civil law, in which a judge, or a panel of judges, is responsible for deciding the facts of the case (sometimes in collaboration with a jury). GB
1215
Magna Carta
England
A charter limiting the king of England’s powers and specifying citizens’ rights
The Magna Carta—Latin for “great charter”—is a document signed in June of 1215 by King John of England (1166–1216). The charter limited the power of the English king and guaranteed, in part, that citizens would not be subject to the whims of the sovereign, but must be dealt with under the terms of English law that had existed prior to the Norman conquest in 1066.
King John’s rule in early thirteenth-century England had not been going well by the time his barons forced him to sign the Magna Carta and give up some of his powers. Under the feudal system, the barons had an obligation to pay the king taxes and to support him by providing troops during times of conflict. When the king lost his territories in France and started raising taxes without consulting the barons, they, along with Archbishop Stephen Langton, rebelled and forced the king to sign the Magna Carta after capturing London.
“To no one will we sell, to no one will we refuse or delay right or justice.”
Magna Carta (1215)
By limiting the power of the English throne and recognizing citizens’ rights, the Magna Carta became one of the first steps toward the establishment of English constitutional law. English settlers later took that history of constitutional law with them when they moved to New England, and the principles found in the Magna Carta were instrumental in both the writing of the Declaration of Independence (1776) and the formation of the U. S. Constitution (1787). Even though most of the individual rights guaranteed by the Magna Carta were later repealed, and the powers of English sovereigns during the Middle Ages were, for all practical purposes, total, its symbolic importance has played a guiding role in the development of modern democracy. MT
1225
Theosophy
Unknown
A system of esoteric philosophy concerning the nature of divinity
In common use as a synonym for theology as long ago as the third century BCE, it was not until the thirteenth century CE that theosophy was categorized as being something very different from mainline theology and classical philosophy. In his Summa Philosophiae published in 1225 the author, scientist, and Bishop of Lincoln Robert Grosseteste (1175–1253) described theosophists as “authors inspired by holy books”—a very different calling to that of theologians, who were charged with the task of teaching spiritual truths.
Theosophy comes from the Greek theosophia, meaning “divine wisdom.” It encourages a person to question established beliefs and to think critically, and is committed to a set of religious and philosophical ideas striving to show how these ideas are common to all religions. Its goal for the individual is enlightenment and self-actualization through reincarnation and karma. It is not, however, a standalone religion and it does not have a religious-style hierarchy. There are no clergy, and informal study sessions take the place of formal congregational gatherings.
The Theosophical Society was founded in New York City in 1875, and began combining Eastern religious traditions with Western esoteric beliefs in an attempt to achieve what its cofounder—Helena Blavatsky (1831–91), the Russian-born scholar of ancient wisdom literature—termed a “synthesis” of beliefs, called the Perennial Religion. Critics, however, claimed that all Blavatsky had learned she had simply read from books, not acquired personally from teachers and monks as she professed.
Theosophy today continues to foster understanding between competing faiths, and above all asks for selfless service from its adherents. As theosophical author Charles Webster Leadbeater once said: “In this path growth is achieved only by him who desires it not for his own sake, but for the sake of service.” BS
1247
Mental Asylum
Europe
An institution dedicated to offering care and treatment for those suffering from mental illness
An engraving by William Hogarth, of the final image in his eight-painting series A Rake’s Progress (1735). This concluding scene is set in Bethlem Royal Hospital mental asylum.
Also known as the psychiatric institution, lunatic asylum, mental hospital, madhouse, or, colloquially, the nuthouse, a mental asylum is a place where people receive care and treatment for an acute or chronic mental illness. Perhaps the world’s oldest existing mental asylum, Bethlem Royal Hospital in London, England, has been operating for more than 600 years. It was founded by Christians in 1247 as a simple shelter for homeless people, but gradually its activities came to be focused specifically on helping the “mad.”
“We do not have to visit a madhouse to find disordered minds …”
Johann Wolfgang von Goethe, writer
While places to house and protect the insane existed for centuries after that, a significant transition from lunatic asylum to mental hospital coincided with the rise of institutionalized psychiatry in the nineteenth century. Prior to this, madness was treated as a purely domestic problem—it was up to families and their parish authorities to care for the mentally ill. Private madhouses did exist for those who were considered extremely violent or who were unable to be cared for by relatives, but such places were only affordable to the wealthy. Consequently, the insane were often sent to workhouses or correctional institutions. Public or charitable asylums began popping up in the seventeenth and eighteenth centuries in England, but they were few and far between, and often at full capacity as a result.
With the rise of institutionalized psychiatry came national funding for hospitals, accessibility for all social classes, and mandatory moral treatment of all patients. The nineteenth century saw the development of treatments that no longer relied on restraints or coercive methods, and in the twentieth century came the rise of physical and drug therapies. Today, rather than locking away the mentally ill, asylums offer treatments to enable the ill to learn to function better within society. KBJ
1268
Eyeglasses
Unknown
Objects worn over the eyes that use lenses to correct deficiencies in the wearer’s vision
Virgil With Spectacles, a painting by Tom Rink. Mostly worn by monks and scholars, early forms of eyeglasses were held in front of the eyes or balanced on the nose.
People have been using magnification devices to assist with seeing things since at least Roman times. Seneca the Younger (c. 4 BCE–65 CE) allegedly used a glass globe filled with water in an attempt to magnify text in his later years. By 1000, reading stones, similar to what we know as magnifying glasses, were in use, though they were unwieldy and not wearable. Although it is not exactly clear who invented them, eyeglasses existed by at least 1268, when English philosopher Roger Bacon (c. 1220–94) first wrote about them.
“I always think about what it means to wear eyeglasses … I think about all the people before eyeglasses were invented. It must have been weird because everyone was seeing in different ways according to how bad their eyes were.”
Andy Warhol, artist
Eyeglasses offered a practical, portable means of correcting vision. Early spectacles were made from quartz lenses, but later advances in glassmaking allowed inventors to pair the optics with lighter frames that could be more easily worn. It was not until 1604, however, that an explanation for why eyeglasses worked was put forward. In a treatise on optics, German mathematician Johannes Kepler explained that vision was a process of refraction within the eye, and that people with blurred vision had an optical defect which meant that light rays were focused either in front of or just behind their retina—a problem that lenses corrected by redirecting the light. By the 1780s, Benjamin Franklin, a printer by trade, had created bifocals, allowing those with more than one vision problem to rely on a single set of glasses.
The earliest eyeglasses were used primarily to enable people to continue reading after they developed hyperopia, or farsightedness, as they aged. Early scholars likely relied on eyeglasses to maintain their ability to read and write and to contribute to the development of new ideas. As eyeglasses became more common and cheaper to produce, they allowed millions of people to engage in pursuits that their vision problems would otherwise have prevented. Moreover, they became one of the first widely available medical devices that used technology to overcome physical problems. MT
1274
Doctrine of Double Effect
St. Thomas Aquinas
A set of ethical criteria for carrying out an act that would usually be morally wrong
An old manuscript of the Summa Theologica (1265–74) by St. Thomas Aquinas. Aquinas’s work was intended as a manual for beginners in theology.
Also referred to as the principle of double effect, the doctrine of double effect is a conceptual distinction made by Catholic theologists in the Middle Ages to explain and justify a very specific aspect of the rules of engagement of war, although it has since been used in analyzing a variety of contemporary moral problems, including euthanasia and abortion.
“Nothing hinders one act from having two effects, only one of which is intended …”
St. Thomas Aquinas, Summa Theologica (1265–74)
The Christian theologian and philosopher St. Thomas Aquinas (1225–74) is credited with introducing the doctrine of double effect into Catholic thinking in his unfinished work Summa Theologica (1265–74). Aquinas argued that killing an aggressor is permissible if a person does not intend to kill the aggressor and only intends to defend themself. This, in part, established the primacy of self-defense as the only justification for going to war. A more generalized interpretation of the doctrine is that it permits acting in such a way as to bring about foreseeable but unintended harms, provided that there is a greater intended good as the end result. In such instances, the harmful effect is seen to be something like a side effect of the action (with the primary aim of the action being the good result that is to come in the end), although agents are still required to minimize the foreseeable harms where and when possible.
The doctrine of double effect helped to establish the framework for justified warfare among agreeing nations in the West for more than 1,500 years. More importantly, it serves as a crucial non-absolutist position whereby agents must scrutinize and minimize potential harms that could result from their actions, giving additional autonomy to people to make decisions in situations that are not amenable to simple solutions. In short, the doctrine of double effect recognizes and reflects the inadequacy of simple, formulaic rules for determining justified actions in complex situations. JS
1274
Five Proofs of God’s Existence
St. Thomas Aquinas
Intellectual arguments for the existence of God, based on reason and observation
St. Thomas Aquinas receiving the Holy Spirit (in the shape of a dove), by Andrea di Bartolo (c. 1368–1428).
Of the many attempts to prove the existence of God, arguably the best are the five proofs, or five ways, that were offered by St. Thomas Aquinas (1225–74) in his unfinished work Summa Theologica (1265–74). They are: motion (motion cannot begin on its own and so there must have been a first mover); causation (the sequence of causes that created the universe must have a first cause); contingency of creation (all things depend upon other things for their creation, so there must be a being that caused the initial creation); degrees of perfection (there are differing degrees of beauty in the universe, so there must be a perfect standard—God—to which all things are compared); and intelligent design (the universe follows laws that appear to have order, which implies the existence of a Great Designer).
“It is better to … deliver … contemplated truths than merely to contemplate.”
St. Thomas Aquinas, Summa Theologica (1265–74)
Technically more of an attempt to clarify the ways in which people conceptualize the creator as described in Christianity, the five proofs are generally seen as earnest arguments for the reality of God. Thus, like many other philosophical claims, the five proofs have taken on a significance different to the one intended by their author. After all, Aquinas, being a Roman Catholic, already knew of God’s existence through faith.
The adequacy, or inadequacy, of the five proofs brings up the provocative issue of the relationship between the divine and rationality. Indeed, the five proofs highlight the strained relationship between that which is known a posteriori (through experience) and that which is known a priori (through reason), which then calls into question the priority of philosophical inquiry over scientific inquiry, and vice versa. JS
1274
God and Causation
St. Thomas Aquinas
The view that God does not work directly in the world, but through secondary causes
In his unfinished work Summa Theologica (1265–74), philosopher, priest, and theologian St. Thomas Aquinas (1225–74) refers to God as the “Primary Cause” of all of creation, which God then sustains through his presence. The inhabitants of God’s creation—including humankind—are his “Secondary Causes.” The idea of “causation” is not always as linear as the example of creator followed by creation suggests. The “chicken and the egg” causality dilemma (which came first, the chicken or the egg?) means different things to different people. A literal reading of Genesis makes it clear that the chicken (God) came first; but in evolution it is the egg that first appeared.
“All intermediate causes are inferior in power to the first cause …”
St. Thomas Aquinas, Summa Theologica (1265–74)
According to René Descartes (1596–1650), a primary cause is able to “cause itself” and is not dependent upon anything before it for its existence. For Aquinas, creation was the radical “causing” of the universe—it was not a change to the universe, or to space or time; it was not an altering of existing materials. If anything had already existed to aid in or be added to the causing of the universe, then God would not have been the maker of it. As the initiator of the first, primary cause, God is responsible for the means by which all subsequent secondary causes are enabled and sustained. These secondary causes are truly causal, and are variable and arbitrary according to the whims and vagaries of its agents, whether they are humans, or the laws of nature, or the mechanics of physics. For Aquinas, humans cause their own actions and God influences the actions of humans, and neither impinges upon the freedom of the other. BS
c. 1280
Mechanical Clock
Unknown
A mechanical device that enabled the accurate measurement of time
The mechanical clock—a self-sufficient device that is able to measure time without requiring an external power source once it has been set up for a cycle of its limited period of operation—appeared in Europe in the thirteenth century, although no examples have survived to offer a precise date. However, from around 1280, increasingly frequent mentions of mechanical clocks in church records indicate that religious communities, in particular, were beginning to rely on mechanical clocks to keep them to their schedule of meals, services, and private prayer.
Since prehistoric times, humankind has devised a variety of ways to measure the passage of time, including the sundial, hourglass, and water clock. However, it was not until the invention of a mechanical balancing mechanism, known as an escapement, that mechanical clocks became a practical proposition. Chinese engineer Su Song (1020–1101) produced an escapement in the eleventh century, but because his clock relied on water movement for its power it could not be defined as a completely mechanical clock.
“By putting forward the hands of the clock you shall not advance the hour.”
Victor Hugo, poet, novelist, and dramatist
Early mechanical clocks used the regular motion of pendulums to move gears and measure time. They may have been inaccurate by as much as half an hour per day; today, it would take an atomic clock 138 million years to lose or gain less than a second. Even so, the mechanical clock commodified time by encapsulating it into a measurable, divisible product. Clocks gave scientists the ability to measure time in breathtakingly brief moments, and formed entire industries and economies that rested on its progress. MT
c. 1280
Kabbalah
Jewish religion
A Jewish tradition of mystical interpretation of the Torah
Sun and Kabbalistic symbols, from the seventeenth-century Greek codex Astrologicae Fabulae et Cabalisticae.
Kabbalah (meaning roughly “tradition”) is the dominant form of Jewish mystical theology. Although its devotees trace Kabbalah to Moses, or even to Adam, scholars largely agree that it developed in the eleventh and twelfth centuries. In around 1280, the Spanish rabbi Moses de León (1250–1305) published the Zohar, which became the foundational work of Kabbalah. The Zohar underwent a reinterpretation in the sixteenth century by Isaac Luria (1534–72), and Luria’s Kabbalism is sometimes distinguished from that of the Zohar.
Kabbalah distinguishes between God as he is in himself and as he is manifested. As he is in himself, God is beyond comprehension. God becomes manifest by means of a process of emanation, through ten powers or potencies known as the Sefirot. It is through the Sefirot that God reveals himself and through which he sustains the universe’s existence. Elements in the human psyche correspond to the Sefirot, and therefore moral or immoral behavior affects the harmony or disharmony of the universe. Humans are thus central to the system of the universe. Kabbalah also provides esoteric readings of the Torah and surrounding texts.
“The kabbalah that appeared more than 800 years ago … is still present …”
Joseph Dan, Kabbalah: A Very Short Introduction (2006)
Within Judaism itself, Kabbalah’s popularity varies. It was tainted by the fall of a false messiah in the seventeenth century, and in the nineteenth century it was often rejected as importing pagan teachings or as superstitious nonsense. But certain sects revere it as sacred, and elements of Kabbalah have become widely adopted in Judaism. Outside Judaism, Kabbalah sometimes attracts curious non-Jews, especially when incorporated in mystical or New Age traditions. GB
c. 1400
Nunchi
Korea
The subtle art and ability of listening to and gauging others’ moods
Nunchi is a Korean term that describes the ability to listen to and understand the mood of others, to see into their kibun (state of mind) and respond accordingly. In the West, this ability is seen as possessing little more than “good instincts,” but in a society in which people are taught from childhood not to make their true feelings known, nunchi is an indispensible aid in communicating with those around you. The origins of the concept lie in Confucianism, a revised version of which, known as Neo-Confucianism, became prominent in Korea from the early fifteenth century.
Best translated as “the sense of eye,” nunchi is an important tool in navigating Korea’s high context culture, in which words and word choices are seen as less important in conveying complex messages than in the West. In Korea, communication is less verbally explicit. For example, when a Korean asks somebody “Are you thirsty?” what they are more likely to mean is “I am thirsty, would you like to have a drink?” And so the proper response, if one is attuned to their nunchi, is not to say yes or no—that might upset the other person’s kibun—but simply to reply “What would you like to drink?” Nunchi is a kind of sixth sense, the art of seeking out visual and verbal clues, in order to decipher them and thus understand what is really being said.
“Nunchi is used to discover another’s unspoken ‘hidden agenda’ …”
Theresa Youn-ja Shim, Changing Korea (2008)
Those who are not Korean need not despair. Anyone can develop nunchi through keen observation, pausing before answering a question, by talking to someone who is Korean, or by simply going to Korea and immersing themselves in the verbal and visual subtleties of everyday Korean life. JMa
1405
The Book of the City of Ladies
Christine de Pizan
An early defense of the virtues and achievements of women
An illustration from a French edition (c. 1411–12) of Christine de Pizan’s The Book of the City of Ladies (1405).
The Book of the City of Ladies (1405) is generally considered the first feminist novel written by a Western woman. Its author, Christine de Pizan (1364–c. 1430), lived in Venice and wrote in excess of forty literary works in her lifetime. Widowed at the age of twenty-five, her writings provided her with an income that helped in the raising of her three children. She did not write from a position of privilege.
In City of Ladies, Pizan creates a mythical city in which she installs three allegorical foremothers named Justice, Rectitude, and Reason, who preside over a populace where women are appreciated and respected. The author then begins a dialogue with the three women who, together, lift her out of the “slough of despond” into which the rampant misogyny of her time has placed her. The book was in part a response to the misogynistic writings of popular French male author Jean de Meun (c. 1240–c. 1305), who depicted women as vicious and immoral. However, Pizan’s treatise was far more than defensive in nature. She took great care to demonstrate the positive effect that women had on mediating the affairs of their troublesome menfolk, and encouraged women to use rhetoric as a means to assert themselves and to resolve social and family conflicts.
“Neither the loftiness nor the lowliness of a person lies in the body according to the sex.”
Christine de Pizan, The Book of the City of Ladies (1405)
The women in Pizan’s city are strong and articulate. They are scholars, prophets, inventors, painters, strong wives, and proud daughters, and together they offer up a portrait of womanhood that goes a long way toward correcting the skewed and paternalistic views of many male historians. They also provide a rare window into medieval womanhood. BS
1435
Linear Perspective in Painting
Leon Battista Alberti
A system that enabled artists to create the illusion of space and distance on a flat surface
An illustration and accompanying description from On Painting (1435) by Leon Battista Alberti, demonstrating his technique for creating perspective in an artwork.
Linear perspective occurs when, in a two-dimensional image such as a painting, converging lines meet at a single vanishing point, with all objects in the scene becoming smaller and smaller the farther away they are in the background of the image, according to a set scale. This idea was first written about by the Italian architect, author, and priest Leon Battista Alberti (1404–72) in his book On Painting (1435), generally considered to be the world’s first modern treatise on how to accurately represent three-dimensional images on a two-dimensional surface, such as a wall or canvas.
“Beauty—the adjustment of all parts proportionately so that one cannot add or subtract or change without impairing the harmony of the whole.”
Leon Battista Alberti
The problem of perspective had been looming for a while. After hundreds of years of overtly religious art devoted mostly to heavenly figures, artists in the Renaissance changed their focus to the world around them. Buildings, towns, and everyday objects began to appear in pictures, and all of a sudden it became prudent for an artist who was interested in realistically depicting the world to try to figure out how to paint landscapes and streetscapes so that the paintings resembled what the viewer actually saw in real life.
Most artists working in Europe after 1435 were aware of Alberti’s seminal work. Beginning with a “stage” area in the picture’s foreground, he then drew a kind of receding grid on which all the other elements in the painting would be arranged, using the viewer’s height as a guide. This was not the first time a creative artist had identified and tried to make sense of perspective, however. The architect Filippo Brunelleschi (1377–1446) had suggested a solution some years earlier, but the particulars of his approach were never properly understood. Alberti’s grid showed a proper understanding of perspective and how to achieve it. His work had a profound impact on Renaissance painting (c. 1450–1600), and his principles still form the basis of linear perspective today. BS
c. 1440
Printing Press
Johannes Gutenberg
A machine that allowed for easy reproduction of printed materials
An engraving (1754) of German printer Johannes Gutenberg in his workshop, checking a page while printing his first Bible on his newly invented printing press.
Invented in the 1440s, the printing press of Johannes Gutenberg (1398–1468) is widely regarded as one of the most important and influential inventions in history. Gutenberg’s printing press used customizable forms to transfer the same text or images to different pieces of paper or other material. The press allowed people to create identical copies of books and other written materials quickly, cheaply, and efficiently.
“It shall scatter the darkness of ignorance, and cause a light … to shine amongst men.”
Johannes Gutenberg
While Chinese inventor Pi Sheng (990–1051) is credited with inventing movable type in the eleventh century, it was Gutenberg who refined the process of using block printing and movable type to mechanize printing and enable mass production. Using his knowledge and experience as a professional goldsmith, he developed a method of creating individual lead letters that could be assembled quickly into blocks of text, then melted down for recasting when sufficient copies had been made. When combined with a screw-type press, paper that by then was readily available, and a remarkable ease of use, Gutenberg’s movable type transformed the means by which ideas were spread in the community.
Prior to the press, written materials had to be copied individually by hand in a painstaking, laborious process. After its introduction, even bound books could be mass-produced. (The first book printed by Gutenberg was the Bible.) Within a few decades of its appearance, the printing press was being used in every country in Western Europe, and the number of books being produced each year skyrocketed. It is estimated that in the fifteenth century fewer than 25 million books were made in Europe. By the sixteenth century that number had ballooned to more than 200 million. The printing press’s widespread adoption not only led to a surge in literacy rates, but also made it easier for scientists, philosophers, and other thinkers to share and spread their ideas. MT
Early Modern
1450–1779
The School of Athens (1509–11), a fresco by Renaissance artist Raphael, makes clever use of perspective to depict the ancient Greek philsopher Plato and his student Aristotle.
This was an exciting time in history, when the globe was first circumnavigated and the wonders of nature were beginning to be understood from a scientific perspective. Significant philosophical ideas from this period include those found in Niccolo Machiavelli’s political treatise The Prince, René Descartes’s statement that “I think, therefore I am,” and John Locke’s argument that the mind at birth is a blank slate. Important practical ideas include those behind the internal combustion engine and mechanical programming, the precursor to computer programming. Galileo turned all of Western science—and history—on its head with the idea that the natural state of something is to be in motion, ending the nearly 2,000-year reign of Aristotelian physics.
c. 1450
Homophony
Europe
A musical texture in which multiple parts move together in harmony
Homophony in music is a texture in which two or more parts move in harmony (based primarily on chords). Homophony is distinct from polyphony, which involves combinations of melodies that are relatively independent. Homophony is typically characterized by one part (often the highest) predominating, with little rhythmic difference between parts. In polyphony, by contrast, rhythmic distinctiveness will reinforce the autonomy of the melody.
Music of the Middle Ages (from 400 to about 1450) began with the development of monophony, which is essentially music with a single part or melodic line. This was manifested in the sacred music of the Roman Catholic Church, characterized primarily by vocal chants that were sung without accompaniment and in unison. The gradual development of counterpoint led to the later integration of polyphony into music of this period. Homophony developed after this, in the Renaissance (c. 1450–1600), and involved pieces that could be performed by both singers and different instruments. As a term, however, homophony did not appear in English until its use by composer and music historian Charles Burney (1726–1814) in his General History of Music (1776).
“What most people mean … [by] harmony … [is] the texture is homophonic.”
Worlds of Music (2006), gen. ed. Jeff Todd Titon
Since the middle of the Baroque period (c. 1600–1760), music theorists have considered four voices in homophonic arrangement as the basic texture of Western music. The rise of homophony also led to the development of new melodic forms such as the sonata, which was popular in the Classical (c. 1730–1820) and Romantic (c. 1815–1910) periods. JE
c. 1450
Renaissance Music
Europe
A stylistic period in music history, lasting from roughly 1450 to 1600
A Concert of Three Female Figures (c. 1520), by the artist known as Master of the Female Half-Lengths.
The Renaissance is an era in art and intellectual history that spans from c. 1450 to c. 1600. Traditional accounts of the period emphasize the “discovery of man,” a renewed interest in antiquity and science, and a reaction to the perceived notion of the “barbaric” medieval culture.
In contrast to practitioners of the visual arts and architecture, it was not possible for musicians of the period to be inspired by actual music from antiquity. Instead, antiquity’s influence came through theoretical writings by Plato and others on the relationship between music and text: music’s role was to emphasize the meaning of the text, making a direct impact on the soul. Thus, vocal music was privileged over instrumental during the period. A stylistic consequence was that word-painting—letting the melodic patterns directly illustrate the emotion or imitate the action of a text—became a prevalent stylistic device both in secular and sacred music.
“The Renaissance was … the green end of one of civilization’s hardest winters.”
John Fowles, The French Lieutenant’s Woman (1969)
Thanks to the development of music printing during the sixteenth century, music could be more widely distributed. Instrumental music developed in a specialized fashion, both with performers devoting their time to one instrument and with instrumental ensembles consisting of similar instruments of different sizes. This practice carries on today in highly specialized orchestras that include homogenous instrument sections. The polyphonic compositional practices of composers in Italy and Northern Europe that shaped the Renaissance’s highly complex vocal works, such as those of Luigi da Palestrina and Josquin des Prez, are still taught to music students at universities. PB
1452
Public Library
Malatesta Novello
A collection of books that can be accessed freely by the general public
The reading room in the Biblioteca Malatestiana in Cesena, Italy, the world’s first public library. The interior of the library has not changed since it first opened in 1452.
A public library requires not only books and manuscripts on shelves, but also a public body to own them and a literate population to read them. This combination of circumstances first occurred in Renaissance Italy, although libraries themselves are a far older idea.
“I have always imagined that Paradise will be a kind of library.”
Jorge Luis Borges, writer
The world’s first libraries, containing archives of clay tablets, were founded in Sumerian temples in around 2600 BCE. The Egyptians built a major reference library at Alexandria in around 300 BCE, while rich Romans established their own private libraries in the first century BCE. Christian and Buddhist monks kept rooms of manuscripts, while reference libraries first appeared in Islamic cities during the ninth century. These libraries were known as “Halls of Science” and they were endowed by Islamic sects to promote their beliefs and to disseminate secular knowledge. All these libraries were privately owned and were used only by the few enthusiasts and scholars able to read. It was not until 1452 that the world’s first publically owned library opened in Cesena in central Italy. Commissioned by Malatesta Novello (1418–56), lord of Cesena, and known as the Biblioteca Malatestiana, the library belonged to the city of Cesena. It contained 340 manuscripts on subjects as varied as the classics, medicine, and science, which the literate public could read on one of fifty-eight desks.
The idea of a library funded by pubic monies and open to the general public soon spread throughout Europe, with the royal National Library of France, founded in 1368, opening its doors to the public in 1692. Public libraries were established in U.S. cities during the nineteenth century, while an act of parliament in 1850 allowed British towns and cities to set up free public libraries paid for by local taxes. Public libraries became an important feature of local communities, giving anyone with the motivation the opportunity to improve their knowledge. SA
1486
Oration on the Dignity of Man
Giovanni Pico della Mirandola
A speech that embodied the key concepts of the Renaissance
A detail of Adoration of the Magi (c. 1475–76), by Florentine painter Sandro Botticelli, shows Giovanni Pico della Mirandola (in red hat) with the scholar Agnolo Poliziano (holding sword).
Often referred to as the “Manifesto of the Renaissance,” the “Oration on the Dignity of Man” was a speech written (but never given) in 1486 by Giovanni Pico della Mirandola (1463–94), just twenty-three years of age and already one of the greatest philosophers and humanists of his day. Mirandola abandoned the study of canon law in 1480 in preference for philosophy after the death of his mother. The Oration was intended as an introduction to his 900 Conclusions, a selection of theological theses drawn from numerous sources.
“This is what Moses commands us … admonishing, urging, and exhorting us to prepare ourselves, while we may, by means of philosophy, a road to future heavenly glory.”
Mirandola, “Oration on the Dignity of Man” (1486)
The speech was “pure Renaissance,” embodying all of the precepts that made the Renaissance (c.1450–1600) what it was—a belief in human primacy and humankind’s ability to reshape the environment, invent anything, or be anything. If a man could properly cultivate all that is rational he would, Mirandola said, “reveal himself a heavenly being.” Mirandola emphasized humankind’s dominance over creation, and how humans had been endowed with gifts from God that distinguish them from all other creatures. He saw humans as the summit of God’s creativity, but did not lord Christianity over other faiths, citing several intellectuals and philosophers from other religions to support his idea that all humans are equally capable of pondering the mysteries of existence. He also encouraged his listeners to follow their own paths and not be intimidated by church doctrines or hierarchies, and offered to pay the travel expenses for any scholar prepared to travel to Rome and debate his ideas with him in a public forum.
Following the publication of 900 Conclusions, a number of Mirandola’s theses were denounced as heretical by Pope Innocent VIII and Mirandola was briefly imprisoned. He intended the oration to be a precursor to an authoritative compendium on the intellectual, practical, and philosophical achievements of humankind, but the book was never completed due to his untimely death. BS
1499
Sikhism
Guru Nanak
A religion that combines action—doing good deeds—with belief in God
A sixteenth-century Indian miniature painting depicts Guru Nanak listening to a sitar player.
Sikhism was founded in the Punjab region of Pakistan in 1499 by Guru Nanak (1469–1539), who, according to tradition, became interested in spiritual things at the age of five. However, another twenty-five years were to pass before he experienced an extraordinary, life-changing vision. In 1499, three days after his clothes were found by a stream and he was assumed by his family to have drowned, he miraculously reappeared, but refused to speak of what had happened to him. The next day, he declared: “There is neither Hindu nor Mussulman [Muslim], so whose path shall I follow? I shall follow God’s path. God is neither Hindu nor Mussulman and the path which I follow is God’s.” With these words, Sikhism was born.
The Punjabi word Sikh translates as “learner,” and early followers of the tradition were those in search of spiritual guidance. After the death of Guru Nanak, Sikhism was led by a succession of nine other gurus, all of whom are believed to have been inhabited by the same spirit. Upon the death of the tenth, Guru Gobind Singh (1666–1708), this spirit was transferred to the sacred scripture of Sikhism, the Guru Granth Sahib (The Granth as the Guru). From that point on, the Guru Granth Sahib was considered to be the sole guru.
Sikhism is a monotheistic religion. Adherents believe that there is only one God, without form and without gender. All people have direct access to God, and all are equal. We should live good, honest, and pure lives within our communities. Ritualistic observances have no relevance—they are empty gestures—and there is no place for superstitious traditions or practices. We spend our lives in an endless cycle of birth, life, and rebirth, and the quality of our lives is determined by how we have lived previously—the law of karma. Today, there are around 20 million Sikhs in the world, the majority of whom live in the Punjab province in India. BS
c. 1500
Coffeehouse
Ottoman Empire
A public space for political debate, stories, gossip, and games—and for drinking coffee
A Turkish miniature from 1582, showing a coffee wagon in a carnival procession.
The coffeehouse was originally established to sell coffee to the public, but it quickly became more than that, turning into a place for friends and neighbors to meet and discuss the issues of the day. Samuel Johnson defined it in his Dictionary (1755) as: “A house of entertainment where coffee is sold, and guests are supplied with newspapers.” Consuming coffee thus became a social experience, and the site where it was served became an important gathering place.
Coffeehouses were established as soon as coffee was introduced to the Ottoman Empire in the early sixteenth century, and later spread to Europe in the seventeenth century. The merchants and entrepreneurs who introduced coffee served it in public shops, and these shops offered places for customers to stay while they drank their purchase. These social spaces became a venue for the free exchange of ideas, where people could discuss politics, share stories, and play games. Authorities recognized the freedom of discussion offered by these new public spaces as dangerous and frequently took actions to suppress it, including in Mecca in the early sixteenth century and London in the mid-seventeenth century.
Nothing quite like the coffeehouse had ever existed before, and its provision of a space where social and business life could be mixed in pleasant surroundings proved immensely popular. The coffeehouse became an influential public forum, with a legacy that touched everything from the stock market and philosophy to science and literature. It was also seen as a great “social leveler,” as it was open to all men, regardless of social status, as long as they could afford the price of a cup of coffee. With modern coffeeshop chains, such as Starbucks, now ubiquitous and an estimated 1.6 billion cups of coffee drunk worldwide every day, the popularity of the coffeehouse shows no sign of abating. TD
1511
In Praise of Folly
Desiderius Erasmus
A satirical essay about self-importance and lack of spirituality becomes one of the most influential texts of the Renaissance
A portrait of Desiderius Erasmus, painted in 1517 by the Flemish artist Quentin Massys. Massys also later produced a medal featuring Erasmus’s likeness.
In 1509, the humanist scholar Desiderius Erasmus of Rotterdam (1469–1536) wrote an essay he called In Praise of Folly (Moriae Encomium). After its publication in 1511, it became one of the most popular and influential texts of the Renaissance (c. 1450–1600): read by popes, widely translated, and vigorously discussed. Erasmus believed it to be of only minor importance, and he filled it with wordplay and inside jokes to his friend Sir Thomas More. However, this ostensibly light, amusing satire would go on to add fuel to the fire of the Protestant Reformation in the sixteenth century.
“Clap your hands, live, and drink lustily, my most excellent disciples of Folly.”
Desiderius Erasmus, In Praise of Folly (1511)
The essay personifies the character of Folly as a goddess, the offspring of a nymph, Freshness, and the god of wealth, Plutus. As readers might expect from an entity attended by Inebriation, Ignorance, Misoponia (laziness), Anoia (madness), and Lethe (dead sleep), Folly has a high opinion of herself, declaring that life would be dull and distasteful without her. Folly eventually moves from praising her degenerate friends to satirizing the excesses, superstitions, and needlessly obscure practices of the Roman Catholic clergy. The piece ends with an unadorned statement of Christian ideals.
Erasmus was a Renaissance humanist and also a faithful Catholic, and In Praise of Folly reflects the gradual shift in Western Europe away from a medieval worldview centered on the Roman Catholic Church to an investment in the spiritual and intellectual self-sufficiency of the individual. An earlier work by Erasmus had outlined the same ideas of religious obscurism and excess, but In Praise of Folly’s ironic, playful tone communicated his message more flavorfully to an increasingly literate public. Though Erasmus distanced himself from the politics of the Protestant Reformation, his work promoted and spread the ideals of a humanist Europe throughout the Western world. MK
1513
The Prince
Niccolò Machiavelli
A political philosophy viewing politics as the cynical pursuit of power
The first truly modern political scientist, Niccolò Machiavelli (1469–1527), published in 1513 a slim text he titled The Prince (Il Principe). It was a guide on acquiring power, creating a state, and then keeping that state. In it, he drew from his experiences as a foreign secretary for the government of Florence, and also the teachings of history, to postulate that politics followed its own rules. Scandalized by the seeming moral depravity of such a system, readers of The Prince quickly made his name synonymous with political strategies characterized by cunning, bad faith, and duplicity. All the same, Machiavelli’s work would usher in an attitude of political realism that facilitated all subsequent descriptions of national and international power.
The Prince emphasizes Machiavelli’s belief that realism, not abstract moral idealism, is the necessary approach to political matters. He recognizes that most political leaders are not inherently extraordinarily virtuous in a Christian sense, but instead have a particular aspiration for glory. In order to take advantage of his temperament and to develop his full potential, then, a political leader ought to take risks, even if those risks included recreating the “mores and orders” that define a social order. Machiavelli’s bold departure from the political philosophy of the day created an eager audience for his book, and it almost immediately began influencing world political practices.
In the generations after him, political realists of the stature of Francis Bacon, David Hume, John Locke, and Adam Smith all cited Machiavelli’s approach as a direct influence on their thinking, as did founding fathers of the American Revolution Thomas Jefferson, John Adams, and Benjamin Franklin. Machiavelli’s work does not always lead to positive political revolution, however; twentieth-century mobsters John Gotti and Roy DeMeo have deemed The Prince a “Mafia Bible.” MK
1513
The State
Niccolò Machiavelli
The supreme political power in a certain geographic area
Though nations have existed throughout history, the modern idea of the state is largely attributed to Niccolò Machiavelli (1469–1527), who, in his 1513 work The Prince, spoke of lo stato (the state) as the only organization that not only wields ultimate authority, but also whose interests trump all others, even those of individuals, the Church, or morality itself. Later political theorists, such as Max Weber (1864–1920), would refine the idea of the state as a government that only used its violent authority legally, or for legitimate or acceptable purposes. Today, almost every corner of the Earth is claimed as belonging to a particular state, including the surrounding coastal waters and airspace. States can be governed through any political form, from dictatorships to republican democracies.
“L’Ètat, c’est moi.” (The state, it is I.)
King Louis XIV of France
The state does not exist in a physical form, yet its power is ever present, permeating every part of daily life. The state has the power to control you, yet you cannot touch it. It has the ability to wage war, though it never gives an order. It has the power to detain, incarcerate, and execute without being able to lift a finger because it has no digits. Though Machiavelli’s work is often criticized for its support of questionable or immoral tactics to further the state’s interests, the idea of a final, ultimate power is the basic concept upon which all modern governments, and nations, are built. Behind all the balances, limitations, and checks on government authority, the final power in any nation always rests with the state and those who pursue its interests, and, ultimately, their ability to use violence. MT
1516
Utopia
Sir Thomas More
A seemingly perfect society, usually constructed in a deliberate manner
Woodcut on the title page of the first edition of Sir Thomas More’s Utopia (1516).
The word “utopia” first appeared in a book of the same name published in 1516 by Sir Thomas More (1478–1535). It was created from the Greek for “not” and “place”; its literal translation is “nowhere.” More, a Renaissance humanist, philosopher, statesman, and advisor to King Henry VIII, originally titled the book Libellus … de optimo reipublicae statu, deque nova insula Utopia (Concerning the Highest State of the Republic and the New Island Utopia). In it he argues for the creation of a society ruled by reason, communal power, and property, with the caveat that this kind of world is not easily achieved.
While Sir Thomas More coined the word, the concept of utopia predates him. Many people—some in earnest, some in jest, and some in speculation—have written accounts of what a “perfect” society would look like. Old and new examples of utopian societies are found in Plato’s The Republic (360 BCE), Francis Bacon’s New Atlantis (1624), and H. G. Wells’s A Modern Utopia (1905). Utopias work better on paper than they do in practice, and few of the political and religious communities that have attempted to observe utopian ideals have succeeded. Frequently, once the author or leader of a utopian society dies, the structure of the society crumbles.
“Nations will be happy, when … kings become philosophers.”
Sir Thomas More, Utopia (1516)
Alongside utopia exists the concept of dystopia, an idea that has been more prevalent in popular culture than utopia in the past two centuries. Well-known accounts of dystopias can be found in Aldous Huxley’s Brave New World (1932) and Nineteen Eighty-Four by George Orwell (1949). Dystopian societies are utopias gone wrong because of a distortion of one of the well-intended founding precepts of the utopia. MK
1517
Anti-clericalism
Martin Luther
The belief that churchmen stand in the way of both God and the exercise of reason
Anti-clerical movements were and remain deeply concerned with preserving religion’s relevance. The sixteenth-century Protestant Reformation, arguably a pivotal moment in the political, social, artistic, and religious development of the Western world, is a vivid example of how anti-clericalism was able to reform culture and religion simultaneously.
The protests against the Roman Catholic Church, Ninety-five Theses on the Power and Efficacy of Indulgences, that Martin Luther (1483–1546) nailed to the door of Wittenberg’s All Saints’ Church in 1517 contained attacks against Catholicism’s doctrines, rituals, and ecclesiastical structure, and he gave special attention to clergy members who convinced congregations that without the direct intervention of a priest, and the payment that inevitably followed such intervention, a parishioner would be unable to communicate with God, much less get into Heaven. Luther’s articulation of this concern fueled work by Enlightenment philosophers such as Voltaire who, though uninterested in reforming the prevailing religious practices, argued the efficacy of reason above all else. Churchmen, then, since they did not typically cultivate faculties of reason, impeded societal progress.
“Every man must do two things alone … his own believing and his own dying.”
Martin Luther
Anti-clericalism and its proponents cleared a path for humanism, science, and reason to flourish, which in turn encouraged attitudes of religious tolerance. Humanist thinking situated the power to access God within each individual, rather than in the office of a priest, which weakened organized religion’s hold on secular and personal affairs. MK
1517
Reformation Art
Martin Luther
A critical attitude to religious art that appeared as part of the Reformation
The Judgement of Paris (c. 1528) by Lucas Cranach the Elder. The painting shows Paris awarding a golden apple (here transformed into a glass orb) to the fairest of three goddesses.
The Roman Catholic Church’s love of religious iconography was challenged in 1517 when Martin Luther (1483–1546) posted his Ninety-five Theses on the Power and Efficacy of Indulgences on the door of All Saints’ Church, Wittenberg. Luther abhored what he felt was the idolatry inherent in the highly decorated and elaborate style of art found in religious architecture, paintings, and sculpture, and his list of reforms included wording against preachers “laying up earthly treasures.” Early Reformation art was to include visual satires of the papacy, mass-produced on the new Gutenberg printing presses and distributed throughout Europe. (Luther himself used the technology to print his illustrated tracts.)
“They are not temporal treasures, for the latter are not lightly spent, but rather gathered by many of the preachers.”
Martin Luther, Ninety-five Theses (1517)
Following Luther, the Northern Protestants sought new subject matter and modes of expression in the arts. Roman Catholic subjects, such as the martyrdom of St. Lawrence (who was slowly roasted to death on a grill), were replaced by subjects such as Christ teaching or blessing children, as seen in the works of German painter Lucas Cranach the Elder (1472–1553). These artworks offered a more realistic depiction of biblical scenes, generally portraying the characters as ordinary people in real settings, rather than idealizing them and using symbolism. There was a widespread stripping away of ornamentation and depictions of saints in the churches. In Germany and elsewhere, the release of former papal land holdings to local barons led to a decline in religious art; instead, the barons commissioned portraits of themselves and their possessions, as well as landscapes and still lifes.
During the Counter-Reformation (1545–1648), initiated by the Roman Catholic Church to combat Protestantism, Catholic artists such as El Greco returned to depicting subjects such as the suffering of martyrs and confession and penance—sacraments discounted by the Protestants—for display before the faithful. PBr
c. 1520
Mannerism
Italy
An art style characterized by distortion of elements such as scale and perspective
Vertumnus—Rudolf II (c. 1590) by Giuseppe Arcimboldo, showing Holy Roman Emperor Rudolph II as Vertumnus, the Roman god of seasons who presided over gardens and orchards.
The High Renaissance, from the late fifteenth century to the early sixteenth century, was a period of unequaled beauty in the world of art, but it began to wither with the deaths of Leonaro da Vinci in 1519 and Raphael in 1520, and the sacking of Rome in 1527. Across Europe the momentum of the High Renaissance seemed to stall. Art, of course, would continue, but it was the era of the “Late Renaissance” now, the decades between the death of Raphael and the beginning of the Baroque period in c. 1600. It was in this period that Mannerism, as it would come to be called four centuries later, made its appearance.
“Mannerism … was an art of intellectual contortionism; urban artists measured themselves against the daunting example of Michelangelo and found shelter in caricature and exaggeration.”
The New Yorker (1987)
Mannerism—with its harmonious religious themes, subdued naturalism, and emphasis on complexity—developed in either Florence or Rome, depending upon which accounts you choose to believe. For the first time, artists were able to draw on archaeological excavations to portray classical civilizations accurately, instead of relying on their own imaginations. However, imaginations certainly were unleashed after decades of conformity. Colors were more mixed and vibrant, disconcerting themes combining Christianity and mythology appeared, and the composition of nudes departed from long-held ideas on what constitutes correct posture.
Mannerism lasted about eighty years—much longer than the High Renaissance period it replaced. Notable Mannerists during this time included Giorgio Vasari, Daniele da Volterra, Francesco Salviati, Domenico Beccafumi, Federico Zuccari, Pellegrino Tibaldi, and Bronzino. Typical of the bizarre nature of Mannerism was the work of Giuseppe Arcimboldo (c. 1527–93), whose portrait of Rudolph II showed a man composed entirely of fruit, flowers, and vegetables. Some would later claim him a forerunner of Surrealism. Most admirers of “high” art, however, were pleased to see Mannerism pass into history. BS
1525
Anabaptist Faith
Conrad Grebel
The belief that baptism should only be for people old enough to choose it
Anabaptism is characterized by its adherence to adult, in preference to infant, baptism. Originally a radical offshoot of Protestantism, its tenets are observed today in Amish, Brethren, Hutterite, and Mennonite communities. British and American Puritan Baptists are not direct descendents of the Anabaptist movement, but they were influenced by its presence. The ideas of the Anabaptists had a tendency to heavily influence whatever social order was around them.
The Reformation movement was in full swing in Switzerland when a group of dissatisfied reformers united under an influential merchant and councilman, Conrad Grebel (c. 1498–1526). They performed the first adult baptism, outside of Zürich in 1525. They believed, like a number of other reformers, that infants cannot be punished for any sin until they can reasonably grasp concepts of good and evil, so therefore they ought not to be baptized until they are able to come to the sacrament of their own free will. With this belief as a touchstone, Anabaptists attempted to find a place to settle in Europe, but because of the vehemence of their proselytizing they were ousted from town after town. Many early Anabaptist leaders died while in prison.
“That is the best baptism that leaves the man cleanest inside.”
H. Ward Beecher, Proverbs from Plymouth Pulpit (1887)
Anabaptists did not only believe in adult baptism and civil intransigence. They also insisted that church and state should be kept separate from one another, and that coercive measures used in maintaining order, such as physical violence, ought to be foresworn. These tenets continue to guide the spiritual descendents of the European Anabaptists today and are traceable in the U.S. Constitution (1787) and Bill of Rights (1791). MK
1534
Church of England
King Henry VIII of England
The officially established Christian church in England
A portrait of Henry VIII aged forty-nine, painted in 1540 by Hans Holbein the Younger.
After the second-century arrival of Christianity to British shores, Roman Catholicism was the only state-endorsed religion in the British Isles. The agitation of the Protestant Reformation and a king resentful of the papacy’s limits on his power, however, exchanged Roman Catholicism for a brand of Christianity that is tied intimately to the place it is practiced: the Church of England.
As Martin Luther and his fellow Protestants argued against Roman Catholic excess and exclusivity on the continent, King Henry VIII of England (1491–1547) sought an annulment of his marriage to Catherine of Aragon but found himself repeatedly refused by the reigning pope, Clement VII. Incensed, Henry forced a number of acts through the British Parliament in 1534, formally separating the English Church from Rome and making the English monarch the head of the English Church. In the course of the years that followed, the Church of England alternated between near Catholicism and Protestant beliefs and practices because of the changeable nature of the monarchy.
“The King … shall be … the only supreme head in earth of the Church of England.”
Act of Supremacy (1534)
While the Church of England has retained many of its Catholic trappings, it developed a number of unique texts that reverberate in the language and popular imagination to this day, including the Book of Common Prayer, which contains phrases such as “till death do us part” and “ashes to ashes, dust to dust.” The Church of England’s music became a staple of the Western canon, inspiring the works of seminal composers such as Edward Elgar, Ralph Vaughan Williams, Gustav Holst, Benjamin Britten, and Leonard Bernstein. MK
1536
Calvinism
John Calvin
A Protestant belief that God’s grace and human salvation are predetermined
The teachings of John Calvin (1509–64), a key figure of the Protestant Reformation, are held as some of the most important of the Reformed tradition in Protestantism. In 1536 Calvin first published his book Institutions of the Christian Religion, a work that, along with his other teachings, helped to shape the Reformed movement away from many of the positions expounded by Martin Luther. Though he derived much of his work from the example set by Luther, the churches that adhered to his teachings became known as Calvinist.
Calvinism centers around five religious doctrines. The first tenet, “total depravity,” is that humanity is sinful and without hope of salvation outside of the intervention of God. The second tenet, “unconditional election,” holds that God is unconditionally sovereign and has predetermined everything. The third, “irresistible grace,” says that God had chosen certain people to receive salvation even before the creation of the universe. Fourth, “limited atonement,” claims that Christ’s death was meant only for those elected to receive salvation. Fifth, “preservation of the saints,” holds that anyone who chooses to be saved will always remain in that state and none will lose their salvation.
“God preordained … a part of the human race … to eternal salvation …”
John Calvin
The impact of Calvinism was seen mostly in Scotland, Ireland, England, France, and colonial North America. Many early settlers to the United States were Calvinists, such as the English Puritans, French Huguenots, and Scotch-Irish Presbyterians. Calvinist ideals, such as the belief in a strong work ethic, courage, and adherence to capitalistic free markets, have shaped beliefs far beyond those who occupy the pews of their churches. MT
1543
Heliocentrism
Nicolaus Copernicus
The theory that Earth is not the center of the universe
An illustration of the Copernican system, from Andreas Cellarius’s Harmonia Macrocosmica (1660).
Heliocentrism, the model of the cosmos in which the sun is the central point upon which all other bodies revolve, was proposed in the third century BCE by Greek astronomer Aristarchus of Samos. However, his theory never gained favor because the positions of the stars remained static, which they would not, the logic went, if the Earth were continuously changing its position. Claudius Ptolemy offered a solution to this inconsistency in the second century CE, by arguing that the Earth was the center of the solar system. This geocentric model held sway for the next 1,400 years.
In 1543, Nicolaus Copernicus (1473–1543) of Poland published De Revoluntionibus Orbium Coelestium Libri VI (Six Books Concerning the Revolutions of the Heavenly Orbs), which argued in favor of the heliocentric system. Copernicus was a Roman Catholic cleric and he carefully downplayed any heretical overtones in his argument; the volumes remained unpublished until he died and they were prefaced by a disclaimer stating that the theories were useful for computation, even if they proved untrue. As a result, his work went largely unnoticed for almost one hundred years.
“At rest, however, in the middle of everything, is the sun.”
Nicolaus Copernicus
Italian physicist and astronomer Galileo Galilei (1564–1642) revived the argument for heliocentrism when observations through his telescope suggested that the Earth did indeed revolve around the sun. Galileo’s views were deemed heretical, and in 1616 he was forbidden by the Vatican to “hold or defend” his heliocentric model. The “Galileo Affair,” though, had the unforeseen result of spreading the idea of heliocentrism, and it rapidly became accepted as scientific truth. MK
c. 1550
Capoeira
Brazil
Brazilian martial art that combines elements of music and dance
Men Perform Capoeira, or the Dance of War, an engraving after Johann Moritz Rugendas (1835).
The rich culture of Brazil has given rise to many different arts, but none is more loaded in meaning or mired in human misery than capoeira. The likely origin of the name, from a native Tupi word referring to areas of low vegetation in the Brazilian interior, gives some indication as to its history.
Portugal colonized Brazil in the early sixteenth century, importing slaves from Africa after 1534 to overcome the shortage of native workers to harvest and process sugarcane. The slaves lived in large farms known as engenhos, where conditions were harsh and inhumane. Some managed to escape, and in their hope of survival developed a way of living and a culture that was the start of capoeira. Collecting in remote settlements known as quilombos, where they were out of the reach of colonial troops, the escaped slaves developed capoeira from a survival technique into an unarmed martial art that helped to keep them free.
During the 1700s, when slaves were brought into the cities, some quilombo dwellers moved with them, bringing the culture of capoeira. The colonial government tried to suppress it, arresting capoeira practitioners who found work as bodyguards, hitmen, and mercenaries. In 1890 the new republican government of Brazil banned the practice outright.
In 1932, Mestra Bimba (1899–1974), a fighter from Salvador in the north, founded a capoeira school, renaming the skill Luta Regional Bahia (Regional Fight from Bahia) in order to evade the ban. Capoeira was taught to the cultural elite and lost its criminal links before being removed from the penal code in 1940.
Today, capoeira is a worldwide phenomenon, a Brazilian martial art form that symbolizes resistance to oppression. Theatrical and acrobatic, capoeira still contains subtle and disguised elements of its savage colonial origins. Trickery is paramount, and a capoeirista never takes his eyes off his opponent. SA
1556
Unitarianism
Peter Gonesius
The denial of a triumvirate Godhead and identification of Christ as a prophet only
A nineteenth-century engraving by J. Smith of a Unitarian Chapel in Liverpool, England.
Unitarianism is a religious movement that takes its name from its foundational tenet of understanding God as one person alone instead of the three beings of the Father, the Son, and the Holy Spirit, coexisting consubstantially as one. Most Christian denominations, then, do not fall in line with Unitarian principles, since they maintain that Jesus Christ is a manifestation of God, not simply a prophet, as Unitarians believe. Unitarianism also, quite significantly, promotes reason as a means of interpreting scripture and religion.
Peter Gonesius (c. 1525–73), a Polish student, first spoke out against the Doctrine of the Trinity, the three divine beings coexisting as one, in 1556. His agitation sparked a nine-year debate that culminated in the creation of the Polish Brethren, Unitarianism’s forebears. When members of the Polish Brethren were ordered to convert back to Roman Catholicism or leave Poland, they dispersed to Transylvania and Holland, where they adopted the title “Unitarian.” After the Enlightenment’s elevation of reason, Unitarianism caught on in England and the United States, too. Unitarian churches sprang up in major U.S. and British cities, and Unitarians occupied influential seats in major universities such as Harvard, where they challenged the prevailing Puritan theology. Unitarians continue to interrogate the religious universe, and in the twentieth century they adopted many of the underpinning ideas of religious humanism, which is religious belief centered on human needs, abilities, and interests.
Unitarian rejection of the Trinity is accompanied by a liberal perspective on God, tempered by forays into science, philosophy, and reason. Unitarians maintain that science and religion do not, in fact, contradict one another. Instead, science and religion ought to be seen as complementary, illuminating one another’s otherwise inscrutable facets. MK
1557
Equals Sign
Robert Recorde
The shorthand use of two parallel lines that revolutionized mathematics
In 1557, Welshman Robert Recorde (c. 1510–58) grew weary of writing “is equal to” in his treatise on advanced mathematics and introduction to algebra, The Whetstone of Witte. The shorthand symbol that he devised is the two parallel lines familiar to those with the most rudimentary of mathematical knowledge: the equals sign. This symbol, and the concept of equality of two discrete expressions that it represents, makes otherwise abstract mathematical ideas clear and the discovery of unknown quantities in algebra possible.
The ancient Egyptian Rhind Papyrus (c. 1650 BCE) contains the first recorded linear equation, but scholars are uncertain if the ancient Egyptians’ concepts of equivalence and balance were the same as those developed by mathematicians such as Recorde in the sixteenth century. In any case, the equals sign took many years to find a common place in mathematical texts; seventeenth-century mathematicians were partial to æ, representing the Latin aequalis (equal).
“To avoide the tedious repetition of … is equalle to: I will … use, a paire of paralleles.”
Robert Recorde
The state of being equal, indicated by the presence of the equals sign, is an indispensable concept in basic mathematics and algebra. In fact, an equation could be argued to be the most basic notion in all of mathematical thinking: figures on either side of an equals sign are of the same value. Both sides of the equation can be simultaneously manipulated (by dividing, subtracting, or adding, for example) with the intent of “solving” the unknowns in the equation. Most laws of physics and economics upon which daily life revolves are expressed most cleanly through equations such as, Newton’s First Law of Gravitation. MK
c. 1570
Baroque Architecture
Italy
A late sixteenth-century architectural style of dynamism, fluid curves, and ornament
The Baldacchino in Saint Peter’s Basilica, Rome, designed by Gian Lorenzo Bernini in 1633.
Baroque architecture arose in Italy in around 1570 as an element of the Counter-Reformation (1545–1648). Seeking to win back congregations depleted by decades of Protestant reform, the Roman Catholic Church appealed to the faithful by creating churches that invoked a sense of awe.
Baroque buildings are known for their flamboyant appearance, typified by graceful curves, wild scrolls, and oval forms. Often, their exteriors sport ornate wrought ironwork, snaking columns, and elaborate stone carvings. The interiors are opulent, with large, colorful frescoes, gilded wooden fittings and statuary, planes of stucco sculptures, and faux marble finishes. The careful use of lighting, along with illusory effects such as trompe l’oeil, gives them an almost theatrical appearance.
“[The Baroque era] is one of the architectural peak periods in Western civilization.”
Harry Seidler, architect
In 1656 Pope Alexander VII commissioned the Italian architect and sculptor Gian Lorenzo Bernini (1598–1680) to renovate St. Peter’s Square in Rome. Bernini worked on the project for eleven years. He resolved a spatial problem caused by the erroneous alignment of preexisting architectural elements by creating an oval piazza surrounded by a colonnade of 284 marble columns, crowned by 140 statues of the saints. Other Italian architects, such as Francesco Borromini (1599–1667) and Guarino Guarini (1624–83), created Baroque structures of similar elegance and grandeur.
The Baroque style was introduced to Latin America by colonizers of the New World, and Jesuit missionaries came to favor the style. Latin American Baroque architecture became even more extravagant and boldly ornamented than its European antecedent. CK
1572
Imaginary Numbers
Rafael Bombelli
Numbers with a square that is less than or equal to zero
Imaginary numbers are not actually imaginary at all, but are numbers whose square is less than or equal to zero (the “imaginary” in the term is a vestige from the time when mathematicians had not yet defined a number system to accommodate numbers whose square is less than or equal to zero). Despite the speculative connotation, imaginary numbers are significant in understanding real-world quantities and phenomena. Scholars have argued that fractions, negative numbers, and “zero” are as seemingly irrational as imaginary numbers, and yet are still represented in everyday life.
Greek mathematician and engineer Heron of Alexandria (c. 10–70 CE) is credited with the concept of imaginary numbers, but it was Rafael Bombelli (c. 1526–72) who first codified and thoroughly described their properties in his treatise L’Algebra (1572). He defined an imaginary number as the square root of minus one and gave it the symbol i. Subsequently, mathematicians and theorists seized on imaginary numbers in understanding physical phenomena, such as magnetic fields, electric circuits, and the origins of the universe.
“The imaginary number is a fine and wonderful resource of the human spirit …”
Gottfried Leibniz, mathematician and philosopher
Mathematicians admit that imaginary numbers are difficult to understand. However, if one thinks about imaginary numbers, unlike real numbers, as not providing formulas but relationships and consider them an “upgrade” to conventional mathematical systems, the possibilities expand for what imaginary numbers can do. Quantum mechanics, for example, relies upon a dimension that can be imparted only by imaginary numbers, so understanding them is crucial to comprehending the origins of the universe. MK
1581
Ballet
Balthasar de Beaujoyeulx
A formalized, dramatic form of dance performed to music in a decorated setting
Two Dancers on a Stage (c. 1874) by Edgar Degas. This is one of many images Degas painted of the ballet.
The word “ballet” is a French adaptation of the Italian word balleto, used to describe theatrical dances held at court during the early fifteenth century, directed by dancing masters such as Domenico da Piacenza (c. 1400–c. 1470). When, in 1533, Catherine de’ Medici (1519–89) married the future King of France, Henry II (1519–59), she took choreographer and musician Balthasar de Beaujoyeulx (d. c. 1587) with her to France, where he staged entertainments for the French court. In 1581, he staged the Ballet Comique de la Reine (The Dramatic Ballet of the Queen) at the Palais du Petit Bourbon of the Louvre in Paris, and it came to be recognized as the first ballet de cour. The five-hour-long performance depicted the ancient Greek myth of Circe with dance, music, and verse, using choreography to tell the story. A year later, a book was made containing engravings portraying the spectacle, and copies were given to Europe’s aristocracy: ballet became a way to publicize royal power, culture, and wealth.
“[The Ballet Comique was] a political, philosophical, and ethical mirror of its day.”
Carol Lee, Ballet in Western Culture (2002)
Ballet developed under the patronage of King Louis XIV (1638–1715), when in 1681 he established the first professional ballet school, the Académie Royale de Danse, which later became the Paris Opera Ballet. By 1700, terms such as chassé and pirouette were in use to describe ballet movements, and ballet companies had been founded throughout Europe. It was not until the nineteenth century that en pointe technique—in which dancers dance on the tips of their feet, wearing specially reinforced shoes to distribute their body weight away from their toes—became integral to ballet aesthetics. CK
1584
Infinity of Inhabited Worlds
Giordano Bruno
The theory that, because the universe is infinite and expanding, logically the universe must contain an infinite number of inhabited worlds
A bronze relief from 1887, by the Italian sculptor Ettore Ferrari, showing Giordano Bruno being burned at the stake by the Venetian Inquisition for his heretical theories.
The theory concerning the infinity of inhabited worlds posits that the universe is infinite, and therefore the number of worlds occupying it is also infinite. Philosopher, astronomer, and mathematician Giordano Bruno (1548–1600) published this theory in 1584 to the shock of a society deeply committed to the principles of Aristotle and Socrates in comprehending the structure of the universe.
“The universe is then one, infinite, immobile … It is not capable of comprehension and therefore is endless and limitless, and to that extent infinite and indeterminable, and consequently immobile.”
Giordano Bruno
Bruno’s heretical ideas about cosmology were not limited to the physical realm. Had they been restricted to the material world, his departure from established beliefs might not have compelled the Roman Catholic Church to burn him at the stake. An excommunicated priest, Bruno was intrigued by possible ways to understand reality more intimately, and was therefore interested in all aspects of science and thought. He lectured widely on Copernicus’s then-controversial theory of a heliocentric solar system, and speculated that the observable cosmos was not static but infinite and always expanding. In his lectures and writings, he was critical of the physics of Aristotle and promoted a heretical relationship between philosophy and religion. All this left no place in his thinking for Christian tenets of divine creation and the Last Judgment.
Form and matter are “one,” says Bruno, anticipating the direction of modern science, but greatly upsetting the church fathers, who jailed him and put him on trial for the “heretical” content of his publications and lectures. Despite Bruno’s protests that his claims were intended to be philosophical rather than religious, the Catholic Inquisitors pressed him for a retraction. The philosopher refused and was subsequently burned alive. Bruno’s ideas persist, however, in his unwillingness to limit the universe to a geometric structure and his advocacy of an indivisible “one” that unites all matter; both are forerunners of quantum mechanics. MK
1588
Ballroom Dancing
Jehan Tabourot
A style of formal dancing, originally practiced only by members of high society
Ballroom dancing was once a form of social dancing for privileged people, as distinct from the folk dances of the poor. However, it has long outgrown that distinction. The term itself derives from a “ball” or formal dance, from the Latin ballare, to dance. In 1588, the French cleric Jehan Tabourot (1519–95), writing under the name Thoinot Arbeau, published Orchésographie, a study of formal French dances including the slow pavane, five-step galliard, and solemn basse, or low dance. He provided information about dance etiquette and how dancers and musicians should interact, as well as musical examples in which the dance steps were placed next to individual notes, a major innovation in dance notation.
In 1669, Louis XIV of France (1638–1715) established the Académie Royale de Musique et de Danse, which laid down specific rules for every dance. New dances were added to the acceptable canon, including the minuet in 1650; Louis XIV himself danced the minuet in public, signaling his approval of the style.
In 1819, Carl Maria von Weber (1726–1826) wrote Invitation to the Dance, a piano piece he described as a “rondeau brillante.” Although it was a concert work, it inspired the waltz, a dance that shocked society by bringing male and female dancers into close contact.
Modern competitive ballroom dancing is classified into five International Standard dances—the waltz, tango, Viennese waltz, foxtrot, and quickstep—and five International Latin dances—the samba, cha-cha-cha, rumba, paso doble, and jive. In North America, nine Smooth or Rhythm dances are preferred: the quickstep, samba, paso doble, and jive are replaced by East Coast swing, the bolero, and the mambo. What was once a diversion of high society is now mass entertainment, a staple on our television screens, the stuff of films, and an enjoyable and sociable pastime for millions. SA
1596
Continental Drift
Abraham Ortelius
The proposal that the continents, once joined, move across the Earth’s surface
In the third edition of his book Thesaurus Geographicus, published in 1596, Dutch mapmaker Abraham Ortelius (1527–98) presented a remarkable new theory. Aided by the growing sophistication of world maps showing continents with coastlines that seemed to mirror each other, he suggested that the continent of the Americas had, as a result of earthquakes and other seismic activity, been “torn away from Europe and Africa.” However, the study of landforms and how they are shaped was still very much an emerging science at the end of the sixteenth century, which meant that Ortelius’s theories could not be debated thoroughly.
It would be another 316 years before the term “continental drift” would enter the lexicon of science. The term was coined by the German meteorologist Alfred Wegener (1880–1930) as part of his “reinvention” and expansion of a two-part theory. First, millions of years ago, an ancient land, Pangaea (Greek for “all lands”), began to break apart into two individual, continent-sized landmasses, which he called Laurasia and Gondwanaland; second, they in turn broke up into the continents we have today.
“The vestiges of the rupture reveal themselves [in] a map of the world.”
Abraham Ortelius
Wegener’s conclusions were not well received by his fellow academics, almost all of whom still held to the belief that the continents always had been fixed. His paper, which failed to contain any explanation, much less physical evidence, as to how the continents could possibly have moved so far, did little to alter opinion. It would be another eight years before Wegener was able to provide an explanation, when he presented his theory on plate tectonics. BS
1597
Divine Right of Kings
King James I of England
The belief that royalty was sanctioned to rule by God Himself
The theory of the divine right of kings posits that because God has conferred the right to rule on a monarch, that individual cannot be subject to the authority of any other earthbound entity, including the will of the people he or she governs, other aristocrats, governmental bodies, or even the mightily powerful church. God alone grants a monarch the right to wear a crown, and He alone has the ability to judge, so human attempts to censure or depose a monarch are not only treasonous, but sacrilegious.
This idea emerged in medieval Europe, and because it is a reinterpretation of ancient Roman law, it is difficult to pin down precisely who first encoded it. However, by 1597, James I, then king of Scotland and soon to become king of England, had written The True Law of Free Monarchies, in which he argues for an absolutist monarchy. It is no coincidence that James I was also Protestant and, like many proponents of the divine right of kings, looking to further elude the influence of the Roman Catholic Church. However, at the same time, many political thinkers—among them John Locke and John Milton—were protesting the tyranny of monarchs’ absolute power, contending that to confer such immense political strength on a single person was simply replacing one oppressor, the Roman Catholic Church, with another, the king.
“I will govern according to the common weal, but not … to the common will.”
King James I of England
The debates over the divine right of kings ushered in considerations not only of the separation of church and state but, as Thomas Jefferson famously wrote in the United States Declaration of Independence (1776), the philosophy that “all men are created equal.” MK
1598
Opera
Jacopo Peri
A sung, staged performance of a drama, often with orchestral accompaniment
In the Loge (1879) by Mary Stevenson Cassatt shows a woman watching a performance in Paris.
Opera began as a recreation of Greek antique drama. The Florentine Camerata—a group of intellectuals, musicians, and poets in Florence at the end of the sixteenth century—started the art form that would rapidly spread through Europe. The first known opera was Dafne (1598) by Jacopo Peri (1561–1633), set to a libretto based on Ovid’s Metamorphoses. The remaining fragments of this work for a small ensemble reveal the art form as we know it today: arias sung by soloists, choruses, and sections of narrative consisting of sung recitatives. The first major opera that still remains in the repertoire is Claudio Monteverdi’s L’Orefo (1607).
Composers through the centuries have found ways to adapt the genre to contemporary musical styles and topics. Operas never gave up mythical and allegorical topics—adaptations of Shakespeare’s dramas by Henry Purcell and others, Nordic myths in Wagner’s Ring Cycle, or cosmic allegories in Stockhausen’s seven-opera cycle Licht—but realism never made strong gains. Even a work such as John Adams’s Nixon in China can be seen in allegorical terms, illustrating how myths can be created in contemporary events.
“If music … is an imitation of history, opera … is an imitation of human willfulness …”
W. H. Auden, The Dyer’s Hand and Other Essays (1962)
Traditional opera was aimed at the upper classes and courts, but popular forms developed (operetta in the nineteenth century and musicals in the twentieth) and these have held solid cultural positions. Like opera, operettas and musicals have dealt with myths such as Orpheus in the Underworld, fiction such as Victor Hugo’s Les Misérables (1862), and contemporary topics, such as AIDS (Rent, 1994), the Cold War (Chess, 1984), and the sexual revolution (Hair, 1967). PB
1600
Electricity
William Gilbert
A fundamental form of energy that results from the interaction of charged particles
An illustration from De Magnete (1600) by William Gilbert depicts how a magnetized needle pushed through a ball of cork and submerged in water will point to the magnetic pole.
English physician, physicist, and natural philosopher William Gilbert (1544–1603) coined the term “electricity” to describe phenomena arising from the presence and flow of an electric charge. In his book De Magnete, Magneticisque Corporibus, et de Magno Magnete Tellure (On the Magnet and Magnetic Bodies, and on the Great Magnet the Earth), published in 1600, Gilbert drew the English term from the Greek word electron and the Latin word electricus, meaning amber, as the ancients were known to have produced an electrical phenomenon by rubbing a piece of amber. What those ancients, and Gilbert after them, had observed was caused by charges producing electromagnetic fields that have an effect on other charges. The electric charge, in turn, arises from properties and interactions of subatomic particles.
“Is it a fact—or have I dreamt it—that, by means of electricity, the world of matter has become a great nerve, vibrating thousands of miles in a breathless point of time?”
Nathaniel Hawthorne, The House of the Seven Gables (1851)
Observations of naturally occurring electricity are recorded in texts of ancient Egypt, Arabia, Greece, and Rome, but quantifying and harnessing the power of electricity did not commence until the seventeenth and eighteenth centuries. By 1821, Michael Faraday had designed the electric motor, and in 1827 Georg Ohm successfully analyzed the properties of the electrical circuit. What followed was a burst of scientific discussion and clever innovation.
Pioneering inventors and scientists, such as Nikola Tesla, Thomas Edison, Joseph Swan, George Westinghouse, Alexander Graham Bell, and Lord Kelvin, adapted scientific discoveries to the tasks and necessities of daily life. Advances in medicine, electric lighting, stoves, home computers, washing machines, radios, televisions, and electrically driven transport were all the result of their labors. Without the demand for electric products, and the convenience of life supported by electricity, the Second Industrial Revolution (c. 1860–1914) would never have occurred, and life as we live it (and light it) would be very different. MK
c. 1600
Baroque Music
Europe
A stylistic period in music history, originating in Italy and lasting 150 years
A portrait of George Frideric Handel by T. Hudson, painted in 1749. Handel’s best-known work is his oratorio Messiah, which was composed in 1741.
The term “Baroque” is thought to derive from the Portuguese barroco, a deformed pearl, implying an imbalance and imperfection in form and style, in contrast to the well-balanced and reserved ideals of antiquity. The first use of the term in reference to the arts is believed to have been in a derogatory review of Rameau’s opera Hippolyte et Aricie (1733). More generally, however, “Baroque” came to describe the lavish visual art style of c. 1600 until 1750—exemplified by Peter Paul Rubens, Gian Lorenzo Bernini, and Rembrandt—in which both the emotional and the rational response to an artwork were emphasized.
“Always sad where it might be tender, this singular brand of music was Baroque …”
Anonymous, Mercure de France (May 1734)
In music, the Baroque period coincided with the introduction of tonality, the major and minor modes used today, and with a simplification of musical texture through monody, featuring one melodic line with basso continuo—that is, chordal accompaniment and a prominent bassline. The Baroque also gave birth to the genre of opera. During the eighteenth century, the Baroque style became more complex, polyphonic, and ornamented in various genres, including opera, sacred music, and chamber music, culminating in the mature works of George Frideric Handel and Johann Sebastian Bach.
Baroque music, especially the works of Bach, Handel, and Vivaldi, remains prominent on today’s concert programs, but the era had another important impact. It initiated the “common practice” period, which lasted from roughly 1600 to 1900 and was defined by shared basic notions of harmonic and rhythmic syntax, despite the different aesthetic ideals behind many of the compositions. Several common-practice stylistic traits remain evident in many twentieth-and twenty-first-century genres, such as musical theater, jazz, and rock. Even the basic ensemble of harmonic accompaniment, keyboard, and a bass instrument is still (with the addition of drums) dominant in popular music. PB
c. 1600
Metaphysical Poetry
Europe
Poetry as a serious meditation in the form of a witty extended metaphor
An anonymous portrait of the metaphysical poet John Donne, from c. 1595. Donne is often considered to be the greatest love poet in the English language.
“Metaphysical” was originally a pejorative term used by English critics John Dryden (1631–1700) and Samuel Johnson (1709–84) to describe a broad school of seventeenth-century British and Continental poets who wrote in a lyric style, using psychological analysis, and paradox, and whose juxtaposition of unrelated concepts was intended to shock the reader and force them to carefully consider the meaning behind the poem. Their use of the “conceit,” an extended metaphor, allowed them to construct elaborate and intricate poems on topics ranging from sexuality to contemporary politics.
“[Donne] affects the Metaphysics … in his amorous verses, where nature only should reign; and perplexes the minds of the fair sex with nice speculations of philosophy, when he should engage their hearts.”
John Dryden, critic
The most influential Metaphysical poet, Englishman John Donne (1572–1631), wrote his nineteen Holy Sonnets, including “Death, Be Not Proud,” after his beloved wife died in childbirth. His spiritual meditations on the meaning of death and the place of God and love in human life include important psychological insights into how grieving may be comforted. Another important poet in the style is Englishman Andrew Marvell (1621–78), whose poem, “To His Coy Mistress,” is a carpe diem conceit on the futility of remaining virginal in light of the shortness of human life: “The grave’s a fine and private place, But none I think do there embrace.” Other notable Metaphysical poets include Henry Vaughan, John Cleveland, and Abraham Cowley, as well as, to a lesser extent, George Herbert and Richard Crashaw.
Twentieth-century scholars, such as T. S. Eliot in his essay “The Metaphysical Poets” (1921), looked back favorably on the Metaphysical poets for their ingenuity and clever use of language and humor. Their subjects were very different from the traditionally slight ones of the period. Instead, they used reason to examine complex political, religious, scientific, and ethical questions. Today, their works are minutely analyzed and vaunted as models for modern poetry. PBr
1604
Laws of Falling Bodies
Galileo Galilei
The theory that falling objects move under the influence of gravity alone
A sketch showing Galileo conducting his experiments from the Leaning Tower of Pisa. Although an infamous story, this was not actually the method that he used to prove his theory.
In the fourth century BCE, Aristotle maintained that an object falls with a speed proportionate to its weight. This idea was accepted until the sixteenth century, when in 1576 Italian mathematician Giuseppe Moletti reported that bodies of the same material but of different weight arrived at the earth at the same time. In 1586, Flemish mathematician Simone Stevin demonstrated that two objects of different weight fall with exactly the same acceleration. Then, in 1597, Italian philosopher Jacopo Mazzoni observed large and small fragments descending at the same rate.
“In a medium totally devoid of resistance all bodies would fall with the same speed.”
Galileo Galilei
In an apocryphal tale, Italian physicist and astronomer Galileo Galilei (1564–1642) dropped iron balls of unequal weight from the Leaning Tower of Pisa. In fact, Galileo determined the rate at which bodies accelerate as they fall by rolling balls down a sloping board in 1604. Galileo is credited with the definitive experiments on falling bodies because the contributions of others were not scientific; they did not measure time as Galileo did, and they did not use mathematics to establish their theories.
The fact that a lump of lead will fall faster than a leaf seemingly contradicts the rule that all bodies fall at the same rate. However, the two objects fall at different rates because of air resistance. This was demonstrated by U.S. astronaut David Scott (b. 1932) in an experiment on the moon, which has no atmosphere; a hammer and a feather were dropped from the same height and both struck the surface of the moon simultaneously.
Galileo also used experimental observation and mathematical reasoning to explain one-dimensional motion with constant acceleration, the acceleration due to gravity, the behavior of projectiles, the speed of light, the nature of infinity, the physics of music, and the strength of materials. His theories were instrumental in paving the way for the laws of motion and gravity formulated by Sir Isaac Newton (1642–1726). BC
1605
International Law
Hugo Grotius
A body of rules subscribed to by all nations for peaceful resolution of disputes
International law comprises the rules widely accepted and agreed upon for regulating the interactions of nations. Dutch philosopher and jurist Hugo Grotius (1583–1645) established its foundations on philosophies of natural justice. As nations were increasingly brought into contact with one another by improved transportation, thinkers wanted to establish a structure for dealing with disputes that would be understandable and acceptable to all the peoples involved.
Grotius’s involvement in encoding standards of international justice began with a legal case arising from the seizure by Dutch merchants of a Portuguese vessel in the Singapore Strait in 1603. The Netherlands and Portugal were at war, and a Dutch merchant, without his government’s or his company’s permission, had taken advantage by capturing the Portuguese cargo, which he then distributed to his company’s shareholders. The Portuguese were incensed, as were many of the Mennonite (and thus pacifist) Dutch shareholders. The scandal, and the ethical and legal debate that followed, inspired Grotius’s seminal tract On the Right of Capture, completed in 1605 but only published centuries later, in 1864.
“Insofar as international law is observed, it provides us with stability and order.”
J. William Fulbright, U.S. senator
Eventually, entities such as the European Court of Human Rights and the International Criminal Court were formed to enforce violations of international law. Importantly, the nations subject to the International Criminal Court and the Court of Human Rights also consent to be subject to its governance. In this way, the sovereignty of the nation itself is protected while still allowing for accountability to the rest of the world. MK
1605
Don Quixote
Miguel de Cervantes
The first modern novel, and one of the most rewarding fictions of all time
A nineteenth-century oil painting of Don Quixote and his squire Sancho, by Alexandre Gabriel Decamps.
Don Quixote is one of the most widely read classics of Western literature. It was published in two volumes, in 1605 and 1615, by Spanish novelist, poet, and playwright Miguel de Cervantes Saavedre (1547–1616), who wrote it as a comic satire of the chivalric romances that were in vogue in his country at the time. The novel describes the misadventures of its protagonist, Alonso Quixano—an aging minor nobleman who takes the nobler name of Don Quixote—as he sets out on his horse, Rosinante, with his peasant squire, Sancho Panza, on a knight errant’s heroic quest.
Don Quixote is regarded as the first modern novel because, for the first time, the protagonists’ evolving characterizations, rather than their actions, are of principal interest. Earlier romances simply related events, but Cervantes looked beyond narrative to explore what could be learned from juxtaposing his characters’ personalities, especially those of the idealistic Don Quixote and the cynical, world-weary Sancho Panza. Their dialogue produces comedy and tragedy in equal measure, and yet it is never obvious what Cervantes wants his readers to think, or be persuaded to conclude, about the real world.
“And maddest of all, to see life as it is and not as it should be.”
Miguel de Cervantes, Don Quixote (1605)
The character of Don Quixote has become an archetype for the pursuit of idealistic goals; the word “quixotic” immortalizes this characterization. The phrase “tilting at windmills” has its origins in an episode of the novel in which the Don attacks a row of windmills, believing them to be gigantic demonic knights; the phrase has come to mean a tendency to pursue unattainable objectives. BC
1607
Rosicrucianism
Christian Rosenkreuz
A secret society dedicated to using arcane knowledge for humankind’s benefit
Frontispiece of Collegium Fama Fraternitatis (1618), a manifesto of the Rosicrucians, by Theophilus Schweighardt.
Rosicrucianism refers to the study or membership of a secret society called the Rosicrucian Order, or the Order of the Red Cross. According to the order’s mythology, it was founded in Germany by the perhaps fictional Christian Rosenkreuz. Its philosophy is based on “esoteric truths of the ancient past” that offer insight into both the physical world and the spiritual realm. Other secret societies, including Freemasonry, took inspiration from Rosicrucianism.
The first manifesto outlining the Rosicrucian Order was anonymous and appeared in 1607; another surfaced in 1616. Both documents align the order with the Protestant movement and promote a “universal reformation of mankind” while relating a fantastical tale of the 106-year-old Christian Rosenkreuz and his trip to the Middle East to gain wisdom. Upon returning to Germany, Rosenkreuz discovers that he is unable to disclose any of his secrets to European leaders, so he collects a small group of friends—no more than eight, it is said—and founds the order. The members, all doctors and sworn bachelors, must swear to heal the sick without payment, to maintain the secrecy of the fellowship, and to replace themselves before they die.
“Summa Scientia Nihil Scire—The height of knowledge is to know nothing.”
Christian Rosenkreuz
Intellectual Europeans were intrigued by Rosicrucianism’s mystical and alchemical elements. The order gave rise to the precursor to the Royal Society, comprising scientists who held regular meetings to share empirical knowledge they had gained through experimentation. In addition, esoteric fraternities, such as the Freemasons, were formed throughout Europe and the United States to strengthen social alliances. MK
1608
Telescope
The Netherlands
The notion of an instrument capable of revealing the universe to humanity
The world’s first telescope was constructed in the Netherlands in 1608. The news of its invention reached Italian physicist and astronomer Galileo Galilei (1564–1642), and in order to further his study of the heavens he built one of his own the following year; he produced another of superior design the year after that. Although Galileo’s telescopes were crude by twenty-first-century standards, they enabled him to make observations of the valleys and mountains of the moon, the four moons of Jupiter, and the phases of the planet Venus, none of which had been examined before. Sir Isaac Newton (1642–1726) followed Galileo’s example with his reflecting telescope of 1668, and telescopes powerful enough to scrutinize the Milky Way galaxy were in circulation by the eighteenth century.
“Measure what is measurable, and make measurable what is not so.”
Galileo Galilei
The pantheon of telescopes now includes not only reflecting telescopes but also refracting telescopes, plus versions called Schmidt telescopes that incorporate both reflecting and refracting technology. Multimirror telescopes are used to make observations deep into the universe, while solar telescopes are designed specifically for investigations of the sun. Telescopes are themselves no longer earthbound; with the added capabilities of camera and broadcast technology, scientists have been able to send telescopes aboard spacecraft to capture images that would not be accessible from Earth. The huge Hubble Space Telescope, launched into orbit around Earth in 1990, is also able to take images of cosmic objects. The images, unlike those of earthbound telescopes, are not subject to distortion by Earth’s atmosphere. MK
1614
Logarithm
John Napier
A mathematical concept that reduces calculation times and accelerates scientific progress
The title page of the 1614 edition of Mirifici Logarithmorum Canonis Descriptio. The book discussed theorems in spherical trigonometry and introduced natural logarithms.
In Mirifici Logarithmorum Canonis Descriptio (Description of the Wonderful Rule of Logarithms), published in 1614, Scottish mathematician John Napier (1550–1617) introduced a concept that was to explode the possibilities of research in physics, mathematics, geography, and astronomy: the logarithm.
“[The logarithm is] an admirable artifice which, by reducing to a few days the labor of many months, doubles the life of the astronomer, and spares him the errors and disgust inseparable from long calculations.”
Pierre-Simon Laplace, mathematician
Napier’s name for his new mathematical figure belies its purpose: logos in Greek can be understood as “proportion,” and arithmos as “number.” Logarithms, then, are numbers that indicate ratios. Logarithms allow mathematicians, scientists, engineers, and navigators to perform mathematical calculations more efficiently; instead of completing manifold monotonous multiplication calculations, a researcher could employ logarithmic tables and slide rules to solve a mathematical question quickly. Mathematicians such as Nicholas Mercator, John Speidell, and Leonhard Euler expanded upon Napier’s concepts to develop the logarithmic form and rules that are still employed throughout the world in the present day.
The applications of the logarithm are far-reaching and critical. Logarithms simplify mathematics in fields such as astronomy, surveying, navigation, and physics, and so scientific discovery and innovation advanced more quickly than they might have without their aid. Researchers and practitioners continue to express ideas in logarithmic scales. For example, musical tones and intervals are expressed in logarithms, and Hick’s Law, which proposes a logarithmic relation between the time people take in making a choice and the number of choices they are presented with, is founded upon algorithms. Even though they might not be conscious of the exact calculations, humans use logarithmic reasoning in considering risk assessment: which issues they ought to be concerned with, and which are relative longshots. Logarithms also appear in studies of entropy, statistics, and information theory. MK
1616
False Tidal Theory
Galileo Galilei
The theory that the Earth’s daily rotation and annual orbit around the sun cause the tides
A portrait of Galileo Galilei by the Russian artist Ivan Petrovich Keler-Viliandi, painted in 1858. Galileo originally studied medicine before moving to philosophy and mathematics.
False Tidal Theory posits that the Earth’s “dual motion,” its once-daily spin on its axis combined with its annual journey around the sun, is responsible for bodies of water rising and falling in the form of tides. Eminent scientist Galileo Galilei (1564–1642) first proposed this theory in 1616 in an attempt to marry mathematics, astronomy, and physics, at the same time proving the Copernican model of a solar system in which the Earth and other bodies orbit the sun.
“[Galileo’s] aim was to substitute for a petrified and barren system of ideas the unbiased and strenuous quest for a deeper and more consistent comprehension of the physical and astronomical facts.”
Albert Einstein, theoretical physicist
Galileo’s inspiration for this theory of causation came when he was riding a barge carrying fresh water from Padua to Venice, Italy. Whenever the barge switched direction or changed its speed, he noted, the freshwater in the barrels sloshed in response. He then posited that, even though we cannot perceive it, the Earth is moving at different speeds and directions because it spins on its axis and around the sun. Other leading scientists of the day, including Johannes Kepler (1571–1630), argued for a tidal theory based on the gravitational effects of the moon, but Galileo pointed out that they had no empirical evidence for this, and that such ideas seemed suspiciously occult. When he eventually encoded his tidal theory in Dialogue Concerning the Two Chief World Systems (1632), Roman Catholic inquisitors judged Galileo’s defense of the Copernican system, including the tidal evidence, to be blasphemous, and he was placed under house arrest and his book banned from publication.
Kepler’s theory won the day, and nowadays we all readily attribute the rise and fall of the tides to the pull of the moon. However, Galileo’s dogged loyalty to his mistaken theory inspired successive scientists—Albert Einstein among them—to continue seeking a relationship between mathematics, astronomy, and physics, arguably leading to advances in quantum mechanics. While Galileo’s conclusion was wrong, his method opened up new possibilities. MK
1620
Naturalism
Francis Bacon
The belief that the universe’s workings may be understood by studying natural causes
Naturalism is the belief that we can acquire knowledge of how the world works by studying natural phenomena, not supernatural causes. Everything in the universe, from the existence of life to the motions of the planets and interactions between objects, is said to be governed and ruled by natural laws that humanity can investigate and understand. Naturalism is a belief that only natural phenomena exist, both in existence and in how knowledge is obtained.
Questions about how the universe came to be, and why events happen as they do, are likely as old as humanity itself. Thinkers such as Thales of Miletus proposed naturalistic solutions to such fundamental questions as early as the sixth century BCE. During the Renaissance (c. 1450–1600), naturalistic explanations became more prominent. In 1620, English philosopher Francis Bacon (1561–1626) published Novum Organum Scientiarum (New Instrument of Science), in which he proposed a method of learning called inductive reasoning, where conclusions are drawn from observed data, instead of implied from presumed principles. Inductive reasoning, and the investigative method on which it is based, became essential to scientific inquiry.
“The subtlety of nature is greater … than the subtlety of the senses and understanding …”
Francis Bacon, Novum Organum Scientiarum (1620)
The natural world is, for the most part, one that is knowable, measurable, quantifiable, and predictable. Naturalism presumes that the world as we see it is what it is. In contrast, belief in supernatural phenomena stands in the way of understanding the world; humanity cannot exert control over supernatural phenomena or influence them, and, even worse, it can provide no explanation or reason for their actions. MT
1621
Thanksgiving
U.S. Pilgrims
A traditional celebration to mark an auspicious event
The First Thanksgiving by Jean Leon Gerome Ferris, painted between c. 1912 and c. 1915.
Religious celebrations of gratitude took place among many settlers in the Americas in the 1600s. In the United States, the traditional celebration of Thanksgiving on the fourth Thursday in November is associated with the Pilgrim settlers of the Plymouth colony in present-day Massachusetts. The most common account of the first Thanksgiving links the celebration to 1621, when the Pilgrims joined with indigenous people to give thanks for a particularly good harvest after a difficult year within the settlement. Several other settlements in the Americas around this time also have claims for celebrating early Thanksgiving.
The idea of the Thanksgiving event had its origins in England during the Protestant Reformation, when reformers were anxious to replace Catholic public holidays with feast days of their own. A tradition began of celebrating fortuitous events with a special thanksgiving meal; conversely, adverse events were marked by a day of fasting. It was hoped that giving thanks to God might bring further good fortune, while fasting might prevent additional disasters.
“I do therefore invite my fellow citizens … to … observe the last Thursday of November.”
Abraham Lincoln, U.S. president 1861–65
Even though several of the symbols and traditions of Thanksgiving are taken from the story of the Pilgrims at the Plymouth colony, the holiday is now a celebration of a spirit of gratefulness rather than a commemoration of a particular day or event. As a religious celebration, Thanksgiving is intended to remind those who celebrate it of God as the provider of all good things. Thanksgiving in the United States is also celebrated with a secular appreciation of the work ethic and perseverance of the early U.S. colonists. TD
1622
Propaganda
Pope Gregory XV
The distribution by missionaries of Roman Catholic information or ideas, presented in ways surreptitiously designed to change people in their thinking or behavior
Portrait of Pope Gregory XV and Ludovico Ludovisi (seventeenth century) by Domenichino (Domenico Zampieri). Ludovisi, the pope’s nephew, was made cardinal to assist his aging uncle.
Few words in the English language are used as pejoratively as “propaganda.” The word conjures up black arts of misinformation and manipulation, with a perverted intention to mislead and deceive. Yet the original purpose of propaganda was more benign, even though its existence was viewed in a hostile way right from the beginning.
“Propaganda is a much maligned and often misunderstood word. The layman uses it to mean something inferior or even despicable. The word propaganda always has a bitter after-taste.”
Josef Goebbels, Nazi Minister for Propaganda and Enlightenment
The Protestant Reformation after 1517 seriously weakened the Roman Catholic Church. In the course of reforming Catholic practices and stamping out dissent, Pope Gregory XV (1554–1623) decided that his Church needed a single body to fight Protestantism. On June 22, 1622 he set up the Sacra Congregatio de Propaganda Fide (the Sacred Congregation for the Propagation of the Faith). This new body trained missionaries—or propagandists—to revive the Catholic faith in Europe and strengthen it in the European colonies across the Atlantic. Because the missionaries wanted people to accept the Church’s doctrines voluntarily, no force was permitted. Accordingly, the missionaries would resort to covert or hidden ways to persuade people to change their religious views. Not surprisingly, propaganda, as an instrument of the Catholic Church, was disliked by its Protestant opponents, and hostility to its overall concept has continued to the present day.
The word “propaganda” began as a term for any organization that set out to spread a particular doctrine or set of beliefs. It soon came to describe the doctrine itself, and, after that, the techniques used to change opinions in order to spread the doctrine. Propaganda can take many forms, both covert and overt, but in reality it is neither good nor bad in itself. Propaganda is merely a process of manipulation of a person’s behavior or views for someone else’s benefit. The only way to judge propaganda, therefore, is to determine whether or not it is successful. SA
1622
Slide Rule
William Oughtred
A manual analog computer designed to speed up lengthy calculations
It was the introduction of logorithm by John Napier (1550–1617) in 1614 that made the slide rule possible. Like the logorithm, the slide rule is used to simplify tedious mathematical operations. The first adjustable logarithmic slide rule, a circular design, was created in 1622 by British mathematician William Oughtred (1574–1660). A typical slide rule today looks much as it did in Oughtred’s time: either circular or linear, with scales for mathematical computations: multiplication, division, roots, logarithms, and trigonometry.
Slide rules continued to undergo development as technology improved and demand for detailed calculations increased. For example, improvements were made by manufacturer Matthew Boulton (1728–1809) and by engineer James Watt (1736–1819) for the purpose of designing steam engines, and in 1814 the physician Peter Roget (1779–1869), the Roget of thesaurus fame, invented a slide rule for calculating powers and roots of numbers. In certain fields, such as aviation, customized slide rules were required to facilitate specialized mathematical operations. The slide rule so greatly simplified what had previously taken a great deal of time to calculate longhand that it is now referred to as an analog computer.
The advent of the scientific calculator made the slide rule obsolete, but slide rules offer the advantage of not relying on electricity or batteries, and they display all the operations of a calculation alongside the result. Calculating on slide rules is slower than on calculators or computers, and the precision of a result can be off by as much as three significant digits, which can lead to serious error. All the same, aviators and sailors continue to carry slide rules in case of instrument failure. The existence of the online International Slide Rule Museum attests to the ongoing significance and utility of the instrument. MK
1624
Deism
Lord Herbert of Cherbury
Belief in the existence of a creator who does not intervene in the universe
Deism is a religious philosophy that eschews supernatural, religious dogma, and the idea of revelatory religious texts that are inerrant (totally free from error of any kind). Instead, the deist believes that a human’s ability to reason is proof in itself of a divine creator, though one that may merely govern, and not interact, with the created universe.
Many of the tenets of deism existed in the classical world, and deistic principles have been present in a variety of cultures in Europe prior to the seventeenth century. In England, the term “deist” first appeared in print in The Anatomy of Melancholy (1621) by Robert Burton. English philosopher Lord Herbert of Cherbury (1583–1648) is often cited as one of the first proponents of English deism after his publication of De Veritate in 1624. The philosophy attracted many influential thinkers of the Enlightenment, including Jean-Jacques Rousseau, Voltaire, and a host of the founding fathers of the United States, such as Benjamin Franklin, Thomas Paine, Thomas Jefferson, and George Washington.
“Religion is a matter which lies solely between man and his God.”
Thomas Jefferson, U.S. founding father
Classical deism declined after thinkers such as John Locke and David Hume began attacking the underlying foundations of the belief system, and by the nineteenth century few people claimed to be deists. The lack of emphasis on a personal relationship with the divine led many believers to divert to other religious movements, while others turned to atheism. The idea, however, of a moderate position between the extremes of atheism and dogmatic religion is still influential, allowing many to believe in a divine creator while accepting antitheistic scientific notions. MT
1636
Contact Lens
René Descartes
The theory of using a water-filled vial resting on the eye to correct faulty vision
An illustration of how vision works, from a 1692 edition of René Descartes’s Opera Philosophica (1662).
In 1508 Leonardo da Vinci (1452–1519) composed a journal, Codex of the Eye, Manual D, in which he described how the perception of the eye alters when opened in water. Da Vinci was not interested in how to correct faulty vision, however; what concerned him was how the eye alters its optics to maintain clarity of vision when viewing an object at varying distances. For the artist, the eye was key to everything, but it never occurred to him, apparently, that a lens could be placed over the eye to correct faulty vision.
This idea would have to wait another 128 years. In 1636, French thinker and father of modern philosophy René Descartes (1596–1650) pondered whether, if a person were to fill a glass tube with water and then place that tube over the eye’s cornea, it might have the effect of correcting that person’s vision. Descartes’s idea was innovative but impractical, as the glass tube he proposed would be too thick to allow blinking.
“I take out … a double convex lens …. My eye immediately becomes presbyopic.”
Thomas Young, On Mechanisms of the Eye (1800)
In 1801 English scientist and founder of physiological optics Thomas Young (1773–1829) coined the term “astigmatism” and constructed a rudimentary set of contact lenses following Descartes’s design principle. However, it would not be until 1887 that the first functioning set of contact lenses was made and fitted, in Zurich, Switzerland, by the German-born ophthalmologist Adolf Fick (1829–1901). Made from heavy blown glass, Fick’s lenses ranged in diameter from 19 mm to 21 mm and were shaped after first taking casts of rabbit eyes and those of deceased humans. Instead of resting directly on the cornea, these lenses were placed on the eye around it. BS
1637
Epistemological Turn
René Descartes
A philosophical change of approach, from “What do we know?” to “How do we know it?”
What is called the “epistemological turn” was a philosophical change in focus and point of departure that occurred in the seventeenth century with the emergence of modern philosophy. Questions about what a person can know to exist (epistemology) began to be seen as more fundamental than those concerning what does exist (ontology). The approach that had been dominant throughout philosophy’s medieval period then appeared to be “putting the cart before the horse,” since thinkers had taken for granted the existence of certain things—God, angels, and the soul—without seriously investigating why they believed in them.
“[The world …] Must vanish on the instant if the mind but change its theme.”
William Butler Yeats, “Blood and the Moon” (1928)
The epistemological turn started with rationalist philosopher René Descartes (1596–1650) and his procedure of doubting everything that is not known with absolutely certainty—first described in works such as Discourse on the Method (1637), and developed in the work of philosophers such as George Berkeley (1685–1753), David Hume (1711–76), and especially Immanuel Kant (1724–1804). The knower is now the original point of departure for the investigation, and also secures the objectivity of possible objects. That is not to say that the apple on the tree is only real because I see it; rather, it is to say that the apple is red and appears on the tree because my mind can take that sensory data and assemble that picture for me. One important result of the epistemological turn has been that people must now examine their beliefs about themselves, the world, and reality—as well as the justification for those beliefs—before making existential claims about such things. KBJ
1637
Rationalism
René Descartes
The belief that the nature of the world is best discovered by the exercise of reason
Rationalism, from the Latin ratio, meaning “reason,” is a philosophical viewpoint that sees reason as playing the main role in obtaining knowledge; we come to understand the world through our powers of reasoning and logic. The core canons of rationalism are three: we have a priori knowledge, that is, ideas independent of experience; we have innate ideas present at birth; and there exist laws of logical necessity, meaning that there are ideas that cannot be thought otherwise. Rationalism is often contrasted with empiricism, which sees experience as the origin of all knowledge.
Rationalism dominated the seventeenth century, and is often described as beginning with Galileo Galilei (1564–1642) and ending with Gottfried Leibniz (1646–1716). The theory was set out most fully in Discourse on the Method (1637) by philosopher René Descartes (1596–1650). Many of its core ideas were foundational for the Enlightenment of the eighteenth century, and for many philosophies today. The birth and rise of rationalism occurred along with several important historical events and discoveries that ushered in the modern world. These include the decay of the medieval church system, the separation of church and state in many countries as a result of oppression, the discovery of the New World through navigation using magnetic compass, the speed and span of conquest made possible by gunpowder and guns, and the implementation of scientific method.
“I rejected as false all the reasons which I had formerly accepted as demonstrative.”
René Descartes, Discourse on the Method (1637)
Rationalism demonstrated to humankind that it was born with the tools and abilities to solve the mysteries of the universe. With or without God, the world was humankind’s to know. KBJ
1637
Substance Dualism
René Descartes
The idea that soul is distinct from body, and mental substances from bodily ones
In metaphysics, a “substance” usually refers to that which stands under, or grounds, something else. Typically, a substance is the owner and unifier of its properties, both essential and accidental, and it has within itself an impulse to develop or actualize its capacities or potential properties. Substances are often considered nonphysical living things as they are able to maintain exact similarity throughout change (unlike “property things,” such as cars or piles of sand).
Aristotle (384–322 BCE) and St. Thomas Aquinas (1225–74) identified substances with souls, maintaining that substances such as God, humans, animals, and trees all have souls, though the properties in each soul type vary. For example, God has an uncreated, eternal, rational soul/substance; a human has a created, conditionally immortal, rational soul/substance; an animal has a created, mortal, sentient soul/substance; and a tree has a created, mortal, vegetative soul/substance. Most religions and philosophies subscribe to some form of substance dualism, positing a real distinction between substance (soul or spirit) and physicality or matter.
“[Body and mind] may be made to exist in separation … by the omnipotence of God.”
René Descartes, Discourse on the Method (1637)
Aristotle taught that the soul forms the body it possesses, making the interaction between the two entities fairly intelligible. Yet, rationalist philosopher René Descartes (1596–1650), in Discourse on the Method (1637), taught that there are only mental substances (rational souls) and physical substances (matter), and that these share no such intimate connection. This has resulted in the so-called “mind/body problem,” which continues to plague philosophy students today. AB
1637
The Mind / Body Problem
René Descartes
A proposed answer to philosophical questions concerning the nature of the mind and how it communicates with the body
An illustration from De Homine Figuris (Treatise of Man) by René Descartes, published posthumously in 1662. It is regarded as the first textbook of physiology.
What is the relationship between mental and physical events? How does the mind, an incorporeal and nonextended substance, communicate with the body, which is a corporeal and extended thing? These are the central questions of the mind/body problem, one of the oldest metaphysical problems in philosophy.
“ … it is certain that I, [that is, my mind, by which I am what I am], is entirely and truly distinct from my body, and may exist without it.”
René Descartes, Discourse on the Method (1637)
The most popular version of the mind/body problem appears in Discourse on the Method (1637), by rationalist philosopher René Descartes (1596–1650), under the label of Cartesian Dualism; however, it has roots in the philosophies of ancient Greece (Plato and Aristotle), and of medieval scholars such as Augustine, for whom the notion of soul takes the place of mind. The problem persists in modern philosophies of mind.
Descartes believed that the nonmaterial mind inhabited and expressed itself in a mechanically operated body, like a ghost in a machine. He was concerned with exactly how two unlike things could communicate, and to answer this he conceived of an interaction point that he called the “pineal gland.” Interactions are bi-directional in nature: sense perception and physical sensations are felt by the body and relayed to the mind, but awareness of and reactions to these things are supplied by the mind. So, “I think” and “I am” are products of the conscious mind, while “Ouch, I stubbed my toe!” and “I am cold” are supplied by the body, conveyed through the pineal gland. But, can physical things be cleanly divided from mental things? Not really. Other attempted solutions to, or ways to avoid, this problem include various forms of dualism, materialism, monism, and epiphenomenalism.
Inquiries prompted by the mind/body problem have led to psychology, physiology, neurobiology, and neuroscience. The problem informs how we conceive of our embodied selves and our corresponding concepts of freedom, identity, and selfhood. KBJ
1637
Masters of Nature
René Descartes
The argument that the purpose of science is to enable humans to control nature
According to both Francis Bacon (1561–1626) and René Descartes (1596–1650), the purpose of natural philosophy—science—is to allow humans to gain power over nature and harness it for their needs. To be a master, or possessor, of nature (the universe) is to view science as a practical enterprise for human life and prosperity. Descartes’s Discourse on the Method (1637) was intended to provide humans with the knowledge needed to effectively attack, alter, and control nature.
Prior to the seventeenth century, this kind of thinking would have been seen as absurd and even impious because the study of science was expected to encourage contemplation; science was a form of spiritual discipline. By contemplating science, a person arrived at a sense of the higher moral order and purpose in the world, and achieved a union with the creation of the Greek gods and later the Christian God. Humans were only one element of the design, not the master of it all. However, Bacon and Descartes saw this approach as unfruitful, since science was producing nothing that improved the conditions for humans. To focus purely on moral purposes created a sterile, unproductive, and distorted kind of knowledge. They concluded that this notion of the universe as a moral framework should be set aside to make room for the human quest for efficient domination of nature.
Theirs was a victory of power and knowledge over mystery and wonder. With nature understood as a soulless machine whose causes and laws could be fully understood and utilized by humans, science could become a practical discipline that assisted humans in making the world a better place. Humans could be moral agents when using scientific discoveries for the betterment of all, as they did with medicine or meteorology, and at the same time be masters of their domain rather than its uncomprehending slaves. KBJ
1637
Fermat’s Last Theorem
Pierre de Fermat
A math problem that defied the world’s finest minds for more than 300 years
Fermat’s Last Theorem, also referred to as Fermat’s Conjecture, postulates that no three positive integers a, b, and c can be suitable for the equation an+bn=cn where n has a value greater than two. Before it was finally solved after 358 years, the Last Theorem was the world’s most difficult math problem, with some of the greatest mathematicians attempting a solution.
In 1637, Pierre de Fermat (c. 1607–65), a French lawyer and amateur mathematician, scribbled in his copy of Diophantus’s Arithmetica (third century) that he had “a truly marvelous proof” of an+bn=cn that “this margin is too narrow to contain.” While Fermat never wrote out a complete proof, he did leave a proof for the special case of n=4. This meant that subsequent mathematicians were left to prove the theorem for cases when n would represent a prime number. Proofs for three, five, and seven were published in the next 200 years, but it was not until the mid-nineteenth century that Ernst Kummer (1810–93) proved the theorem for regular prime integers. All of n would not be solved until 1995.
“Pierre de Fermat created the most profound riddle in the history of mathematics.”
Simon Singh, Fermat’s Last Theorem (1997)
In their struggle to solve Fermat’s Last Theorem, mathematicians were forced to expand upon existing mathematical structures, and algebraic number theory was thereby advanced and the modularity theorem proved. British mathematician Andrew Wiles (b. 1953) at last provided proof for Fermat’s Last Theorem to great acclaim in 1995, and the impact of his feat reverberated across popular culture and academic circles. References to Fermat’s Last Theorem have appeared in the TV show Star Trek: The Next Generation (1987–94), and Stieg Larsson’s novel The Girl Who Played with Fire (2006). MK
1641
Evil Genius
René Descartes
Everything we think we know was created by an evil genius and is, in reality, false
A watercolor by William Blake (1808), showing Satan training the rebel angels.
Suppose, posited French rationalist philosopher René Descartes (1596–1650), that everything you have ever perceived, all your knowledge, memories, observations, and sensory experiences, were false. Your entire existence has been the result of a powerful, malevolent being known as the “evil genius.” This evil genius has created everything you know just to deceive you and convince you of something that is not true.
Descartes wrote about the evil genius in his work Meditations on First Philosophy (1641), in which he attempts to show that the notions of science and God’s existence can not only rationally coexist, but also provide a rational basis for knowledge itself. The idea of the evil genius prompted him to doubt everything he knew, and everything he had ever known. Thus, he attempted to rid himself of all prejudices and historical hindrances imparted to him from the past. Facing the possibility of such systematic and pervasive doubt, the work of a powerful, malevolent spirit, what could Descartes say he truly knew?
“If you would be a real seeker after truth … doubt, as far as possible, all things.”
René Descartes, Meditations on First Philosophy (1641)
Descartes’s hypothesis of the evil deceiver laid the foundation for what would become known eponymously as Cartesian skepticism, or Cartesian doubt. This universal doubt laid the foundation for modern philosophy, with its focus on using logic and reason independently of the influence of the material world. Instead of studying the classical philosophic texts of the Greeks and Romans in order to understand the world, Cartesian philosophers employed their own reasoning capabilities to test the limits of knowledge and experience, searching for truth. MT
1641
“I Think, Therefore I Am”
René Descartes
We can be certain of our existence once we understand that we are capable of thinking
In the quest to determine what is real, true, and good, we can begin with the basic knowledge that we exist. We reach this conclusion because we are able to understand our own thoughts, and know that in order to think them we must exist.
In his work Meditations on First Philosophy (1641), rationalist philosopher René Descartes (1596–1650) asked what, if anything, can we ever really know? If we assume that all our sensory perceptions—and the beliefs based upon them—are flawed and potentially erroneous, how can we come to any conclusion about the world? For Descartes, the answer lay with thought itself. In order to have any thoughts at all, a person must exist. Even if all our senses, memories, and everything we accept as true turn out to be false, as the result of some universal deception, we can still be assured of our own existence because we are able to ask the question in the first place. In summary, Cogito ergo sum: “I think, therefore I am.”
“I am, then, in the strict sense, only a thing that thinks.”
René Descartes, Meditations on First Philosophy (1641)
Descartes is widely viewed as the foundational figure of modern Western philosophy. From his methodological doubt, Descartes built a system of epistemology—the study of knowledge—that would shape philosophy for centuries to come. Many of those who followed Descartes—David Hume, Baruch Spinoza, John Locke, Immanuel Kant—were influenced by his work. His insistence on using reason on its own, without the influence of perceptions, to obtain knowledge, and on the objectivity that that requires had a lasting impact, both on those who agreed with his approach and those who reacted against it. MT
1641
Isolationism
Tokugawa Shogunate
A policy by which a nation sets itself apart from the rest of the world
An engraving from c. 1901 showing a Dutch trader speaking to Leyasu, founder of the Tokugawa shogunate.
Isolationism is a political doctrine designed to isolate a nation diplomatically, militarily, economically, and culturally from the world around it. A country with an isolationist government will be mindful of protecting its own economic base from foreign competition through the use of tariffs and other forms of protectionism; it will often refuse to enter into trade and other economic agreements, will fundamentally oppose becoming a part of any political or military alliance, and will generally seek to maintain peace and security within its borders by avoiding being drawn into foreign conflicts or disputes—primarily by adhering to policies of nonintervention. One of the most striking twentieth-century examples of an isolationist approach to foreign affairs may be seen in the determination of the United States government—at least initially—to stay out of the war in Europe in 1939 and 1940. But isolationism is by no means a modern political phenomenon; the history of nations wishing to remain apart from the world around them goes back many hundreds of years.
One of the world’s most feudal and inward-looking regimes was the Tokugawa shogunate, also known as Tokugawa bakufu, which ruled Japan from 1600 to 1868. From 1641 to 1832 the shogunate enforced a policy that it called kaikin, meaning restriction or “locked country.” The policy was isolationism at its most extreme: any Japanese man or woman who left the country, and any foreigner who dared enter it, risked the penalty of death. There were exceptions: limited trade with China was permitted, and a Dutch factory in Nagasaki was permitted to trade under strict supervision. Exceptions aside, the policy remained in effect until the four U.S. warships of Commodore Matthew Perry’s “Black Ship” fleet arrived at Edo (Tokyo) Bay in July 1853. The shogunate opened Japan to trade to avoid war with a technologically superior power. JS
1647
Quakerism
George Fox
The belief that God exists in all of us and finds expression in the conscience
A nineteenth-century lithograph by Edward Henry Wehnert, showing George Fox preaching.
Quakerism is a Protestant religious movement that is committed to belief in a direct relationship between the believer and God. Practitioners aspire to live in accordance with the “inward light.” Quakerism also goes by the name of Society of Friends, or Friends Church, and it rejects all encoded ecclesiastical forms, such as clergy or creeds. Quaker worship is characterized by its unstructured nature, and leaders of the religion encourage believers to focus on their private lives with the intention of achieving spiritual and emotional purity.
In the mid-seventeenth century, before the name “Quaker” came into use, small groups of English Protestants who did not identify with existing Puritan groups began to gather, organize, and evangelize. They were led by charismatic preachers, notably George Fox (1624–91), who began preaching in 1647 and created a Quaker membership structure that persisted for many generations. Numbers grew both in England and New England, and soon Quaker communities were flourishing on the U.S. East Coast, most notably in what is now Pennsylvania, where William Penn (1644–1718) attempted to govern the entire state according to Quakerism’s principles of pacifism and religious tolerance. Although Penn did not turn out to be a strong governor, the influence of Quakerism remained politically potent in the following centuries, with Quakers on both sides of the Atlantic fighting to abolish the slave trade, advocating social justice and prison reform, and seeking pacifist resolutions to disputes.
Quaker influence persists in Western culture, especially in education and trade. Colleges such as Bryn Mawr College, Cornell University, and Johns Hopkins University were founded by Quakers, and Barclays and Lloyds are both Quaker banks. There is also the Quaker Oats brand, but this name and image is not actually connected to the religious organization. MK
1651
Leviathan
Thomas Hobbes
A government with absolute authority is necessary to ensure humanity’s survival
The title page of Leviathan by Thomas Hobbes, first published in London in 1651.
According to Leviathan (1651) by English philosopher Thomas Hobbes (1588–1679), humanity is naturally selfish, egotistical, hedonistic, and bent on pursuing self-destructive interests. The natural tendency of humanity inevitably leads to conflict and warfare, but it also has the contradictory impulse to better itself and pursue happiness. In order to temper and balance these intrinsic, competing natures, Hobbes believed that an authority must exist, both to curb people’s natural destructive urges and to allow them to prosper. That “Leviathan” state—monarchy, democracy, or other form of government—must possess a monopoly on that from which ultimate authority derives: violence.
Hobbes published Leviathan after living as an exile in Paris during the tumultuous years of the English Civil War (1642–51). As a royalist, Hobbes wrote Leviathan largely as a way of showing his royalist beliefs and support for the monarchy. At the time, however, his views were seen by many as antithetical to the traditional notion that a monarch’s rule is based upon divine grant, rather than a social contract between the ruler and the ruled to help keep peace in a civil society.
“The condition of Man … is a condition of Warre of every one against every one.”
Thomas Hobbes, Leviathan (1651)
While the notion that humanity is inherently destructive, and needs an absolute authority to govern it, is not universally agreed upon, the ideas that Hobbes expressed in Leviathan had a significant impact on many philosophers, politicians, economists, and social theorists. Utilitarianism similarly presumes humanity’s hedonism, while the works of Adam Smith and many other economists recognize the benefits of striking a balance between contradictory interests. MT
1651
Knowledge Is Power
Thomas Hobbes
Knowledge contributes to the material prosperity and peace of humankind
The idea that knowledge is power implies that, armed with information, an individual is likely to have greater influence on others, and therefore power. Some would argue that all power derives from knowledge.
Although the phrase “knowledge is power” has been attributed to English philosopher and scientist Francis Bacon (1561–1626), it does not occur in any of his works. However, it appears in De Homine (Concerning Man, 1658) by English philosopher Thomas Hobbes (1588–1679). Hobbes worked for Bacon as a young man, and the phrase may stem from their discussions.
Hobbes outlined his political philosophy of knowledge being power in an earlier work, Leviathan (1651), subtitled The Matter, Form, and Power of a Commonwealth, Ecclesiastical and Civil. He posited that history is knowledge of fact, while science is knowledge of consequences. A royalist sympathizer, Hobbes was writing against the backdrop of the English Civil War (1642–51). Although his works tackle questions of political philosophy, he was concerned with persuading the public to submit to sovereign power in order to maintain a commonwealth.
“Natural power is the eminence of the faculties of body or mind.”
Thomas Hobbes, Leviathan (1651)
Hobbes’s doctrine—that a legitimate state government is created when many people transfer power to one or a few, who in return secure their well being and security—was both lauded and criticized. His detractors regarded his philosophy as secular, he faced accusations of heresy, and from 1666 he was forbidden to publish any works relating to human conduct in England. Nevertheless, Hobbes’s works were the foundation of social contract theory. CK
1653
Pascal’s Triangle
Blaise Pascal
A triangular array of binomial coefficients used to advance probability theory
In mathematics, what is known as Pascal’s triangle is a tabular presentation of numbers arranged in staggered rows. The triangular arrangement gives the coefficients in binomial expression, such as (x+y)n, so the number in each subsequent row is obtained by adding the two entries diagonally above.
The Pascal’s triangle concept has its roots in Pythagorean arithmetic and Arabic algebra. Although it was described originally by Chinese mathematician Jia Xian (c. 1010–70), it is named after the French mathematician, physicist, and philosopher Blaise Pascal (1623–62), who was the first to discover the importance of the patterns it contained. Pascal described the triangular table of numbers in his treatise Traité du triangle arithmétique (Treatise on the Arithmetical Triangle), written in 1653 and published posthumously in 1665. The triangle shows many patterns. For example, drawing parallel “shallow diagonals” and then adding together the numbers on each line results in the Fibonacci numbers 1, 1, 2, 3, 5, 8, 13, 21, and so on. (After the second number, 1, each number in the sequence is the sum of the two previous numbers.) Pascal used the triangle to solve problems in probability theory.
Pascal’s triangle demonstrates various mathematical properties. His work led to the discovery by English physicist and mathematician Isaac Newton (1643–1727) of the binomial theorem for fractional and negative indices, and to both Newton and German mathematician and philosopher Gottfried Wilhelm Leibniz (1646–1716) developing a mathematical theorem for infinitesimal calculus. In the twentieth century, Polish mathematician Wacław Sierpiński (1882–1969) demonstrated that if all the positions in the triangle containing odd numbers are shaded black, and all the positions containing even numbers shaded white, a geometric fractal that is similar to a mosaic pattern and known as the “Sierpiński gasket” emerges. CK
1658
Orbis Pictus
John Amos Comenius
The first book to be written primarily for children
For centuries, children were regarded as adults. They were trained for adult life and were sent to work as soon as possible, although no doubt they enjoyed listening to folktales, fables, myths, and literature created for adults. Czech teacher, educator, and writer John Amos Comenius (1592–1670) was the first person to successfully publish a book written specifically for children, the illustrated Orbis Pictus (Visible World), released in 1658. Resembling an encyclopedia, the work was designed to teach children Latin by means of pictures and short, memorable sentences in both Latin and the child’s mother tongue. Orbis Pictus was first published in Germany in Latin and German, and English, Italian, French, Czech, and Hungarian editions soon followed. For a century it was the most popular textbook in Europe. Orbis Pictus is the precursor of contemporary audiovisual aids to language learning.
In the late seventeenth century, English philosopher John Locke (1632–1704) argued that at birth the mind of a child was a tabula rasa (blank slate) waiting to be written upon by educators. His assertion changed the concept of childhood, giving rise to books of moral instruction aimed at children. English publisher John Newbery (1713–67) produced A Pretty Little Pocket Book in 1744, which was the first book written for children’s enjoyment and for education. By the nineteenth century, there existed fairy stories, novels, and poems that children could read purely for pleasure.
Today, there is debate regarding the definition of children’s literature, given that an adult may read and enjoy a children’s book, and, to a lesser extent, vice versa—as evidenced by numerous successful crossover works. Historians have also pointed out that a discernable children’s literature requires a recognizable childhood, in which case children’s literature dates from the eighteenth century, when the concept of “childhood” was recognized in philosophy. CK
1661
Chemical Substance
Robert Boyle
A material (in any state—solid, liquid, or gas) that has a definite chemical composition
The Greek philosopher Democritus (c. 460–c. 370 BCE) proposed that everything is made of invisible substances, unbreakable “atoms” of different kinds, whose various combinations are responsible for all natural things. Using alchemical techniques, seventeenth-century scientists could isolate chemicals with stable properties, but these would change dramatically when mixed with other chemicals. Early chemists saw these chemical substances as the basic parts of matter, but some philosophical scientists, including Robert Boyle (1627–91), suggested that the Greek atomists had been right: the basic chemical substances were made of even smaller atoms clinging together in certain combinations. Boyle’s book The Sceptical Chymist: or Chymico-Physical Doubts & Paradoxes (1661) inaugurated modern chemistry. Mathematical laws for chemistry were established after Antoine Lavoisier (1743–94) proposed the conservation of mass in 1783 and John Dalton (1766–1844) announced his theory of atomic weights in 1805. By the mid-nineteenth century, chemists had isolated dozens of “elements” that could not be further reduced by laboratory methods. The periodic table of these was proposed by Dmitri Mendeleev (1834–1907) in 1869.
“To liberate the chemical energy of coal, another substance is required … oxygen.”
Wilhelm Ostwald, chemist
A chemical substance is composed of two or more elemental atoms, in a definite ratio, which are bonded together to form a stable composition with specific properties. A chemical substance can be broken apart (by analysis) and rearranged in new compositions (by synthesis), and in turn these processes involve the release or addition of energy. JSh
1663
Magazine
Johann Rist
A regular publication that contains articles of common subject matter and interest
Illustrations from The Gentleman’s Magazine, February 1785.
The world’s first magazine is generally considered to be Erbauliche Monaths-Unterredungen (Edifying Monthly Discussions), published in German in 1663 by Johann Rist (1606–67), a theologian and poet from Hamburg. It lasted until 1668, by which time it had been joined by the French periodical Journal des Sçavans (Scholars’ Journal), the first academic journal. These early publications were specialist in nature, and generally contained summaries on developments in art, literature, philosophy, and science. This enabled knowledge of academic advances to spread much more quickly, particularly with regard to works published in unfamiliar languages. Moreover, the emphasis on new discoveries led to an intellectual shift away from established authorities and the classics.
“Ephemerality is the little magazine’s generic fate …”
Frederick C. Crews, The New York Review (1978)
It was not until 1731, however, that a general interest magazine—and the word itself—first appeared. Edward Cave (1691–1754) was the son of a cobbler and had been expelled from Rugby School for stealing from the headmaster. He conceived the idea of producing a regular periodical that would cover topics of interest to the general reader. He called this innovation a “magazine,” a word describing a military storehouse of material that was derived from the Arabic makhazin, meaning “storehouses.” Edited under the pen name Sylvanus Urban, Cave produced the first edition of The Gentleman’s Magazine in London in January 1731 as a monthly digest of news and commentary on topics as wide as commodity prices and Latin poetry. The new magazine was a great success and was published almost without interruption until 1922. SA
1665
Biological Cell
Robert Hooke
A microscopic unit that is the basic unit of structure and function in all living organisms
An illustration of cork wood cells as seen under a microscope, from Robert Hooke’s book Micrographia (1667), believed to be the first major book on microscopy.