Without heavier-than-air flight, the world would seem a much larger place than it does today. Within fifty years of the Wright brothers’ flight, aircraft could fly faster than sound and travel across continents. A short time after that, aircraft were traveling into space. Military aircraft have changed warfare, while same-day intercontinental travel is now available to anyone sufficiently well-off to purchase a ticket. MT
1809
Lamarckian Evolution
Jean-Baptiste Lamarck
A theory concerning the evolution of living creatures that held that adaptive behaviors of parents resulted in physical characteristics in their offspring
According to Jean-Baptiste Lamarck’s theory of evolution, a given giraffe could, over a lifetime of straining to reach high branches, develop an elongated neck.
English naturalist Charles Darwin (1809–82) was not the first to use the term “evolution,” or to propose that today’s species are different from those living long ago. French naturalist Jean-Baptiste Lamarck (1744–1829), after contributing to biology and taxonomy, ambitiously suggested that vital forces within life guide it toward greater complexity, resulting in the evolutionary “progress” observable in higher life forms. Lamarck’s Philosophie Zoologique (1809) added another older idea, that useful acquired features of parents can be reproductively inherited by their offspring. Lamarck accounted for giraffes’ long necks by saying that the slightly stretched necks of some individuals were passed on to their offspring in the form of slightly longer necks; the offspring in turn would keep stretching their necks during their lifetimes before having their own longer-necked offspring; thus each generation has longer and longer necks.
“Lamarckian types of theory … are traditionally rejected—and rightly so—because no good evidence for them has ever been found.”
Richard Dawkins, biologist
The Lamarckian theory would mean that parents’ sperm and eggs are modified somehow by their achievements, but no evidence has supported this. In 1883 August Weismann (1834–1914) became the first Darwinian biologist to reject this theory of the inheritance of acquired characteristics, arguing that germ cells are produced in the body without any contribution from other somatic (body) cells formed during the parent’s lifetime. Yet, despite growing evidence favoring the modern evolutionary synthesis propelled by Weismann and Gregor Mendel (1822–84), scientists’ confidence in nonrandom evolutionary progress sustained a preference for Lamarck.
New research in epigenetics confirms ways in which environmental factors can indirectly manipulate the timing and extent of genes’ activity. Epigenetics has aroused interest in neo-Lamarckian effects, but Larmarckian ideas of inheritance remain refuted. JSh
1810
Tonality
Alexandre-Étienne Choron
A way of organizing pitch that continues to dominate most forms of Western music
In 1810 French musicologist Alexandre-Étienne Choron (1771–1834) defined tonality in the way it is used today, as an organization of harmonic triads around a focal major or minor triad, the Tonic, with the Subdominant on the fourth scale degree, driving to the Dominant on the fifth, followed by a resolution to the Tonic. A variety of metaphors, such as “gravitational pull,” have subsequently been used to describe the perceptual forces at play in this arrangement.
The three chords of the Tonic, Subdominant, and Dominant provide a rudimentary harmonization of most tunes constructed from a diatonic scale. Since the early seventeenth century they have come to replace an older melodic/harmonic approach based on the twelve church modes. Today, the concept of tonality includes a complex syntax of chords surrounding a referential Tonic chord.
“Tonality” has been used in wider, less useful, senses, too, to describe pitch organizations around a referential pitch center, such as the scale system in North Indian classical music, and even to describe the organization of highly atonal music (such as “twelve-tone tonality,” the technique devised by the composer George Perle).
Notes that do not belong to a chord, or chords that do not belong to the key, create tension and require resolution; this was explored in the chromatic music of J. S. Bach in the eighteenth century and that of Richard Wagner in the nineteenth, culminating in Wagner’s operas Tristan and Isolde (1865) and Parsifal (1882). During the first decades of the twentieth century, a notion arose that the constant increase in complexity would result in a total breakdown of the system in favor of atonality. However, tonality continues to dominate music today, particularly in many genres of popular music, but also in neo-romantic music. PB
1810
Absolute Music
Germany
The notion that instrumental music has no purpose or content outside of itself
A German philosophical idea of the nineteenth century, “absolute” music refers to instrumental music that is “untainted” by words and extra-musical ideas; it is a pure art form. One of the first advocates for this concept was writer and composer E. T. A. Hoffman (1776–1822), as expressed in his review of Beethoven’s Fifth Symphony in 1810. Richard Wagner (1813–83) later challenged this idea, arguing that poetry, dance, and music created a complete artwork, the Gesamtkunstwerk.
Defenders of absolute music, including German music critic Eduard Hanslick (1825–1904), pointed to the instrumental music of Johannes Brahms (1833–97) as the ideal. Hanslick argued that there is no strict causation between a particular musical phrase and its emotional effect, and that the same musical phrase set to diametrically different poetry could be considered equally appropriate. Music has no purpose beyond itself. Hanslick never denied that a piece of music could engender extra-musical thoughts and feelings, but maintained that these emotions were not intrinsically linked to the music itself, only the individual listener.
“Music cannot imitate nature: a musical storm always sounds like the wrath of Zeus.”
W. H. Auden, The Dyer’s Hand and other Essays (1962)
The notion of absolute music represents a valid aesthetic problem: if music’s content is just “tonally moving forms,” as Hanslick termed it, why are there often agreements between different listeners as to the expressive properties of a composition? With the advent of silent cinema, music written to fit standardized dramatic situations—love scene, chase, agonized reflection—created a more or less fixed vocabulary of musical gestures. Perhaps there is a purpose for music after all: to provide support for moving images. PB
1816
Utopian Socialism
Henri de Saint-Simon
A vision of a just, happy society without poverty or the need for social controls
Occupations of women (c. 1830), as envisaged by the utopian socialist Saint-Simonianism movement.
Utopian socialists describe a society with four features: it maximizes wellbeing for all; it satisfies standards of ethics and justice; it requires egalitarian cooperation; and all peoples can freely join. Utopian socialism emphasizes ethical cooperation and equal opportunity as the route to maximal wellbeing, and expects that all people will gladly join. If human nature is mainly good, then little civil control is needed in small-scale communalism, and freedom can be optimized. If human nature turns out to be a mixture of good and bad, then large-scale socialist government is recommended, and perhaps a single world government.
“The utopian socialists saw themselves as social scientists …”
Vincent Geoghegan, Utopianism and Marxism (2008)
Widespread adoption of Enlightenment ideas and mass democracy by 1800 gave encouragement to consideration of ideal democratic societies able to eliminate feudalism and poverty, reduce wealth and caste distinctions, and rethink economic systems. Sociologist and political economist Henri de Saint-Simon (1760–1825) shocked Europe with L’Industrie (1816) and subsequent works, in which he proposed government guarantees of economic productivity to all who want to work, and the elimination of nonproductive classes, such as aristocrats, bureaucrats, and clergy. His collaborator, Auguste Comte (1798–1857), advanced scientific principles in social organization. Socialist Robert Owen followed, along with Charles Fourier and Pierre-Joseph Proudhon. Karl Marx and Friedrich Engels were indebted to this tradition, but their Communist Manifesto of 1848 identified utopian socialism as better than capitalism, yet less realistic than revolutionary communism. JSh
1819
Pessimism
Arthur Schopenhauer
Life consists of a doomed and ultimately meaningless striving for power
In their worldview, optimists expect higher values to prevail in the long run; realists or cynics expect few long-term successes; pessimists expect everything of value to disappoint and perish. All three apply objective ethical standards to praise or condemn reality. Forms of pessimism had existed as far back as ancient Greece, but it was not until the work of nineteenth-century German philosopher Arthur Schopenhauer (1788–1860) that modern pessimism was born.
In 1819 Schopenhauer published his theory that will, or the drive to power, is the ultimate reality. Inspired by ancient Hindu Upanishads and Buddhism, he perceived will to be behind everything that is biologically alive. Our ethical duty is to love and aid each other, since our achievement is worthy, but we also see that nothing lasts for long and death destroys each of us. The godless cosmos is nothing but ceaseless unsatisfiable striving that may be eternal but never amounts to anything. Schopenhauer goes further than Stoicism or Buddhism, counseling people to stop having children, embrace resignation, extinguish their will to live, and hasten death. Schopenhauer influenced dramatist Richard Wagner, philosophers Friedrich Nietzsche and Ludwig Wittgenstein, and author Jorge Luis Borges.
“It would be better if there were nothing … there is more pain than pleasure on earth.”
Arthur Schopenhauer
Contemporary atheists agree that the universe is meaningless and nothing good is eternal, yet they value life for its own sake, following scientific rationalists such as August Comte and Herbert Spencer in expecting endless biological and technological progress. Few deny that this life is worthwhile. JSh
1821
Braille
Louis Braille
Using touch to read combinations of dots representing letters and numbers
The braille system of writing uses a code of sixty-three characters.
Louis Braille (1809–52), the inventor of the braille writing system for the visually impaired, became blind at a young age after suffering an accident. As a child at the Royal Institute for Blind Youth in Paris, Braille learned to read by running his hands over raised letters. In 1821, after learning about a military code known as night writing, one that employed raised dots to represent sounds to allow soldiers to read in the dark, Braille began to develop his system of using raised dots to represent letters instead of sounds.
Braille’s system substitutes tactile letters for visual ones. Today, braille can be written through the use of specially made typewriters that imprint paper with combinations of raised dots, or computers and printers that do the same; otherwise, users can emboss dots in paper by hand with the aid of a special stylus and slate. People who cannot see clearly use their fingers to touch the sequences of raised dots and interpret them as individual letters or numbers. Like other forms of writing, braille is a method of representing letters, rather than a language in itself. It can be used to represent any language, including musical notation.
Prior to the invention of braille, visually impaired people had very few options when accessing literature. Conventional lettering could be raised for use as a tactile writing system, but creating new works of raised letters was unwieldy and complicated. With braille, reading and writing became much simpler for those with visual impairments, potentially allowing them to experience the entire written world.
After the introduction of braille, literacy rates among visually impaired people skyrocketed, especially after braille typewriters, and later printers, became readily available. Visually impaired people were even enabled to use computers when refreshable braille displays, which usually raise dots through holes in a flat surface, were introduced in the 1980s. MT
1822
Photography
Joseph-Nicéphore Niépce
The concept of capturing a permanent visual record of a fleeting moment in time
Niépce captured this view, one of the earliest surviving photographs, from his window in 1826.
The world’s first photograph was produced in 1822 by French inventor Joseph-Nicéphore Niépce (1765–1833). Using bitumen of Judea, a light-sensitive type of asphalt that hardens on exposure to light, he succeeded in obtaining a photographic copy of an engraving superimposed on glass. In 1826, using a camera, he took the world’s first permanently fixed photograph, a view from his workroom captured on a pewter plate. Light passing through the camera’s lens was converted into a two-dimensional, static, and durable image.
His collaborator, Louis-Jacques-Mande Daguerre (1787–1851), perfected the stable daguerreotype in 1837. Ambrotypes (on glass plates) and tintypes (on metal plates) were replaced by the film photography of George Eastman (1854–1932) in the 1880s. Kodak introduced cheap color photography in the 1930s, which was followed by Polaroid’s instant photography (1960s), and digital cameras using charge-coupled devices (1980s).
As in painting, the choice of subject, perspective, context, framing, focus, and color tone all make photography a creative artistic medium. The positioning of human actors and the dramatic arrangement of action captured by the photographer permit a photograph to tell a compelling story, as in theater. The static scene can suggest a realism and a truth more objective than an eyewitnessed account. At the intersection of aesthetics, ethics, and epistemology, photography is the place where the beautiful, the good, and the true can temporarily unite. A beautiful photograph can lend goodness to its subject; a disturbing one can arouse a sense of injustice; and a justly taken image can be the truth. However, a photograph can mask its own agenda, and no other art form, perhaps excepting the novel, can serve ideology as effectively.
Today, photography is ubiquitous and inexpensive. However, with the advent of photo-editing computer software, the photograph can easily be made to lie. JSh
1823
Dark Night Sky Paradox
Heinrich Wilhelm Olbers
The question of why the universe’s billions of stars do not brighten the night sky
Olbers answered his paradox by suggesting that light from stars, such as these ones in the globular cluster Messier 10, is gradually absorbed while traveling through space.
Experience and common sense suggest that when night descends, the sky gets dark except for the moon and the bright pinpoints of light that we recognize as stars. However, this apparently straightforward phenomenon actually poses significant problems for scientific theories of an infinite universe. German physician and astronomer Heinrich Wilhelm Olbers (1758–1840) pointed out the paradoxical nature of night in 1823, arguing that if the universe really is endless and uniformly populated with stars, then, no matter from where the observation is being made, the night sky should be completely bright.
“Were the succession of stars endless, then the background of the sky would present us a uniform luminosity.”
Edgar Allen Poe, “Eureka” (1848)
The name Olbers tends always to be used in referring to the “dark night sky paradox” as Olbers’ Paradox. However, British astronomer and cosmologist Edward Robert Harrison (1919–2007), in his widely accepted account of Olbers’ Paradox titled Darkness at Night: A Riddle of the Universe (1987), makes the case that Olbers was derivative in his description of the problem and actually contributed nothing of importance to the paradox bearing his name. Harrison asserts that German astronomer Johannes Kepler (1571–1630) advanced it far earlier, in 1610, in an argument against the theory of infinite stars in an infinite universe. Later work on this scientific conundrum was conducted in the eighteenth century by Edmond Halley (1656–1742)—after whom the comet was named—and Frenchman Jean-Phillippe Loys de Chéseaux (1718–51), the latter being the first to state the paradox in its present form.
Regardless, mathematical physicist Lord Kelvin (1824–1907), in a barely noted paper of 1901, proposed a satisfactory resolution to the paradox. Olbers’ Paradox is proof that the universe, rather than being static, is constantly expanding. The light from the stars in the far reaches of the universe, he claims, has not yet reached Earth, and that is why the sky is dark in between the stars that we observe. MK
1824
Greenhouse Effect
Joseph Fourier
The theory that the atmosphere traps infrared radiation from the Earth
A diagram demonstrating the absorbtion and reflection of solar radiation that is involved in the greenhouse effect, enabling the Earth to remain at a comfortable temperature.
The discovery of the “greenhouse effect” is attributed to the French physicist Joseph Fourier (1768–1830), who published his first article on the subject in 1824. Citing the researches of fellow physicist Horace-Bénédict de Saussure (1740–99) with “hot boxes”—miniature greenhouses covered with several panes of transparent glass to trap heat from the sun—Fourier concluded that atmospheric gases could trap heat in the same way as glass panes—a comparison that suggested the term greenhouse effect.
“Recent warming coincides with rapid growth of human-made greenhouse gases. The observed rapid warming gives urgency to discussions about how to slow greenhouse gas emissions.”
James Hansen, Earth and environmental sciences professor
Life on Earth depends on maintaining an average surface temperature comfortably between the boiling and freezing points of water. The planet is heated by absorbing solar radiation, but it also gives it off in the form of invisible infrared radiation. Every region of the planet would heat up during the day but rapidly cool to subzero temperatures at night were it not for gases in the atmosphere forming an insulating blanket that absorbs escaping heat and redirects it back to the planet’s surface. The so-called greenhouse effect is what keeps our temperature in a comfortable range.
The discovery of which gases in the atmosphere were responsible for this process was made by Irish physicist John Tyndall (1820–93) in 1859. Tyndall found that carbon dioxide and water were strong absorbers of radiant energy (unlike the main atmospheric gases, nitrogen and oxygen) and that even in relatively small quantities they could account for the heat trapped by the atmosphere. He also believed that changes in the various components of the atmosphere could result in changes to the Earth’s climate.
Today, Tyndall’s theory is at the center of fierce debate. Many scientists believe that human activity since the Industrial Revolution (1760–1840) has created an increase in “greenhouse gases,” causing not only higher average temperatures but also climate changes that could have severe consequences for life on Earth. GD
1824
Animal Rights
Lewis Gompertz
The belief that animals deserve at least some of the ethical rights of humans
A satirical artwork from 1921, that reverses the role of humans and animals in vivisection.
Whether humankind owes a moral, ethical, or legal duty to animals has probably been discussed since people first began domesticating and using animals. Some early thinkers, such as the Greek philosopher Pythagoras (c. 570–c. 495 BCE), believed that animals deserve to be treated respectfully, and some religious traditions, such as Jainism, hold that humans should never engage in violence against any type of animal.
The first animal protection law was created in 1635 in Ireland, and subsequent laws in various nations prohibited various acts as cruel or inhumane. But the idea that animals have inherent rights did not appear until 1824, when English animal rights advocate Lewis Gompertz (1783/4–1861) published Moral Inquiries on the Situation of Man and of Brutes. Gompertz posited that animals, like humans, are entitled to liberty and to legally protected rights. More recently, in 1970, animal ethics philosopher and animal welfare campaigner Richard Ryder (b. 1940) coined the term “speciesism,” calling attention to the fact that human ill-treatment of animals does not differ from any other form of bigoted behavior that seeks to deprive others of rights. (Another form of speciesism would consist of human beings insisting that some species, lions for example, are entitled to more rights than, say, mice.)
“Arguments … cannot shatter this hard fact: in suffering the animals are our equals.”
Peter Singer, moral philosopher
Why should humans have rights and other animals not? That is the basic question at the heart of the idea of animal rights. For proponents and opponents alike, the idea of animal rights challenges our ideas of humanity, and why humans should, or should not, provide protection for other animals. MT
1824
Second Thermodynamic Law
Nicolas Léonard Sadi Carnot
Every physical reaction generates heat, and is therefore irreversible
French military engineer Nicolas Léonard Sadi Carnot (1796–1832) laid the foundations of the Second Law of Thermodynamics in his book Reflections on the Motive Power of Fire (1824). Carnot was postulating an ideal heat engine (one that converts thermal energy to mechanical work) to show how such a machine could achieve maximum efficiency. But he realized that no actual engine could ever attain the reversibility of motion of the ideal, since all actual processes inevitably generate heat, relative to the external environment. Carnot had recognized a fundamental paradox: the laws of physics are reversible, but this law was not.
English physicist James Joule (1818–89), fascinated by electricity and working to improve the efficiency of his industry, decided in 1834 that heat and mechanical work are essentially related. An exchange of letters with Belfast-born mathematician and physicist William Thomson (Lord Kelvin, 1824–1907) led the latter to coin the term “thermodynamics” in 1849. Meanwhile, German physicist Rudolf Clausius (1822–88), after reading Carnot, spent fifteen years formulating his Mechanical Theory of Heat. In 1854, he reformulated the second law: “The entropy of the universe tends to a maximum.”
“[Any theory against the second law will] collapse in deepest humiliation.”
A. Eddington, The Nature of the Physical World (1927)
Physicists still ponder the probable paradox of the second law. Engineers continue to develop technologies in areas such as communications, electronics, and industrial development, trying to meet the apparently insatiable human hunger for energy and the comforts it affords. However, Carnot’s ideal heat engine remains as distant a prospect as ever. LW
1830
Mormonism
Joseph Smith, Jr.
A primitivist Christian ideology based on a revelation of its founder and on the Bible
Church of Jesus Christ Mormon Temple, Salt Lake City, Utah. Salt Lake City was founded by Mormons in 1847.
The founder of Mormonism, Joseph Smith, Jr. (1805–44), was a treasure hunter and diviner who saw visions. He revealed that, in the course of one of his revelations, an ancient prophet, Moroni, directed him to the burial place of two golden plates inscribed with the history of a previously unknown tribe of Native American Israelites. A transcription of the history of these people provided the basis of Mormonism’s most revered text, The Book of Mormon, which Smith published in 1830. The Book of Mormon, the Bible, and two other texts containing extensive contributions from Smith comprise Mormonism’s four standard works. Many passages in The Book of Mormon echo sections of the Old and New Testaments.
Smith’s revelation occurred in the 1820s, during the Protestant revival movement known as the Second Great Awakening, which was part of a general attempt to renew the Christian foundations of what was seen as an ailing or corrupt version of the faith. Smith’s family, already deeply influenced by the movement, provided a fertile base for his visions. Despite his lack of wealth or connections, Smith was charismatic and had a powerful ability to attract followers and convince them of the validity of his claims, however unlikely they might have seemed. Inevitably, he alienated some, and he may even have committed acts of fraud, but those who believed in him did so unreservedly. Mormonism became a cause of martyrs and pioneers. It taught a return to fundamental conservative principles: family, community, and hard work, with commitment, material and spiritual, to the Church of Latter Day Saints.
Mormonism is to Christianity what Christianity is to Judaism: it emerged out of the former religion but it has its own focus. As an ideological movement, it illustrates how the personality of one individual can invoke a response leading to an entirely new religion. LW
c. 1830
Germ Theory of Disease
Agostino Bassi
The theory that microscopic organisms are the cause of some diseases
A colored scanning electron micrograph of anthrax bacteria spores (Bacillus anthracis).
The germ theory of disease, now widely accepted in medicine, postulates that organisms invisible to the naked eye, known as microorganisms, can infect the human body and become the cause of diseases. At the time it was first suggested in the nineteenth century, the theory flew in the face of long-held claims by physicians that bad air, or “miasmas,” caused epidemics, or even that diseases were divine retribution for ungodly behavior.
In 1677 Anton van Leeuwenhoek (1632–1723) was experimenting with a microscope when he first discovered microorganisms. The breakthrough of associating these with infection and disease occurred in around 1830 when Italian entomologist Agostino Bassi (1773–1856) showed that muscardine, an infectious disease affecting silkworms, was caused by a living entity, a fungus later named Beauveria bassiana in recognition of his work. In 1844 Bassi asserted that microorganisms caused human diseases, too.
In the 1860s, French chemist and microbiologist Louis Pasteur (1822–95) confirmed Bassi’s theory that other microorganisms were linked to disease, also showing that microbes were responsible for decomposition. In 1876, German physician Robert Koch (1843–1910) proved that bacteria are the cause of diseases such as anthrax and tuberculosis.
While merely identifying the cause of a disease does not necessarily suggest a cure, the germ theory of disease transformed medicine from an art to a science. Discovering physical, identifiable, causes for specific diseases led to new understanding not only of how to treat illnesses, but also of how to prevent their spread. Germ theory led to a fuller understanding of the importance of clean water, care in personal hygiene, and hygienic food preparation. The development of pharmaceuticals targeted at specific harmful microorganisms would soon follow. MT
1830
Positivism
Auguste Comte
Only that which can be empirically verified or proved logically or mathematically exists
An engraved portrait of French philosopher and founder of positivism Auguste Comte, from c. 1840. Comte is also known as the founder of the science of sociology.
The word “positivism” is an English transliteration of Auguste Comte’s (1798–1857) French term positivisme. Developed from 1830 to 1845, Positivisme was Comte’s reaction against contemporary German philosophy, which was highly speculative and metaphysical. He thought that the development of the physical sciences, with their emphasis on observation and experimentation, required that philosophy should base itself on a similar premise, rather than relying on ancient thought or the ideas of earlier philosophers. At the same time as Comte was developing positivisme in France, intellectuals such as John Stuart Mill (1806–73) and Hubert Spencer (1820–1903) were thinking along the same lines in Britain. Influences on the early positivists included the Enlightenment thinkers John Locke (1632–1704), David Hume (1711–76), and Immanuel Kant (1724–1804).
“By the very nature of the human mind, every branch of our knowledge is necessarily obliged to pass successively through three different theoretical states.”
Auguste Comte
Positivists maintain as a historical law that every science has three successive stages—the theological, the metaphysical, and the positive—and that the positive stage, which confines itself to the study of experimental facts, represents the perfection of human knowledge. A second and third wave of positivism in the late nineteenth and early twentieth centuries developed these ideas still further, with the movement splitting into two groups, one led by Paul-Emile Littré (1801–81), the other by Pierre Laffitte (1823–1903).
Positivism carried with it important implications for philosophical and religious thought. With its absolute insistence on empirical verification and the primacy of the senses, positivism rejected the idea of a personal God, replacing it with a humanist perspective that Comte and the early positivists promoted as a religion in its own right. It also spawned a number of new theoretical subject disciplines, including sociology, while new slants were provided on ancient disciplines, such as logic and the theory of knowledge. JF
1830
Transcendentalism and Self Reliance
Ralph Waldo Emerson
The purity of the individual is gained through identification with nature
Having first lectured on the individualistic belief system of Transcendentalism, Ralph Waldo Emerson presented his ideas in the essay “Nature,” published anonymously in 1836.
The Transcendentalism taught by U.S. essayist, lecturer, and poet Ralph Waldo Emerson (1803–82) consists of an open-minded experience of the natural world that goes beyond just being human to becoming an awareness of participation in existence, as expressed in the maxim, “I am a part or particle of God.” In 1830, Emerson gave a lecture containing his philosophy of Self Reliance—we must learn how best to provide ourselves with the essentials, and also learn what these are. Taken together, the two ideas promoted an examination of each strand of a person’s life; anything unnecessary was to be excised so that reality revealed itself. The philosophy applied as much to its form of expression as to the way of living it recommended.
“Not the sun or the summer alone, but every hour and season yields its tribute of delight; for every hour and change corresponds to and authorizes a different state of the mind, from breathless noon to grimmest midnight.”
Ralph Waldo Emerson, “Nature” (1836)
U.S. poet and philosopher Henry David Thoreau (1817–62), deeply influenced by his friend and mentor, extended Emerson’s philosophy through his practical experiment in living at Walden Pond, Massachusetts. Like Plato, Thoreau sought to show that the means by which human needs are met are often far from ideal.
In a sense, both Transcendentalism and Self Reliance were reactions to the onward march of pioneerism, the philosophy of “we shall overcome” directed at the land (and the indigenous peoples) of North America. Transcendentalism rejected this deep confidence in the materials and manners of so-called “civilization.” Instead, it sought to demonstrate how an appreciation of place was an appreciation of self, since the external and internal worlds were reflections of one another.
Emerson’s Transcendentalism was acknowledged by Friedrich Nietzche in The Gay Science (1882), and it shaped the work of William James and John Dewey. Relying on one’s hands and wits, paring back requirements to cover needs, Transcendentalism has been compared recently with Eastern thought and traditions that seek to link actions, abilities, and perspective in order to alter both the individual and the whole of existence. LW
1830
Inflation
Andrew Jackson
A steep increase in money circulation, then prices, in relation to the supply of goods
Children in the Weimar Republic play with banknotes made worthless by inflation (1923).
Inflation has been around ever since money and the value it represents were dissassociated. The abolition of the Second Bank of America in 1830 by the seventh U.S. president, Andrew Jackson (1767–1845), led to a spiral of increased speculation that ended in the Panic of 1837. Yet the term “inflation” only started to appear during the American Civil War (1861–65), in reference to currency depreciation, when the rate in the Confederacy soared to 9,000 percent.
Inflation has two stages: money is printed in large amounts over a short period of time, and prices increase. Although the nominal value of the money stays the same, the amount each unit can buy decreases. Things cost more. Central banks attempt to keep inflation between 2 and 3 percent, but during a recession or depression, printing money resembles economic growth. In wartime, the pressure to conscript labor, increase the price of goods, and divert capital from civilian to military programs is even greater.
“Inflation had risen to the unimaginable figure of just over 100,000 percent …”
Jung Chang, Wild Swans (1991)
The delay between monetary inflation and price inflation can be deliberately extended: it lasted for two decades after World War II (1939–45) in the United States, when the country spent most of its hoard of gold. Except for profiteers, suppliers, and the rich, no one benefits during inflationary periods. They shatter the illusion that money is a secure form of asset protection. Political discomfort, then anger, then social unrest or even war often follow. Inflationary periods underline just how tenuous our hold on material security really is under systems that depend for their continuance on the idea of continuous growth. LW
1831
Hegelianism
Georg W. F. Hegel
A philosophical movement based on the work of German Idealist Georg W. F. Hegel
One of the German Idealists, Georg Wilhelm Friedrich Hegel (1770–1831) sparked a movement that was interpreted in radically different ways by those whom he inspired. Philosophy was “its own time, raised to the level of thought,” and Hegelianism both placed thought in its historical context and investigated the transcendental nature of reality. It held that contradictions demanded resolution at a higher level of intellectual complexity, with the infinite (or Absolute Spirit) characterized by “becoming,” rather than “being.” Attempts to reconcile this mysticism with a geneology of political progress created deep tensions.
After Hegel died in 1831, interpreters of his work divided into three camps. The right read him as a defender of their religious and political conservatism and saw in his work an inevitability to history: the unfolding of events was a logical necessity. The left, in contrast, understood Hegelianism as a directive to social and cultural revolution. The center focused on Hegel’s theoretical significance, particularly in logic.
‘’Dialectics, logic, history, law, aesthetics … assumed a new aspect, thanks to … Hegel.”
George Plekhanov, The Meaning of Hegel (1891)
The right spawned schools of thought that advocated purely rational solutions and this, combined with criticism of Judaism as the paradigm of a “positive” (irrationally submissive) religion, paved the way for the spread of anti-Semitism. The left, as Karl Popper argued, spawned the legitimization of totalitarianism. The center influenced theories in fields as diverse as law and the study of beauty. Hegelianism had a historical, philosophical, social, and religious influence, but it was as a justification for political movements of every persuasion that it really came into its own. BC
1831
Think Tank
First Duke of Wellington
The notion of an independent, expert, policy-oriented research organization
Although the first known use of the term “think tank” was in 1959 in an article, “Facts and Impressions from the Groves of Academe,” in reference to the Center for Behavioral Sciences at Palo Alto, the concept originated much earlier. In 1831, Arthur Wellesley (1769–1852), First Duke of Wellington, initiated the founding of the Naval and Military Museum—thought to be the first think tank—which was renamed the United Service Institution in 1839. The institution was granted royal patronage in 1860, and, as a result of its promotion of informed debate, its influence began to be felt on British defense policy. The institute was expanded in the 1960s for independent study of national defense.
In the United States, businessman Robert S. Brookings (1850–1930) founded the Institute for Government Research in 1916. Reorganized as the Brookings Institution in 1927, its role was to bring nonpartisan expertise to policy questions of the day. In 1965 it became a major center for policy innovation in welfare, health care, education, housing, and taxation.
“Think tanks are increasingly prominent … in the policy processes of many countries.”
Think Tanks Across Nations (1998)
Ostensibly, think tanks are nonpartisan and function as extensions of state power, gaining and losing influence with changes in governments and shifts in ideological climate. Sometimes think tanks can function more independently, questioning and monitoring state strategies and structures. Think tanks conduct interdisciplinary research for governmental clients, while commercial projects include developing and testing new technologies and products. Funding sources include endowments, contracts, private donations, and sales of reports. BC
1831
Ready-made Clothing
George Opdyke
The innovation of mass-producing clothing to sell to the public
A photographic portrait of George Opdyke, taken between 1855 and 1865.
U.S. entrepreneur George Opdyke (1805–80) may or may not have invented the ready-made garment industry—the selling of finished factory or production line clothing in standard sizes—but he was certainly responsible for bringing it into mainstream society. The invention by Elias Howe (1819–67) of the power-driven sewing machine was still another fifteen years away, and Opdyke lived in an era of hand-sewn clothing, but neither factor stopped the Louisiana-based merchant from developing a small-scale production line of ready-made clothes in 1831. The South’s slave population required cheap, ready-to-wear clothing, and his store on Hudson Street, New Orleans, helped to provide it.
Opdyke had taken a riverboat down the Ohio and Mississippi rivers to New Orleans in 1827 after operating a dry-goods store in what was then the frontier trading post town of Cleveland. Upon arrival, he found that manufacturers were selling clothes at a profit of 100 percent, and he knew that he had found his vocation. He gathered around him a coterie of tailors and seamstresses and gave his new business a name: Opdyke Manufacturing. In his first year he made 6,000 dollars, and his future was assured.
“In early life [Opdyke] went to New Orleans and learned the trade of a tailor.”
Harpers Weekly (December 21, 1861)
However, today’s dominance of ready-made clothing was not achieved instantly. In 1850, almost 80 percent of all clothing worn in the United States was still handmade, partly because the industry was kept on life support as tailors worldwide faced the threat of automation and job losses. Indeed, in 1841, a group of French tailors destroyed a factory of automated looms in Paris in a desperate effort to protect their livelihood. BS
1832
Motion Pictures
Simon von Stampfer and Joseph Antoine Ferdinand Plateau
The process of viewing a rapid succession of still images depicting a moving subject step by step, in such a way that the eye is tricked into seeing actual movement
Six stroboscopic disks, of the kind used in a Phenakistoscope. The disk was attached to a handle and then spun, creating the impression of a moving picture.
At least two men can claim to have been the first to invent the visual trick that is a “motion picture.” In 1832, Austrian inventor Simon von Stampfer (1792–1864) read how British physicist Michael Faraday (1791–1867) had experimented with rapidly rotating objects to create the illusion of movement. Impressed, he devised some experiments himself, and these led him to develop his version of a moving picture. The Stampfer Disk, presented to the public in December 1832, actually consisted of two disks, one with slits around its edge and the other with pictures showing stages of movement. When the slit disk turned in front of the picture disk, the pictures seemed to merge and join into the now-familiar sensation of seamless motion.
“The cinema is an invention without a future.”
Louis Lumière, filmmaker
However, Stampfer was not alone. In Belgium, in the same year, Joseph Antoine Ferdinand Plateau (1801–83), also inspired by Faraday, revealed an almost identical mechanism that he termed a Phenakistoscope. Plateau’s fascination with “persistence of vision” theory, the idea that the after-image on the retina persists for a short time, led him to experiment with staring at the sun. A decade later, Plateau was completely blind. Meanwhile, back in 1833, Stampfer was getting ready to receive imperial privilege for his discoveries.
More than sixty years passed before their motion picture devices were progressed into what we now call cinema. Connecting together single photographic frames, the French Lumière brothers, Auguste (1862–1954) and Louis (1864–1948) were first, in 1895, to project moving, photographic pictures to a paying audience of more than one. U.S. inventor Thomas Edison (1847–1941) produced the first commercially successful projector in 1896. Their inventions radically changed how we see the world, because now our ideas are almost totally subject to the way the world is presented to us. LW
1833
Knot Theory
Carl Friedrich Gauss
A means of describing the internal structure of knots in mathematical terms
In 1771, the French musician Alexandre-Théophile Vandermonde (1735–96) recognized that pinpointing the position of a one-dimensional, nonintersecting line in space would deepen mathematical understanding. But the world had to wait until 1833 for German mathematician Carl Friedrich Gauss (1777–1855) to develop a mathematical model that described the universal nature of loops that cannot be undone.
Knots occupy three-dimensional space, but they also link a line within that space that is always, in mathematical studies, joined to itself. Describing the position of this linked curve in numerical terms is a hugely complex problem: it requires a “geometry of position,” a notation to show how a looped and twisted three-dimensional object is located, both relative to itself and in relation to the space surrounding it. For this, Gauss defined the linking integral, a whole number that describes how many loops or links a knot has.
People have been braiding and tying knots, using them to remember things and making patterns out of them, almost since the evolution of opposable thumbs. Developing a theory that described these twists and loops was a phenomenal challenge. Ultimately, knot theory is knowledge for its own sake, a kind of Zen appreciation of a koan (paradox) or problem, numerically defined. Perhaps, more than any other idea, it shows that sometimes solving a puzzle requires no further justification than its inherent merit. However, since its development, the theory has proved useful as a basis for understanding how DNA behaves and the arrangement and position of other polymers in space. It will also help in the development of quantum computers when our thirst for more powerful information processors outweighs the capacity of current silicon-based systems. LW
1833
Electrolysis
Michael Faraday
The generation of a nonspontaneous chemical reaction using electrical current
One of the most influential scientists of all time, Englishman Michael Faraday (1791–1867) was fascinated by electricity, magnetism, and the nature of energy and matter. In 1833, Faraday discovered how to reverse the natural process by which electromagnetic bonds are formed. He formulated two laws to illustrate how electrochemistry holds matter together, and described how to manipulate this pattern so that a new arrangement comes about.
Electrolysis requires a liquid medium, the electrolyte, through which a direct electrical current can be passed. The current splits the bonds that hold ions in their molecular positions at the anode, or positive end, of the reaction and causes them to enter into new molecular relationships at the cathode, or negative end. The ratio and rate of transfer depend on the nature of the solution, the amount of power used, and the material at the anode, but the process is universal.
“Nothing is too wonderful to be true, if it be consistent with the laws of Nature.”
Michael Faraday
Controlling and reversing electrochemical reactions extended humanity’s sense of power over the environment. If we could manipulate such fundamental forces, we were not merely Earth’s inheritors, we were its masters. Being able to manipulate the very structure of matter at a molecular level enabled us to separate elements from naturally occuring compounds; this massively improved our ability to exploit natural sources of metals, and drove forward industrial and technological expansion. Electrolysis refined our ability to take charge of human progress. It also opened the eyes of the scientific world to the particular nature of matter and the forces that bond it together. LW
1834
Weber-Fechner Law
Ernst Weber and Gustav Fechner
The magnitude of a sensation is proportional to the intensity of the stimulus causing it
If you are in a dark room you can easily tell when someone lights a single candle, but if you are in a room with 1,000 burning candles you notice no difference in light if someone lights one more. The Weber-Fechner law explains why this is true. The law states that people only notice a difference in sensation when there is a proportionate increase in the intensity of the stimulus relative to the original intensity.
In 1834, German physiologist Ernst Weber (1795–1878) noted that a person holding a light object will easily notice when someone adds a small amount of weight to it, but someone holding a heavier object will not notice a small increase. He developed a law that stated there was a linear relationship between the level of stimuli and our perception of a just-noticeable difference in sensation. In 1860, Weber’s student Gustav Fechner (1801–87) improved on the law when he found that it was not a linear relationship, but rather a logarithmic one. As stimuli become more intense, it becomes harder and harder to notice a difference because your body requires a much greater amount of additional stimuli.
“ … Weber’s law also underlies the octave tonal structure of music perception …”
György Buzsáki, Rhythms of the Brain (2006)
The Weber-Fechner law paved the way for psychophysics: the study of how sensations relate to stimuli, or how physical stimuli affect our perceptions. With the discovery of the law, we could measure how, and when, our senses caused us to perceive something new. We could also begin to answer the question that had troubled philosophers for so long: at what point does the physical process of sensation transform into the mental phenomena of experience and perception? In other words: where do mind and body meet? MT
1838
Supply and Demand
Antoine Cournot
The use of a combination of mathematics and economic principles to predict trends
The first person to understand and attempt to describe that most innate of all economic principles, the law of supply and demand, was not an economist but the French mathematician and philosopher Antoine Cournot (1801–77). Generations before it became commonplace to use mathematical models for the purpose of predicting trends and behavior in the marketplace, Cournot—in his misunderstood, largely overlooked, but nevertheless seminal work Researches into the Mathematical Principles of the Theory of Wealth (1838)—did something remarkable: he constructed the first formula for predicting how the rule of supply and demand might affect the price of an item or service.
The world was slow to realize the importance of Cournot’s idea, and he became embittered and disillusioned. He rewrote Researches in 1863 to make it more “readable,” yet still it went unnoticed, which led to him becoming increasingly reclusive and melancholy. Remarkably, without ever having encountered Cournot’s work, the English economist William Jevons (1835–82), in his Theory of Political Economy (1871), presented ideas that Cournot had pioneered earlier. It was not until 1890, when the great neoclassical economist Alfred Marshall (1842–1924) expanded upon Cournot’s formula in his own monumental work, Principles of Economics, that it was brought into the mainstream and given the recognition it deserved.
Cournot used concepts such as function and probability to describe economic theories, and drew lines of supply and demand on graphs three decades before the practice became routine. It was a great pity that the significance of his breakthrough was largely unappreciated in his lifetime, for he had more or less invented the modern discipline of econometrics, the ability to measure economic theory and apply it to realistic, everyday situations. BS
1839
Cell Theory
Matthias Jakob Schleiden
All living matter—from microorganisms to mammals—is made up of cellular structures that interact through electrical and chemical activity
An illustration by Ernst Haeckel depicts stages in the embryonic development of vertebrates, from cell doubling (top row) to formation of separate germ layers (bottom right).
The invention and refinement of the microscope opened the cellular world to scientists hungry to discover more about how life was organized and of what it was composed. After Dutchman Antonie van Leeuwenhoek (1632–1723) discovered microorganisms moving under his lens, it was not long before Robert Hooke (1635–1702), an English physicist who was also a distinguished microscopist, used the word “cells” to describe the divided units he observed in the structure of a piece of cork. However, the official formulation of cell theory is credited to two later individuals. First, in 1839, German botanist Matthias Jakob Schleiden (1804–81) suggested that every structural element of plants is composed of cells or their products. The following year, a similar conclusion was reached by German zoologist Theodor Schwann (1810–82) in relation to animals. All living existence had been proved to have a common denominator, and cell theory was born.
“The principal result of my investigation is that a uniform developmental principle controls the individual elementary units of all organisms.”
Theodor Schwann, zoologist
Schleiden saw that living organisms have a common architecture: they are all made up of extremely variable but nevertheless identical basic organic arrangements that communicate systematically through electrical and chemical activity. This, the basis of cell theory, amounted to the first formal description of both the structure and the operation of universal features of life.
Cell theory led to the development of an entire discipline within biology dedicated to understanding how these units functioned and maintained themselves, and also how they divided or replicated, and exchanged information. The cell became recognized as the fundamental unit of life, and disease became defined as the altered functioning of cellular activity. As important a leap in scientific understanding as the discovery of DNA, cell theory opened up possibilities in the fields of evolutionary theory, medicine, and microbiology. LW
c. 1840
National Nativism
United States
The belief that new immigrants are a threat to their longer-established countrymen
An anti-immigration cartoon from 1888 depicts “a possible curiosity of the twentieth century. The last Yankee.”
Nativism is an ideological or political point of view in which the interests of an established society are zealously preferred and perpetuated over the interests of newly arrived immigrants. It is pervasive, universal, and persists to this day. On a national level, however, it first came to prominence in around 1840 in the United States as a response to successive waves of migrants arriving on the eastern shores of the New World from Europe; it intensified when 100,000 Irish Catholics fleed their nation’s potato famine in 1847.
A controversial political party that emerged at this time was the Know-Nothing Party, made up of people who wanted to keep immigrants from setting foot in the New World and who would, failing that, do all they could to prevent them from participating in society once they did. The adherents to this ideology of irrational prejudices were given a name: nativists.
“Immigrants are … paupers, and diseased, and become a charge upon the Town …”
Cholera warning, Burlington, Vermont (1849)
Nativism had various guises: It was middle-class elitism that looked down on socially and intellectually “inferior” immigrants; it was a fear that immigrant votes would distort the body politic; and it was also a fear of competition in the workplace. But mostly it was a raft of anti-Catholic sentiments, embedded in a deeply Protestant nation and directed against an imagined papal plot, long expected to surface and attempt to subvert the new U.S. republic. A young Abraham Lincoln (1809–65) denounced the movement, saying he would prefer to live in Russia where “despotism is out in the open” rather than live in his own country where “all men are created equal except for Negroes, foreigners, and Catholics.” BS
1840
Property Is Theft
Pierre-Joseph Proudhon
The notion that any claim to ownership is incompatible with communal justice
From the Genevan philosopher Jean-Jacques Rousseau (1712–78) onward, socialists have maintained that the idea of private property ownership flagrantly flouts the natural rights of community members who, they argue, should enjoy the material benefits of the environment in which they live. Thus, any cultural norm that takes away those rights does a disservice to the community. French politician and philosopher Pierre-Joseph Proudhon (1809–65) went further: private ownership is not only a negative infringment, it is a positive crime.
Proudhon’s political outlook can be traced, to some degree, to his own impoverished background. After enduring a harrowing upbringing of poverty and long periods of unemployment as an adult, Proudhon was eventually awarded a bursary to support his studies in Paris. It was there that he developed the idea, in his first major work, What Is Property? (1840), that property is theft. Like Karl Marx, he foresaw possession and power becoming concentrated within an elite of diminishing size, until revolution redressed the balance. Unlike Marx, however, he saw ultimate redress as being the entire dissolution of the state. Historic disenfranchisement would be corrected, he believed, when human beings, freed by anarchy, could rely on their natural harmoniousness and work things out for themselves.
“[The individual] is always subordinate to the right … the community has over all.”
Pierre-Joseph Proudhon
Proudhon’s ideas are helpful in understanding the unfolding dichotomy in the world between libertarian or rights-based notions of unfettered ownership and the growing dissent expressed by disaffected members of society for whom such notions are merely excuses used by a wealthy elite to defend their power. LW
1840
Anarchism
Pierre-Joseph Proudhon
The belief that government infringes the rights of individuals or of the collective
The anarchist as he appears in Pierre-Joseph Proudhon and His Children (1865), a portrait by Gustave Courbet.
Anarchism—the belief that any form of political coercion, including government, is illegitimate and inefficient, and therefore should be opposed—emerged as a developed political philosophy only in the nineteenth century, with the consolidation of the modern state. Many consider French politician, philosopher, economist, and socialist Pierre-Joseph Proudhon (1809–65) to be the first self-described anarchist. In What Is Property? (1840), he urged the replacement of government with a social organization based on voluntary contractual agreement.
Appropriately, there is no party line in anarchism. Anarchists differ in their analysis of the wrongness of government: philosophical anarchists regard it as illegitimate; ideal anarchists regard it as inefficient compared to the anarchist alternative (individualism, communism, or something in between); and revolutionary anarchists regard it as so immoral and harmful that it deserves violent resistance and overthrow. These are not mutually exclusive attitudes, of course. Anarchists also differ in their views on the ideal system of society and distribution of property: communitarian anarchists hold various views—such as mutualism, collectivism, communism, and syndicalism—acknowledging the legitimacy and importance of voluntary, noncoercive, nonhierarchical social organization and rejecting, in varying degrees, the idea of private property, while libertarian anarchists insist on the sovereignty and primacy of the individual and private property. Variants abound: there are also green, Christian, and feminist anarchisms, to name a few.
Despite the continuing intellectual vitality of anarchism, its prospects for effecting a substantial political change are generally considered to have waned in the wake of World War I (1914–18) and the Russian Revolution of 1917. GB
1842
The Doppler Effect
Christian Doppler
The theory that movement measurably compresses or stretches energy waves
An image illustrating the different red shift values of Stephan’s Quartet group of galaxies and NGC 7320.
When an object that is emitting waves, such as sound waves, is moving, the wavelength ahead of the object is shortened, relative to its actual frequency. You might say that the forward motion crushes the waves together; at the same time, relatively, the waves behind the moving source are stretched out. This change in wave pattern is what is meant by the Doppler effect, or the Doppler shift, and is familiar from the altering sound of a race car or police siren as it passes.
Austrian physicist Christian Doppler (1803–53) first described the effect in relation to astronomy in On the Colored Light of the Binary Stars and Some Other Stars of the Heavens (1842). Doppler was determined, after enduring an at times humiliating journey through academia, to demonstrate his ability to apply his genius to natural phenomena. He named the effect while seeking to explain differences in the colors of double stars. His principle was swiftly demonstrated in relation to any type of wave—sound, light, water—emanating from a moving source. It also applies to an observer moving relative to the medium through which the wave is transmitted, and to situations in which the medium itself is the method of transmission.
The Doppler effect now helps with predictions in such diverse fields as meteorology, navigation, and medical diagnosis. As Doppler himself predicted, as soon as instruments developed sufficiently to take measurements, scientists could determine the directional motion of stars. This alerted them to “the red shift,” whereby light from a star, as observed from the Earth, shifts toward the red end of the spectrum (with a lower frequency or longer wavelength) if the Earth and star are receding from each other. In other words, they discovered that we are in an expanding universe. It follows that things were once very much closer. The Doppler effect thus contributed to a seismic shift in universal understanding. LW
1843
Cartoon
John Leech
A lighthearted illustration with the purpose of provoking discussion
The word “cartoon” comes from the world of fine art and refers to a board used to sketch out rough ideas for paintings or sculptures before a final version is attempted. However, it was not until the establishment of modern print media that the cartoon as a piece of humorous art became properly recognized. All cartoons have developed from a tradition of individual pictures presented to amuse the viewer, certainly, but also to provoke a reaction in that viewer.
On July 15, 1843, the sketch “Substance and Shadow,” by English artist John Leech (1817–64), appeared in Britain’s Punch magazine. This marked the first use of the word “cartoon” to refer to a satirical representation of fictional characters whose figures and features depicted types (namely poverty and greed in Leech’s first offering). The first popular fictional cartoon character, designed simply to entertain, had appeared much earlier, in 1809, and this had presaged the enduring appeal of a cartoon with a storyline. A bony, elderly gentleman, Dr. Syntax, had a penchant for traveling and an unfortunate propensity for mishaps. His creator, Thomas Rowlandson (1756–1827), developed an entire fictional history for Dr. Syntax, generating a hunger for merchandise (prints and postcards), and sowing the seeds for much profitable franchising since.
The idea of the cartoon has been influential on two fronts. First, it has been popular as straightforward, humorous entertainment, generating the pantheon of characters that have become a mainstay of children’s television; second, the cartoon has had adult possibilities, for social and political commentary and nonlibellous satire, but also for adult sexual fantasy. From innocuous child icons to images of evil, cartoons now span and imitate the entire range of human possibility. Not being human, they have limitless abilities; being insensitive, they can be made to explore the heights and depths of human imagining. LW
1843
Commercial Air Travel
William Henson and John Stringfellow
Profiting through providing air transport for the general public
A lithograph of the Aerial Steam Carriage (1842); far from flying over pyramids, it never flew at all.
Italian artist and inventor Leonardo da Vinci (1452–1519) had made drawings of flying machines with flapping wings, and Frenchmen Jean-François Pilâtre de Rozier (1754–85) and François Laurent d’Arlandes (1742–1809) had made the first ascent in a hot-air balloon in 1783. However, it was British inventor William S. Henson (1812–88) and his friend and fellow engineer, John Stringfellow (1799–1883), who first investigated in earnest the possibility of a profitable flying business. From their observations of bird flight, they concluded that a fixed-wing design propelled by a sufficiently robust power plant offered a promising combination.
In 1843, a full sixty years before Wilbur (1867–1912) and Orville (1871–1948) Wright made their inaugural flight in such an aircraft in 1903, the pair set up the Ariel Transit Company with the idea of attracting the investment necessary to build their first prototype. Unfortunately, their extravagant advertising campaign scared off investors, and even the engineer Sir George Cayley (1773–1857), Henson’s mentor and inspiration, refused to back them unless they could demonstrate a working model of their Ariel Steam Carriage, patented in 1842. The Ariel never flew, the scheme collapsed, and Henson emigrated. But the public had been introduced to the possibility of exotic destinations being only a commercial flight away.
It is impossible to overestimate the impact of commercial air travel on the character of modern industrialized society. The holiday destinations of the world have changed visitors and hosts alike, and tourism has brought new possibilities for cross-cultural mobility, but the exploitation of lands and peoples has raised new questions. The industry never approached the heights of profitability that its early investors and inventors envisaged, but it undoubtedly changed the way we think of distance, and how we weigh the costs and consequences of our leisure pursuits. LW
1843
Ring Theory
William Rowan Hamilton
The study of “rings,” or structures in abstract algebra, that relate integrally (in terms of whole numbers) if they are commutative, but not if noncommutative
The largely self-taught Irish mathematician Sir William Rowan Hamilton, here photographed in 1857, had been made professor of astronomy at Trinity College, Dublin, in 1827.
Ring theory, in simple terms, is the study of mathematical sets in which addition and multiplication are possible. There are really two ring theories: “commutative” ring theory, in which the order of the elements added or multiplied does not affect the mathematical outcome, and the more complex “noncommutative” ring theory. The latter was developed from the theory of quaternions of Irish physicist, astronomer, and mathematician Sir William Rowan Hamilton (1805–65). Hamilton remarked, “ … there dawned on me the notion that we must admit, in some sense, a fourth dimension of space.”
“The mathematical quaternion partakes of both these elements; in technical language it may be said to be ‘time plus space,’ or ‘space plus time’: and in this sense it has, or at least involves a reference to, four dimensions.”
William Hamilton, quoted in R. P. Graves’s Life of Sir William Rowan Hamilton (1882)
The motivation behind the search for an integrated theory of rings lies in the depth of the problems they represent. In mathematics, rings are sets, in which two operations, addition and multiplication, fulfill conditions for various axioms. Study of the mathematical properties of rings is ancient, but by the seventeenth century, abstract numerical problems were attracting much interest. Fermat’s Last Theorem, the most fiendishly difficult of all, developed by Frenchman Pierre de Fermat (1601–65), was tackled by Richard Dedekind in the 1880s, and out of his work arose the first comprehensive theory of algebraic numbers. This theory, which included the idea that sets could be treated as integers, was the generalization that first allowed commutative ring theory to evolve. Meanwhile, Hamilton’s “quaternion,” discovered in 1843, had broken down the commutative property of multiplication and opened the way for noncommutative ring theory.
Following their development as abstract problem solvers, both types of ring theory have helped to further scientific understanding, both in mathematics and, practically, in physics. The theories have altered how we conceive of the relationships between numbers, and this has had profound implications in our understanding of the structure of the universe. LW
1843
Enlightened Despot
Wilhelm Roscher
The notion of an absolute monarch influenced by the Enlightenment
Monarchs, from the eighteenth century onward, who pursued legal, social, and educational reforms inspired by the Age of Enlightenment became known as “enlightened despots.” The theory held that leaders thus inspired had the authority, sense of duty, and character to institute administrative, economic, and agricultural reform; to increase separation of the powers of the church and state; and to develop health and educational systems. On the other hand, reformation of the monarchical system itself was ruled out, and there was no move to disrupt the existing social order.
During the second half of the eighteenth century, European ideas about how to govern began to shift under the influence of the Enlightenment. These ideas, including strong arguments for increasing equality and religious toleration, and for involving wider sectors of the population in political decision making, threatened the status quo and, it was argued, the stability of the state. In response, German economist Wilhelm Roscher (1817–94) put forward his concept of enlightened despotism in 1843 (although his term, coined in 1847, was “enlightened absolutism”). Roscher saw in his idea the final stage of a tripartite process of monarchical emancipation. He argued that reform could take place without dismantling the state, and intellectual progress could take place through the existing order.
Today, the oxymoronic or seemingly contradictory idea that a monarch could be enlightened and a despot at the same time creates confusion, and to some extent the idea created just the same tension at its inception. Present-day dictatorial regimes use justifications that echo those of the first apologists. Their leaders and supporters may genuinely consider themselves to be social reformers, ruling in the interests of all, but their efforts are subject to human fallibility and the dictates of pragmatism. The despots were no different. LW
1843
Emergentism
John Stuart Mill
The concept that an entity might have a property possessed by none of its parts
British philosopher John Stuart Mill (1806–73) first introduced the theory of emergentism in his book System of Logic (1843). According to Mill, a property of something is “emergent” if it is in some sense more than the sum of its parts. Emergentism is often contrasted with reductionism, the theory that the simplest components of matter are the only real properties. Emergentism offers a layered view of nature, in which new properties emerge with each new layer of physical complexity.
Mill described two ways in which causes can collaborate to produce an effect. Nonemergent properties can be predicted from our knowledge of each individual cause alone—for example, if one force propels an object to the north and another equal force propels it to the east, the resulting motion will be northeasterly. But with emergent properties, the conjunction of causes produces something that cannot be predicted from knowledge of how each cause operates individually. Mill cites the properties of chemical compounds, which may not be anticipated solely on the basis of their components’ properties.
“The higher quality emerges from the lower level of existence …”
C. Lloyd Morgan, ethologist and psychologist
Some critics have argued that advances in chemistry have cast doubt on this argument. But the contemporary importance of emergentism lies more in its implications for our understanding of the mind than in accounting for chemical reactions. As a philosophy of mind, emergentism occupies a middle ground between materialism, which reduces the mind to brain activity, and dualism, which treats the mind as wholly independent of the brain. GD
1843
Leap of Faith
Søren Kierkegaard
The concept that, to attain faith, we must go beyond the limits of the rational
An angel halts Abraham, Kierkegaard’s “knight of faith,” in The Sacrifice of Isaac (1650), by Laurent de la Hyre.
The Danish Existentialist philosopher Søren Kierkegaard (1813–55) first articulated the idea of the leap of faith in his pseudonymous books Fear and Trembling (1843), Philosophical Fragments (1844), and Concluding Unscientific Postscript to Philosophical Fragments (1848).
Kierkegaard argued that self-cultivation is a process in which a person leaps progressively between three different stages. In the first, aesthetic stage, a person is motivated by egoism and hedonism, but the transience of such self-gratification makes it ultimately unfulfilling, and many are prompted to look elsewhere for a sense of significance. The second, ethical stage is characterized by a concern for others that becomes progressively grounded in the universal principles of philosophical moral discourse. Although the social and rational basis of the ethical life is more satisfying than that of the aesthetic, it still produces anxiety and despair because a person can never fully live up to their duties. The final, religious stage involves a leap of faith in which a person suspends ethical and rational considerations to embrace God. Kierkegaard uses the biblical story of Abraham and Isaac to illustrate this leap: Abraham must set aside universal moral prohibitions against killing his son, embrace the will of God, and thus become a “knight of faith.” In the religious stage, a person’s morals are based upon imitatio Christi, or an emulation of the virtues of Christ.
Kierkegaard’s understanding of faith has had an outstanding impact on philosophy and theology, influencing thinkers such as Karl Barth, Karl Jaspers, Paul Tillich, and Dietrich Bonhoeffer. The term “leap of faith” has entered common parlance, but it is typically misused to indicate something done on the basis of blind faith. A person can only make a true leap of faith after passing through the aesthetic and ethical stages; they should not simply take everything on faith without first struggling to experience and understand the world. JM
1845
Manifest Destiny
John L. O’Sullivan
The belief that Americans had a God-given destiny to establish a great nation
In American Progress (c. 1872) by John Gast, Columbia presses westward with pioneers.
The phrase “manifest destiny” came to express a mid-nineteenth-century idea of the people of the United States of America that they had a divinely conferred duty to settle all the land lying between the country’s Atlantic and Pacific coasts. Settling the continent, according to those who believed in the notion, was the U.S. people’s anointed purpose, made manifest because of the exceptional nature of the people concerned, with their unique culture and political ideals of democracy and liberty.
In the mid-to late-nineteenth century, the United States went through a period of rapid expansion as the nation extended beyond the borders of the Louisiana Purchase and into the far west. The nation had long been set on expansion, yet the actual phrase “manifest destiny” did not appear until the summer of 1845, when the editor of Democratic Review magazine, John L. O’Sullivan (1813–95), wrote an article discussing the nation’s right to annex Texas. O’Sullivan presented the idea that the United States was no common nation, but rather a nation with a special destiny given by the divine, one that would lead it to span the continent. The notion quickly gained popularity with politicians, public officials, and average citizens alike.
The idea of taming a wild land and making it habitable, and of building a new nation from the wilderness, captivated generations of Americans. The land was already inhabited, but the idea of manifest destiny was not intended to refer to long-established indigenous peoples. As settlers of European origin expanded westward, their idea of manifest destiny could be realized only through the destruction or relocation of aboriginal populations. Today, the idea of U.S. exceptionalism, that the nation is special among all others and has a duty to promote democratic ideals, is still a driving force in many aspects of U.S. politics and culture. MT
1845
Illusionism
Jean Eugène Robert-Houdin
Entertaining theater audiences by confounding their natural expectations with seemingly impossible tricks, from apparent mind reading to superhuman feats of endurance
A nineteenth-century poster advertising a performance by Jean Eugène Robert-Houdin in Paris. His professional career as an illusionist only lasted around eleven years.
Most historians agree that credit for first bringing illusionism, or stage magic, to a theater audience should go to Frenchman Jean Eugène Robert-Houdin (1805–71). There are other potential candidates—Scotsman John Henry Anderson (1814–74) among them—but, even if they preceded him, none could rival Robert-Houdin for originality and variety. His first performance, in 1845, was poorly given and unfavorably received, but he persevered until his illusions even attracted the attention of King Louis-Philippe; the king was given a private performance in 1847, just months before he was deposed by revolution in 1848.
“Magic is the only honest profession. A magician promises to deceive you, and he does.”
Karl Germain, magician and lawyer
One of Robert-Houdin’s illusions made it seem as though his son was balancing horizontally, unsupported, on one elbow. Robert-Houdin explained the feat to his audience as being the effect on the boy’s body of ether, a liquid whose properties were just beginning to be discovered. Robert-Houdin made full use of scientific discoveries still unfamiliar to his audience; for example, a child would be asked to lift a small box, but then the illusionist used electromagnetism to make it impossible for a man to do the same thing. Robert-Houdin’s illusions were so brilliant that in 1856 Napoleon III asked him to deflect potential rebellion in Algeria by outdoing the faux-magic of marabouts (Muslim religious teachers).
Illusionism transformed simple entertainment into a sophisticated demonstration of the apparently supernatural. Today, journeying from street to screen, it continuously renews the idea that the impossible can happen, while recognizing that escapism is a fundamental human hunger. People will happily pay to be baffled, shocked, and amused, and have their expectations challenged. Illusionism is popular because it awakes audiences to the unknown within the known, despite their natural skepticism. LW
1846
Might Makes Right
Adin Ballou
Being strong is the best way to secure what you want
The phrase “might makes right” contains three ideas, two of which oppose each other. The first idea is the use of a pithy sentence to summarize a concept. The second is the perjorative use of the phrase, implying that force, wielded unworthily by an authority over unwilling subjects, is unjustified. The third describes social, political, or even biological relations, and concludes from those descriptions that acts in the interest of dominance and power are justifiable.
The first person to use the phrase in English was the U.S. utopian anarchist Adin Ballou (1803–90), in his work Christian Non-Resistance: In All Its Important Bearings, Illustrated and Defended (1846). Ballou was conditioned by his perspective of history, his extreme pacifism, and his conviction that the road to happiness lay beyond any humanly imposed law. However, his dictum summed up a much older idea. The ancient Greek historian Thucydides (c. 460–c. 404 BCE) had observed that “the strong do what they can, and the weak suffer what they must,” but the sage had prefaced this with the comment, “Right, as the world goes, is only in question between equals in power.”
The neat phraseology of “might makes right” has become something of a meme, and pithy ways of wording ideas succinctly have been an increasing feature of human communication ever since it was coined. The idea itself influenced two separate positions, one dichotomously opposed to Ballou’s meaning. Ballou emphasized (as did Thucydides) the injustice of imposed authority. Ballou put the phrase in inverted commas to illustrate that authorities who exercise their programs for governance by force have no legitimacy. The alternative interpretation of the idea refers to its use as a statement of historical description and the fallacy, thereby, that it justifies certain political systems, such as totalitarianism. LW
1847
Hand-washing in Hospitals
Ignaz Semmelweis
Good personal hygiene in medical staff prevents the transmission of disease
In 1847, Hungarian obstetrician Professor Ignaz Semmelweis (1818–65) declared that there was a connection between the unhygienic habits of (male) clinicians and the high mortality rate among women they were attending. The official response to his idea illustrated just how difficult acceptance of a challenging observation can be when it threatens the self-image of an institution.
In mid-nineteenth-century Europe, far more women having babies in hospitals were dying of puerperal sepsis, a bacterial infection, when attended by hospital doctors than when not attended by them. Semmelweis focused on the hygiene habits of the doctors themselves. In contrast to midwives, the doctors refused to consider themselves as potential transmitters of disease, and this was reflected in their negligence regarding hand-washing. Semmelweis published his theory, but was not applauded. Instead, the Viennese medical establishment villified him utterly, rejected his findings, and denied him reappointment to his post. He died, tragically, in an asylum.
“ … cadaveric material is the cause of … mortality in the First Obstetrical Clinic …”
Ignaz Semmelweis
Semmelweis’s idea has had both an obvious and an indirect effect on our thinking. Eventually, it did change behavior relating to medical hygiene, vastly reducing suffering and mortality in hospitals. But it also showed how a powerful profession could refuse point-blank to accept its own negative role in a situation. This reflected a much broader habitual response among those in power to findings that undermine or challenge their self-image. It is a social phenomenon that continues to demand attention today. LW
1847
Topology
Johann Benedict Listing
The study of the properties preserved in geometrically distorted shapes
Swiss genius Leonhard Euler (1707–83) was the first to consider the problem of the Königsberg Bridges as a geometric puzzle; his memoir on the subject effectively gave rise to topology. The problem—to cross seven bridges in a single journey, without recrossing any—was reduced by Euler to a graph of lines and points. This demonstrated that numbers and shapes were related in ways not previously considered: where distance, for example, was irrelevant, but quality was not. Topology is, therefore, the geometry of a process of reducing, distorting, or bending shapes whose fundamental geometric properties remain intact.
If Euler laid the foundation stone, the first to set the field’s descriptive boundaries was Czech-German Johann Benedict Listing (1802–82). His publication of the Vorstudien zur Topologie (Introductory Studies in Topology, 1847) laid out the first systematic treatment of the subject. German mathematicians Augustus Möbius (1790–1868) and Bernhard Reimann (1826–66) were also important influences on the development of this field, but Listing described the Möbius strip (a continuous one-sided surface) four years before Möbius himself and so deserves credit as topology’s originator.
“If it’s just turning the crank it’s algebra, but if it’s got an idea in it, it’s topology.”
Solomon Lefschetz, mathematician
The initial importance of the discovery was that it showed that numbers and shapes could be related qualitatively, rather than only through the quantitative geometrical relations that ordinary geometry treats. This different kind of geometry weds shape to space through a new set of conditions and it is in this respect that it is innovative, opening up dynamic possibilites, such as the discovery of new forms of matter. LW
1848
Communism
Karl Marx and Friedrich Engels
An economic system and political philosophy advocating social ownership
A Russian Communist poster from 1920, bearing the message “Knowledge breaks the chains of slavery.”
Communism is an ideology and a theory that sees the trajectory of human history as an inevitable progression from inequality to equality, via revolution. The goal of this progression is a classless, stateless society in which people willingly contribute toward the wellbeing of all. This contribution will involve holding both human and natural resources to be common goods. Distributing the products of work according to need will ensure that status, envy, greed, and even war pass away, and harmony will ensue.
Karl Marx (1818–83) and Freidrich Engels (1820–95) are jointly credited with authorship of The Communist Manifesto (1848), the pamplet in which the theory was first outlined. Its publication sparked a massive political reaction, although the ideal of an egalitarian society with all resources held in common ownership has existed throughout recorded history. Heavily influenced by the philosophical work of Georg Wilhelm Friedrich Hegel (1770–1831), communism offered both an explanation of inequality and a battle plan.
“Communism may be summed up in one sentence: abolish all private property.”
Karl Marx, philosopher and economist
It is impossible to imagine how differently human history would have unfolded if communism had not radicalized political theory. The “Communist Threat” set ideological battle lines between liberal, industrialized, capitalist nations of the “First World” and authoritarian, semi-industrialized, socialist nations of the “Second World.” Some argue that it even crystallized its fiercest detractors toward fascism. Nevertheless, the ideology maintains a firm grip, not least because there will always be those who sense in the aims of communism an abiding hunger for a fairer world. LW
1848
Pre-Raphaelitism
England
An artistic movement that sought to emulate the style of Italian artists before Raphael
The Beloved (1865–66) by Dante Gabriel Rossetti, one of the founding members of the Pre-Raphaelite Brotherhood. The painting shows the brilliance of color favored by the group.
The term “Pre-Raphaelite” comes from the title of a group of artists, both literary and visual, who founded, in 1848, the Pre-Raphaelite Brotherhood. Led by Dante Gabriel Rossetti (1828–82), William Holman Hunt (1827–1910), and John Everett Millais (1829–96), its inspiration largely sprang from the ideas of English artist and critic John Ruskin (1819–1900). The original movement sought to challenge the artistic and literary conventions of the day, subverting the rules by using value criteria based on feeling rather than intellect. The results shocked and challenged the Victorian public’s perception of what art ought to be. The second movement counterpointed the first, since one of its aims was to revive an interest in traditional arts and crafts, which were primarily functionalist in nature.
“All great art is the work of the whole living creature, body and soul, and chiefly of the soul.”
John Ruskin, artist and critic
Ruskin’s writing questioned the entire pantheon of social norms and this spurred the Pre-Raphaelites to look beyond conventional Victorian society for their inspiration. They reacted against the “Mannerist” artistic genre and instead focused more on naturalistic accuracy, sometimes of fantastical subjects. Their work hankered back to an earlier era, represented by the Italian Rennaissance artist Raphael (1483–1520) and his predecessors. They considered the artistic process to include the entire activity of the person, and that an artist ought not to be limited to a single medium. Writers were encouraged to paint, for example, and liberal modes of self-expression, including sexual expression, were encouraged.
The themes of Pre-Raphaelitism were enormously more complex than this brief summary can convey and yet the group was, in a sense, the “hippie revolution” of its day. From the intuitionism and mysticism of W. B. Yeats (1865–1939) to the determined realism of Anthony Burgess (1917–93), art, literature, and the debates on norms and conventions have all been fired into life by the Pre-Raphaelites. LW
1849
Civil Disobedience
Henry David Thoreau
The active refusal by an individual to follow certain rules, laws, or policies of government
Henry David Thoreau, photographed in 1847. The activist spent a night in jail for civil disobedience in July 1846: he refused to pay his tax to a government that endorsed slavery.
Any person who believes a government is acting unjustly has a duty to oppose those actions by protesting and resisting them, according to U.S. writer Henry David Thoreau (1817–62). These acts of civil disobedience are designed to both educate others about the injustice and to bring about change. Civil disobedience stems from an individual’s belief that the actions of the state are immoral or unjustified.
“Disobedience is the true foundation of liberty. The obedient must be slaves.”
Henry David Thoreau
In 1849 Thoreau wrote “Resistance to Civil Government,” an essay later published as On the Duty of Civil Disobedience. In that work he coined the term “civil disobedience” to describe his refusal to pay taxes to support what he perceived as unjust U.S. government policy. In his study of history, Thoreau observed that many people who change society for the better are seen initially as enemies, and that those who follow their consciences are often at odds with society at the time, yet are later regarded as the most influential of reformers and heroes. Governments, even democracies, he argued, cannot be relied upon to promote what is good or what is right because they are primarily corrupt organizations. Thoreau believed that individuals have a duty to follow the dictates of their consciences and oppose unjust laws by actively refusing to obey, instead of merely voting, speaking against, or hoping for change.
The tenets of mass nonviolence that Thoreau outlined were instrumental in the successful use of civil disobedience in the twentieth century, in both Mahatma Gandhi’s (1869–1948) Indian independence movement and Martin Luther King, Jr.’s (1929–68) civil rights campaign. Prior to that, the Egyptian Revolution of 1919 was widely seen as the first successful use of civil disobedience by a population to achieve a political goal, and the tactics employed by these movements are commonly used today. MT
1850
Modern Olympic Games
William Penny Brookes
An international sports competition held every four years in different venues
Competitors prepare for the men’s 100 meters race at the 1896 Olympic Games in Athens.
English physician and botanist William Penny Brookes (1809–95) believed that extending opportunities for physical and intellectual betterment could do much to address the ill-effects of poverty. To this end, he developed a competition based on the ancient Olympics, and in Much Wenlock, England, in October 1850, he held the first “Wenlock Olympian Games.” This laid the foundation for a revival of the ancient Greek Olympiad, a four-yearly event in which international athletes from an ever-increasing range of summer and winter sports compete.
While Brookes provided the initial inspiration for the Games, it is French aristocrat Baron Pierre de Coubertin (1863–1937), himself invited by Brookes to attend a Wenlock Olympian Games in 1889, whom the International Olympic Committee recognize as its founder. Coubertin later sought to play down the impact of the Wenlock Games on his own efforts to revive the Olympiad, although the photographs of his stay with Brookes attest to the influence of that visit. The first international games took place in Athens in 1896, with 245 participants from fourteen countries in nine sports and forty-three events.
While the philosophy of harmony that guided the original Greek competition still resonates in its modern equivalent, its main influence has been on how we think about who competes in sport. Brookes’s original idea of inclusivity regardless of class or status has been extended to include women, athletes of color, and those with disabilities. Unfortunately, the modern Olympiad’s reputation has been somewhat tarred by accusations of bribery in the context of selecting a new capital city to host the event every four years, and by the prevalence of illegal performance-enhancing drug use, particularly since the introduction of a ban in 1968. While this has made some observers cynical, most embrace the ideals that the Olympics represents. LW
1850
AD&D Insurance
Franklin Health Assurance Company
A form of life insurance that covers death or dismemberment from an accident
The body of a construction worker fatally injured in a fall from scaffolding is brought home to his family.
Referred to in the industry as AD&D, accidental death and dismemberment insurance is a form of coverage that pays a predetermined sum of money to a named beneficiary in the event that the insured person loses his or her life, or a body part in an accident. In the contemporary context, AD&D insurance is most often secured by laborers or persons otherwise employed in a high-risk occupation. The insurance is a safeguard against lost wages or the insured’s inability to work due to an accident. AD&D was originally conceived, however, as protection against the perils of travel.
The first AD&D policy was issued in 1850 by the Franklin Health Assurance Company of Boston, Massachusetts. Franklin Health Assurance offered accident coverage to steamboat and railroad passengers against any accident that may occur on their voyage. Numerous firms followed suit and eventually broadened the scope of their coverage to insure against lost wages due to accidents of all sorts.
Over time the plethora of firms offering policies of this sort consolidated through attrition and, in the 1920s and 1930s, alternative means of administering health coverage emerged. In 1929 the precursor to the Blue Cross (a federation of health insurance organizations and companies) was developed at Baylor University in Dallas, Texas, as a means of providing insurance that would pay for hospital expenses for those enrolled in the plan. Since then the Blue Cross (and Blue Shield) system has grown into one of the primary vehicles for health care coverage in the United States. The success of the private accident and health insurance industry that grew up around AD&D insurance is arguably the reason why the United States government has long resisted intervention in health care, despite the establishment of public health care systems in other developed nations. DM
1850
De Morgan’s Laws
Augustus de Morgan
The principle in logic that “and” and “or” are dual
De Morgan’s laws are a pair of rules in propositional logic and Boolean algebra that are named for the British mathematician and logician Augustus De Morgan (1806–71), who first formalized them in 1850. Although the rules are fairly intuitive and may even seem trivial—indeed, logicians have used them since at least the fourth century BCE—De Morgan was the first to incorporate them into a system of formal logic.
De Morgan’s laws are a pair of rules of inference that allow us to turn conjunctions (“and” statements) into disjunctions (“or” statements) and vice versa, via negation. Essentially, they can be expressed as follows: (1) The negation of a conjunction is the disjunction of the negations and (2) The negation of a disjunction is the conjunction of the negations.
“ … logical truth depends upon the structure of the sentence …”
Augustus de Morgan, Formal Logic (1847)
Consider, for example, the following conjunction: “Pam is perky and Quincy is quick.” The negation of this conjunction is “It is not the case that Pam is perky and Quincy is quick.” De Morgan’s first law tells us that this statement is logically equivalent to “Either it’s not the case that Pam is perky or it’s not the case that Quincy is quick.” Similarly, consider the following disjunction: “Either Pam is perky or Quincy is quick.” The negation of this disjunction is “It is not the case that either Pam is perky or Quincy is quick,” which the second of De Morgan’s laws tells us is logically equivalent to “It’s not the case that Pam is perky and it’s not the case that Quincy is quick.” Today, the laws are used to simplify electric circuits and logical expressions that are used in computer programs. GD
c. 1850
Program Music
Franz Liszt
A musical work whose form and content derive from an extra-musical source
“Program music” is a phrase coined by composer Franz Liszt (1811–86) in the early 1850s, along with the term “symphonic poem.” Liszt believed that developments in harmony and orchestration during the nineteenth century necessitated a break with the formal patterns for organizing a composition that had been advanced a century earlier, as the development of content and form must go together—as he put it, “New wine demands new bottles.” He also believed that music could benefit from a relationship with the other arts: it could be a “program” inspired by a story, play, or poem, rather than just abstract constructions in sound.
Liszt was not the first to bring extra-musical aspects into music or to make radical changes to compositional practice; there are many earlier works that imitated or alluded to events or characters within a traditional form. Examples include Jean-Philippe Rameau’s The Hen (1728), Ludwig Beethoven’s Symphony No. 6 (1808), or Hector Berlioz’s Symphonie Fantastique (1830). Liszt’s works were not directly representational, meaning that they did not mimic precise events through sound effects, rather they were suggestive. As he described it, “Music embodies feeling without forcing it to contend and combine with thought, as it is forced in most arts, and especially in the art of words.” Liszt opened the door to individualized formal developments, rather than predefined models, a trend that continues to the present day.
During Liszt’s lifetime, controversies had already begun about whether music has a content outside of the sound itself—a perhaps unsolvable aesthetic discussion that remains relevant today. Not least are these discussions apparent with the prevalence of representational film music, in which music often is supposed to amplify or mirror the narrative on screen, often with a preconceived and universal vocabulary of musical gestures. PB
c. 1850
Rational Dress
Elizabeth Smith Miller
A style of women’s dress that focused on comfort and practicality, characterized by the wearing of knickerbockers or bloomers in place of a skirt
An illustration from c. 1850 displays the new style of rational dress for women. The bloomer suit maintained Victorian decency while allowing women to move more freely.
In the 1850s women routinely wore up to 14 pounds (6.4 kg) of undergarments, and from the very beginnings of the emancipation movement in the United States and Britain women’s fashion was prominent alongside the struggle for better wages, property rights, education, and marriage reform. It was Elizabeth Smith Miller (1822–1911), the daughter of abolitionists Gerrit Smith and Ann Fitzhugh, who initiated the rebellion against restrictive clothing, by wearing Turkish pantaloons—trousers worn under a knee-length skirt tucked in around the ankles.
“I became so thoroughly disgusted with the long skirt, that the dissatisfaction—the growth of years—suddenly ripened into the decision that this shackle should no longer be endured. The resolution was at once put into practice.”
Elizabeth Smith Miller
When Amelia Bloomer (1818–94), Miller’s fellow suffragette and editor of the temperance magazine The Lily, saw the new dress code, she approved of it immediately. The style became popularized through her magazine, and the pants were later dubbed “bloomers.” Bloomers offered a practical alternative to tightly strung corsets and layers of skirts, but only a small percentage of the female population ever wore them. In 1856 came the more fashionable crinoline, which also liberated women from the abundance of petticoats, but it still proved a hazard to women as many were burned to death when their impossibly wide skirts brushed over open fireplaces. Outcries over the harm done by corsets and heavy skirts continued in the 1870s, but it was not until 1881 that the Rational Dress Society established itself in London, promising to oppose “any fashion in dress that either deforms the figure, impedes the movements of the body, or in any way tends to injure the health.”
A key development in the widespread acceptance of rational dress was the rise in popularity of the bicycle at the end of the nineteenth century. The design of early bicycles meant that they were all but impossible to ride in long skirts, and pants therefore became respectable for women as a form of “sporting” dress. BS
1850
You Are What You Eat
Ludwig Feuerbach
All human processes are the product of the food that we ingest
In 1850, Jacob Moleschott (1822–93), a Dutch dietician and physiologist, published The Theory of Food: For the People, a popular book on nutrition. Describing the physiological bases of hunger and thirst, the processes of digestion and assimilation, and the nutritional properties of various foods, it was widely praised for its eloquence and clarity. However, an even greater sensation than the book itself was a review of it written by Moleschott’s friend and former teacher, the German philosopher Ludwig Feuerbach (1804–72).
A materialist who regarded physical processes as the only reality, Feuerbach found support for his views in Moleschott’s depiction of human beings as bodily organisms produced and sustained through eating and drinking. Even our thoughts, beliefs, and emotions depend on the work of our digestive system. “No thought without phosphorus,” Moleschott wrote, explaining how the brain could not be formed without the ingestion of phosphorus-bearing fat. Feuerbach summarized Moleschott’s account of the origin of our mental life by saying, “Food becomes blood, blood becomes heart and brain, the stuff of thoughts and attitudes … Man ist, was man ißt.” This German pun loses some of its bite in translation, but it has nonetheless become a familiar saying—“you are what you eat.”
“Only sustenance is substance. Sustenance is the identity of spirit and nature.”
Ludwig Feuerbach
Feuerbach intended his quip to mean that there is literally nothing more to human beings than the matter we ingest and the transformations it undergoes. Since then, however, it has come to have the looser meaning that what we eat is a major factor in our state of mind and health, becoming a slogan for healthy eating. GD
c. 1850
Devolution
Benedict Morel
The notion that a species can evolve backward into a more “primitive” form
Devolution is the belief that an organism or a species can, over time, lose its biological complexity and “de-evolve” (or evolve backward) into a more primitive form of life. The theory was developed by the French psychiatrist Benedict Morel (1809–73), whose theory of “degeneration” in the 1850s represented his search for a biological explanation for the onset of mental illness. Morel wrote of a progressive degeneration, from neurosis to mental alienation and on to imbecility, then finally sterility. He also claimed that behavior such as the excessive consumption of alcohol or drug taking would result in the degeneration of the offspring of those involved, offspring that could, over generations if the pattern were repeated, begin to revert to a more “primitive” biological form.
Morel believed that species must inevitably evolve in a Darwinian sense, a thought that presupposes some kind of future hierarchical structure that organisms are destined to achieve. However, he also believed that species could regress. Biologists have, after all, found countless examples of decreasing complexity in the fossil record, particularly in the jaw bones of fish, mammals, and reptiles, although strictly speaking from a biologist’s perspective devolution cannot exist. All change, whether regressive or not, is evolutionary—it is always a progression forward, regardless of its resultant level of complexity. Ninety-nine percent of all species that have ever existed on Earth are now extinct, a fact that makes a mockery of the concept of teleology in nature—that species evolve because they tend to adapt to their changing environments and therefore survive.
Perhaps Morel was right. With so many species no longer with us, clearly the idea of ever-increasing complexity, and the ability to adapt and survive that should accompany it, is far from a sure thing. JF
1854
Chance Favors the Prepared Mind
Louis Pasteur
We are more likely to make the most of an opportunity if we have thought about or studied the issue beforehand
French chemist and microbiologist Louis Pasteur performs an experiment in his laboratory. Pasteur’s scientific accomplishments earned him France’s highest decoration, the Legion of Honor.
A very large number of scientific advances have been made as a result of chance: the antibiotic effect of penicillin, the low toxicity of warfarin, the anaesthetic properties of nitrous oxide, even the hallucinogenic effects of LSD; the list is endless. But many, if not all of these, would not have been recognized had not the scientists working on them already, through hard work and careful observation, been prepared to take advantage of them when the opportunity arose. Furthermore, in order to change the world and the way that people think, a scientist has to do more than simply achieve a breakthrough, whether by effort or good luck. He or she has to prove that the breakthrough is not a fluke, and to persuade fellow scientists to follow it up and ordinary people to believe in it.
“Preparation is essential. It brings the necessary ideas to mind at the same time, so that they can be combined in a novel way. When the combination happens, it leads to a sequence of insights and to the ‘Aha!’ experience of discovery.”
Mark Stefik and Barbara Stefik, Breakthrough (2004)
The English author Sir Horace Walpole (1717–97) coined the term “serendipity” in 1754 for those happy accidents that bring about major advances in individual prosperity, not to mention the medical and physical sciences. But it was French chemist and microbiologist Louis Pasteur (1822–95) who recognized that such accidents benefit only those who are ready for the opportunity that comes their way. In a lecture given at Lille University in 1854, Pasteur stated, “In the field of observation, chance favors only the prepared mind.” Preferring observation to theory, Pasteur made many important discoveries, including the process known as fermentation, by which bacteria (beneficial and harmful) grow in organic substances. Until his thorough study of yeasts revealed this, it was believed that bacteria arrived by spontaneous generation. It was only a step from this discovery to the long battle against harmful microbes that cause diseases such as typhoid, cholera, and tuberculosis. The special heating process that destroys microbes in milk was first carried out in 1862, and bears his name: pasteurization. JF
1854
The Simple Life
Henry David Thoreau
Simplifying one’s lifestyle as a means of improving spiritual wellbeing
A detail from Ferdinand Brütt’s painting of a summer’s day (1904) evokes Henry David Thoreau’s idea of “the simple life,” which for him involved living in natural surroundings.
Prescriptions for simple living date back to ancient times and are found in such traditions as Daoism, primitive Christianity, and ancient Greek and Roman philosophy. The eighteenth-century Swiss philosopher Jean-Jacques Rousseau (1712–78) also praised the life of rustic simplicity at a time when enthusiasm for sophisticated modern life was running high. The thinker most closely associated with the virtues of simple living, however, is the U.S. transcendentalist philosopher Henry David Thoreau (1817–62), whose two-year experiment with the simple life on the shores of Walden Pond, near Concord, Massachusetts, was memorialized in his classic book Walden, published in 1854. As a transcendentalist, Thoreau affirmed the inherent goodness of people and nature, but believed that human beings were at their best when they were most “self-reliant” and independent of the corrupting influence of society and its institutions.
“Be content with what you have; rejoice in the way things are. When you realize there is nothing lacking, the whole world belongs to you.”
Laozi, Daoist philosopher
For Thoreau, the simple life meant a retreat from the consumerism and materialism of the modern world in order to regain a lost closeness to nature and to foster self-reflection. For many people today, the practice of voluntary simplicity promises a reduction of worry and stress, a more eco-friendly lifestyle, and a better balance between work and leisure.
The first step in simple living is usually to decrease one’s possessions and consumption. Proponents of simple living recommend being content with the satisfaction of one’s needs, rather than devoting oneself to the pursuit of an ever-increasing catalog of wants. Many also choose to grow their own food. For those who want to take “baby steps” in the direction of a simpler life, “downsizing” entails a gradual shift of emphasis away from economic success and toward activities designed to bring greater personal fulfillment. The simple life, in this view, is the good life. GD
1854
Gradgrind’s Education
Charles Dickens
A fictional educational philosophy critiquing utilitarian attitudes toward education
An illustration from Charles Dickens’s novel shows the utilitarian educationist Thomas Gradgrind after finding his children, Louisa and Tom, who have sneaked off to Sleary’s circus.
In 1854, English novelist Charles Dickens (1812–70) published his tenth novel, Hard Times, a searing indictment of the moral, political, and social ideologies that he held responsible for perpetuating the hard times experienced by the poor in mid-nineteenth-century England. Set in the fictional industrial city of Coketown amid smokestacks and factories, Hard Times tells the story of Thomas Gradgrind, his family and associates, and others with whom his life intersects. Gradgrind, a retired merchant who has become a school headmaster, espouses a philosophy of rational, calculating self-interest and exclusive focus on cold facts and numbers. Events in the novel eventually prompt Gradgrind to renounce his philosophy, which is portrayed as an unintended source of misery for those around him.
“A man of realities. A man of facts and calculations. A man who proceeds upon the principle that two and two are four, and nothing over, and who is not to be talked into allowing for anything over.”
Charles Dickens, Hard Times (1854)
The failures of Gradgrind’s educational philosophy are meant to offer an object lesson in the folly of a “utilitarian” outlook that dismisses the worth of anything that cannot be economically quantified or measured, such as the experience of wonder and the exercise of the imagination. “With a rule and a pair of scales, and the multiplication table always in his pocket,” Gradgrind is said to be “ready to weigh and measure any parcel of human nature, and tell you exactly what it comes to. It is a mere question of figures, a case of simple arithmetic.” His single-minded focus on facts and figures at the expense of more imaginative pursuits has left him unable to appreciate that the most unquantifiable aspects of human existence, the lively “sentiments and affections” on which he heaps such disdain, are the ones that make life worth living.
Gradgrind’s name has entered the vernacular as a synonym for a soulless devotion to facts and figures and, in particular, for a pedagogical theory that favors the learning of facts to the neglect of cultivation of the arts and humanities. GD
1854
The Immaculate Conception
Pope Pius IX
A Catholic doctrine teaching that Mary, the mother of Christ, was born free of sin
A painting depicting the Immaculate Conception by José Antolinez (c. 1650–75).
According to Roman Catholicism, all humans are born with the stain of an original sin inherited from our first ancestors, whose reckless disobedience of God—first in the Garden of Eden when Adam and Eve ate fruit from the Tree of Knowledge of Good and Evil—inflicted lasting damage on human nature itself. This original sin is the reason we are weak in our ability to resist moral temptation and prone to vices, such as lust and greed. To overcome the effects of original sin and become eligible for eternal life with Christ, we need God’s sanctifying grace, which is normally conferred at baptism. The sole exception to this universally sinful human condition is the mother of Jesus, the Virgin Mary.
“Mary … a virgin whom grace has made inviolate, free of every stain of sin.”
St. Ambrose
Unique among human beings, Mary was conceived “immaculately,” that is, without the stain of original sin. The sanctifying grace that others receive at baptism was granted to her from the moment her soul was created. This dogma, known as the Immaculate Conception, has been an official tenet of the Church since it was first formally proclaimed by Pope Pius IX in 1854, although it had been a popular belief since the fifth century. Nonetheless, it had been a bone of contention among Catholic theologians, many insisting that sanctifying grace could be conferred only after conception. A decisive consideration in favor of the dogma, according to Pope Pius IX, however, was what he called “a profound sensus fidelium,” the consensus among the faithful as reflected in their devotion to Mary. The Immaculate Conception should not be confused with other Catholic dogmas: the virginal conception of Jesus and Mary’s perpetual (or lifelong) virginity. GD
1854
Entropy
Rudolf Clausius
A measure of the unavailability of a system’s thermal energy for conversion into work
Entropy is a concept in classical thermodynamics, which is the study of how heat is converted into usable energy. Entropy is the measure of how much of a system’s thermal energy (heat) is unavailable for doing useful work or how evenly that energy is distributed in the system. The more even the distribution, the less energy is available. German physicist Rudolf Clausius (1822–88) produced the first mathematical formulation of entropy in 1854. However, it was not until 1865 that he coined the term “entropy,” which comes from the Greek entropía, meaning “transformation content.”
For an illustration of entropy, consider what happens when an ice cube is placed in a glass of water. Initially, the energy in the glass is very unevenly distributed, with the warmer water molecules possessing more energy than the colder ice. Consequently, the system is in a state of low entropy, with much of its energy available for the work of melting the ice. But once the system has achieved equilibrium, with the ice melted and thermal energy distributed randomly throughout the glass, the system is in a state of high entropy and its energy is unavailable for work.
“Entropy shakes its angry fist at you for being clever enough to organize the world.”
Brandon Sanderson, science fiction writer
This example also illustrates the second law of thermodynamics, which states that the entropy of any isolated system, one that exchanges neither matter nor energy with the outside world, always tends to increase. Without outside energy inputs, every system tends toward greater equilibrium, randomness, and disorder. Since the universe as a whole is an isolated system, it is steadily approaching a state of maximum entropy, at which point all its available energy will be spent. GD
1855
Biblical Psychological Criticism
Franz Delitzsch
A field of biblical criticism analyzing the psychological dimensions of Judeo-Christian scripture
Biblical psychological criticism is a field of study that applies psychological and psychoanalytical insight to the origins, authorship, content, translation, and interpretation of the Bible. Psychological criticism also aims to examine the history of the Bible’s personal and cultural effects. It attempts to psychoanalyze the authors of scripture and the characters mentioned in the biblical texts, and to clarify the writers’ intentions. Psychological criticism also inquires into the relationship of the reader with the scriptures and how that affects the personal world of an individual and becomes relevant in their life.
“When we approach scripture we come with the same baggage. It is not possible to come in any other way.”
Wayne G. Rollins, Jung and the Bible (1983)
Psychological inquiry of the scriptures began before the birth of modern psychology. Biblical psychological criticism started with A System of Biblical Psychology (1855) by German theologian Franz Delitzsch (1813–90). He examined works by writers such as Tertullian (c. 160–225), St. Augustine of Hippo (354–430), and St. Thomas Aquinas (1225–74) in a survey to assess what he defined as a “psychological literature” that stretched from the writers of the early Christian church to contemporary theologians. Delitzsch suggested that biblical psychology was a science and “one of the oldest sciences of the church.”
Delitzsch’s A System of Biblical Psychology was not received favorably at the time. However, as psychology evolved to mean more than the study of the soul, or psyche, and developed into a discipline that involved the study of the mind, so biblical psychological criticism also evolved with it, influenced by the work and writings of pioneering psychiatrists Sigmund Freud and Carl Jung on the conscious and unconscious activities of the soul. Biblical psychological criticism gained currency in the 1960s via the works of U.S. academics such as Frederick Charles Grant and Wayne G. Rollins, and German theologian Gerd Theissen. CK
1856
Limited Liability Corporation
British Government
A company that has a separate legal identity from the people who own and run it
The Houses of Parliament in London, England, where the Limited Liability Act was passed in 1855. This historic legislation met the need for larger amounts of capital investment in industry.
The abbreviation “Ltd.” (Limited), frequently seen after the name of a business, identifies it as a limited liability corporation, a particular type of company that has a legal identity separate from both its managers and shareholders. Limited liability sets a limit on how far investors or partners can be held liable for a company’s debts and unfulfilled obligations should the business fail. That limit is typically set at the value of their original investment, so that shareholders risk only what they have already put into the company. Consequently, if the company is successfully sued, only the company itself, not its owners or investors, will be liable to pay any resulting judgment. This feature distinguishes limited liability corporations from businesses with a sole proprietor and general partnerships, in which the owners are liable for all the company’s debts.
“Corporation: An ingenious device for obtaining profit without individual responsibility.”
Ambrose Bierce, journalist and satirist
In England, limited liability has been granted to some associations, such as monasteries and some trade guilds, since the fifteenth century. However, joint stock companies—businesses owned by shareholders—were not allowed to incorporate as limited liability corporations in the United Kingdom until Parliament passed the Limited Liability Act of 1855. The Joint Stock Companies Act of 1856 established a simple procedure whereby any group of seven people or more could register a limited liability corporation. Since then, this form of corporation has spread throughout the world and been adopted by the vast majority of businesses operating in market economies.
It is difficult to exaggerate the importance of this innovation, for it effectively made the company equivalent to a person under the law. By limiting shareholder liability, this new status helped companies to raise enough capital to form the large enterprises required by the emerging industrial economies and thereby enabled the growth of modern capitalism. GD
1858
The Great Controversy
Ellen G. White
The religious belief that all humanity is caught in a battle between good and evil
Throughout recorded history, religious teachers have depicted humanity as caught in a fierce battle between the forces of good and evil. The Seventh-day Adventist Church offers one perspective on this battle, derived from a reading of the Bible. In 1858, Ellen G. White, one of the founders of the Seventh-day Adventist Church, wrote The Great Controversy, recounting the history of “the great controversy between Christ and Satan,” from its beginnings before the creation of the world right up to its end-time when the world will be destroyed.
In this account, sin first came into existence when Satan, originally an angel created to enjoy fellowship with God in Heaven, became filled with pride and grew discontent with living under God’s laws. Aspiring to autonomy and equality with God, he led a rebellion that resulted in his expulsion from Heaven. The field of battle then shifted to Earth, where Satan infected Adam and Eve with his same pride, persuaded them to disobey God, and thereby wrested from them the dominion over the Earth that God had granted them. Establishing himself as the new “prince of this world,” he continued his rebellion from his new earthly base of operation. His eventual downfall was later assured, however, by the selfless sacrifice of Christ on the cross.
“ … that old serpent, called the Devil, and Satan … was cast out into the earth.”
The Bible, Revelation 12:9
For Seventh-day Adventists, this controversy involves God’s character, sovereignty, and law over the universe. Satan objects to God’s law, denouncing it as arbitrary and burdensome. His rebellion impugns God’s character, construing it as defective, and challenges God’s right to govern the universe. The battle is ongoing and touches every life. GD
1858
Gray’s Anatomy
Henry Gray
An inexpensive and accessible anatomy textbook for medical students
Engraving of the muscles of the neck from Gray’s Anatomy (1897).
The compilation of a single, accessible, and inexpensive work of reference about human anatomy that could be used by everyone seems an obvious idea, yet it was only comparatively recently that it came to fruition. The work of a single-minded and creative doctor, Gray’s Anatomy made its first appearance in 1858 and has remained the standard text ever since.
Henry Gray (1827–61) trained as a medical student at St. George’s Hospital in London. In 1853 he was appointed lecturer in anatomy at St. George’s Medical School. Two years later, Gray, who was only twenty-eight at the time, recognized that no single anatomy textbook existed that he could recommend to his students. He asked his colleague Henry Vandyke Carter (1831–97), a skilled medical illustrator, to help him compile such a book. They worked together for eighteen months before the finished text was published in 1858. Known then as Anatomy: Descriptive and Surgical, the book was 750 pages long and contained 363 figures. It was an immediate hit. Gray, however, did not live to see its long success; in 1861 he died of smallpox while treating his nephew.
“As a practical work … Gray’s Anatomy has always been recognized and appreciated.”
Preface to the twentieth edition of Gray’s Anatomy (1918)
Gray’s Anatomy was organized in a systemic way, with separate sections for the entire skeletal system, nervous system, and so on. Over the years, the book increased in length, so that by the thirty-eighth edition in 1995, it had 2,092 large-format pages. The contents were then reorganized for the thirty-ninth edition into regional anatomies according to where in the body the structures are located. The fortieth edition, in both print and online editions, appeared in 2008. SA
1859
Natural Selection
Charles Darwin
The gradual process by which populations evolve to have certain biological traits
Charles Darwin (pictured in 1902), who laid the foundations of evolutionary theory.
In his studies of the natural world, English scientist Charles Darwin (1809–82) observed that in any population of living organisms there is inevitable variation between individuals. When those variations allow an organism a better chance of surviving and reproducing, that organism, and its inherent traits, will be naturally selected to continue on into future generations due to its suitability for survival. Over time, the accumulated small traits naturally selected by the environment will lead to the evolution of new species.
In 1859 Darwin published On the Origin of Species, which caused a tectonic shift in the view of life on Earth. Darwin was not the first to observe that life changed over time, or to propose that environmental conditions could influence an organism’s ability to adapt. In fact, Alfred Russell Wallace (1823–1913), a contemporary of Darwin, independently arrived at many of the same conclusions. However, Darwin’s extensive research and evidence in support of his theory ensured his claim as the discoverer of evolution by natural selection.
“ … the origin of species—that mystery of mysteries, as it has been called …”
Charles Darwin, On the Origin of Species (1859)
Since its introduction, evolution by natural selection has become the bedrock theory of the life sciences. In the simple calculation of organisms surviving based on natural traits, it removed the notion of a prearranged natural world that had been widely assumed throughout Western history. The idea that all organisms—even humanity itself—evolved from more primitive forms had profound implications for both scientists and society at large. Even today there are those who vehemently oppose the idea of natural selection, largely because of the conclusions it entails. MT
1859
On Liberty
John Stuart Mill
Individual freedoms should only be limited to prevent harm to others
Philosopher and statesman John Stuart Mill (1806–73) published On Liberty in 1859 as part of his theory of utilitarianism. While Mill’s later Utilitarianism (1861–63) states that the right thing to do is what promotes the greatest good for the greatest number of people, On Liberty delineates the appropriate limitations of a government in enforcing this principle.
Mill argues that politics is necessarily a struggle between liberty (maximizing personal freedom) and authority (maximizing safety). Too much emphasis upon the former produces anarchy, while too much of the latter results in tyranny. The balance between these two extremes is struck by following the harm principle: liberty to pursue one’s own happiness is a fundamental good for all human beings and can only be infringed upon if the exercise of one’s liberty harms other persons. A state is not justified in making paternalistic laws that restrict citizens’ freedoms for their own good. For example, while the state can ban drink driving because it harms others, it should not outlaw alcohol simply because the drug might harm its user. If the state is to err, it should do so on the side of liberty rather than authority. Mill argues that three types of liberty should always be protected by a just state: (1) freedom of consciousness, including beliefs and speech; (2) freedom of tastes and pursuits; and (3) the freedom to unite for any noninjurious purpose.
“Over one’s mind and over one’s body the individual is sovereign.”
John Stuart Mill
On Liberty is one of the most important treatises in the history of political philosophy. The harm principle is a cornerstone of liberal democracy and continues to be used by both lawmakers and political theorists. JM
1859
Last Universal Ancestor
Charles Darwin
The organism from which all organisms now living on Earth descend
The title page of the expanded fifth edition of Charles Darwin’s On the Origin of Species, published in 1869—exactly a decade after the first edition.
Contemporary evolutionary biology attests that all organisms on Earth share a common ancestor, known as the “last universal ancestor,” estimated to have lived 3.5 to 3.8 billion years ago. Charles Darwin (1809–82) first introduced the theory that all life descended from a single ancestor in his book On the Origins of Species (1859), basing his argument for common descent on evidence drawn from the presence of homologous structures in different species and embryonic development.
“There is a grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one.”
Charles Darwin, On the Origin of Species (1859)
In order to understand the idea of homologous structures, compare the arm of a human being, the forelimb (flipper) of a seal, and the wing of a bat. Although each has a very different function, they closely resemble one other in their relative position and assembly. These similarities suggest that these three structures all evolved from a single prototype belonging to a shared ancestor of the three organisms. Other homologies found across a wide range of species led Darwin to the conclusion that all or most animals shared a common descent. He also noticed that the embryos of disparate species shared a similar structure, making it impossible to distinguish them in the earliest stages of their development, thus indicating a common ancestry.
Although Darwin’s observations were limited by the science of his time and seemed at best only to support a common descent for all animals and, through a different independent lineage, for all plants, he believed that we could extrapolate to the much stronger conclusion of a universal ancestor for all living organisms, based on similarities “in their chemical composition, their germinal vesicles, their cellular structure, and their laws of growth and reproduction.” More recently, comparative analyses of the DNA of different species have offered new insights and confirmation of Darwin’s hypothesis of a universal common ancestor, which scientists now believe to have been a small, single-cell organism. GD
1859
Riemann Hypothesis
Bernhard Riemann
A theory explaining the apparently random pattern of prime numbers
A portrait of the German mathematician Bernhard Reimann, created in the 1860s. Reimann made significant contributions to analysis, number theory, and differential geometry.
In 1859, Bernhard Riemann (1826–66), an obscure German mathematician, introduced a bold hypothesis in his paper “On the Number of Prime Numbers Less Than a Given Magnitude.” As the title indicates, his hypothesis offers a formula for calculating how many prime numbers appear in any block of numbers. To this day, the Riemann Hypothesis has been neither proven nor refuted, despite a century and a half of painstaking research and a $1 million prize awaiting the first person to solve it.
“However timid and listless he may have appeared to casual observers, Reimann’s mathematics has the fearless sweep and energy of one of Napoleon’s campaigns.”
John Derbyshire, writer
A full explanation of the Riemann Hypothesis requires an account of the mathematical entities that it employs, called “the zeros of the Riemann zeta function,” of which the Riemann Hypothesis says, “All the non-trivial zeros of the Riemann zeta function have real part one half.” An adequate explanation of these mysterious “zeros” would involve a book-length exposition, but it is possible at least to get a general grasp of the mathematical problem that these entities are recruited to address.
The Riemann Hypothesis concerns the distribution of prime numbers. The set of primes is infinite—however high you count, there are always more up ahead. However, as one proceeds along the number line, the occurrence of primes becomes more and more infrequent. More prime numbers lie between 1 and 100 than between, say, 9,001 and 9,100. But there is an irregularity to how the primes thin out that complicates any attempt to calculate just how many primes to expect in any block of numbers. There already exists an established formula for the average density of primes, but it was Riemann who first proposed an exact formula for calculating the deviation from average density, too. Still unproven, the Riemann Hypothesis remains, in the words of author John Derbyshire, “the great white whale of mathematical research.” GD
1859
Sexual Selection
Charles Darwin
Animal species evolve certain traits in order to attract mates
The male peacock’s showy display of plumage is an example of traits arising from sexual selection.
If species evolve based on their ability to adapt to their environments, how can some animals have developed extreme—and seemingly nonadaptive—traits, such as vibrant feather displays or large and ungainly sets of antlers? The answer, according to Charles Darwin (1809–82), lies in sexual selection. Regardless of how suited to an environment an organism is, that organism will not be able to reproduce unless it can find a mate. If those potential mates only choose partners with certain qualities, then only those individuals that display such qualities will be able to reproduce, while the others will die without successfully producing offspring.
In his seminal book On the Origin of Species (1859), Darwin laid the groundwork for the understanding of evolution by natural selection, also explaining that an organism must necessarily be able to attract a mate in order to reproduce and pass on whatever beneficial characteristics it may have. He explained that individuals, typically males, of the same species commonly display characteristics or traits that only exist to allow the male to attract and mate with a female. This explains why animals develop traits that do not necessarily allow them to better adapt to an environment, and also competition between males of the same species over the ability to mate with a female.
“Sexual selection is, therefore, less rigorous than natural selection.”
Charles Darwin, On the Origin of Species (1859)
With sexual selection Darwin showed that evolution is not simply a process of adaptation to the environment. Sexual and social pressures also have a great impact on how a species evolves, even to the point of making an individual organism less likely to survive. MT
c. 1860
Abduction
Charles Sanders Peirce
The ability to derive conclusions from information and observable facts
The U.S. philosopher and logician Charles Sanders Peirce (1839–1914) first began writing on what he called his abduction theory in the 1860s, and continued to expand and refine it over the next five decades. Abduction is a form of inference that uses information describing something to draw a hypothesis that offers a plausible explanation for what has been observed or has occurred. In the words of its originator: “Abduction is the process of forming an explanatory hypothesis. It is the only logical operation which introduces any new idea.”
Peirce believed that there were three different types of reasoning. There was deduction or necessary reasoning, deriving a conclusion B from A where B is a formal consequence of A. Inference was having good reason to believe a conclusion on the basis of a premise. Abduction, however, was a kind of guesswork, “very little hampered by the rules of logic”; an initial phase of inquiry in a situation where premises do not necessarily guarantee correct conclusions.
“Bad reasoning as well as good reasoning is possible …”
Charles Sanders Peirce
A classic example of abductive theory is this: you are driving home from work and notice a blue car is behind you. After making several turns the blue car continues to be behind you, and is still there after several more changes of direction. After doing a U-turn because you had forgotten something in your office that you need at home, you notice the same blue car is still behind you. Having already ruled out the possibility of coincidence, and in the absence of any more plausible explanations, you reluctantly conceive the hypothesis that the blue car is following you deliberately. JMa
1860
Industrial Design
Christopher Dresser
The combination of principles of design with industrialized manufacturing processes to create an entirely new aesthetic
A watering can designed by Christopher Dresser in 1876, manufactured by Richard Perry, Son & Company. Much of Dresser’s most influential work was produced around this time.
Although industrialization came with the transition to new manufacturing processes from about 1760, the idea of design had existed for centuries prior to this. Some would argue that the first ever industrial designer was Leonardo da Vinci, as evidenced in his “Book of Patterns of Machine Elements” in the fifteenth century. The word “design” itself, however, was not defined in the Oxford Dictionary until 1588, as “a plan or scheme devised by a person for something that is to be realized …” It was the civil servant and inventor Henry Cole (1808–82) who made the case for functional design in 1849 with his publication of the short-lived Journal of Design, and also for a Great Exhibition in London to showcase the world’s manifold new industrial creations, made possible by innovations such as cast iron, industrial carpentry, and automated looms.
“One of Dresser’s great strengths as a designer was his ability to understand the properties of materials and the processes of production …”
Robert Edwards, art critic
The first modern, commercially successful attempt at combining art and technology to improve the aesthetics of everyday objects, however, belongs to the great nineteenth-century Scottish designer Christopher Dresser (1834–1904). After studying as a botanist and authoring three well-received books on the subject, he turned his attention to design in 1860 after failing to gain the chair of botany at the University of London. His success in applying design principles to consumer products, such as wallpapers, ceramics, stained glass, and metalware, was so immediate that he was quoted in 1871 as saying, “as an ornamentalist I have much the largest practice in the kingdom.”
Industrial design has had a great impact on everyday life; in its quest to design products for utility, comfort, and beauty, professional industrial design has given the world iconic objects ranging from the Barcelona chair and the KitchenAid mixer to the iPod. Design is now art, and art, design—form and function limited only by the boundaries of our imagination. BS
1860
Renaissance Man / Woman
Jacob Burckhardt
Individuals with wide-ranging talents and interests as shown by the Renaissance era
In modern societies, anyone with many diverse interests may be loosely described as a Renaissance man or woman. While the European Renaissance is well known as an era marked by the appearance of such men as Leonardo da Vinci, Copernicus, and Galileo, the period itself only became known as the Renaissance after French historian Jules Michelet described it as such in the mid-nineteenth century. The characterization of the Renaissance man was first explicitly described a few years later in a work published in 1860 by Swiss historian Jacob Burckhardt (1818–97), The Civilization of the Renaissance in Italy. Burckhardt described fifteenth-century Italy as particularly notable for the rise of the “many-sided” man, who was knowledgeable or adept at languages, natural history, and other scientific subjects while also involved in politics and the arts.
This idealized sense of the Renaissance man as a multitalented individual is retained in its modern meaning. Burckhardt notes that the Italian Renaissance was imbued with the ideals of humanism, which placed a high value on an individual developing their capacities as fully as possible. In keeping with these ideals, Burckhardt observes that upper-class women were often given the same education as men. Nevertheless, he concedes that, despite their education, few women were afforded the same opportunities to practice their talents and skills in the many fields open to men.
Our modern understanding of Renaissance men and women is greatly indebted to Burckhardt’s insight into the culture of individualism that encouraged creativity over conformity. As that individualism spread throughout Europe and has since become widely and firmly established in modern societies, many of the greatest achievements in the arts and sciences are owed to those who are rightfully called Renaissance men and women. TJ
c. 1860
Darwinism
Thomas Huxley
A movement in support of Charles Darwin’s theory of evolution
In the 1860s, naturalist Charles Darwin (1809–82) was busy developing his theory of evolution and searching out corroborative evidence for it. He had better things to do than defend his ideas from his opponents, and it was not his concern to pull his ideas together to form an overarching super-theory. Both tasks were undertaken by English biologist Thomas Huxley (1825–95), who dubbed himself “Darwin’s bulldog” for his advocacy of Darwin’s ideas. Indeed, in the lectures he gave in London in the 1860s, Huxley may well have extended the scope of Darwin’s ideas further than the biologist himself intended. In Huxley’s hands, Darwin’s work became a movement with a life of its own: Darwinism.
The Darwinist view that the theory of evolution had destroyed the idea of a divine creator encouraged the public perception that agnosticism, and later atheism, was the logical conclusion to be drawn from Darwin’s work. Darwin himself had delayed publication of On the Origin of Species (1859) in fear of such controversy, and the dispute over the theory of evolution’s implications became more entrenched and bitter as a result of Huxley’s championing of Darwin’s work.
“As for your doctrines, I am prepared to go to the Stake if requisite …”
Thomas Huxley, in a letter to Charles Darwin (1859)
Atheist scientists, such as the British biologist Richard Dawkins (b. 1941), have become well known in recent years for their intolerance of religion of all kinds, and their firm view that Darwinist ideas have made religious belief untenable. However, by no means all scientists agree. In the face of this debate, the U.S. National Academy of Sciences recommended in 1981 that religion and science should not be presented in the same context, to avoid misunderstanding. JF
1863
The Missing Link
Charles Lyell
A critical gap in the fossil record showing the evolutionary link between ape and man
Archaeopteryx lithographica could be considered the “missing link” between therapod dinosaurs and birds.
The British geologist, and close friend of Charles Darwin, Charles Lyell (1797–1875) popularized the term “missing link” in reference to hypothetical fossil remains that exhibit traits of both an ancestor and a descendent, and thus provide evidence of a clear evolutionary line of heredity. Today, the term—which is more of a popular one than a scientific one—is often used by critics claiming that these missing fossils suggest that evolution is unsound theory.
In 1863, Lyell published Geological Evidences of the Antiquity of Man, the book that would introduce the concept of the “missing link.” As a geologist, Lyell had been investigating what he knew to be layers of sedimentary rock deposited at different times, and he had noticed that there was a distinct difference in the appearance of fossils found in adjacent sedimentary layers. He brought in the term “missing link” to explain that sudden unexplained transition. Coincidentally, 1863 was the year in which someone first used the phrase to criticize the theory of evolution. In that year, a Scottish doctor, John Crawford, said that in order for evolution to be true, there must be some fossil evidence to show how “man came from a monkey.”
In the evolutionary process, organisms evolve across generations and through long periods of time. The time intervals involved give rise to the expectation that animals existing as intermediary stages in between two related species will show shared traits of both. The idea of missing links in the fossil record has captivated people since its introduction, especially those who are troubled by evolution’s assertions. But as a scientific notion, the idea of the missing link is largely useless. All organisms that point the way toward the evolution of new species can be considered transitional, and thus each is a missing link in the vast chain of evolving species. Verification of the theory of evolution does not hinge on the identification of missing-link fossils. MT
1863
Bahá’i Faith
Bahá’u’lláh
A religious ideology stressing unity in God, science, and the human striving for peace
The Bahá’i Shrine and Gardens in Haifa, Israel, are the international headquarters for the Bahá’i Faith.
The Bahá’i Faith was founded in the nineteenth century. Its basis is unity in worship of a single god, and its vision is for harmony between all people, and, eventually, unity between all paths. The religion anticipates an eventual convergence of the paths of science and religion, sexual equality (although this does not imply acceptance of homosexuality), and increasing respect for the environment. Bahá’i stresses the unity of all creation, in addition to the oneness of God, of the human family, and of religion itself.
The Persian founder of the Bahá’í Faith, Bahá’u’lláh (1817–92), was born Mírzá Husayn-`Alí Núrí, the son of a visier who, along with Bahá’u’lláh’s mother, died before he came of age. When the orphaned boy was old enough to enter government, he refused to do so, instead seeking more wisdom. The decision was to mean almost unimaginable suffering, although perhaps it was this that enabled Bahá’u’lláh, in the notorious dungeon of Síyáh-Chál in Tehran, to reach a state of consciousness that led him to found the faith. In 1863, he announced, in the Garden of Rivdán, Baghdad, that he was the one about whom the Báb (the Gate, a manifestation of God) had prophesied. The foundation of the Bahá’i Faith began with Bahá’u’lláh spending twelve days in that garden, at the start of a long exile.
Enduring further long imprisonment, torture, and the loss of his beloved son, Bahá’u’lláh developed the clarity and determination to develop his ideas and write them out as teachings. The Bahá’i emphasis on equality, tolerance, integration rather than segregation, and freedom from prejudice opened the way to a humane and inclusive understanding of religion. Seeking common ground between reason and religion, economy and spirituality, Bahá’i challenged the inequalities and injustices of the social and political systems of Bahá’u’lláh’s day. The religion still challenges exclusive religious orthodoxies today. LW
1863
Gettysburg Address
Abraham Lincoln
A reaffirmation of a founding principle of the United States: that all humans are born equal
A painting of Abraham Lincoln giving his Gettysburg Address by J. L. G. Ferris (c. 1900), a U.S. artist best known for his series of seventy-eight scenes from U.S. history.
The Battle of Gettysburg took place during July 1–3, 1863, and resulted in the retreat of General Robert E. Lee’s Army of Northern Virginia from its incursion into Union territory. On November 19, months after the battle, President Abraham Lincoln (1809–65) attended a ceremony dedicating a national cemetery at the Gettysburg battlefield site. The Gettysburg Address is the speech he gave to the assembled crowd at the ceremony, and it is widely celebrated as one of the most important and influential political speeches in the history of the United States.
“ … we here highly resolve that these dead shall not have died in vain—that this nation, under God, shall have a new birth of freedom—and that government of the people, by the people, for the people, shall not perish from the earth.”
Abraham Lincoln
When President Lincoln delivered his address, he was second on the bill to Edward Everett (1794–1865), a famed orator who gave a two-hour-long speech to the assembled crowd. Lincoln’s speech was incomparably shorter, lasting no longer than two to three minutes, and encompassing about 250 words. Yet in that speech, the president reflected the ideals expressed in the Declaration of Independence (1776), the founding document of the American nation. His simple, eloquent expression of the notion that the nation was founded for equality, and for the good of all people, not once referred to slavery, the Confederacy, the Union, or any of the political issues of the day.
It is unclear what the reaction to Lincoln’s speech was at the time, and less than two years after giving it the president was dead and the civil war over. However, the impact of the Gettysburg Address lived on as a model of political rhetoric, oratorical simplicity, and political ideology. The speech turned the nation’s political attention toward the unifying ideal that all people are born equal—an ideal that is almost universally assumed today. The Gettysburg Address is credited as being largely responsible for the introduction of that ideal into U.S. political discourse, and it remains an important political reference point today. MT
1863
In Praise of Cosmetics
Charles Baudelaire
No woman is so beautiful that her beauty would not be enhanced by cosmetics
Charles Baudelaire, here photographed in 1862, believed that “everything beautiful and noble is the result of reason and calculation,” with nature being only brutal and instinctive.
For much of the history of humanity, the wearing of cosmetics by women has been viewed, in the West at least, as something associated with harlots and stage performers (with those two professions once being considered almost equally disreputable). As an early nineteenth-century song once asserted, it is nature itself that “embellishes beauty,” so what need would a virtuous woman have for makeup?
“I am perfectly happy for those whose owlish gravity prevents them from seeking beauty in its most minute manifestations to laugh at these reflections of mine …”
Charles Baudelaire, “In Praise of Cosmetics” (1863)
The French poet and essayist Charles Baudelaire (1821–67) was raised in this culture of “naturalized beauty” and never really questioned it in his early years. But then, in the 1860s, the man who coined the word “modernity” began to question what Romantic artists and writers referred to as the “supremacy of nature.” In his book, The Painter of Modern Life (1863), he turned his attention to the nature of beauty in the chapter titled “In Praise of Cosmetics.”
Baudelaire had always felt especially drawn to the opposite sex, and was conscious of how society’s notion of beauty was changing in an increasingly industrialized world. His essay on beauty was a little too whimsical to be taken absolutely seriously, but it was nonetheless a triumphant defense of the notion that makeup can make the beautiful even more beautiful. “External finery,” Baudelaire wrote, is “one of the signs of the primitive nobility of the human soul.” Every fashion is “charming,” and every woman is bound by “a kind of duty” to appear magical, to astonish, and to charm her fellows. Accordingly, nature could now be imaginatively surpassed by applying black eyeliner, which “gives the eye a more decisive appearance,” and rouge, which “sets fire to the cheekbone.”
Baudelaire’s emphasis on the beauty of artifice over nature marked a significant departure from the Romanticism of the first half of the century, reflecting the rise of decadence and Aestheticism, to many of whose practitioners he was a hero. BS
1864
Pasteurization
Louis Pasteur
Heating food kills harmful microorganisms that could otherwise cause illness
Louis Pasteur’s Portrait (1885), by Finnish artist Albert Edelfeldt, depicts the microbiologist in his laboratory.
During his investigations into the microscopic world, chemist and microbiologist Louis Pasteur (1822–95) developed a process by which some foods could be heated—at a particular temperature and for a specified length of time—to destroy any potentially dangerous microorganisms. The process, eponymously named pasteurization, produces food free of many of the most common pathogens. Today, the pasteurization process is widely used around the world in the preparation of numerous beverages and foods, including milk, beer, wine, cheese, seafood, and yogurt.
Historical records show that heating wine to prevent it from spoiling was known in China by at least the twelfth century, and in Japan by the sixteenth. But it was only in 1856 that Pasteur discovered that, while microbiotic yeast turned juice into alcohol through the fermentation process, other microbes caused the alcohol to spoil. In 1864, Pasteur developed his process of heating liquids to specific temperatures for specific times in order to prevent spoilage.
“Gentlemen, it is the microbes who will have the last word.”
Louis Pasteur
Without pasteurization, many foods would have extremely short shelf-lives and would pose a significantly higher health risk to consumers. The introduction of pasteurization greatly reduced the number of cases of tuberculosis, diphtheria, scarlet fever, and other bacteriological diseases. It also led to more efficient food storage and transportation, and facilitated the mass production of safe, reliable foodstuffs. Modern pasteurization techniques, such as exposing food to ionizing radiation, are improvements on Pasteur’s methods, but his are still widely used today. MT
1864
Anarcho-syndicalism
France
A movement blending traditional anarchist sentiments with Marxist pragmatism
The emergence of anarcho-syndicalism is not easily attributed to a particular individual or place, but French politician and socialist Pierre-Joseph Proudhon (1809–65) was probably the first to put on paper its fundamental theories. Anarcho-syndicalism appeared with the anarchist movements that emerged from the Workingmen’s Association of 1864 and the Paris Commune. It lies between traditional, social anarchists, who believe that the only way to end capitalism is through organizing the working class, and individual anarchists, who instinctively oppose all forms of organization and authority. Anarcho-syndicalists believe the state to be profoundly anti-proletarian, and see the working class as a kind of “elementary school for Socialism,” a necessary component in the establishing of an anarchist society; in this aspect the movement is aligned politically with Karl Marx.
“To be governed is to be watched, inspected, spied upon, directed …”
Pierre-Joseph Proudhon, politician and socialist
By the end of the nineteenth century, anarcho-syndicalist unions were established throughout Europe and in the United States and Argentina, and they continued to evolve. In France, adherents of what would become known as “revolutionary-syndicalism” wanted nothing less than the complete destruction of capitalism and all economic monopolies, although they were ambivalent about what political structures might come along to fill the void. In Spain, anarcho-syndicalists formed the National Confederation of Labor party in 1910, which aligned itself with the poor and the landless and by 1936 had in excess of a million members. Thus, anarcho-syndicalism found itself immersed in Spain’s “legitimate” politics. BS
1864
Survival of the Fittest
Herbert Spencer
Those who survive do so because they are adapted to their specific circumstances
The aye-aye (Daubentonia madagascariensis) is perfectly adapted for extricating food insects from wood.
In coining this memorable phrase, English philosopher and biologist Herbert Spencer (1820–1903) produced one of the most tenacious, and most misunderstood, buzzwords of the modern era. The word “fittest” is often popularly understood to mean “strongest” or “best” in an athletic sense, rather than following the fundamental observation of naturalist Charles Darwin (1809–82) that adaptation to changing circumstance is the key to evolution. Both Darwin and Spencer used the word to mean “best fitted” or “best equipped” to survive in local circumstances.
The idea of a struggle for existence was first mooted by British scholar Thomas Malthus (1776–1834) fifty years before Spencer and Darwin published their work. Spencer’s first essay, “The Development Thesis,” published in 1852, seven years before Darwin’s On the Origin of Species (1859), discusses the scientific principle that complex organisms are originally simple. In Principles of Biology (1864), having read Darwin’s work, Spencer suggested the phrase “survival of the fittest” as an alternative to “natural selection”; Darwin introduced it into the fifth edition of On the Origin of Species in 1869. Spencer also compared biological evolution with what he saw as a similar evolutionary trajectory in society, an idea that came to be known as Social Darwinism.
The phrase “survival of the fittest” was widely accepted, but in addition to popularizing Darwin’s evolutionary ideas, it fostered the philosophy that human weakness, both in individuals and societies, was a failing to be despised (although Spencer himself specifically recognized the importance of compassion in human relations). It also fed nineteenth-century racism by encouraging Europeans to see themselves as having evolved into a superior race. The phrase is still popularly used today to describe anything that can be related to Darwinian theories of natural selection. JF
1865
Mendelian Inheritance
Gregor Mendel
How hereditary characteristics are passed from parent organisms to their offspring
A diagram demonstrating Mendelian inheritance of color in Analusian fowls.
The Mendelian theory of inheritance describes the way in which a characteristic can be passed from a parent to his or her children by means of genes. Each parent has two genes (alleles) for inherited characteristics (such as blue eyes or small feet), only one of which carries that parent’s own characteristics. When sexual reproduction occurs, one of these genes is passed to the offspring by each parent. Genes are said to be either “dominant” or “recessive.” The dominant gene will generally reproduce its characteristic in the offspring. Dominant genes in a particular family may turn up in the offspring of almost every generation, while a recessive gene may be carried unused through many generations before it plays its role again, creating a “throwback” to a remote ancestor. This pattern of gene utilization explains how recognizable likenesses occur between family members, and between successive generations in a single family.
In the mid-nineteenth century, scientists were still uncertain of the mechanism by which hereditary characteristics were passed from one generation to the other. In general, people believed that all parental characteristics were melded together, so that two individual characteristics would become mixed or diluted, as observed in the skin color of people of mixed race. Silesian scientist Gregor Mendel (1822–84) discovered that, instead, the alleles of specific characteristics sort themselves independently of each other to produce the gametes (germ cells) of offspring. His work, presented in two separate lectures in 1865 to the Natural Science Society in Brünn, Germany, was not, at first, recognized for the breakthrough it was. Instead, Mendel’s discoveries had to be “rediscovered” in the early twentieth century. Scientists have since worked out numerous modifications to Mendel’s theory of inheritance, but Mendel was the one who set the science of genetics in motion. JF
1865
Light
James Clerk Maxwell
The discovery that illumination is caused by electromagnetic radiation
The aurora borealis, light caused by collisions of energy-charged particles, is reflected by a Norwegian fiord.
Light is electromagnetic radiation; visible light is electromagnetic radiation that is visible to the human eye. As visual creatures, humans have always known light. Systematic investigation of light extends back to ancient Greece but accelerated with the emergence of modern science. In the 1860s, the Scottish physicist James Clerk Maxwell (1831–79) identified light with electromagnetic radiation. Maxwell’s identification was based on his theory of electromagnetism (published in 1865), which united the forces of electricity and magnetism. On its basis, he predicted the existence of electromagnetic radiation and calculated that its speed ought to be about the same as the measured speed of light. He concluded, “Light consists in the tranverse undulations of the same medium [the ether] which is the cause of electric and magnetic phenomena.” By the late 1870s, Maxwell’s identification of light and electromagnetic radiation was widely accepted. The identification was important technologically, especially in enabling the use of nonvisible electromagnetic radiation, from radio waves, microwaves, and infrared light to ultraviolet light, x-rays, and gamma rays.
“ … the most fruitful [work] that physics has experienced since the time of Newton.”
Albert Einstein on Maxwell’s work (1931)
Later work extended Maxwell’s theory, but the idea of the ether was dismissed by the Michelson–Morley experiment in 1905. The idea that light consisted of waves was complicated by the advent of quantum theory, which resolved the debate over whether light consisted of waves (as Maxwell thought) or of particles by showing that neither model alone was satisfactory. The result, emerging in the 1940s, was quantum electrodynamics, the current theory of light. GB
1865
Radio Waves
James Clerk Maxwell
A prediction of the lowest frequencies on the electromagnetic spectrum
Radio waves have the lowest frequency and longest wavelength on the electromagnetic spectrum. In the natural world, radio waves are emitted by stars, but they can be created artificially using radio transmitters. Radio waves vary in length from 1 millimeter to 19 miles (30 km). They were first predicted mathematically by Scottish theoretical physicist James Clerk Maxwell (1831–79) as part of his electromagnetic theory in 1865, and were demonstrated in the laboratory twenty years later by German physicist Heinrich Hertz (1857–94), who gave his name to the unit of measurement (kilohertz, abbreviated as khz) of their frequencies. Radio waves fall between 10 and 300,000 khz in frequency.
There are four main types of radio wave: long wave, used for earth to space transmissions and cell phones; medium wave, used for most ordinary radio broadcasts; VHF (very high frequency), used for FM radio, civilian aircraft, and taxi frequencies; and UHF (ultra high frequency), used for police and some military aircraft radios, and television transmissions. Microwave radiation may also be included in the list, coming above UHF in frequency/length.
“In the new era, thought itself will be transmitted by radio.”
Guglielmo Marconi, New York Times (October 11, 1931)
Radio waves were and remain the foundation of broadcasting and other types of communication technology, including television, radar, and cell phone as well as radio itself. The higher frequencies can be used for re-transmission via satellite to users out of direct contact because of the Earth’s curvature. In addition to being widely employed for communications, radio waves are used for medical procedures, such as noninvasive surgical treatments and MRI imaging. JF
1865
Electromagnetic Theory
James Clerk Maxwell
A new understanding of electrical and magnetic fields as two aspects of a single continuum based on wavelength and frequency
Metal filings form a magnetic field pattern around either pole of a magnet. Magnetic fields force moving electrically charged particles in a circular or helical path.
Scottish theoretical physicist James Clerk Maxwell (1831–79) understood electromagnetism in terms of work produced earlier in the nineteenth century, by André-Marie Ampère (1775–1836) and Michael Faraday (1791–1867), on electric currents in relation to the magnetic field. In a key paper, “A Dynamical Theory of the Electromagnetic Field,” published in 1865 in Philosophical Transactions of the Royal Society, Maxwell showed mathematically that electricity and magnetism are two aspects of the same phenomenon, before publishing a landmark work based on his own and others’ discoveries, A Treatise on Electricity and Magnetism, in 1873. Here, he proposed that electromagnetism consisted of a spectrum, and predicted different frequencies of radiation, from high-frequency ultraviolet light down to long radio waves.
“The unification of electricity, magnetism, and light represented the crowning achievement of classical physics in the nineteenth century.”
Mauro Dardo, Nobel Laureates and Twentieth Century Physics (2004)
Experimental proof that Maxwell’s theory was more than just a clever piece of mathematics was provided a few years after Maxwell’s death by German physicist Heinrich Hertz (1857–94). However, Maxwell’s idea that a substance known as ether was involved in electromagnetic processes later fell out of favor as a result of Albert Einstein’s special theory of relativity, which ruled out ether as a necessary prerequisite for the behavior of electromagnetic radiation.
For the next fifty years, Maxwell’s electromagnetic theory dominated theoretical physics, along with Isaac Newton’s mechanics, and many practical applications were found for both electric currents and magnets. The theory was seminal to the work of Einstein, who recognized how it had changed scientific and popular perceptions of reality itself. He commented: “Before the electromagnetic theory was put forward, people conceived of physical reality as material points. After Maxwell, they conceived reality as represented by continuous fields, not mechanically explicable.” JF
1865
Speed Limit
Great Britain
A law limiting the maximum speed at which a vehicle can be driven
In 1865, the British government passed the Locomotive Act (also known as the Red Flag Act), which set a speed limit of 4 mph (6 km/h) for steam-powered vehicles in the countryside and 2 mph (3 km/h) in towns. One reason for its introduction was concern about damage that the large, heavy locomotives were causing to the roads; it was argued that by reducing their speed, their impact on the road would be lessened. The other main argument for a speed limit was public safety—hence why the Act also stipulated that all self-propelled vehicles must be preceded by a red flag held by a pedestrian, as a warning to horse-drawn vehicles and pedestrians. Unsurprisingly, the Act received a great deal of support from those with horse and railroad interests, and its implementation did much to hinder the early development of road transport in Great Britain.
The first comparable U.S. law was introduced in Connecticut in 1901, with a limit of 12 mph (19 km/h). With the arrival of the internal combustion engine, the red flag was dispensed with and speed limits were raised, first to 14 mph (23 km/h), then to 20 mph (32 km/h). In 1934, after a short unregulated period, a standard limit of 30 mph (48 km/h) was introduced into U.K. urban areas.
This urban limit is now the norm for most countries. In Europe the Italian autostrada has a maximum limit of 81 mph (130 km/h), the same as the advisory speed limit on the German autobahn. Limits in Japan are low in comparison, with the urban limit at 25 mph (40 m/h), rising to 50 mph (80 km/h) on expressways and 62 mph (100 km/h) on some highways. U.S. speed limits are set by state authorities and vary from one territory to another.
The introduction of speed limits proved to be the first step in a whole raft of legislation governing road safety. Acceptance of restrictions marked a new realization that consideration for other road users should be a matter of law, not just moral obligation. JF
1866
Recapitulation Theory
Ernst Haeckel
A theory that embryos literally mimic evolutionary stages as they develop
The phrase “ontogeny recapitulates phylogeny,” coined in 1866 by the German physician-turned-naturalist Ernst Haeckel (1834–1919), was popularized by biologists who believed that ontogeny, the development of a human being from embryo to adulthood, involved a recapitulation (run-through) of phylogeny, the entire history of humanity’s evolutionary development. To paraphrase, ontogeny is the growth, development, and changing shape of the embryo, and phylogeny the evolutionary development of the embryo’s species.
Haeckel believed that every organism’s ontogeny was a reflection of its phylogeny: that a chick embryo, for example, when in its early stages of development, resembled that of a fish, complete with gills, fishlike tail, and all of a fish’s associated characteristics. During further development it changed again, each time reflecting the evolutionary process that resulted in chicks: from a fish it altered to become a reptile, and eventually it became a chick. Humans, too, according to Haeckel, begin in the womb as fish before changing into a reptile, then in our specific case a mammal, before finally beginning to take on human form. Every stage in the development of an individual person resembles a fully-formed adult that appeared at some point in its own evolutionary history.
“In the case of the … libido, the phylogenetic origin is … obvious”
Sigmund Freud, psychoanalyst
Studies in experimental morphology conclusively show that no such correspondence between species has ever existed. Haeckel’s efforts to have his theories accepted were not helped when it was revealed that he had altered drawings to emphasize similarities in embryos that could not possibly have existed. JMa
1867
Maxwell’s Demon
James Clerk Maxwell
An experiment designed to contradict the second law of thermodynamics
In 1867, Scottish theoretical physicist James Clerk Maxwell (1831–79) created a thought experiment to show that it would be philosophically possible to contradict the second law of thermodynamics, which says that closed systems (those isolated from outside stimuli) always tend toward equilibrium (maximum entropy), so that when a hot liquid and a cold liquid are mixed, for example, the result will be a lukewarm liquid in which all molecules are at the same temperature.
Imagine a box, divided in two by a wall, that contains a mixture of “hot” (fast-moving) and “cold” (slow-moving) gas molecules. An imaginary creature (nicknamed Maxwell’s demon) crouches by a door in the wall, opening and closing it to ensure that cold molecules end up on one side of the wall and hot molecules on the other. A heat engine could be run by allowing the hot molecules to run through into the cold side. Similarly, energy could be created by getting all the molecules into one side of the box, then operating a turbine in the doorway, over which the gas would flow from the “full” to the “empty” side of the box.
“The Second Law of Thermodynamics has only a statistical certainty.”
Letter from Maxwell to Peter Guthrie Tait, physicist
Recently, Maxwell’s demon has acquired applications in information theory. In a Japanese experiment, a particle was moved to a higher energy state by observing the particle’s path and encouraging it in that direction rather than back toward a low-energy state. The possibilities of this are exciting, even if we are still a long way from generating significant amounts of energy using this method. The experiment suggests that philosophical reflection can be as important as scientific empiricism. JF
1868
Traffic Light
J. P. Knight
A road signal for directing vehicular traffic by means of colored lights
Traffic lights being manufactured in Shreveport, Louisiana, 1947.
The first use of lights to control traffic occurred in 1868 in a London street, where a revolving gas lamp with red and green glass was set up to help police constables direct horse-drawn traffic on a busy junction. The lamp, the concept of railroad engineer John Peake Knight (1828–86), exploded in January 1869, injuring its attendant constable, and the idea came to nothing.
In Detroit in 1920, early in the age of the automobile, William L. Potts (1883–1947), a police officer, invented a system of lights much closer to the traffic lights we know today. His idea was to adapt the red, green, and amber lights used on railroads to control traffic on the highway, particularly at urban intersections. Within a year, the Detroit authorities had installed fifteen. Around the same time, the Ohio inventor Garrett A. Morgan (1877–1963) came up with a semaphore-type traffic signal in Cleveland. His patent for the device was bought by General Electric Corporation and used as the basis for their traffic light monopoly.
“[Early traffic signals] were commonly assisted by gongs, whistles, and bells.”
M. G. Lay, Ways of the World (1992)
Today most traffic lights use the code of red for “stop,” green for “go,” and amber for “proceed with caution.” In many countries, “running the red” is a criminal offense, but in others traffic lights are not seen as legally binding and are widely ignored. Urban myths regarding traffic lights include the untrue idea that in China red means “go,” as it would be politically incorrect for it to mean “stop.” The traffic light concept has even found applications outside of its original road safety purpose: the European Union has introduced a code system for food nutritional values based on the traffic light colors. JF
1869
Nature Versus Nurture
Francis Galton
The question of whether characteristics are inherited (nature) or fostered (nurture)
An illustration depicting some of the factors at play in the nature versus nurture debate.
English polymath Francis Galton (1822–1911) was born into a rich and influential family that included naturalist Charles Darwin (1809–82), his cousin. He initially studied mathematics at Cambridge University but became interested in psychology, along with anthropology, geography, statistics, and many other subjects.
In one study, Hereditary Genius (1869), he considered the implications of his cousin’s theories on sociology and psychology. He favored the position that all characteristics, including intelligence, are inherited through natural selection, though he later came to believe that the nurturing environment had an important influence. His work also led him to develop the pseudoscience of eugenics.
Much of the important evidence in the nature versus nurture debate has come from the study of twins, including both nonidentical (fraternal or dizygotic) twins (who, when raised together, possess different natures but share the same nurture), and identical or monzygotic twins (who, when separated at birth or very soon after, experience different nurture but possess the same initial natural inheritance). The results of such studies have highlighted some remarkable instances of natural inheritance, such as the development of Type 2 diabetes in separated identical twins at almost the same time in their mid-life, and have also cataloged the psychological effects of a variety of environmental factors.
Today, the debate initiated by Galton is still very much alive. At one extreme, Nativists such as John Bowlby and Noam Chomsky believe that most or even all psychological characteristics, including those that develop later in life, are governed by the body’s genetic code. On the Empiricist side of the argument, theorists such as Albert Bandura and B. F. Skinner see the human mind at birth as resembling a blank slate, onto which character is engraved by later experiences. JF
1869
Periodic Table of the Elements
Dmitri Mendeleev
A chart of the relationship between the elements and their atomic numbers
The standard modern periodic table, color-coded according to the date of each element’s discovery.
During the nineteenth century, scientists worked out, by observing the properties of elements, that elements were related in some way. Dmitri Mendeleev (1834–1907), professor of general chemistry at the University of St. Petersburg, was the first to propose relationships between elements based on their atomic structure. In the course of writing The Principles of Chemistry (1869), he kept record cards showing each known element’s atomic weight and properties. As he explained later, “This soon convinced me that the properties of elements are in periodic dependence upon their atomic weights.” Mendeleev illustrated his insights with a table in which he boldly left gaps for elements that he believed existed but had not yet been discovered.
Mendeleev’s periodic table was initially met with a degree of skepticism, but in due course scientists filled in the gaps he had left with newly discovered elements whose properties he had predicted from comparison swith those already known.
Early in the twentieth century, chemists also recognized that an element’s atomic number—the number of protons (and therefore electrons) in its nucleus—is more significant than its atomic weight (which includes both protons and neutrons) when it comes to deducing an element’s properties. Modern versions of the periodic table therefore list similar elements vertically, transposing the rows and columns of Mendeleev’s original chart.
The periodic table revolutionized how chemists thought about the relationships between elements. It convinced scientists that the way elements behaved was not simply random, and encouraged them to seek out new elements to fill the identified gaps. The table also highlighted the importance of the size of an atom’s nucleus, at a time when the internal structure of the atom, thought to be the smallest particle of matter, was only just beginning to be studied. JF
1870
Propaganda by Deed
Mikhail Bakunin
The promotion of a political agenda by using physical violence against enemies
A portrait of Russian revolutionary and anarchist Mikhail Bakunin, taken in c. 1865 by French photographer Nadar (Gaspard Felix Tournachon).
Propaganda by deed (sometimes translated from French as “Propaganda of the deed”) may mean an action involving terrorism as a means of self-expression, or the use of extremist terror as a tactic for mobilizing political support. The phrase was invented by the French Marxist Paul Brousse (1844–1912), in the article “Propagande par le fait” (Propaganda by Deed), published in the August 1877 issue of Bulletin de la Fédération Jurassienne (Bulletin of the Jura Federation), which Brousse himself edited.
“We must spread our principles, not with words but with deeds, … the most popular, the most potent, … the most irresistible form of propaganda.”
Mikhail Bakunin, Letters to a Frenchman on the Present Crisis (1870)
However, the idea encapsulated by the phrase was not new. Mikhail Bakunin (1814–76), a much-traveled Russian anarchist and revolutionary, who was based in Switzerland at the same time as Brousse, had advocated such an approach in his own writings during the Paris Commune seven years earlier. (It could even be said that Maximilien de Robespierre and the Directory used the massacre of the French aristocracy under the guillotine as a kind of action propaganda.)
The highly influential notion of propaganda by deed has driven terrorist acts of all kinds and for all sorts of causes, including political assassinations in Europe between 1880 and 1914, guerrilla warfare in Ireland during the Easter Rising of 1916, Chinese communism under Mao Zedung in the 1940s, black and liberal South African resistance to apartheid in the 1950s and 1960s, IRA bombings in Ulster during the Troubles, and, more recently, the twenty-first-century atrocities perpetrated by al-Qaeda. The essential qualification for such action propaganda is that the terrorist act is carried out publicly, advertising and glorifying its authors, rather than in secret, with the perpetrators hiding their involvement and hoping to escape detection, as would be expected of other criminal behavior. What is terrifying—or, to its perpetrators, attractive—about such propaganda is precisely its originality: the unexpectedness of its pride in organized violence. JF
1870
Aryanism
George William Cox
The notion of a blue-eyed, blond-haired super-race destined to unite and rule the world
Images like this one of a mother and her baby in the German countryside were used by the Nazis during the Third Reich as examples of the perfect Aryan family.
The concept of Aryanism—not to be confused with Arianism, the early Christian heresy—was first developed by British historian George William Cox (1827–1902) in his book, The Mythology of the Aryan Nations, published in London in 1870. The term “Aryan” was derived from an Indo-European root word meaning “noble.” Aryans were thought to be the descendants of a noble super-race of pure racial origin from an imagined rural golden age. Its destiny was to unite and rule the world. The Aryanist movement still exists on the political right wing today.
“Aryanism … [was] oriented toward a lost perfection and implied an ill-defined hope of the restoration of that unity within modernity.”
Christopher M. Hutton, Race and the Third Reich (2005)
Aryanism underwent further development in Nazi Germany in the 1930s, when it was used to justify the regime’s oppression of Jews and East Europeans. According to Hitler, Aryans were to be identified with the blue-eyed and blond-haired people typically found in northern Europe, including Germany. Their destiny was to gain hegemony over non-Aryans, who would be treated as subject peoples. Ironically, this notion actually hindered Hitler’s initial war strategy in 1939 and 1940, when he made a number of peace overtures to the British, whom he treated as fellow Aryans. Only after these were firmly rejected by Winston Churchill in 1940 did Hitler concentrate on Aryanism within the Greater Germany of the Third Reich.
The concept of a super-race influenced European thinking for much of the later nineteenth and early twentieth centuries and spawned a number of more or less horrific social and political experiments. These included Nazi efforts to “improve” European genetic inheritance by both fostering “Aryan” elements through the Lebensborn program of selective parental assistance and rooting out perceived racial and mental defects via the pseudoscience of eugenics. The logical conclusion of Aryanism was the Holocaust, which demonstrated to members of every race both the dangers and the moral turpitude of the concept. JF
1870
Papal Infallibility
First Vatican Council
A Roman Catholic doctrine that God protects the pope from error whenever, in the role of mouthpiece of the Church, he speaks about faith or morality
An oil painting of Pope Pius VII giving an audience in the Vatican’s Sistine Chapel (1814), by Jean-Auguste-Dominique Ingres; the pope was, in fact, being held prisoner by Napoleon.
The question of papal infallibility has caused much debate among both Roman Catholic and non-Catholic Christians since the First Vatican Council voted for the measure by a vast majority (433 to two) in 1870. The declaration was probably occasioned by the Church’s need to combat liberal, secular movements that were sweeping Europe at that time.
“Should anyone … reject this definition [of papal infallibility]: let him be anathema.”
First Vatican Council (1870)
Papal infallibility combines acceptance of the authority of Christ over faith and moral practice with the belief that Christ’s authority was given to St. Peter, as leader of the Church, and subsequently handed on to his successors in the papacy via a process known as apostolic succession. A further logical element is that it is seen as necessary for the pope to be infallible, since he might otherwise lead the Church into error, with catastrophic consequences. The Vatican Council did not expect popes to use the provision to promulgate new doctrines, only to confirm those that were part and parcel of the faith. It viewed papal infallibility as having been part of the fabric of Roman Christianity since 519 when the bishop of Rome was first accepted as guardian of apostolic truth.
Contrary to popular myth, papal infallibility has not, in fact, been used very often. In only two instances—the doctrine of the immaculate conception of Mary (1854) and the doctrine of the corporeal assumption of Mary (1950)—has it been formally invoked by the pope speaking ex cathedra, that is, when acting as pastor and teacher to all Christians. However, the notion created a deep and enduring divide between Roman Catholics and all other Christians. For example, conservative Protestants believe that only the Bible is free of error, while human interpretations of the Bible are not. Accordingly, they cannot accept that the pope could ever be infallible. The issue has hindered efforts to bring about Christian unity. JF
1870
Antidisestablishmentarianism
Great Britain
Opposition to the removal of state support and status from the Anglican church
In England in the sixteenth century, as part of Henry VIII’s Reformation, the Anglican Church became an established church with the monarch at its head. For centuries after that, a succession of reformers questioned establishmentarianism, the recognition of the Anglican Church as a national institution.
However, in the nineteenth century, the British government proposed to disestablish the church, giving it the kind of independent status enjoyed by churches in other European countries and in the United States. Disestablishment of the (Anglican) Church of Ireland took place in 1871, but popular (antidisestablishmentarianist) opposition to identical proposals in England and Wales ensured that in those countries it continued to be an established church, as it is today. Although the principle of a secular state, in which religion and government are clearly separated, has become the norm in most non-Islamic countries, the debate about whether or not the Anglican Church should be disestablished is still very much alive. Many agnostics, atheists, and Anglicans support disestablishment, but others cling instinctively to the traditional link between the church and state.
The word itself is somewhat quaint and is now used mainly as a joke. Its semantic creation is unusual: antidisestablishmentarianism is formed by the linking of three sections to form one word, which is much more common in German than English. It is also one of the longest words in the English language, if not the longest, recognized as such in 1923. The opposition movement described thus is obviously a very contrary sort of body, existing in opposition to something that is in opposition to something else. The word is often used for any body of opinion that seems to resist new ideas for the sake of it, in the spirit of the original natural conservatism that spawned it. JF
1872
The Birth of Tragedy
Friedrich Nietzsche
A theory of drama that the tragedy of ancient Greece was the highest form of art
German philosopher Friedrich Nietzsche (1844–1900) published his theory of drama, The Birth of Tragedy from the Spirit of Music, in 1872, when he was professor of classical philology at Basel in Switzerland. Nietzsche was impressed by the ancient Greeks’ ability to transcend pessimism and derive pleasure and meaning through their own performances of tragedy. The Greeks believed that drama provoked enjoyment and an uplifted spirit on the part of the viewer when watching human angst and suffering on stage. Tragedy emerged in Athens in the sixth century BCE and was enacted in open-air theaters during the better part of a day.
In Greek mythology, Apollo and Dionysus are both sons of Zeus. In The Birth of Tragedy, Nietzsche discusses how tragedy was born from a fusion of views of life, both Apollonian (culture and reason) and Dionysian (wine, ecstasy, and intoxication). In the contemporary world this has become a struggle between civilized and primitive man, between collectivism and individualism. Nietzsche is clearly on the side of the primitive, of the individual, and he dismissively describes Greek art in pre-Dionysus times as naive. The Greeks themselves never considered Apollo and Dionysus to be rivals.
“The satyr chorus of the dithyramb is the saving deed of Greek art …”
Friedrich Nietzsche, The Birth of Tragedy (1872)
In Nietzsche’s account, the protagonist of tragedy tries to make sense of his reasoned, Apollonian lifestyle in the face of a chorus of Dionysus-led exhortations. For Nietzsche, Dionysian man magnifies man; he is the precursor of the “Superman,” an ideal, fully-realized human. Nietzsche hoped that his book would make the viewer of tragedy want to revive his Dionysian nature, and reconnect with his “Primordial Unity.” BS
1872
Pragmatism
Charles Sanders Peirce
An idea or proposition is likely to be true if it conveys practical benefit
Today, philosophy and empirical science can be seen as two distinct worlds: those of theory and practice. In the late nineteenth century, a group of U.S. thinkers sought to merge these worlds by developing the school of pragmatism. The thinkers regarded an idea as worthwhile if people gained some practical benefits from it. Thus, the actual meaning of any idea rests upon those practical benefits, and any idea that fails to produce such effects deserves to be discarded.
In 1872, a group of Harvard graduates decided to form the Metaphysical Club, an informal philosophical discussion group of about a dozen members. The group lasted less than a year but is widely credited with originating the pragmatic school of philosophy. William James (1842–1910), one of the members, is thought to have introduced the idea to the world in a lecture of 1898. However, James insisted the term originated with Charles Sanders Peirce (1839–1914), another member of the group, who coined it in the early 1870s.
“The pragmatist clings to facts and concreteness, observes truth at its work …”
William James, Pragmatism (1907)
The unofficial motto of the state of Missouri is “Show me,” a phrase that is quite suited to the sensibility of the pragmatists. To the pragmatist, the worth of an idea, and even its claim to truth, comes from empirical progress, from deriving a concrete use. This attitude initially met with a high level of interest, resulting in pragmatist movements in social science, applied public administration, and even urban development. However, after the early adopters and proponents eventually died, the idea fell out of favor with many academics, although it did gain some renewed interest in the later part of the twentieth century. MT
1873
Comparative Psychology
Douglas Spalding
Studying animal behavior can illuminate human psychological processes also
A chick responding to a rubber ball as if it were its mother, in a comparative psychology experiment in c. 1955.
The first person to carry out empirical research on animal intelligence was English biologist Douglas Spalding (1841–77), who experimented with chicks to find out which behavioral traits were innate and which learned. He published his early work in an article for Macmillan’s Magazine in 1873. Meanwhile, naturalist Charles Darwin (1809–82) and his friend George Romanes (1848–94) were writing on the evolutionary relationship between animals and humans.
Most comparative psychology studies since then have focused on learning, including conditioning and association. One approach has been concerned with traits that many species, including humans, have in common, mainly focusing on shared instinctual processes and the way in which these influence learning. This line of study can be undertaken under artificial conditions in the laboratory. Another approach, looking at the way in which evolutionary processes select for particular behavior adaptations, is more suited to field studies in the natural habitat. Scientists have also compared the behavior of modern animals with that of their ancient ancestors, or related species that have become extinct.
“We see that living beings need many things in conjunction …”
Lucretius, On the Nature of the Universe (first century BCE)
Comparative psychology gave rise to a shift in the way that we perceive both animal intelligence and the mental processes that cause animal behavior. Rather than assuming an innate superiority in human mental processes, scientists now study the similarities and differences in how humans and animals learn, and in the behaviors associated with their mental abilities, in order to better understand human psychology. JF
1873
Product Placement
Jules Verne
The surreptitious advertising of brands in a normally advertisement-free context
Ford’s placement in Bullitt (1968) was so successful that it later built special Bullitt editions of the Mustang GT.
Perhaps surprisingly, given its prevalence in modern entertainment, product placement is anything but a late twentieth-century phenomenon. Instead, the practice originated in the late nineteenth century, when shipping magnates beat a path to the door of world-renowned French author Jules Verne (1828–1905), begging him to mention their companies in his new novel, Around the World in Eighty Days, published in 1873; he was happy to do so—for a price.