Pareto efficiency has been an influential idea in economics and it has also had applications in engineering and the social sciences. In economics, it has been long assumed that the operations of market actors untouched by government intervention would tend toward Pareto optimality. However, it is now recognized that markets inherently suffer from a number of market failures, tendencies toward imperfect competition, information asymmetries, and other faults that would inhibit the organic development of Pareto efficient outcomes.

“When it is useful to them, men can believe a theory of which they know nothing …”

Vilfredo Pareto

Despite the shortcomings of its use in certain tendencies of neoclassical economics, the concept of Pareto efficiency has proven vastly influential in economic thought. Pareto’s legacy to economics has been profound and he has become acknowleged as the Newton of Economics, for having united economics in mathematical form. All modern economics is based on the idea of Pareto efficiency. JE

1907

Racialism / Racism

Leon Trotsky

The belief that people can be divided into a hierarchy of biological categories

Racism is the idea that there are distinct biological groups called “races,” some of which are superior to others. There is much controversy as to the definition of racism and its origin. Although human history is riddled with instances of one group claiming to be better than another, racism specifically requires the belief that groups can be divided into biological categories. Racism, then, in the strictest sense, is not only the idea that one group, culture, or nation is better than another, but alsospecifically that one group, due to its race, is better than another. The issue is also complicated by the inability to actually distinguish groups of people by “race” in any legitimate scientific sense.

“Racialism,” a precursor term to racism, first appeared in the Oxford English Dictionary in 1907. It was defined as “belief in the superiority of a particular race.” The term “racism” was first used in response to Adolf Hitler and the rise of Nazi Germany. Critical of the Third Reich in the early 1930s, Leon Trotsky (1879–1940) argued that Hitler used racism as a means by which to rationalize repressing the working class and consolidating power for himself.

“Racism is a vapid and bombastic variety of chauvinism in alliance with phrenology.”

Leon Trotsky

Racism is now best understood as the classification of others by race for the purpose of subjugation, and it continues to be used as a way to justify bigotry, intolerance, and genocide. In many countries, legislation against racial discrimination exists. Although racism remains pervasive, the general agreement by the scientific community that there are no actual biological or socilogical categories of race is slowly undermining justification for oppression based on race. NM

1907

Steiner Education

Rudolph Steiner

A humanist approach to education that stresses the role of the imagination

Austrian-born Rudolph Steiner (1861–1925) was many things—philosopher, architect, political reformer, and literary critic—but he is best remembered as the driving force behind the establishment of his own distinct brand of education. The founder of anthroposophy, the idea that there is a spiritual world that is able to be accessed and comprehended through personal development, he integrated this belief into his philosophy on the education of the whole child. Steiner wrote his first book, The Education of the Child, in 1907 in which he considered children to be three-fold beings: body, spirit, and soul. He emphasized the importance of imagination, the visual arts, and participating in festivals and ceremonies in order to connect the child more fully with nature and the wider cosmos.

“To truly know the world look deeply within your own being.”

Rudolph Steiner

The first Steiner/Waldorf school opened for the children of employees of the Waldorf-Astoria Cigarette Company in Stuttgart, Germany, in 1919. There are currently over 600 schools in sixty countries worldwide, teaching more than 125,000 students. Today’s curriculum is tailored to the various phases of a child’s development. In a child’s early years activities such as clay modeling are favored, along with basic mathematics and an introduction to reading and writing. There are no text books until the age of eleven or twelve; instead children are encouraged to produce their own texts based on their own experiences. Subjects comprise the usual core disciplines, but may also include weaving and toy making. Learning at primary level takes place in a noncompetitive environment, and the use of electronic devices is discouraged. BS

1907

Atonality

Arnold Schoenberg

A form of pitch organization without reference to tonal centers

The term “atonal music” has several different but interrelated meanings, including music free of the boundaries of major/minor tonality or music free of any other form of pitch centricity. The term began to be applied to compositions written by Austrian composer Arnold Schoenberg (1874–1951) and the Second Viennese School, beginning with his second string quartet in 1907. He and his students, Alban Berg and Anton Webern, pioneered important atonal works that are still in the repertoire, including Schoenberg’s Pierrot Lunaire (1912) for voice and chamber ensemble.

Atonality became a driving force for experimentation during the twentieth century at a time when composers were searching for new approaches. The methods they pursued were manifold, including using microtones (scale steps smaller than half tones) and randomly generated notes, and applying different mathematical formulas to pitch generation. Electro-acoustic music that utilizes sound objects rather than clearly defined pitches could also be considered an extension of atonal composition.

“One must be convinced of the infallibility of one’s own fantasy …”

Arnold Schoenberg

Atonal techniques also made their way into popular culture. In movie music, atonality became a powerful element not only in suspenseful, violent, or dramatic scenes but also in reflective scenes to suggest some kind of ambiguity. Both approaches are present in Alex North’s movie scores, such as Viva Zapata! (1952). Avant-garde jazz and rock artists and groups, such as U.S. saxophonist John Coltrane and the British rock group Henry Cow, have also used atonality, thus extending the boundaries of their genres. PB

1907

Cubism

Pablo Picasso and Georges Braque

A style of visual art distinctive for its use of geometric shapes, angles, and lines

Picasso’s seminal cubist painting Les Demoiselles d’Avignon (1907) deeply shocked the art world.

A radical style of visual art pioneered by Pablo Picasso (1881–1973) and Georges Braque (1882–1963), Cubism attempted to capture reality as a composite of elementary shapes and angles as if seen from multiple perspectives at once. Art critic Louis Vauxcelles had scornfully described a series of Braque’s paintings exhibited in 1908 as “full of cubes,” but the beginnings of the style were already evident in Picasso’s painting Les Demoiselles d’Avignon (1907).

With two of its female faces resembling tribal masks, the painting showed the influence of African art. The women’s bodies form angular shapes that appear as part of the flat background. The Cubists rejected the idea of perspective that had long been the basis of visual art. They were inspired by Paul Cézanne’s idea that a painting is after all a work of art—a conscious attempt to reconstruct a scene viewed from a particular perspective. Abandoning such conventions, Cézanne’s Postimpressionist paintings used dense layers of color to convey depth and movement. The early Cubists took these ideas much further, literally following Cézanne’s advice that nature should be depicted “in terms of the cylinder, the sphere, the cone, all in perspective.”

“Let them eat their fill of their square pears on their triangular tables!”

Marc Chagall, artist

From 1910 Picasso and Braques had taken this interest in geometric patterns to a more abstract level, and Picasso took it further still when he invented the collage technique. By 1912 Cubism was in decline, but with its insistence on unveiling the many disparate elements and perspectives that construct any image, this short-lived movement had a profoundly subversive and liberating influence on all modern art. TJ

1908

Expressionism in Music

Arnold Schoenberg

A school of music, often atonal in style, that focused on evoking heightened emotions

Expressionism originated in the visual arts, with the German group Die Brücke (The Bridge) in 1905, and was later extended to music. Although not stylistically homogenous, the movement’s program provides stylistic clues that apply across the visual arts, literature, cinema, and music: fragmentation rather than unity, no imitation of nature, and an emphasis on the individualized, subjective artist.

In music, the early atonal works of Arnold Schoenberg (1874–1951), from about 1908, most clearly manifested these characteristics. They were highly original pieces, avoiding standard melodic and harmonic constructs and repetitions, and in the case of texted compositions, they dealt with human emotion rather than lyrical description. Schoenberg’s Erwartung (Expectation, 1909), for example, a drama for one voice connects musical expressionism with a surreal, psychoanalytically laden libretto: a woman who wanders in the forest searching for her lover, eventually finds his bloodstained corpse. Schoenberg described his intention as to “represent in slow motion everything that occurs during a single second of maximum spiritual excitement, stretching it out to half an hour.”

“Music is at once the product of feeling and knowledge.”

Alban Berg, composer

The Nazi seizure of power, followed by the Degenerate Art Exhibition in Munich in 1937 (in which many expressionistic works were displayed), ended the movement in Germany, although it continued to develop in art and cinema. In music, it regained popularity in the 1950s. Hungarian composer György Ligeti (1923–2006), in particular, used expressionistic stylistic traits and reinvigorated Expressionism. PB

1908

Modern Christian Apologetics

G. K. Chesterton

A branch of Christian theology that offers modern defenses of the Christian religion

The Christian apologetic texts of G. K. Chesterton (here in 1909), such as Orthodoxy (1908) and The Everlasting Man (1925), were well received partly because they were so entertaining.

The Renaissance and Enlightenment were largely Christian projects, advancing specifically Christian understandings of philosophy and science. Isaac Newton, for example, wrote more theology than physics, and in his physics he made sure that his model of the universe was an open one, allowing for supernatural causation (miracles) to take place. However, some of the immediate children of the Enlightenment, such as Karl Marx, Sigmund Freud, and Friedrich Nietzsche, attempted to divorce reason from religion, and then to either put down religion (Marx, Freud) or to put down both reason and religion (Nietzsche). In either case, subsequent centuries saw the gradual erosion of Christian influence in the academies of the West.

“It is idle to talk always of the alternative of reason and faith. Reason is itself a matter of faith. It is an act of faith to assert that our thoughts have any relation to reality at all.”

G. K. Chesterton, Orthodoxy (1908)

In the early twentieth century, however, a modern Christian apologetic renaissance began with Catholic polemist G. K. Chesterton (1874–1936), who published a classic text on the subject, Orthodoxy, in 1908. Chesterton’s wit and repartee in the face of rising atheism quickly influenced another wave of Christian apologists, including C. S. Lewis (1898–1963). Lewis—an Oxford-trained philosopher, and atheist turned Christian—has been called the greatest Christian apologist of all time, and even if this is an exaggeration, he helped to release many Christians in the West from the belief that blind faith was their only option. One of these philosophers was Alvin Plantinga (b. 1932), who, with similar distinction, has been called the greatest living philosopher, and who Time magazine credits with the rebirth of Christian philosophy in North America.

The influence of modern Christian apologetics has been huge. Today nearly one in three philosophers, and even more scientists, in the West claim to be serious, miracle-believing Christians, and this number continues to rise. AB

1909

Futurism

Filippo Marinetti

A short-lived movement that glorified the dynamism and power of the mechanized world

Futurist architect Antonio Sant’Elia’s drawing Power Plant from 1914 celebrates the advent of electricity. Although most of his designs were never built, they influenced architects and artists.

When the Italian poet and novelist Filippo Marinetti (1876–1944) published his “Futurist Manifesto” in the Paris newspaper Le Figaro on February 20, 1909, he hoped it would ignite a revolution and erase society’s aging values—values perpetuated by what he called “cultural necrophiliacs.” He implored his countrymen to throw off the “secondhand clothes” they had been wearing for far too long and argued that the time was ripe, at the start of a new and increasingly industrialized century, to create a new art and a new world representative of what was to come: speed, new technologies, and the glory of war.

“It is from Italy that we launch through the world this violently upsetting incendiary manifesto of ours.”

Filippo Marinetti, “Futurist Manifesto” (1909)

Marinetti’s manifesto found resonance within Italy’s cultural and artistic circles, and its devotees included architects, musicians, artists, graphic and industrial designers, and filmmakers. Futuristic poetry dispensed entirely with verbs, adverbs, and punctuation—annoyances, the poets said, which only served to interrupt the new images seen in their use of only nouns. Futurist architects such as Antonio Sant’Elia (1888–1916) drew sketches of towering concrete cities that were devoid of Baroque curves and ornamentation. The artist Giacomo Balla’s (1871–1958) painting Dynamism of a Dog on a Leash (1912) was a blur of legs, intended to depict the coming world in a state of rapid and inexorable change.

Initially, Futurism seemed like a breath of fresh air after decades of sentimental romanticism. Futurists appeared optimistic, even heroic; devotees of everything modern: automobiles, noise, speed, and burgeoning cities, as well as the industrialization that made it all possible. Violence and patriotism were also among their hallowed ideals (We will glorify war, the world’s only hygiene …), and many enlisted when Italy entered World War I in 1915, which, due to the revealed horror of the trenches, ironically spelled the end of this ill-conceived movement. BS

c. 1910

The Dozens

United States

An African game of insults that became a social pastime for black Americans

Blues musician Speckled Red (pictured in 1960) was known for his recordings of “The Dirty Dozens.”

Trying to find a clear origin in time and place for The Dozens—a game of spoken words involving two contestants who take it in turns to hurl insults (or “snaps”) at one another until one of them either surrenders or commits an act of violence against the other—is not easy. When the insults relate to sexual issues, the game is known as The Dirty Dozens. Also known as “woofing,” “wolfing,” “joning,” “sounding,” and “sigging,” the game originated in West Africa, in countries such as Nigeria and Ghana, where it indicated a measure of each person’s intelligence and social status. However, it first started to become a recognizable social phenomenon in the United States in the black districts of New Orleans in the early years of the twentieth century.

“We played The Dozens for recreation, like white folks play Scrabble.”

H. Rap Brown, Die Nigger Die! (1969)

Early examples of a Dozens contest would often involve twelve rounds of insults. In 1929 the hit song “The Dirty Dozen” by the African American songwriter and pianist Speckled Red conveyed the art of the insult in its lyrics: “Cause he’s a rotten mistreater, robber, and a cheater; slip you with a dozen, yo pop ain’t yo cousin; and you mama do the lordylord!” Speckled Red recalled playing The Dozens in his own community when he was a child, when men of words were looked up to. The game was also valued as a mechanism in the display of aggression, but never countenanced violence against whites. A Dozens game might begin with: “Now, first thing I’m gonna talk about is your ol’ momma”; the first one to crack and throw a punch lost. Participants were overwhelmingly male, although females were known to play, too. JMa

c. 1910

Jazz Music

United States

Blues and ragtime are fused to create a unique new form of improvised music

One of the United States’ great cultural gifts to the world, jazz music most likely began in the early 1910s on the streets surrounding New Orleans’s red light district of Storyville. Born out of a mix of African and European musical traditions, jazz features syncopation, improvisation, polyrhythms, and blue notes. There was no city like New Orleans in the early years of the twentieth century. Mass migration to the city after years of depressed cotton prices had given it a rare cosmopolitan mix, and musicians and brass bands were in high demand. There was no shortage of instruments; the city had been flooded with trumpets, trombones, cornets, and drum kits since the end of the Spanish-American War in 1898, when U.S. military bands disembarked in the city.

“Jazz is restless. It won’t stay put and it never will.”

J. J. Johnson, trombonist

Music became embedded in the daily lives of its citizens, while segregation formed African Americans into tight, insular communities—perfect ingredients, it turned out, for producing new forms of musical self-expression. Some claim this exotic new combination of blues, ragtime, and never-heard-before rhythms emerged spontaneously in cities across the United States, from Chicago to Kansas City, but evidence points to New Orleans as being the only city that possessed all of the factors necessary to create this new music, which thrived on improvisation and instrumental diversity.

The term “jazz” did not emerge until the 1920s and it was possibly a derivative of “jass,” a reference to jasmine perfume, popular among the prostitutes of New Orleans. Incorporating Dixieland, swing, even funk and hip-hop, it is and always will be “America’s music.” BS

1910

The Oedipus Complex

Sigmund Freud

Drawing parallels between an ancient legend and an early developmental stage in boys, Freud launched a controversial psychoanalytic theory

Ingres’s painting Oedipus and the Sphinx (c. 1808). When Oedipus solved the sphinx’s riddle, he was made king of Thebes and wedded to Jocasta, who was later revealed to be his mother.

Oedipus is a character from ancient Greek mythology who, in unwitting fulfillment of an oracular prophecy, murders his father and goes on to marry his mother. Viennese psychologist Sigmund Freud (1856–1939), familiar with this legend through Sophocles’s tragedy Oedipus the King, made connections between this myth and the material he claimed to have uncovered through dream interpretation, psychoanalysis of neurotic patients, and self-analysis. Freud first used the term “Oedipus complex” in print in 1910, by which time it had formed an integral part of his thinking on child development and the origin of neuroses.

“Freud … has not given an explanation of the ancient myth. What he has done is to propound a new myth.”

Ludwig Wittgenstein, philosopher

Freud read the Oedipus myth as an expression of the repressed fantasies of every young male child. “It is the fate of all of us,” he wrote, “to direct our first sexual impulse toward our mother and our first murderous wish against our father.” According to Freud, at the age of three boys entered a stage of psychosexual development in which their genitals became the prime erogenous zone. In this “phallic stage,” boys focused their desire upon their mothers; they violently resented and feared their fathers as more powerful rivals. He later extended the theory to female children in the “Electra complex”—girls desiring their fathers and feeling murderous toward their mothers.

Freud claimed that, in a normal individual, the Oedipal stage was left behind as the child learned to identify with the parent of its own gender and choose a sexual object outside the family. Those who remained fixated on their opposite-gender parent, however, developed neurotic symptoms and sexual deviance. Critics have pointed out that there is little in the way of solid evidence to support Freud’s vision of the hidden psychosexual dramas of childhood. Nevertheless, the Oedipus complex has itself become a myth with great potency in modern culture. RG

1911

Emergency Room

Louisville City Hospital, Kentucky

“Accident Services” herald a new generation of medical treatment

In the mid-1800s the Medical Institute of Louisville, Kentucky, was home to Samuel Gross (1805–84), one of the giants of nineteenth-century surgery. Gross pioneered new suturing techniques, set up one of the country’s first surgical laboratories, and performed the first successful surgery on an abdominal stab wound. Gross is depicted performing surgery in Thomas Eakins’s celebrated painting The Gross Clinic (1875) and he influenced future generations of medical students. Graduates from the University of Louisville established Kentucky’s first clinic to educate students in clinical medicine and performed that state’s first appendectomy. By the turn of the century Louisville was at the forefront of medical research in the United States, and it therefore came as no surprise when, in 1911, Louisville City Hospital opened the first “Accident Service,” an early form of trauma care similar to services already being offered by railway companies and other workers’ organizations in Europe. Louisville Hospital’s emergency services were further expanded and improved with radical new approaches to trauma care instituted by the surgeon Dr. Arnold Griswold in 1935, and became a precursor of the modern emergency room, which began to take shape in the mid-1950s.

The advent of the Korean War in the early 1950s saw the approach to emergency care evolve further, with the realization that immediate treatment on the battlefield could save countless lives. This spurred the development of the emergency room; however, by the 1960s emergency room care in the United States was still uneven at best as doctors realized special training was required to properly equip doctors, surgeons, and nurses with the skills to cope with trauma. In 1968 the American College of Emergency Physicians was established to train a new breed of “emergency physicians.” BS

1911

Monetarism

Irving Fisher

The theory or practice of controlling the supply of money to stabilize the economy

Monetarism is a macroeconomic theory first considered by Martín de Azpilcueta, a Spanish theologian and early economist in the sixteenth century. It was only developed as a theory—the quantity theory of money—in the late 1800s and not fully unraveled until Irving Fisher (1867–1947), in The Purchasing Power of Money (1911), stressed the significance of the money supply as a primary factor in determining a nation’s gross domestic product (GDP). The theory, a mix of core policies and dogmatic theoretical constructs, led to a belief that economies are at their most efficient when subject to only minimal government interference, and also that the growth rate of money in an economy should be fixed to the rate of GDP, thus limiting the Federal Reserve Bank’s ability to “meddle.”

“The rate of interest acts as a link between income-value and capital-value.”

Irving Fisher

Many economists credit the rise of monetarism as an alternative to traditional Keynesian theory—which supports appropriate government regulation of the economy—to the U.S. economist Milton Friedman (1912–2006), who held that the growth of and quantity of money circulating in an economy directly affects rises in the cost of goods and services. Monetarism grew in the late 1970s after Keynesian theories failed in the face of stagflation: high inflation and high unemployment created by successive oil-price shocks.

However, monetarism was built on false assumptions: that central banks alone control the amount of money in circulation, and that inflation cannot be restricted by limiting the money supply. Even what constitutes “money” became difficult to define. Monetarism was fated to be an experiment that failed. BS

1912

Blues Music

Southern African Americans

Music developed by African Americans in the rural southern United States

Unlike many rural bluesmen, W. C. Handy was an educated musician who encountered the blues at around age thirty. He remained passionate about the form until the end of his life.

Blues music, or, more simply, the blues, is a type of music original to the American Deep South, and one characterized by its scale structure, chord structure, notes, and lyrical themes. The blues also often features the use of “blue,” or bent notes, caused by the musician raising or lowering notes after sounding them, as well as a three-line A-A-B verse form in which the first two verses repeat, and a twelve-bar, four/four-time stanza paired with a three-chord progression.

“The blues tells a story. Every line of the blues has meaning.”

John Lee Hooker, musician

Along with jazz, the blues is considered one of the few original U.S. art forms, one that represents a merger between African and European musical traditions. Since Elizabethan times (1558–1603), the color blue had been associated with feelings of sadness and melancholy. Slaves working in southern fields often used a call-and-response style of singing as they worked, with many of these songs and spirituals expressing aspects of their lives of forced labor and servitude. This music, when blended with the structures of English ballads, developed into the style that became the blues.

In 1912, composer W. C. Handy (1873–1958), known as “the father of the blues,” published the sheet music for a song, “Memphis Blues.” Arguably the first blues song, it introduced Handy’s style of twelve-bar blues. The first recording of instrumental blues music appeared the year after, in 1913, and blues music was gradually noticed before exploding in popularity in the 1920s; it was extremely popular for decades thereafter.

The history of the blues is an indelible part of modern music. Blues structures, particularly the twelve-bar blues form, are present in numerous contemporary popular songs, especially those of modern rock ’n’ roll, R&B, and pop music. Its influence has also stretched to jazz, and country, and even modern classical music. Today, blues stars such as Eric Bibb and Robert Cray have enthusiastic followings. MT

1912

Knowledge by Description / Knowledge by Acquaintance

Bertrand Russell

All knowledge is ultimately dependent on experience

Philosopher Bertrand Russell, photographed in 1935. He was one of the founders of analytical philsophy, which remains the dominant philosophical tradition in the English-speaking world.

The distinction between “knowledge by description” and “knowledge by acquaintance” can be applied to understand knowledge. There are at least two ways that we come to know truths about the world. One is through direct, “firsthand” experience of events and another is through description of these events. For example, a man, “John,” who met Bertrand Russell (1872–1970) may know, by acquaintance, that Russell was a nice man. If John tells someone that Russell was nice, they know, by description, that “Bertrand Russell was a nice man.” Our ability to distinguish between two different ways of learning about the world has profound impacts on philosophy.

“Knowledge by acquaintance is always better than mere knowledge by description, and the first does not presuppose the second nor require it.”

A. W. Tozer, Christian preacher

Russell introduced the distinction in his seminal article “Knowledge by Acquaintance and Knowledge by Description” (1912). He was motivated to write about the distinction by the work of nineteenth-century philosopher John Grote. Russell, however, brought the idea to the contemporary philosophical world. He realized that the knowledge that we have by acquaintance is special because it is not really something that we put into sentences when we know it. For example, if a person met Russell and experienced his niceness, that experience is not really true or false; it is just an experience of niceness. It is not until they think about it and realize “Bertrand Russell is a nice man” that they can say whether the sentence is true or false. The amazing problem is the realization that our own thoughts about our experiences themselves are knowledge by description. The experience we have is by acquaintance, but as soon as we start making sentences about it, even to ourselves, we are engaging in description. Philosophers are still trying to figure out the nature of experience as it relates to knowledge. If knowledge requires that our sentences be true or false, then we cannot say our experience, our “knowledge by acquaintance,” is really knowledge at all. NM

1912

Gestalt Psychology

Max Wertheimer et al

A school of psychology stresses that the whole of anything is greater than its parts

The term “Gestalt” was first coined by the Austrian philosopher Christian von Ehrenfels, who in 1890 published On Gestalt Qualities, a critique of the prevailing Atomists who believed the natural world to be absolute without any regard to context. Von Ehrenfels challenged this through a musical example: play a song in the key of C and then in A flat. No two notes are the same, yet the listener will recognize the tunes as the same. In other words there are no absolutes and context matters. One of von Ehrenfels’s students was Max Wertheimer (1880–1943); the publication of his “Experimental Studies of the Perception of Movement” in 1912 marked the foundation of the Gestalt school.

Gestalt psychology was formulated as a response to the inexorable accumulation of data through scientific investigation and, in the opinion of its originators, the resultant total neglect of the phenomena that the scientific inquiry was supposed to be revealing. Much the same can be said for psychology and philosophy, pursuits replete with terms that seem full of potential, intuition, personality, and existentialism, but when we try and comprehend their meaning the terms fail us and we are left with a question: are these disciplines, in their present forms, equipped to deliver the enlightenment they proffer? Rather than ruminating over psychological hypothesis or scientific theorems, it was the hope of the founders of Gestalt psychology—Max Wertheimer and fellow psychologists Kurt Koffka (1886–1941) and Wolfgang Köhler (1887–1967)—to be able to determine the true nature of things without forever puzzling out unanswerable riddles. BS

“A man is not only a part of his field, he is also one among other men.”

Max Wertheimer

1913

Delusion

Karl Jaspers

The capacity for pathological beliefs is endemic to the human condition

The Cat (c. 1930), by Louis Wain, was possibly informed by the English artist’s late-onset schizophrenia.

Delusions are pathological beliefs that are adhered to with conviction despite there being a dearth of evidence to support them. Delusions are almost always pathological, and most occur as a result of impaired neurological function or mental illness. German philosopher Karl Jaspers (1883–1969) was the first to define the criteria for a belief to be considered delusional in his textbook General Psychopathology (1913).

Delusions can be broken down into four categories: bizarre (strange and implausible); nonbizarre (possible, though still false); mood-congruent (consistent with a depressive diagnosis); and mood-neutral (a belief that bears no relation to a person’s emotional state). There are also a myriad of delusional “themes,” such as poverty, in which the sufferer believes they are impoverished; delusional parasitosis, a severe reaction against an imagined parasitic infestation; erotomania, thinking that someone is in love with you when they are not; and grandiose delusions, beliefs that usually revolve around a supernatural, religious, or science fiction theme that can be related to mental illnesses, such as bipolar disorders and schizophrenia.

“Might not [delusions] be imaginings that are mistaken for beliefs by the imaginer?”

Stanford Encyclopedia of Philosophy

Regardless of how many guises we attribute it, delusion does not have to be a destructive false assumption. Religious belief, for example, possesses many of the same attributes common to pathological delusions, and yet can be empowering. The fact is delusion remains an imagined contrivance, the product of intellectual thought, and in that sense it is a potential pitfall common to all. It is only the destructive, shared delusions referred to above that we tend to label. BS

1913

Infinite Monkey Theorem

Émile Borel

A monkey typing at random for an infinite amount of time will type a given text

Kokomo, Jr. the chimpanzee poses at a typewriter in 1957 for a news story on the infinite monkey theorem.

This completely implausible theorem, attributed to the French mathematician Émile Borel (1871–1956) in 1913 and popularized by astronomer Arthur Eddington (1882–1944) in his book The Nature of the Physical World (1928), asks us to believe that by taking an infinite number of monkeys and sitting them down at typewriters, eventually one of them will type a known work of literature, such as William Shakespeare’s Hamlet (c. 1600). The raft of dilemmas that this illustration of the concepts of infinity and probability creates is self-evident. Philosophically, how can one calculate the probability of achieving a desired result if you cannot first demonstrate that the goal is feasible? The laws of probability also condemn it. A monkey has a one in twenty-six chance of typing the first letter of Hamlet. Typing the second letter correctly sees that probability increase to one in 676. After just fifteen characters, the probability increases to a figure in the thousands of trillions, and there are still 130,000 correct letters to go.

“To be or not to be—that is thegrrdnm zsplkt.”

Bob Newhart, comedian

In 2003 researchers at the University of Plymouth in the United Kingdom put six crested macaques in front of a computer. Occasionally, in between urinating on the keyboard and smashing it with a rock, the monkeys typed out a large number of “S”’s. The theorem, however, is not without its unintended uses. In his book The Blind Watchmaker (1986), the evolutionary biologist Richard Dawkins cited the theory to explain how complexity can arise from random mutations; that if any correct letters could somehow be “frozen” and the incorrect letters discarded then a Darwinian platform has been created for the monkey’s descendants to improve on. BS

1913

Behaviorism

John Watson

Our response to external stimuli becomes, for a time, a possible window to the mind

Psychological behaviorism can trace its beginning to the publication of the article “Psychology as a Behaviorist Views It” (1913) by John Watson (1878–1958), who believed behavior is acquired through conditioning. In the world of the behaviorist, the reason we do the things we do can be understood without the need to resort to complex psychological concepts that are too subjective and defy analysis. How we act is determined by external forces, not the machinations of our cognitive, emotional minds. Behaviorists such as Burrhus Frederic Skinner thought psychiatry need concern itself only with a subject’s observable behavior in response to external physical stimuli. But can all the complexities of our behavior simply be attributed to the world around us, to mere “stimulus and effect?”

“Thinking is behavior. The mistake is in allocating the behavior to the mind.”

B. F. Skinner, About Behaviorism (1974)

One of the proponents of this form of behaviorism was the Russian physiologist Ivan Pavlov, whose work with dogs led to him theorizing that behavior that is rewarded immediately is more likely to be repeated. Pavlov observed dogs salivating when about to be fed, but also noticed that after repeated feedings, the dogs began to salivate even when he entered the room without any food. Pavlov realized that the dogs had been conditioned to begin salivating; their behavior had been altered. What Pavlov had discovered, and Skinner after him, was that behavior, whether animal or human, can be “shaped.” But to what extent?

Behaviorism was oft criticized for belittling psychiatry and did not stay long on the disciplinary landscape. It was too deterministic and left too little room for the expression of free will. BS

1914

Gratuitous Act

André Gide

An act of crime or evil that defies classification and interpretation

In the existential novel Lafcadio’s Adventures (1914) by French author André Gide (1869–1951), a young man named Amedee is pushed off the roof of a moving train by his half-brother Lafcadio, and is killed. Lafcadio, like Albert Camus’s Meursault in The Stranger (1942), is oddly unmoved by his own actions, which are deemed a gratuitous act: an unmotivated crime committed without any apparent objective or subsequent remorse. There is no premeditation, only impulse—and the gaining of immediate gratification. Gide did more than anyone to evolve the idea of the unmotivated, unpremeditated, gratuitous act.

On a London street on April 1, 2009, a Scotland Yard policeman used a baton to beat a homeless man who had unwittingly and innocently wandered into the vicinity of a public protest. The homeless man’s liver was injured in the assault, and fifty minutes later he died. The court described the incident as “a gratuitous act” by an officer who had “lost all self-control.” The officer himself could offer no explanation for his actions.

“It is better to be hated for what you are than loved for what you are not.”

André Gide

The gratuitous act has always been used to argue against the existence of God. The argument is thus: if God exists, gratuitous evil would not exist; gratuitous evil exists, therefore God cannot exist. This, of course, presupposes that the gratuitous act actually exists and that there can indeed be effect without cause. Most, however, would scoff at such a suggestion. There cannot be an effect without a preceding cause, and perhaps the phrase is nothing more than a convenient label to describe acts that have no psychological explanation. BS

1914

Postmodern Biblical Criticism

Great Britain

A postmodernist approach to theology and biblical studies

Postmodern biblical criticism came into fashion in the 1950s, but it originated much earlier. A British quarterly philosophical review, The Hibbert Journal, applied the term “postmodern” in a religious context in 1914, in an article titled “PostModernism.” It used the phrase to describe the contemporary change in beliefs and attitudes in religious criticism: “The raison d’être of PostModernism is to escape from the double-mindedness of Modernism by being thorough in its criticism by extending it to religion as well as theology, to Catholic feeling as well as to Catholic tradition.”

Postmodernist biblical criticism tackles issues of authorship, ethnicity, fantasy, gender, ideology, linguistics, and sexuality in the same way that postmodernism in general tackles topics. In philosophy, postmodernist thought is not one thing: it can refer to various types of thinking, such as deconstruction, structuralism, poststructuralism, and postcolonialism. Similarly, postmodernist biblical criticism does not refer to only one type of critical biblical analysis. It addresses issues regarding identity from the point of view of writers of the Bible and those whom they wrote about, as well as from readers in their interpretation of the Bible.

Postmodern biblical criticism remains very much alive as a discipline, and various philosophers and theologians continue to work in the field. French philosopher Jacques Derrida (1930–2004) famously used the example of the biblical Tower of Babel to explain multiple layers of meaning in his deconstructivist work Des Tours de Babel (Towers of Babel, 1985), in which he stressed the importance of reading the narrative in Hebrew. Contemporary writers engaged in postmodernist biblical criticism include the Jewish-American historian and Talmudic scholar Daniel Boyarin, the U.S. theologian and priest A. K. M. Adam, and the Australian theologian Mark G. Brett. CK

1916

Dadaism

Artists of the Cabaret Voltaire

A nihilistic art movement that was opposed to war and traditional modes of artistic creation, favoring group collaboration, spontaneity, and chance

Dadaist ready-mades, such as Marcel Duchamp’s Fountain (1917)—a porcelain urinal, which was signed “R. Mutt”—sparked heated debate about the meaning of art.

More a loosely knit group than an organized movement, Dadaism, which had its beginnings in Eastern Europe in the years prior to 1914, came to a head in Zurich, Switzerland, in 1916 as a protest against the barbarism of World War I (1914–18) among the artists and acolytes of the Cabaret Voltaire, a Zurich nightclub owned by the German poet Hugo Ball (1886–1927). After randomly placing a knife in a dictionary and it falling on the word “dada” meaning “hobbyhorse,” the group chose the word as the name for their anti-aesthetic activities. Considered a traitor in his own country for his opposition to the war, Ball wrote the “Dada Manifesto” (1916) and went on to become one of its leading proponents. His Cabaret Voltaire was the birthplace of Dada, safely ensconced as it was within neutral Switzerland while Europe was at war around them.

“For us, art is not an end in itself … but it is an opportunity for the true perception and the criticism of the times we live in.”

Hugo Ball, Dadaist

The Dadaists despised nationalism, materialism, and colonialism: anything that upheld the existing order and that had brought Europe to war was vilified, including its artistic traditions. Dadaists blamed the causes and continuing promulgation of the war on bourgeois and colonial interests, and used a new, deliberately irrational art to express their disgust at having been led into a conflict that nobody, aside from politicians and the military, seemed to want.

The art of Dada was whimsical, often sarcastic, vibrant, and occasionally silly. It did not adhere to any rules of form, although abstraction and Expressionism were never far away, and ready-made objects were a common theme, such as Marcel Duchamp’s Fountain (1917), a porcelain urinal mounted on a pedestal, a statement that illustrated Dada’s nonsensical nature. The movement in the United States was centered at Alfred Stieglitz’s New York gallery “291.” Dadaism proved resilient, only losing its relevance three decades after its birth in the face of post-World War II optimism. BS

1916

Supermarket

Clarence Saunders

A large, self-service store that offers a wide variety of goods for purchase

Customers discover a new way to shop for groceries at the Piggly Wiggly store in Memphis, USA, in 1918.

Grocer Clarence Saunders (1881–1953) first developed the idea of self-service retail when he opened the Piggly Wiggly store in 1916 in Memphis, Tennessee. Customers entered the store via a turnstile, took a shopping basket, and viewed the merchandise available, which was displayed in elaborate aisles. By 1922, there were 1,200 Piggly Wiggly stores, and they became the model for the supermarket.

The first supermarket to incorporate the ideas of self-service, with separate product departments, discount pricing, and volume selling, was King Kullen, which opened in 1930 in New York City, with the slogan “Pile it high. Sell it low.” Supermarkets took advantage of the wide-scale opulence in the United States and elsewhere that resulted from the economic growth in the decades after World War II (1939–45), which also saw increasing automobile ownership and suburban development. Self-service shopping took longer to catch on in the United Kingdom, and supermarkets only became established from the 1950s.

The rise of supermarkets and their later larger incarnation, the superstore, led to the decline of many smaller, local, family-owned shops. Many superstores, including Walmart in the United States, utilized novel purchasing and distribution logistics. This included buying in bulk and only building stores within a specific distribution radius so that transportation costs could be minimized. Businesses, such as Tesco in the United Kingdom, came to believe—with superior profits supplying abundant reasons—that, given the option, consumers would rather shop at one place with lower prices for all the items they required than shop at multiple locations. Supermarkets changed the way that people shopped in the latter portion of the twentieth century and have had a marked effect on the business culture of towns and cities. CRD

1916

General Relativity

Albert Einstein

The geometric theory of gravitation alters the fundamentals of the universe

An artwork shows the huge gravitational pull of a black hole (bottom right), which warps the fabric of space.

One of the foundation stones of modern physics, general relativity (GR), or the general theory of relativity—the geometric theory of gravitation—was published by German-born physicist Albert Einstein (1879–1955) in 1916, and caused a scientific revolution.

With one sweeping theory, space and time were no longer absolute and passive; now they were dynamic, able to be bent, slowed down—the very matter that comprised them and surrounded them capable of altering their behavior. Even light itself could be shifted—general relativity predicted it could be curved by the gravitational pull generated by the interstellar objects it passed, a prediction confirmed by a solar eclipse just three years later. The universe had all of a sudden lost its static predictability. Physics, and the universe it was struggling to make sense of, had instantly become a lot more interesting.

GR was first and foremost a theory of gravity—one of the most fundamental forces in the universe—and turned on its ear the universally accepted gravitational laws of Isaac Newton, developed in the seventeenth century. No longer was gravity a straight line, an apple falling from a tree that continues straight down until it collides with the ground. In Einstein’s universe, with planets and suns causing the space around them to be warped and altering the very fabric of spacetime, Newton’s straight lines now had to bend to Einstein’s curved, puckered universe. Orbits, in the words of U.S. astronomer and astrophysicist Carl Sagan, were really “tramlines in the sky,” bent by gravity to take moons around planets and planets around suns—a predestined, endless loop along their otherwise straight, applelike trajectories. GR also illustrated a universe in a state of flux, a notion that so railed against Einstein’s own view of a static universe that he tried to modify it but could not, unable to bend the laws of his own immutable theory. BS

1918

Wormholes

Hermann Weyl

Hypothetical tunnels in spacetime that would enable faster than light travel

A conceptual artwork illustrates a theoretical wormhole between Earth and its nearest star, Alpha Centauri.

First postulated by German theoretical physicist Hermann Weyl (1885–1955) as part of his book analyzing electromagnetic field energy, Raum, Zeit, Materie (Space, Time, Matter, 1918), wormholes are hypothetical constructs in spacetime (the merging of space and time into a single continuum). They are presumed theoretical shortcuts or tunnels through space, not unlike the rabbit hole Alice fell into only to emerge in the parallel, fantastical world of Wonderland. The name “wormhole” was given to them later, in 1957, by U.S. theoretical physicist John Wheeler (1911–2008).

Wormholes wholly lack any observational evidence. Their existence, however, is theoretically possible according to the laws of general relativity (1916) governing the universe of Albert Einstein (1879–1955). Einstein and his collaborator, Nathan Rosen (1909–95), saw them as tunnel-like pathways connecting disparate regions of spacetime. Their constructs were given a name: Einstein-Rosen bridges. The principle, however, did not sit well with many physicists who balked at the notion that such travel would, in effect, allow the transmission of data and objects at a speed greater than the speed of light, thus breaching one of the foundational aspects of special relativity (1905).

In 1962 Wheeler claimed that the wormhole of Weyl and Einstein/Rosen was inherently unstable and prone to closing at both ends moments after opening. Wheeler then divided wormholes into two categories: Lorenzian wormholes along the line of Einstein’s standard model, and Euclidean wormholes, which exist only in “imaginary time” and the world of quantum mechanics. In 2010, British physicist Stephen Hawking (b. 1942) wrote that wormholes may be present not only in interstellar space, but also may be infinitely small and all around us here on Earth, winking in and out of existence unnoticed. He asserted that natural radiation would always rule out their use for time travel. BS

1918

Deflationary Theory of Truth

Gottlob Frege

The argument that there is no such thing as a truth beyond what is simply asserted

Frege noted that “I smell the scent of violets” means the same as “It is true that I smell the scent of violets.”

According to the deflationary theory of truth, the claim that a sentence is true means the same thing as simply asserting the sentence. The first notable defender of the deflationary theory was German philosopher and logician Gottlob Frege (1848–1925), who first presented the idea in 1918.

The central idea of the deflationary theory is that there is no deep nature to truth, and that adding the words “is true” to a sentence does not change the meaning of the sentence. The search for the nature of truth will always be frustrated, the deflationist says, because there is nothing there. So, if someone were to say that the sentence “there are 1001 ideas in this book” is true, they would have said simply that there are 1001 ideas in this book. This minimalism is opposed to “inflationary” theories of truth, such as the correspondence theory or coherence theory, according to which truth requires correspondence with the facts or coherence with your beliefs, respectively. Early deflationary theorists such as Frege, Frank Ramsey (1903–30), and A. J. Ayer (1910–89) were motivated by a distrust of traditional philosophical metaphysical debates regarding the nature of truth, particularly because there seemed to be no hope of resolution. The theorists nevertheless recognized that the concept of truth was a useful one to have, as it allows people conveniently to affirm or deny whole sets of sentences, such as “everything that Einstein wrote is true” or “some things that Descartes said are true.”

The deflationary theory has remained popular. Modern versions often claim that everything that there is to know about truth can be captured by an equivalence schema:

is true if and only if p. The “<>” element on the left-hand side of the schema indicates that the proposition, belief, or sentence is being mentioned rather than actually asserted. The right-hand side of the schema then uses it. BSh

1918

Modal Logic

Clarence Lewis

A form of logic intended to make sense of the ambiguities of everyday phrases

Modal logic was devised in 1918 by the U.S. academic philosopher Clarence Lewis (1883–1964), one of the founders of modern philosophical logic, in an attempt to circumvent the paradox of implication—that a false proposition was capable of implying any proposition. It has since developed into a mathematical tool for the study of description languages that discuss various types of relational structures.

Lewis was interested in looking at the reasoning behind everyday modalities (modes of truth), such as “It should be that … ,” and “It ought to be the case that … ,” and how their ambiguous phrasing seemed to allow for two kinds of truth, necessary and contingent. Consider the statement, “It is summer.” But is it necessarily summer? Is it summer right now, or at some point in the future? Fly to the opposite hemisphere and it certainly would not be summer there. Logicians refer to such modifications to an initial assertion as “modalities” and examine the mode in which the statement could be considered to be true. The truth tables of basic logic cannot easily handle such ambiguities, and so Lewis promulgated “modal logic” to tease apart the contradictions in beliefs, possibilities, and the supposed truth of judgments.

“Only the true statements are provable, and all the provable statements are true.”

Joel McCance, “A Brief Introduction to Modal Logic”

Modal logic’s principles are now used in linguistics, artificial intelligence, computer science, and game theory. Modal logic—given impetus by many logicians who have since expanded upon Lewis’s original precepts—has come a long way since Lewis first propounded it, as the biannual Advances in Modal Logic conferences attest. BS

1919

Prohibition

United States

Prevention of the sale of alcohol with the aim of obtaining partial or total abstinence

Government agents dispose of confiscated whiskey during prohibition in the 1920s.

The temperance movement in the United States attributed society’s ills to alcohol consumption. Progression toward national prohibition of the manufacture, sale, and distribution of alcoholic beverages was gradual, beginning with sporadic prohibitions arising from religious revivalism in the 1820s and 1830s. The Anti-Saloon League, founded in 1893, led state prohibition drives between 1906 and 1913, and during World War I a Wartime Prohibition Act was enacted to conserve grain for food production.

An amendment to the Constitution of the United States, championed by congressman Andrew Volstead (1860–1947), was passed by the U.S. Congress in December 1917 and ratified by the requisite three-fourths of the states in January 1919. The Eighteenth Amendment (the National Prohibition Act, commonly referred to as the Volstead Act) was passed in October 1919 over the veto of President Woodrow Wilson.

“We have seen the evil of … liquors … let us see what [prohibition] will do for us.”

Thomas Jordan Jarvis, U.S. congressman

By January 1920, prohibition was in effect in thirty-three states; however, the law was enforced only when and where the population was sympathetic to it. In December 1933, the Twenty-first Amendment repealed federal prohibition but allowed prohibition at the state and local levels; by 1966, all had abandoned it.

Prohibition gave impetus to entire illegal economies—bootlegging, speakeasies, and distilling operations. Criminal activity increased, and illegal manufacture and sales of liquor flourished, resulting in higher prices for liquor and beer. The Eighteenth Amendment is the only Constitutional amendment to have been ratified and later repealed. BC

1919

Archetypes

Carl Jung

Universally recognized prototypes that inform our understanding and behavior

Carl Jung (pictured here in 1960) suggested the notion of unconscious archetypes in rejection of the idea that people must learn everything from birth.

In the English language, the word “archetype” was first coined in the 1540s to describe something from which copies are made. It would not enter popular parlance until 1919, when Swiss psychiatrist Carl Jung (1875–1961) used it in an essay titled “Instinct and the Unconscious” to describe any type of behavior, symbol, term, or concept that was copied or emulated by others. In other words, it is a sort of universally recognized prototype. Jung asserted that archetypes are “ancient or archaic images that derive from the collective unconscious,” the fragment of our unconscious mind that is passed down to us as an inherited gift. He also described archetypes as “primordial images” that have been expressed continuously throughout history in folklore and as far back as prehistoric rock art.

“The concept of the archetype, which is an indispensable correlate of the idea of the collective unconscious, indicates the existence of definite forms in the psyche which seem to be present always and everywhere.”

Carl Jung, The Archetypes and the Collective Unconscious (1934)

New archetypes are not so much created as discovered. They have always been within us, Jung claimed, hidden and waiting to be found as each of us journeys individually into our own psyche. Archetypal beings include the hero, the wise old man, and the fraud, while archetypal events include birth and death. According to Jung, these and countless other varieties of archetypes act as conduits for our experiences and emotions. Because they surface as recognizable patterns of behavior, archetypes can be studied in order to gain insights into human behavior.

Jung’s five primary archetypes comprise: the self, which is the center of our psyche; the shadow, the opposite of our ego; the anima, which is the female aspect of man’s psyche; the animus, the masculine image in women; and the persona, which equals the image that we as individuals present to those around us. Archetypes are all around us in literature, from the “illfated lovers” represented by Shakespeare’s Romeo and Juliet, the “brooding antihero” of 1940s artist Bob Kane’s Batman, to the “villain” Voldemort in author J. K. Rowling’s Harry Potter series of novels. BS

1919

Ponzi Scheme

Carlo Ponzi

A fraudulent investment operation promising significant returns on a risk-free investment

Carlo Ponzi faces the police camera following arrest for forging a check in 1910. Ten years later, he received a five-year sentence for his pyramid scam as a warning to others.

Carlo Ponzi (1882–1949), an Italian immigrant to the United States, scammed thousands of New Englanders out of millions of dollars with a get-rich-quick plan that involved investing in the conversion of European postage coupons into U.S. currency. In 1919, Ponzi formed the Securities Exchange Company, assuring investors that they would double their investments in ninety days. He ultimately defrauded thousands of investors out of approximately $15 million using only $30 worth of stamps. Ponzi was arrested in 1920 and charged with multiple counts of fraud and larceny and sentenced to prison. He was in and out of U.S. jails until 1934, when he was deported to Italy.

“I landed in this country with $2.50 in cash and $1 million in hopes, and those hopes never left me.”

Carlo Ponzi

Variations of the Ponzi scheme, also known as a “pyramid scheme,” had existed since the seventeenth century, and they all had but one objective: to make a fortune for the operator. First, the operator entices a small group of initial investors into the scheme. The early investors receive tremendous investment returns from funds secured from a second group of investors. The second group, in turn, receives funds obtained from a third group of investors, and so on until the operator absconds with the funds when the operation is about to collapse.

Ponzi was by no means the first or the last to profit from such a scheme. In 2008, Bernard Madoff, (b. 1938) a former chairman of the NASDAQ stock exchange, was charged with fraud for defrauding investors out of some $50 billion. Madoff orchestrated a multibilliondollar scheme that swindled money from thousands of investors; although he did not promise spectacular short-term investment returns, his investors’ phony account statements showed consistent positive returns even during turbulent market conditions. In March 2009, he pleaded guilty to eleven charges of fraud and money laundering, and in June he was sentenced to 150 years in prison. BC

1919

Bauhaus

Walter Gropius

An integration of artist and craftsman that bridged the gap between art and industry

In 1925 the Bauhaus school moved from Weimar to these new premises in the German town of Dessau.

The Bauhaus approach to design originated in Weimar, Germany, with the foundation, in 1919, of a private architectural practice by Walter Gropius (1883–1969) and Adolph Meyer (1881–1929). It was a style born of the production lines of the new industrial age, which Gropius, after having witnessed firsthand the horrors of World War I (1914–18), was determined to harness for the betterment of society. In contrast to the skilled, romantic woodwork of the rural-based Arts and Crafts period, Bauhaus (German for “house of construction”) would be an urban, city-bred movement with its roots firmly in modernism. When Gropius was appointed master of the Grand-Ducal Saxon School of Arts and Crafts, also in 1919, he transformed it into what would become known as the Bauhaus School.

“A modern … lively architecture is the visible sign of an authentic democracy.”

Walter Gropius

A Bauhaus building was almost always cube-shaped, replete with right angles and overwhelmingly asymmetrical. Ornamentation was shunned in favor of functionalism, open floor plans, and smooth facades. Advances in engineering allowed for walls to be built around skeletons of iron and steel for the first time, freeing them from the need to support the structure. Bauhaus colors were mostly gray, black, and white, roofs were flat, and industrial aestheticism meant a reduction of form to the barest of essentials. Rooms within Bauhaus buildings were sparsely furnished, too.

Bauhaus design extended to interior design, sculpture, painting, and even typography and poster art. Informing myriad disciplines, Bauhaus demonstrated the exhilarating idea that an individual’s artistic spirit could exist in harmony with mass production. BS

1919

Biblical Form Criticism

Martin Dibelius

Analysis that classifies the Bible according to the literary forms in which it is written

German theologian Hermann Gunkel (1862–1932) was the first to develop biblical form criticism, but it was German theologian Martin Dibelius (1883–1947) who pioneered the discipline’s analytical method. Furthermore, it was Dibelius’s work, Die Formgeschichte des Evangeliums (From Tradition to Gospel, 1919) that gave form criticism its name. Dibelius analyzed the gospels in terms of oral traditions and asserted that in their earliest forms they consisted of short sermons written for the early Christian community. Dibelius also demonstrated that the first-century CE gospel writer Luke had access to St. Paul’s written records.

In biblical form criticism, the Bible is analyzed and classified into literary forms, such as elegies, parables, poems, and sayings. Critics then attempt to clarify the history of their formation. Form criticism enables scholars to take into account the tradition behind the Gospels at the time of writing, and attempt to reconstruct the life of Jesus and his authentic sayings.

“An original-original form never existed, or at least not in … missionary … Greek.”

Martin Dibelius, From Tradition to Gospel (1919)

Dibelius’s work was well received by his contemporaries. German theologian Rudolf Bultmann (1884–1976) developed form criticism, and in turn Bultmann’s work revolutionized biblical scholarship, inspiring academics to further distinguish the sayings of Jesus from those of the authors of the Gospels.

Form criticism led to the foundation of the Jesus Seminar in 1985 in the United States. Consisting of some 150 scholars and laymen, it attempted to verify the historicity of Jesus’s sayings and actions, going on to produce new translations of the apocrypha and New Testament for use as textual sources. CK

1919

Surrealism

André Breton

A movement giving artistic expression to dreams, imagination, and the unconscious

The Surrealist Manifesto defined surrealism as “psychic automatism in its pure state.”

The Surrealist movement in art first emerged from within the Dada-esque artistic communities of Paris in the years after World War I (1914–18). An attempt to unlock the imagination through the channeling of the unconscious, its first artistic expression is generally ascribed to French writer and poet André Breton (1896–1966) who, in 1919, wrote what is considered the movement’s first literary work, Les Champs magnétiques (The Magnetic Fields); he followed that, in 1924, with Le Manifeste du surréalism (The Surrealist Manifesto), later seen to be the movement’s anthem.

Surrealists utilized dream imagery to uncover and comment on our deepest anxieties, often using collage and employing images from popular culture. The movement proved enormously resilient to change, surviving challenges from existentialists and Abstract Expressionists as they sought to usurp its role as the preeminent artistic voice of the unconscious. However, it began to fracture when Breton tried to alter its primary focus from art to political activism, a move that split the movement into those who believed art to be inherently political and those who did not. Still, Surrealism would not die, and an exhibition of Surrealist art in New York in 1942, organized by art collector Penny Guggenheim, was an example of just how pervasive and universal the movement had become.

Influenced by Sigmund Freud’s pioneering work in psychology, the Surrealists believed that the conscious mind, burdened as it was with all of society’s accumulated cultural and religious taboos, needed to be freed, not only so that it could better express itself artistically, but also to help reveal society’s multitude of destructive contradictions. Surrealism may not have gone on to achieve its lofty social objectives, but artistically it flourished, becoming arguably the most pervasive and influential art movement of the twentieth century. BS

1919

Talkies

Lee de Forest

Motion pictures that featured the innovative addition of sound

A huge crowd waits outside Warners’ Theatre in New York City to see Al Jolson in The Jazz Singer in 1927.

The synchronization of motion pictures with sound, producing what were referred to at the time as “talkies,” was first demonstrated at the Paris Exposition in 1900. Further demonstrations were occasionally held after that, but the technical difficulties involved in the process—which basically worked by crudely synchronizing the starting of the movie projector with an audio playback cylinder—made widespread commercial screenings all but impossible for two decades. However, in 1919, U.S. electrical engineer Lee de Forest (1873–1961) patented his Phonofilm sound on film process, which recorded sound directly onto the movie strip in the form of parallel lines of variable density. The lines recorded electrical waveforms from a microphone photographically, and were translated back into sound when the movie was projected.

The first commercial screening of talkies came on April 15, 1923, at the Rivoli Theater, New York City, with a set of short movies promoted by De Forest Phonofilms. The first commercial hit for talkies, The Jazz Singer, appeared four years later, in 1927, although its sound system was not de Forest’s. The movie had fewer than 400 spoken words, but its success was undeniable, and within three years 95 percent of all new movies made used talkie technology. This proved a momentous change in the movie industry, simultaneously raising the cost of film production, which drove many smaller production companies out of business, and reducing the availability of jobs for many silent film stars, because the addition of sound to movies brought radical changes in the style of acting required. The bringing together of motion pictures and sound caused a revolution in film technology, the impact of which can be seen in the almost complete absence of professionally made silent films today. JE

c. 1920

Inferiority Complex

Alfred Adler

A psychological condition that is displayed through a lack of self-worth

The term “inferiority complex” was first used in passing by Sigmund Freud and Carl Jung, but assumed central importance in psychology during the 1920s on account of the work of Austrian psychotherapist, Alfred Adler (1870–1937). Adler had tired of Freud’s emphasis on the unconscious as a factor in human behavior and believed that the reality of much behavior was relatively easy to explain. He believed that everyone begins life with feelings of inferiority simply because, as children, we lack many of the skills common to older people. This sense of inferiority—what Adler called inferiority feelings, not to be confused with a complex—is gradually overcome as people grow and develop. Thus, Adler’s notion of the inferiority complex was one capable of being corrected.

“Exaggerated sensitiveness is an expression of the feeling of inferiority.”

Alfred Adler

Sufferers of the inferiority complex feel inferior to those around them and believe they will never be able to compensate for their inadequacies. The complex develops first in childhood and is often the result of discrimination, bullying, physical disability, or some form of personal rejection. Later, as adults, when confronted with a challenge that cannot be overcome, a secondary inferiority complex can develop. Inferiority complexes can remain dormant, and associated feelings can include resentment, depression, and aggressive or irritable behavior. However, overcoming an inferiority complex can be relatively simple. If a person feels inferior because they do not fit into a particular group, studies indicate that changing groups and finding acceptance elsewhere are often all that is necessary to provide a way forward. BS

c. 1920

Country Music

United States

A form of popular music developed from U.S. rural folk and blues roots

Country music duo the Delmore Brothers playing guitar in c. 1930.

Country music originated in the United States in the 1920s, in the folk culture of the rural South. In turn, U.S. folk had grown out of the folk music brought by successive waves of immigrants from Europe, especially the British Isles. Over time, such existing music was drawn upon by banjo players, hillbilly string bands, virtuoso fiddlers, blind balladeers, and gospel singers to generate a louder kind of music that could be heard above the hubbub of community functions. Some of this music came to be known as bluegrass. As southern musicians moved northward, their music came with them, and radio and recordings did much to popularize the style. With the likes of singers Gene Autry (1907–98), Roy Rogers (1911–98), and Tex Williams (1917–85), the music gained particular popularity in the 1940s, with the label “country” replacing the slightly derogatory one “hillbilly” music.

“I think there’s enough room in country music for everybody.”

Charley Pride, country musician

Some of the earliest country music recordings emerged from Atlanta. The Grand Ole Opry, a concert hall in Nashville, Tennessee, itself considered the country music capital of the United States, provided radio performances for fans of country music from 1925. The Grand Ole Opry was housed in the Ryman Auditorium in 1943, where it remained until moving into the Grand Ole Opry House in 1974.

In the 1950s, the rise of rock ’n’ roll challenged country music, but the latter evolved over time into a more pop-oriented form, and now has gained an international listenership. In recent years many rock acts—including Jon Bon Jovi and Kid Rock—have had crossover hits in the country music realm. JE

1920

Autosuggestion

Émile Coué

The routine repetition of a formula that tricks people into feeling better

High-wire performers succeed mainly because, says Émile Coué, their minds tell them that they can.

“Every day in every way I am getting better and better.” This is the phrase the physician and psychologist Émile Coué (1857–1926) urged his patients to say to themselves twenty to thirty times every morning after waking and again at night before going to sleep. A former advocate of hypnotism, Coué abandoned that approach in favor of what he called “the laws of suggestion” as an adjunct to prescription medicines. As he explained in his book Self-Mastery Through Conscious Autosuggestion (1920), he thought autosuggestion to be much more than mere “positive thinking”—he believed a change in our unconscious thoughts actually had the potential to heal physical ailments. Rather than thinking of himself as a healer, Coué stressed that he merely provided his patients with the mental tools needed to heal themselves. Any idea implanted into the conscious mind, as long as it was within the realms of possibility, could become that person’s reality.

“Autosuggestion is an instrument … with which we play unconsciously all our life.”

Émile Coué

Coué argued that walking along a plank of wood lying on the ground without falling off it is a relatively simple task. But take that plank and place it 100 feet (30 m) above the ground between two buildings, and suddenly crossing it becomes almost impossible. The plank has not changed, but now the mind is saying that crossing it is cannot be done. A person accustomed to heights, however, could cross it with relative ease.

Coué himself defined autosuggestion as the influence of the imagination upon the moral and physical being of mankind. Whatever it was, people with all kinds of ailments came to him to be, at least in part, cured by what he called his “trick.” BS

c. 1920

Bell Curve Grading

United States

A method of assigning grades according to a predetermined distribution

The “bell curve” is a concept that states that the majority of things or events happen around the middle point and fewer things happen at the upper or lower end. In other words, the “mean” (average) is also the “mode” (most frequent). When this is plotted on a graph, the resulting shape resembles a bell. It is known in mathematics as the “de Moivre distribution,” or the “Laplace-Gaussian curve” and is often used in education for grade distribution determinations.

The practice of bell curve grading, also known as grading on a curve, originated in the United States in the 1920s. It was developed in response to a study of 1912 by two Wisconsin researchers that found that high school English teachers in different schools assigned highly varied percentage grades to the same papers. By grading on the basis of “normal” probablility, as depicted by the bell curve, it was thought that a fairer distribution of grades could be ensured among teachers and the subjective nature of scoring could be brought into check.

“Grading on a curve is extremely distorting as a reference of mastery.”

Rick Wormeli, Fair Isn’t Always Equal (2006)

When grading on an A = 100–90, B = 89–80, C = 79–70, D = 69–60, and F = 59–0 distribution, the top of the bell should be a middle C of 75 percent. If we assume that grade distribution should fit a “normal” curve then all grades must be normalized; overly low scores will be adjusted up and overly high ones will be adjusted down. It is a distortion of the original data for the sake of fulfilling the expectations of the statistical concept being used. There is some controversy over using the bell curve, especially in terms of modern concerns about grade inflation. LW

1920

Nuclear Fusion

Arthur Eddington

The means by which the sun emits energy, and a potential source of unlimited power

The Tokamak-3 nuclear fusion reactor at the Kurchatov Institute of Nuclear Power near Moscow, built in 1960.

In 1920, while engaged in a search for neon isotopes, the British physicist Francis Aston (1877–1945) measured the mass of hydrogen and helium atoms and discovered that four hydrogen atoms weighed more than a single atom of helium. The significance of this was seen immediately by the British astrophysicist Arthur Eddington (1882–1944), who realized that the differential in mass of helium and hydrogen atoms as measured by Aston was the mechanism that allowed the sun, through the conversion of hydrogen into helium, to shine, producing heat and light.

Nuclear fusion—the conversion of matter into energy that occurs when atomic nuclei are brought together to create a larger, more stable nucleus—occurs naturally within stars. Our sun fuses more than 600 million tonnes of hydrogen into helium every second. But replicating that “artificially” on Earth, on a large scale via a process that does not involve the detonation of a hydrogen bomb (the only process by which any meaningfully large amount of fusion has so far been achieved), is an entirely different proposition.

Research in nuclear fusion increased around the world after World War II (1939–45), although individual state-sponsored programs were oriented toward military uses and independent of each other. In 1958 this changed with the “Atoms for Peace” conference in Geneva, Switzerland. For the first time, research became coordinated and information was shared. It was agreed in Geneva that nuclear fusion was an achievable goal, but it was also acknowledged that inherent plasma instabilities meant that many hurdles would need to be overcome. The first fusion-generating devices, called “tokamaks,” were designed and built in Russia in the late 1960s, and today the Joint European Torus tokamak near Oxford in the United Kingdom holds the record for fusion generation: it produced 16 megawatts of energy in a single second. JMa

1920

Robotics

Karel Capek

Machines that reproduce human physical and/or mental capabilities

A scene from Rossum’s Universal Robots (1920), a play written by Karel Capek, who coined the term “robot.”

The idea of a mechanical device mimicking the form and functions of a human being, perhaps able to perform work for its creator, has long preoccupied mankind. Inventor and artist Leonardo da Vinci (1452–1519) sketched a humanoid robot in 1495, and in the eighteenth and nineteenth centuries innumerable life-sized automatons were built, including a mechanical duck, designed and constructed by inventor Jacques de Vaucanson (1709–82), that could move its wings and swallow food. Even so, it would not be until 1920 that Karel Capek (1890–1938), a Czech science fiction writer long before science fiction was considered a recognized genre, first used the word “robot” (a word suggested by his brother, Josef) in his play Rossum’s Universal Robots. The play describes a futuristic world of humanlike mechanical beings and takes place in a factory where “artificial people” are made.

Departing from the realm of fiction, engineer Joe Engleberger (b. 1925) and inventor George Devol (1912–2011) designed and built the world’s first programmable robotic arm in 1954; named Unimate, it became in turn the first industrial robot, for General Motors in 1961. IBM produced its first commercial computer in 1964, and a swathe of advanced robotics accompanied Apollo 11 to the moon in 1969. In July 1994, an eight-legged tethered robot, Dante II, built by students at Pittsburgh’s Carnegie Mellon University, descended into Alaska’s Mount Spurr and collected fumarole gas samples, a major advance in the use of robotic explorers to access extreme terrain.

Traditionally robotics has promised more than it could deliver because computing power has always lagged behind the human imagination. But with the processing power of computer mainframes growing so fast, technicians predict that by 2050 robotic brains will be capable of executing 100 trillion commands per second, and will begin to rival human intelligence. BS

1920

Id, Ego, and Superego

Sigmund Freud

The view of our personalities as the product of three competing regions of our mind

The Id, Ego, and Superego are the three elements in Sigmund Freud’s (1856–1939) psychoanalytic theory of personality. It is the interactions between them, he claimed in “Beyond the Pleasure Principle” (1920), that define our behavior. The Id contains our most basic and instinctual impulses, and acts according to Freud’s so-called “pleasure principle.” If the Id were all we possessed we would be at the same level as the animals. Hunger, anger, thirst, sex—these are the Id’s insatiable needs. It is disorganized, incapable of judgment, and has no sense of morality or of good and evil. The Id exists solely for self-gratification. A human who acts solely on instincts derived from their Id would be akin to a sociopath. The Id gives us what we need to survive, but it is also the “devil on our shoulder.”

“The poor Ego has a still harder time of it; it has to serve three harsh masters …”

Sigmund Freud, New Introductory Lectures … (1932)

Our conscience, our ability to moralize and criticize the rampant desires of the Id, is the realm of the Superego. It provides us with our sense of right and wrong. Freud referred to the Superego as the psyche’s equivalent of the father figure, who raises us to be good citizens and disciplines us when we misbehave. It encourages us to remain within the boundaries of social expectations and cultural norms.

The Ego is now thought of as suppressing the Id’s urges, and also of being a sort of organizer that takes the competing needs of the Id and Superego and turns them into something that is coherent and workable. Freud himself saw the Ego as more of a mediator in a strictly hierarchical structure, driven by the demands of the Id while at the same time confined within the boundaries of the Superego. BS

1920

Death Drive

Sigmund Freud

The theory that everyone is subject to a subconscious impulse for self-destruction

In 1920, Austrian neurologist Sigmund Freud (1856–1939) published an essay, “Beyond the Pleasure Principle,” about the struggles everyone has between two opposing impulses: Eros, the drive to creativity, pleasure, and sexual reproduction; and Thanatos, the Death Drive, an aggressive desire for self-destruction. Perhaps unsurprisingly, the theory—one of Freud’s most obscure and enigmatic—was also his most poorly received. Nobody, neither psychologist nor layman, wanted to think of themselves as possessing an unstoppable drive to die.

Freud had always believed that every decision that people made was motivated by the pursuit of pleasure. Now he began seeking out alternate drives to explain an impetus toward destruction that he saw in the post-traumatic dreams of his patients. He also began to see death as part of the unfolding journey of life, a compulsion of the body—the organism—to return to its primordial inertia. This was, according to Freud, the body’s most primitive, inhuman element.

Critics of the theory point to its coincidental emergence just months after the death of Freud’s daughter, Sophie, who was a victim of the Spanish flu that ravaged Europe. Or maybe he was inspired by Friedrich Nietzsche’s concept of the “eternal return,” from death back to life? A “drive to death” did not please evolutionists, either; it was a theory that seemed to fly in the face of Darwinism and the survival of the fittest.

More likely, however, it was simply an outgrowth of Freud’s own bewilderment at his troubled patients’ resistance to treatment, and the persistent presence of depression and angst that he was unable to treat, particularly in veterans of World War I. The Death Drive seemed to fill Freud’s own conceptual “gap,” even as it left almost everyone else wondering just where it had come from. BS

1921

Tractatus Logico-Philosophicus

Ludwig Wittgenstein

A philosophical treatise that sought to identify the nature of the relationship between language and reality

Ludwig Wittgenstein was a member of one of the richest families in Austria, but also one of the most troubled. Three of his brothers committed suicide in differing circumstances.

One of the twentieth century’s most seminal philosophical works, the Tractatus Logico-Philosophicus (Logical-Philosophical Treatise) was published in 1921, three years after its author, Austrian-British philosopher Ludwig Wittgenstein (1889–1951), first developed its propositions while serving in the Austro-Hungarian army on the battlefields of Europe. He began writing it while held as a prisoner of war in an Italian prison camp, and completed it in the summer of 1918 at his family’s summer house in Vienna.

“A man will be imprisoned in a room with a door that’s unlocked and opens inward as long as it does not occur to him to pull rather than push.”

Ludwig Wittgenstein

Consisting almost entirely of a series of assumptions considered by the author to be self-evident, the treatise is virtually devoid of arguments as it attempts to explain the underlying logic of language, and how language represents the limits of what can be said and, therefore, of what can be reasoned. For Wittgenstein, the limitations of philosophy and the limitations of language are one and the same. By linking logic to metaphysics using language, he emptied philosophy of its circular reasoning, saying that, “If a question can be framed at all, it is also possible to answer it.” The Tractatus provided new insights into the interconnectedness of language, of thought, and of the outside world, and supplied the everyday tools required to separate sense from nonsense.

Wittgenstein was admired by the Vienna Circle of philosophers at the University of Vienna, and British philosopher Bertrand Russell (1872–1970) described him as “the most perfect example I have ever known of genius.” But he was a lonely and troubled man. He considered Russell his only peer, but that relationship soured to the point where he opposed Russell writing an introduction to the Tractatus (he did, even so). Tormented by thoughts of suicide, he regarded his professorial career as a “living death” and held many of his fellow philosophers in contempt. BS

1921

Rorschach Test

Hermann Rorschach

A subtle diagnostic tool that was derived from a childhood game

Rorschach inkblots serve as neutral yet suggestive stimuli for the purpose of psychological assessment.

The use of ambiguous shapes and patterns to determine an individual’s personality may first have been considered by prolific inventor and artist Leonardo da Vinci (1452–1519). But it would not be until 1921, in his book Psychodiagnostik (Psychodiagnostics), that Swiss psychiatrist Hermann Rorschach (1884–1922) would present the well-known inkblot test named after him. Rorschach’s monograph contained ten cards, and the author believed that subjects invited to interpret these might reveal a tendency toward schizophrenia. The use of the cards as a personality test does not date until 1939.

Rorschach had taken Klecksography—a game popular among Swiss children that involved taking a random wet inkblot on a piece of paper and trying to fold it so that it made the shape of a bird or a butterfly—and transformed it into a science. As an adolescent he had always been torn between following a path in drawing or the natural sciences, and here he had devised for himself an ingenious compromise.

It was our ability to perceive, rather than to construct or imagine, that interested Rorschach most, and that is what the test was designed to measure. He was interested in our perceptions, what triggered them, and how they varied from person to person. He saw the inkblots as optical stimuli, put forward to harness our propensity to project our own interpretations onto ambiguous images, in the hope that analysts might be able to identify deeply buried traits and impulses.

Rorschach had trained with Carl Jung and was influenced by Sigmund Freud. Like many psychologists of his era, he was influenced by symbolism and symbolic associations. For example, in one paper he observed that neurotics were often obsessed with their watches, because the roundness of the faces symbolized for them their mother’s breast, and the ticking, her heartbeat. His inkblots acted as symbols in a comparable way. BS

1921

Serialism

Arnold Schoenberg

An approach to composition involving the systematic organization of elements

Composer Arnold Schoenberg was also one of the most influential teachers of the twentieth century.

Developed by Austrian composer Arnold Schoenberg (1874–1951), serialism is a method of composition whereby all twelve notes of the chromatic scale (or in later instances rhythms, dynamics, and other musical features) are systematically employed in comprehensive distribution, utilizing rows and sets as organizing principles. Embodied in his “twelve-tone technique,” invented in 1921, serialism was envisioned as an attempt to distance composition from traditional tonality and melody. Schoenberg’s phrase “emancipation of dissonance” related to his pursuit of atonality and his reframing of musicality.

Frequently, the serialist approach to composition institutes other constraints on organization beyond mere prescribed order, such as the use of elements equally and in similar proportion. Common intervals between tones may be established, or dynamic changes based on an inclusive pattern. While mathematical in its organization, the twelve-tone method was equally expressive as other approaches to composition, according to Schoenberg. He famously compared his music to all other forms of human work, with a skeleton and a circulatory and nervous system—“an honest and intelligent person who comes to us saying something he feels deeply and which is of significance to all of us.”

Serialism can also be seen in a broader range of composition and production that extends to fields beyond music. It paralleled advances in mathematics, such as set theory, which deals with the properties of well-defined collections of objects, while also influencing the visual arts through the work of artists such as Josef Albers and Sol LeWitt. The structural understanding of sets and permutations in music also influenced scholars and critics, who returned to classical pieces in light of serialist principles to discover new features of their organization. JD

1921

Child Development

Jean Piaget

The theory that a child’s intellectual development occurs in distinct stages

Jean Piaget (1896–1980) first came across psychoanalysis during a brief period of study at the University of Zurich, and in 1919 moved to Paris to work with the psychologist Alfred Binet (1857–1911), the inventor of the IQ test. Piaget became fascinated with young children who, in response to certain questions Binet would ask them, continually gave the same wrong answers. By 1921 Piaget had begun to publish his conclusions—that the cognitive processes of young children are very different to those of adults—marking the beginning of a lifetime of study that would make Piaget the undisputed pioneer of cognitive development in children.

Piaget saw a child’s intellectual development as evolutionary, having certain points at which it simply “takes off.” This is prompted by two ongoing processes: assimilation and accommodation. By sucking on everything around it as a reflex action, a baby is assimilating its world, transforming it to meet its individual needs. As the baby develops, it begins to engage in accommodation by picking things up and putting them in its mouth—a modification of the reflex response. Piaget came to believe that the conflict generated by these two opposing and ongoing actions creates the impetus for cognitive development.

“ … knowledge of the external world begins with an immediate utilization of things …”

Jean Piaget

Piaget’s four stages of child development—the egocentric stage, the “magical thinking” stage, the development of logical thought, and the gaining of abstract reasoning—saw him labeled a “cognitive constructivist,” as opposed to a “social constructivist,” who believed that language and exposure to other people were equally influential. BS

1922

Fixed-point Theorem

Stefan Banach

A mathematical principle demonstrating how a function can have a fixed point

First stated in 1922 and one of the great mathematical principles devised by Polish mathematician Stefan Banach (1892–1945), the Banach fixed-point theorem has become a vital ingredient in the theory of metric spaces (sets in which a notion of distance, called a metric, between elements of the sets is defined). The theorem has been used extensively in the field of topology and mapmaking. Explaining it without the aid of of graphs and algebra, however, is not easy.

Imagine somebody climbing a mountain. The climber begins at midday and reaches the summit six hours later. He or she stays on the summit overnight and the following day descends, beginning at midday again and taking just four hours to return to the starting point. Now imagine this depicted on a graph. There are two lines; one—the ascent—begins in the bottom left of the graph and heads to the top right, denoting time and height; the other line—the descent—starts at the top left of the graph, the summit’s starting point, and goes down toward the bottom right. It is inevitable that at some point the two lines are going to have to intersect, and the point at which they do is an example of a mathematical “fixed point.”

Variations in Banach’s fixed-point theories are not restricted to mathematics, and also can be decidedly unfixed. For example, according to the Brouwer fixed-point theorem devised by Dutch mathematician Luitzen Brouwer (1881–1966), if a person stirs a cup of coffee, there will appear to be a point on the surface that is not moving. Due to the turbulence of the swirling coffee, however, it is fair to assume that the fixed point is in fact mobile and traverses the surface. Over the past fifty years, the fixed-point theorem has also been adapted for use in engineering, biology, and quantum mechanics. The Kakutani version (1941) is widely used in game theory and economics. BS

1922

Etiquette in Society, in Business, in Politics, and at Home

Emily Post

A compendium of advice on how to negotiate the minefield of what, and what not, to say and do in social situations

Emily Post (pictured in 1923) provided advice on etiquette to everyone from small boys to motorcar drivers, but her core market was young debutantes, suitors, and newly-weds.

Rules of etiquette first began to be formed in the court of Louis XIV (1638–1715), the so-called “Sun King” of France, with a series of elaborate social customs and protocols compiled by idle courtesans. Future first president of the United States, George Washington (1732–99), when still barely sixteen years of age, wrote his “Rules of Civility and Decent Behavior in Company and Conversation,” comprising 110 rules, such as: “Every action done in company ought to be with some sign of respect to those that are present.”

“Manners are made up of trivialities of deportment which can be easily learned if one does not happen to know them; manner is personality …”

Emily Post

It would not be until 1922, however, that a book that virtually codified every minute aspect of etiquette in a single, all-encompassing volume was produced. Written by the U.S. author Emily Post (1872–1960), Etiquette in Society, in Business, in Politics, and at Home found a ready market and was a bestseller. In her book, Post referred to the “Best Society,” not a fellowship of the wealthy but a Brotherhood “which spreads over the entire surface of the globe.” Etiquette and good manners were not only for the few; they applied to everyone and went far beyond mere manners and deportment, extending into the realm of ethics.

Post leaves no stone unturned in her quest to reassure us about what we should say and do in every conceivable social situation. When do we shake hands, what do we say when introduced? When should a gentleman take off his hat, or merely raise it? When should one bow and, just as importantly, when should one not? And unless completely unavoidable, never call out someone’s name in public. Being told to say “How do you do?” instead of “Charmed!” caused some to criticize Etiquette as a dated and irrelevant work. However, the book is currently in its eighteenth edition, proof that for those who consider manners to be the glue that holds society together, proper etiquette never goes out of fashion. BS

1924

Copenhagen Interpretation

Niels Bohr

An explanation for the bizarre and random behavior of subatomic particles

Quantum mechanics takes a metaphysical approach toward understanding subatomic particles, providing physicists with tools to conduct experiments that might provide insights into their behavior. Yet quantum mechanics at times appears bizarre even to some physicists, and one of its most bizarre aspects is the so-called Many Worlds Theory, which suggests that, for each outcome of any given action, the universe can split it in two, allowing for the quantum particles in the event to behave in two entirely different ways.

Starting in 1924, at the University of Copenhagen, Danish physicist Niels Bohr (1885–1962) set out to try to explain this seeming contradictory duality. In what he called the Copenhagen Interpretation, he theorized that quantum particles do not exist solely in one state or another, but in all of their possible states at the same time. This state is referred to by physicists as coherent superposition.

“When it comes to atoms, language can be used only as in poetry.”

Niels Bohr

Moreover, every subatomic particle has two aspects, a wave aspect and a particle aspect, which may be compared to the two sides of the same coin. And just as a coin tossed in the air can land only on one side, so too a subatomic particle can behave as either a wave or a particle, though never both at the same time. Bohr’s interpretation suggests that if a particle were to be observed in its natural state, the observer would see it only after it has been forced to “choose” which path to take—that path then becomes the “observed state.” Consequently, electrons in identical experiments are capable of behaving differently, depending on which of their two aspects comes into play. BS

1925

Art Deco

The Society of Decorator Artists

A visual arts design style characterized by geometric shapes and bold colors

Designed by William van Alen in 1928, the 1930 Chrysler Building in New York is a masterpiece of art deco style.

The term “art deco” was first used to describe a new style of decorative arts seen at the art exhibition at the Musée des Arts Décoratifs in Paris in 1925. A showcase of French design, it included furniture, elements of interior design, botany, geometry, and naturalism. No object was spared its intervention: lamps, haircombs, footstools, lighting, automobiles, it was a philosophy of design that celebrated a new age of post-World War I exuberance, a prosperous era of style, fashion, skyscrapers, and the internal combustion engine.

It all began with La Société des artistes décorateurs (The Society of Decorator Artists), an artistic association formed in Paris after the city’s Universal Exposition of 1900 with one purpose in mind: to show the world the beauty and form of French artistic and architectural expression. It was their work—pottery by Raoul Lachenal, graphic design by Eugène Grasset, luxury printed books by François-Louis Schmied, and humble bookbindings by Paul Bonet that ignited the art deco movement and took it from the exhibition halls of Paris to the rest of the world.

“ … an assertively modern style … [that] responded to the demands of the machine.”

Bevis Hillier, historian

Art deco architecture featured stylized reliefs, geometric ornamentation, stepped lines, and machine-age chrome and Bakelite with a grandeur reminiscent of the Roman Republic (1509–27 bce). Americans added the principles of streamlining, characterized by strong curves and clean lines, applied it to construction and automobiles and gave the world the Chrysler Building and the Buick Y-Job. Sadly, art deco came to an end in the mid-1940s, its perceived gaudiness not in keeping with the austerity of post-World War II economies. BS

c. 1926

Cultural Hegemony

Antonio Gramsci

A nonconfrontational approach to the overthrow of capitalist social structures

In the 1920s, Italian writer, political theorist, and cofounder and leader of the Communist Party of Italy Antonio Gramsci (1891–1937) coined the term “cultural hegemony” to describe how one single social class can slowly come to dominate the society of which it is a part. Believing that established capitalist societies were too entrenched to be taken over by force, Gramsci instead proposed a process of incremental change, of gaining a toehold within its hierarchical structures by influencing the leaders of churches, schools, and the bureaucracy to alter, over time, their thinking.

Gramsci compared capitalist social structures withdefensive battlements. The state, he said, was an “outer ditch, behind which there stood a powerful system of fortresses and earthworks. The superstructures of civil society are like the trench systems of modern warfare.” The infiltration of society through cultural hegemony, he believed, provided a way of gradually undermining these social foundations and effecting change. Gramsci’s influence on evolving concepts within political science continued long after his death, with the publication in 1971 of Selections from the Prison Notebooks, which he wrote while imprisoned from 1926 to 1934. Hegemony, he said, is a constant struggle, its victories never final and always precarious.

“Gramsci’s social thought contains some remarkably suggestive insights …”

T. J. Jackson Lears, historian

Intent on loosening the rigid formulas of Marxism, cultural hegemony broadened the base of socialist ideology, and acknowledged the role of the state as a political reality and not just as a tool of the ruling class. It continues to aid historians striving to understand how an idea can undermine prevailing social hierarchies. BS

1927

Television

Philo Taylor Farnsworth

An electronic communications system that transmits both images and sound

Philo Taylor Farnsworth adjusts a television camera during a demonstration of his television system in 1934.

Television was not the product of a singular moment of scientific genius or technical insight, but rather a culmination of work by numerous engineers and inventors. Although Philo Taylor Farnsworth (1906–71) is credited with having produced the first successful electronic television in 1927, a mechanical television system had been invented as early as 1884 by German inventor Paul Gottlieb Nipkow (1860–1940). The first commercial television sets appeared in England and the United States in 1928. In 1937 Londoners purchased 9,000 television sets, allowing them to watch the coronation of King George VI.

The television is a device that can receive, decode, and display images and sounds sent over a great distance. Television images can be broadcast over radio frequencies and through closed-circuit television loops, as well as through satellite and cable transmissions. The earliest televisions could only display black-and-white analog images, although advances in technology led to color images, stereo sound, and digital signals.

“It’s the menace that everyone loves to hate but can’t seem to live without.”

Paddy Chayevsky, writer

By 1996, there were approximately one billion televisions in use around the world; a number that has continued to grow. The medium became synonymous with technological advances and significant moments of the twentieth century, and has had a profound impact on the way people communicate, interact socially, and spend their time. Its ubiquitous presence is an endless source of entertainment, news, information, and distraction, and it has played a pivotal role in social and political change around the world, used by both tyrant and liberator alike. MT

1927

Hubble’s Law

Georges Lemaître

A law that provides evidence for the expansion of the universe

Albert Einstein talking to Georges Lemaître in Pasadena, California, in 1933. At a series of seminars held there, Lemaître detailed his theory and was applauded by Einstein.

Although misattributed to him, the idea that the universe is expanding did not originate with the astronomer Edwin Hubble (1889–1953), despite his paper of 1929 detailing his work at the Mount Wilson observatory in the hills outside Los Angeles. Using its 100-inch (250 cm) mirror, Hubble was able to detect redshifts in the light spectrum of galaxies and concluded that they were moving away from each other at an almost unfathomable rate. The further away a galaxy was, he said, the faster it was traveling. His discovery also enabled him to estimate the approximate point at which this expansion began, and so the Big Bang theory of the birth of the universe was born. The problem was, however, that two years earlier his conclusions had already been arrived at and published by the Belgian-born physicist and secular priest Georges Lemaître (1894–1966).

“Equipped with his five senses, man explores the universe around him and calls the adventure science.”

Edwin Hubble, astronomer

In 1927, while a part-time lecturer at the Catholic University of Louvain in Belgium, Lemaître published a paper with an exhausting title: “A Homogeneous Universe of Constant Mass and Growing Radius Accounting for the Radial Velocity of Extragalactic Nebulae.” In it he postulated an expanding universe, and chided Albert Einstein for taking exception to his theory—“Your math is correct, but your physics is abominable.” Lemaître even suggested a “rate of expansion,” later called the Hubble Constant. Unfortunately Lemaître’s conclusions were published in a little-known Belgian scientific journal and were largely overlooked by the worldwide astronomical fraternity.

In 1917 Einstein had allowed for an expanding universe in a model he created based upon his theory of general relativity, but later, in what he described as “the biggest blunder of my career,” he revised and dismissed the theory. In 1931 Einstein paid Hubble, not Lemaître, a visit, and thanked him for his pioneering work, although he later acknowledged Lemaître’s theory in 1933. BS

1927

The Question of Being

Martin Heidegger

An attempt to recover the true task of philosophy—the analysis of “being”

Despite the extremely dense and inaccessible nature of the text, Heidegger’s Sein und Zeit (Being and Time, 1927) led to his recognition as one of the world’s leading philosophers.

In his masterpiece Sein und Zeit (Being and Time), published in 1927, Martin Heidegger (1889–1976) reinvented the term Dasein, which is translated from the German as “being there” or “existence.” Heidegger’s term did not denote a static subject but a specific human and active consciousness that was fully engaged in the world. He remarked, “Dasein is ontically distinguished by the fact that, in its very Being, that Being is an issue for it.” This means that Dasein is not only “existence” but it is also the choice to exist or not to exist. Human beings, unlike rocks, plants, and animals, not only exist but also have a purpose that they themselves define. Dasein was finally the fundamental philosophic category that could serve as the basis for all other fundamental philosophic discussions.

“Why are there beings at all instead of nothing? That is the question. Presumably it is not an arbitrary question, ‘Why are there beings at all instead of nothing’—this is obviously the first of all questions.”

Martin Heidegger

Heidegger’s account of Dasein was substantially shaped by his reading of Aristotle, the ancient Greek philosopher who also devoted a great deal of time to discussions of being. While writing Being and Time, Heidegger rejected not only the traditional account of the separation between subject and object, which had existed in much of Western philosophy, especially after the seventeenth century, but also the philosophy of his time.

Being and Time is among the most important works produced in European philosophy in the twentieth century. It reinvigorated the study of ontology, the philosophic study of being and the nature of existence. Its publication made Heidegger a prominent public intellectual, but his sudden status had consequences. In 1933, Heidegger joined the Nazi Party and was elected the rector of Freiburg. Although he resigned a year later, numerous scholars have called attention to his support for the Nazi regime. This has presented a problem for posterity: are a philosopher’s arguments tainted by his politics or may the two be evaluated independently? CRD

1927

Uncertainty Principle

Werner Heisenberg

A challenge of measurement inherent in the quantum mechanical description of nature

Also called the principle of indeterminacy, the uncertainty principle was formulated in 1927 by Werner Heisenberg (1901–76) in the midst of debates between matrix and wave mechanic interpretations of quantum theory. Heisenberg showed that imprecision was unavoidable if a person tried to measure the position and momentum of a particle at the same time, and furthermore that the more accurately a person could determine the position of a particle, the less well known its momentum would be (and vice versa).

Heisenberg’s insight is an observation relating to waves in general. While we can measure either the frequency of a wave or its position, we cannot measure both at the same time. The model of reality proposed by quantum physics, which invokes a wave-particle duality, implies an inherent uncertainty and complementarity in the structure of the universe.

“Not only is the universe stranger than we think, it is stranger than we can think.”

Werner Heisenberg

While uncertainty is present in any subatomic system, the principle is also related to the wider issue of how actual observation and measurement of the quantum world influences that world. Quantum particles themselves are affected by light, among other things, and through testing and measuring we inevitably disturb a given system. These insights in turn led Heisenberg back into the consideration of uncertainty as an ontological reality, not merely a result of observation. A superposition of states is resolved on observation (the “wave function” collapses). Heisenberg wrote, “I believe that one can formulate the emergence of the classical ‘path’ of a particle pregnantly as follows: the ‘path’ comes into being only because we observe it.” JD

1927

The Big Bang Theory

Georges Lemaître

The universe was originally a tiny, dense point that rapidly expanded into its current state

An artwork showing the universe’s evolution from the Big Bang (top) to the present day (bottom).

The Big Bang theory of cosmology states that the entire universe began its existence as a singular, incredibly hot point with infinite density and no volume. From this singularity, the universe rapidly expanded and cooled, eventually reaching the state it is in today. This rapid expansion was not merely that of the planets, nebulae, or galaxies, but actually of space itself along with all it contains.

Questions of how humanity and the universe originated are as old as human thought. At the start of the twentieth century, scientists made great strides in understanding the universe, although its origins remained largely a mystery. In 1929, astronomer Edwin Hubble observed that large bodies in the universe were moving away from the Earth. Not only that, but they also appeared to be moving away from one another at the same time. Two years prior to that, Belgian astronomer Georges Lemaître (1894–1966) proposed the idea that the universe originated from a primeval atom, expanding over time. After Hubble released his observations showing that the universe was expanding, and inferred a singular moment and point from which it all began, astronomer Fred Hoyle later referred to the idea as a “big bang” in order to differentiate it from the image of the universe he supported, the “solid state” theory.

“ … all the matter in the universe was created in one big bang.”

Sir Fred Hoyle, astronomer

The Big Bang theory is the widely accepted cosmological model for how the universe came to be. Its impact extends well beyond astronomy and physics. It has met with much objection from religious thinkers, while others have adopted it as evidence of a single solitary moment of creation. MT

1928

Antibiotics

Alexander Fleming

The discovery of medication that destroys bacteria and prevents infection

Alexander Fleming, discoverer of penicillin, pictured in his laboratory in 1943.

Antibiotics are powerful medications or chemical compounds that do not allow bacteria to grow and spread. The human body is host to numerous types of bacteria that can cause harmful and potentially deadly infections. Antibiotics, often referred to today as antibacterials or antimicrobials, provide a way to fight these bacteria and prevent such complications.

Although humankind has used medicinal plants and concoctions for millennia, it was not until a purely accidental occurrence that the world of medicine had access to antibiotics. On September 3, 1928, Scottish bacteriologist Alexander Fleming (1881–1955) stumbled upon antibiotics when he discovered that one of his bacteria-coated glass plates contained a ring of fungus that was fending off the harmful bacteria surrounding it. That fungus was a species of the green mold Penicillium notatum and, following experiments to investigate the notion that it could be used to cure animals of infections without harm, penicillin became the first antibiotic.

It is inconceivable now to imagine modern medicine before the introduction of antibiotics, but prior to their widespread use beginning in the twentieth century, an infection of any kind could frequently prove fatal. Since Fleming’s discovery of penicillin, numerous other antibiotics have been discovered, invented, and improved upon. These drugs form a cornerstone of modern medicine, and their discovery and the later discovery of how to produce them in large numbers have played a key role in the development of modern pharmaceutical sciences. Penicillin alone—heralded as a miracle drug after it was introduced during World War II—has been credited with saving more than 200 million lives. While some bacteria have since developed resistances to antibiotics, these drugs changed the way we look at medicine, single-handedly demonstrating the promise that pharmaceuticals could deliver. MT

1928

Antimatter

Paul Dirac

Antiparticles that have the same mass as ordinary particles but have opposite charge

Paul Dirac demonstrates a quantum mechanical model of the hydrogen molecule in 1933.

In 1928 the English theoretical physicist Paul Dirac (1902–84) constructed a quantum theory governing the behavior of electrons and electric and magnetic fields. Although he was unable to appreciate it at the time, his theory required the presence of another particle possessing the same mass as an opposing electron but having a positive, rather than a negative, charge. The theorized particle was the electron’s undiscovered “antiparticle.” Dirac’s theory was soon proven in subsequent experiments by the U.S. physicist Carl Anderson (1905–91) in 1932, who photographed the curvature of particles as they passed through a magnetic field. It was Anderson who would rightly be credited with the discovery of the positron, the electron’s antiparticle and the world’s first undeniable proof of the existence of antimatter, and he was awarded a Nobel Prize, with Victor Francis Hess of Austria, for his work.

There is no inherent difference between matter and antimatter; the laws of physics apply almost equally to both. If they collide they annihilate one another and cease to exist. It was the start of the rewriting of an array of scientific textbooks. Since the beginning of physics everything everywhere was “matter—all that existed and all that we could see.” With the addition of antimatter, the definition of what constituted matter required narrowing. Antimatter was made of antiparticles, such as antineutrons and antinuclei. Antiprotons were later discovered at the Lawrence Radiation Laboratory in California in the 1950s, antihydrogen was found in 1995, and antihelium in 2011. Dirac had opened a quantum Pandora’s box, and although it was Anderson who found the positron, its discovery was foreseen in a paper published a year earlier, written by Dirac, in which he predicted an as yet unseen particle that he prophetically named the “antielectron.” BS

c. 1928

Well-formed Formula

David Hilbert and Wilhelm Ackermann

The appropriate and correct rendition of a word in a formal language

The concept of a “well-formed formula” (WFF) is a key element of modern logic, mathematics, and computer science. The idea of an appropriate and correct rendition of a word in a formal language is first explicitly mentioned in David Hilbert (1862–1943) and Wilhelm Ackermann’s (1896–1962) Grundzüge der Theoretischen Logik (Principles of Mathematical Logic) in 1928. A formal language can be thought of as identical to the set of its WFFs, and the words or symbols in formal languages are governed by the rules of the formal language. The set of WFFs may be broadly divided into theorems (statements that have been proven on the basis of previously established statements) and non-theorems. Formal languages are, more specifically, artificial logical languages, governed by a highly regulated set of rules. In the formal language of symbolic logic (the logic of propositions), for example, one basic example of a WFF is called modus ponens and looks like this:

“Mathematics … is a conceptual system possessing internal necessity.”

David Hilbert

If P, then Q

or P → Q

P

or P

Therefore, Q

or Q

We can fill in the P and Q using philosopher René Descartes’s well-known dictum: “If I think, then I am. I think. Therefore, I am.”

All formal languages must have syntax, concrete definitions of not only the vocabulary of language’s vocabulary but also the strings of vocabulary must count as WFFs. The concept of a WFF has become well known, for example, in the game WFF ‘N PROOF: The Game of Modern Logic designed to teach symbolic logic to children. JE

1929

A Room of One’s Own

Virginia Woolf

A feminist text that urged women to seek their own independence

Penned by English novelist Virginia Woolf (1882–1941), one of the founders of modernism and one of the most influential of all English-speaking female authors, A Room of One’s Own (1929) is an extended essay drawn from a series of lectures the author gave at Newnham and Girton colleges in Cambridge, England, in 1928, and it remains a landmark of twentieth-century feminism.

In it Woolf lamented the lack of legal rights for women from the Elizabethan period (1558–1603) onward. Mostly the essay is a critical history bemoaning the dearth of women writers—“It would have been impossible, completely and entirely, for any woman to have written the plays of Shakespeare in the age of Shakespeare”—while at the same time praising the work of Jane Austen and the Brontë sisters. Woolf asks readers to accept the fact that women sometimes fall in love with women, and rails against society’s embedded patriarchal and hierarchical structures. Where there are gaps in the factual records about women, Woolf draws upon fictional accounts to fill them in an attempt to give balance to a history written almost entirely by men.

“A woman must have money and a room of her own if she is going to write.”

Virginia Woolf

The essay employs a fictional narrator, a woman named Mary, itself an oblique reference to a sixteenth-century figure who insisted on living outside of marriage, rejected the ideal of motherhood, and was eventually hung for her convictions. Woolf has been criticized by some for excluding colored women from her narrative, but her overriding concern was always the need for women writers—and by default all women of all colors and backgrounds—to have their own independence. BS

1930

Hays Code

Will H. Hays

A self-regulated code of ethics that limited free expression in the U.S. movie industry

In the first decades of the twentieth century there were more than forty separate censorship boards spread across the United States, all charged with the task of making sure the nation’s filmmakers produced movies that did not stray beyond their myriad sets of moral boundaries. In 1922 the Motion Picture Association of America (MPAA) was formed and the first tentative steps toward self-regulation were taken—fueled by rising complaints regarding questionable content and high-profile scandals involving Hollywood personalities—with the establishment of the Movie Picture Production Code, also known as the Hays Code, in 1930. It was named for the MPAA’s first president and chief censor, U.S. lawyer and political figure Will Hays (1879–1954), who was one of its authors along with other MPAA censors.

For the next thirty years, the Hays Code stipulated what was morally acceptable on the big screen. A cartoon might be banned because it made light of violent crime; nudity and overt sexual behavior were not permitted; the mocking of religions was banned, as was drug use unless essential to the plot. Homosexuals were unheard of, and sex symbols kept their clothes on.

The Hays Code did not apply to the theater, which meant playwrights could broach subjects that Hollywood could not. In the 1960s, when European cinema became increasingly popular, there was pressure for the code to be relaxed; Hollywood was also under pressure to compete with television for viewers. The code was eventually abandoned in the late 1960s and was replaced with the MPAA film rating system in 1968. The Hays Code has been criticized by some film critics for being unreasonably draconian. However, it should not be forgotten that despite its regulatory grip Orson Welles made Citizen Kane (1941) and Billy Wilder gave us The Lost Weekend (1945), the first Hollywood film on alcoholism. BS

1930

Sex Reassignment Surgery

Kurt Warnekros

The development of surgical techniques to help transsexuals to be themselves

Ex-GI Christine Jorgensen, who underwent sex reassignment surgery in 1952, photographed with her fiance.

Sex reassignment surgery (SRS) is not a new concept. Operations were performed in ancient Greece and sexually permissive Rome. Also known as gender reassignment, SRS refers to a surgical procedure whereby a transsexual’s gender appearance and/or genitalia are altered to mirror those of the opposing sex and of the person they inwardly feel themselves to be. The first recipient of male-to-female surgery in the twentieth century was a German, Lile Elbe, who was likely born intersexed (possessing a reproductive system not consistent with males or females). In Dresden in 1930 Elbe had her testicles removed by German gynecologist Dr. Kurt Warnekros (1882–1949). Four further operations followed over the next two years, including the removal of her penis and the transplanting of ovaries and a uterus, but Elbe died in September 1931 from complications shortly after the fifth procedure.

Contemporary SRS as we now know it, however, only began when new hormonal drugs and surgical procedures—including penile inversion surgery pioneered by French plastic surgeon Georges Burou—became available in the 1950s. Surgeons also began constructing vaginas using skin grafts obtained from patients’ buttocks. In 1952 Christine Jorgensen became the United States’ first nationally known transsexual after the New York Daily News printed her story on its front page under the banner headline: “Ex-GI Becomes Blonde Beauty.”

In the 1960s finding surgeons capable of performing such complex, groundbreaking surgery was not easy and led to many transsexuals resorting to desperate measures. Some, such as the pioneer transsexual Aleshia Brevard, sliced off their own testicles. “In mid-operation I was left alone on a kitchen table,” Brevard would later recount, “draped with Lysol-scented sheets. I sat up and finished my own castration.” BS

c. 1930

Rastafarianism

Marcus Garvey

A religious movement combining Christianity and pan-African consciousness

When he became emperor in 1930, Haile Selassie I was hailed as the new African messiah by Rastafarians.

Rastafarianism, often referred to as Rastafari, holds that the Judeo-Christian God, referred to as Jah, manifested in human form as both Jesus Christ and Emperor Haile Selassie I of Ethiopia. For Rastafari adherents there is no afterlife, but instead heaven exists on Earth in Africa, or Zion. Physical and spiritual immortality, or ever living, is possible for true believers, while resisting the power of white Europe and the United States (Babylon) is essential and accomplished by holding oneself as a humble, spiritual, and peaceful person.

Rastafarianism began in the 1930s in Kingston, Jamaica, when Marcus Garvey (1887–1940) started a black nationalist movement to teach Jamaicans about their origins as Africans and take pride in their roots. Garvey believed that the descendants of African slaves who lived in Jamaica and other parts of the Western world were the true Israelites, and that a new, African messiah would arise. When Ras Tafari Makonnen (1892–1975), after whom the movement named itself, became emperor of Ethiopia in 1930, many of Garvey’s followers believed that this was the sign of their impending return to Africa and that he, renamed Haile Selassie, was the physical embodiment of God.

“Rastafari means to live in nature, to see the Creator in the wind, sea, and storm.”

Jimmy Cliff, reggae musician

Today there are fewer than an estimated one million Rastafarians worldwide. Followers are often associated with their use of marijuana, a plant that naturally grows in Jamaica and is used for medicinal and spiritual purposes. With no formal church, tenets, or dogma, Rastafarianism gave its followers a spiritual, peaceful ethos, one many nonadherents became familiar with through Jamaican music, such as ska and reggae. MT

c. 1930

Salsa

Cuban Dancers

A Latin American style of dancing with Afro-Cuban roots

Salsa (a Spanish word literally meaning “sauce”) is a Latin American dance, usually performed to salsa music, a hybrid musical form. Salsa is a fusion of Caribbean and European elements and has its origins in the Cuban son and Afro-Cuban styles of dance. The son (which combines Spanish guitar playing with the call and response tradition of African music) originated in rural eastern Cuba and spread to Havana in the early twentieth century. Historians disagree about the exact beginnings of salsa, however, musicologist Max Salazar has traced the origin of the word “salsa” to the early 1930s, when Cuban musician Ignacio Piñeiro composed “Échale salsita,” a Cuban son about tasteless food. Others suggest that the word was used as a cry to the band to up the tempo for the dancers or as a cry of appreciation for a particular solo. Salsa dance is performed by couples in four/four time, with quick, fluid movements and rhythmic footwork. It has become internationally popular since the 1980s, and continues to evolve and incorporate elements of other styles of dance.

“Salsa was an unmistakeable product of the pan-Caribbean.”

Deborah Pacini Hernandez, anthropologist

Western culture has gradually assimilated Cuban dance in general since the mid-1800s, including the conga, cha-cha, rumba, and various mambos. Cuban culture itself has absorbed a number of Iberian and African influences in its complex history, and such an interplay had produced the son montuno (a subgenre of son Cubano) style of dance, which became internationally popular in the 1930s (and was often mistakenly called rumba). This became one of the primary bases of contemporary salsa dance. JE

1930

Oscillating Universe

Albert Einstein

Cosmological theory of an eternally expanding and contracting universe

The oscillating universe theory, briefly considered and then reconsidered by Albert Einstein (1879–1955) in 1930, claims that the universe is in the midst of an endless cycle of expansion and contraction. The galaxies are expanding but will eventually reach a point of exhaustion whereby they will begin to collapse inward to a “Big Crunch” before exploding in another “Big Bang.” The universe we currently exist in may be the first or it may be the latest in a series of collapses and rebirths.

There are a variety of cosmological hurdles to overcome for the adherent of the oscillating universe theory: there is not sufficient mass in the universe to generate the gravitational force to bring it all back for a “Big Crunch”; the speed at which galaxies are moving is simply too great for their direction to be reversed; and reduced levels of nuclear fuel will result in a loss of mass that will reduce the gravitational pull of galaxies and make a reversal impossible, and ultimately all nuclear fuel in every galaxy will be exhausted and the universe will grow cold and die in a “Big Freeze.”

“The universe is older and lighter weight than has previously been thought.”

American Astronomical Society

Arguing the merits of an oscillating universe pits one against an array of impressive science. The American Astronomical Society says the universe is far lighter than previously supposed and that “galaxies and their stars are unlikely to stop expanding.” Princeton University states: “There’s simply not enough matter to close the universe,” while the Harvard-Smithsonian Center for Astrophysics says it is “95 percent confident” there is nothing to keep the universe from continuing to balloon out forever. BS

1931

Incompleteness Theorem

Kurt Gödel

A theory proving that there are limits to what can be ascertained by mathematics

After 2,000 years of being unable to prove Euclid’s incontestable five postulates, including the idea that straight lines can be drawn infinitely in opposing directions (true but annoyingly difficult to demonstrate), the last thing mathematicians wanted published was a “theory of incompleteness.” Obsessed with a need to prove everything and so dominate the world, the twentieth century had started in a wave of optimism—huge strides in mathematical theory were being taken and it seemed that, at last, the academic world was closing in on a unifying, all-encompassing “theory of everything.” The Austrian mathematician Kurt Gödel (1906–78), however, put the brakes on that optimism when he proved his theorem in 1931. “Anything you can draw a circle around,” he suggested, “cannot explain itself without referring to something outside the circle—something you have to assume, but cannot prove.”

Gödel had taken the “I am a liar” paradox (if you are a liar then saying you are a liar is itself a lie) and transformed it into a mathematical formula. And if you accept that no equation has the capacity to prove itself, it must follow that there will always be more things in life that are true than we are able to prove. And, at a fundamental level, any conceivable equation will always have to be constructed over assumptions, at least some of which will be improvable.

According to Gödel’s theory, if you draw a circle around the universe itself, then the universe cannot explain itself. To make sense of the universe, there must be something outside of it, something that we have to assume is there but whose existence we cannot prove. Gödel’s circle may seem a simple construct but, as he acknowledged, it leaves the door open for the argument to turn decidedly theological. If God is not on the far side, then what is? BS

1931

Swing Music

African American Musicians

A style of exuberant music that became the sound of a depression-hit generation of Americans

Duke Ellington (center), pictured in 1939, had a big hit in the 1930s with “It Don’t Mean a Thing (If It Aint Got That Swing).” It was probably the first song title to reference the musical style.

Taken from the phrase “swing feel,” which was used to describe an offbeat or weak musical pulse, swing music began to take shape in the United States in the early 1920s when subtle, stylistic rhythm changes started to surface from within the established jazz repertoire, most notably among stride pianists. By the mid-1930s, swing had become a musical style all of its own with a sort of rhythmic “bouncing groove,” providing a new kind of energetic intensity that was a welcome relief from the everyday tribulations of the Great Depression. Some historians have traced swing music’s stage debut to 1931 and the music of the Chick Webb Orchestra, the house band of Harlem’s Savoy Ballroom in New York City.

“Ah, swing, well we used to call it syncopation then they called it ragtime, then blues, then jazz. Now it’s swing. White folks yo’all sho is a mess.”

Louis Armstrong, musician

The small ensembles common to jazz, often including a single clarinet, trombone, trumpet, piano, bass, and drums, had to be expanded to accommodate swing music’s more powerful and complex sounds: instruments were doubled, trebled, or more to give a section three or four trumpeters and a similar number of trombonists and saxophonists. All of a sudden there was more of everything; the era of the Big Band—of Duke Ellington’s “It Don’t Mean A Thing (If It Ain’t Got That Swing)”—had arrived.

More a style of music than a genre, swing is more than the sum of its technical parts, more than just music comprised of “triplet note subdivisions.” Its definition is more cultural than technical, with a sound that still makes people want to get up and dance, to move their bodies with energy and rhythm. Swing bands had strong rhythm sections with an emphasis on basses and drums, which were used as an anchor for the woodwinds, brass, and occasional stringed instruments such as violins or guitars. In the hands of later masters, such as Glenn Miller and Tommy Dorsey, it was loved by World War II’s younger generation. BS

1931

The American Dream

James Truslow Adams

The idealistic belief that all individuals have the opportunity for prosperity and success

Margaret Bourke-White’s photograph (1937) highlights the social divide of the Depression-era United States.

The phrase “the American Dream” may have only entered the popular lexicon with the publication in 1931 of The Epic of America by historian James Truslow Adams (1878–1949), but the idea was as old as the Declaration of Independence itself, embedded there in Thomas Jefferson’s grand statement that “all men are created equal,” and “endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the pursuit of Happiness.” Abraham Lincoln then furthered the idea of a comfortable life for all in the 1860s by encouraging people to believe that through hard work and enterprise, everyone could achieve comfortable, middle-class lives. By the time Adams coined his well-known phrase, the United States had long been puritan lawyer John Winthrop’s “city upon a hill,” the “rags-to-riches” land of nineteenth-century children’s author Horatio Alger, and the land of opportunity for almost everyone else.

What Adams meant by “dream” was not the pursuit of wealth or status, despite finishing his manuscript in the midst of a “can-do” period, on the day President Herbert Hoover threw a switch in New York City to light up the just completed Empire State Building. Adams’s dream was of a nation with an egalitarian social order, in which people were free to realize their individual potential and be accepted for what they were regardless of class, religion, or race: “that dream of a land in which life should be better and richer and fuller for everyone, with opportunity for each according to ability or achievement.” Therein lay the problem. The idea was as diverse as the nation itself, and for African Americans for 200 years it was an unattainable goal created by whites for whites.

Keeping Adams’s national ethos alive in the twenty-first century is proving difficult. Inequality is rife, and the gap between rich and poor in the United States is as great as ever. BS

1932

Planned Obsolescence

Bernard London

The practice of deliberately designing products with a limited useful life

Discarded computer hardware pays testament to the built-in obsolescence inherent in the computer industry.

It is an idea as old as the Industrial Revolution (1760–1840): design a product with a built-in deficiency so that it becomes unusable or obsolete within a predetermined period of time. In effect, an item is “made to break” or “die.” In the 1930s executives at General Electric investigated how they might be able to decrease the lifespan of their lightbulbs, and automobile makers began discussing how they might be able to build cars with approximate “use-by” dates.

The term “planned obsolescence” entered the lexicon for the first time in 1932 in a pamphlet titled “Ending the Depression through Planned Obsolescence” written by a Manhattan real estate broker, Bernard London, who advocated writing obsolescence into law in order to stimulate domestic consumption during the Great Depression.

Planned obsolescence decreased in popularity in the late 1950s, however, as ethics caused manufacturers to see themselves as con men, cheating workers out of their hard-earned money by selling them so-called “death-dated” products. Times have changed, however. Planned obsolescence is still with us, and it is becoming increasingly sophisticated. Toothbrush manufacturers, for example, now design their bristles with colorings to indicate when should be replaced, whether they need to be or not.

Obsolescence, though, is not always “planned.” There is the obsolescence of function; for example, when the starter motor was introduced in 1913, all cars before it became obsolete overnight. There is also the obsolescence of desirability: when a product becomes dated and falls out of favor for aesthetic reasons. However, it is only planned obsolescence that brings with it ethical and moral considerations. Should products be intentionally made to fail so that businesses can sell more products and increase their profits at the expense of the hapless consumer? BS

1932

Neoliberalism

Alexander Rüstow and Louis Rougier

A movement of free market economics that rose from the turmoil of the Great Depression

“Neoliberalism” entered the English language in 1932, when the German socialist and economist Alexander Rüstow (1885–1963) used it at a Dresden conference to describe a new generation of liberalism with beliefs that contrasted starkly to the civil libertarians and free marketeers of classical liberalism. Rüstow’s neoliberalism developed out of the new economics of John Maynard Keynes, which were formed in the early 1930s in response to the failure of traditional economic theories to halt or correct the effects of the Great Depression.

The core principles of neoliberalism were the freeing of private enterprise from all government and bureaucratic shackles, the abolition of any regulation that might hinder the creation of profits, reduced public expenditure on social services and infrastructure, the selling-off of state-owned enterprises into private hands, and replacing the idea of “community” with that of individual responsibility. Although espoused by many leading economists in its heyday, by the 1960s neoliberalism was advocated by only a small minority.

“The very design of neoliberal principles is a direct attack on democracy.”

Noam Chomsky, Hopes and Prospects (2010)

Neoliberalism found its earliest organized expression in the Colloque Walter Lippman, a congress of twenty-six handpicked businessmen, economists, and civil servants who met in Paris in 1938 to push the neoliberal agenda. The movement was also given impetus by the failure of the French left-wing coalition, the Front Populaire, to transform France’s existing social and economic structures. It was by no means a unified theory. Some called it, simply, “individualism,” others “positive liberalism,” with the term “neoliberalism” only emerging for reasons of strategy and commonality. BS

1932

Dark Matter

Jan Hendrik Oort

A type of matter hypothesized to account for a large part of the universe’s total mass

In 1932 the Dutch astronomer Jan Hendrik Oort (1900–92) turned his telescope toward visible stars in the disk of our Milky Way galaxy and began to observe the orbital velocities of stars within distant cluster galaxies. In each case the stars that Oort observed appeared to be traveling too fast considering the combined gravitational pull from the systems in which they were embedded. He concluded that there had to be some form of hidden gravitational force dragging them onward, some kind of hidden matter.

Simply put, dark matter is the unaccounted for, yet understood to be hidden, “stuff” of the universe, the presence of which can only be inferred by studying the gravitational effects it has on observable celestial bodies. Because it neither emits nor absorbs light or radiation, it cannot be seen, and its precise nature is still only being guessed at.

“ … the riddle is that the vast majority of the mass of the universe seems to be missing.”

William J. Broad, The New York Times (1984)

The year after Oort’s discovery, the Swiss astronomer and physicist Fritz Zwicky (1898–1974) correctly concluded, after examining the redshift in galaxies in the Coma cluster, that the universe seemed to be made up mostly of what he called “missing mass.” Zwicky, who mapped thousands of galaxies that presently make up the Zwicky Galaxy Database, was the first to measure evidence of unseen mass using the virial theorem of kinetic energy.

A component of the universe, dark matter—currently thought to be a new kind of subatomic particle the discovery of which is the great quest of particle physics—is estimated to make up almost 85 percent of the matter in the universe. BS

1933

Religious Humanism

The Chicago 34

A new philosophy dispenses with God and puts humankind in charge of its own fate

In the 1920s a new philosophy and assortment of ethical perspectives began to emerge among academics in the United States. It represented a radical challenge to established religious beliefs and attitudes that, it was thought, had lost their significance in the twentieth century and were no longer capable of solving many of the complexities and dilemmas of life. In 1933, thirty-four humanist scholars and academics at the University of Chicago put their signatures to, and then published, A Humanist Manifesto, outlining the tenets of this new worldview: the universe is self-existing and not created, man emerged from nature, self-realization should be everyone’s ultimate goal, religion is little more than a mix of wishful thinking and sentimental hopes, and wealth needs to be evenly distributed to create equitable societies and to foster the common good. It was man, with his free will, intelligence, and power, not God, who was solely responsible for the realization of his dreams and destiny. This new philosophy was called, ironically perhaps, religious humanism.

“Humanism is a philosophy of joyous service for the greater good of all humanity.”

Linus Pauling, Nobel Prize-winning scientist

Religious humanism, like the religions it hoped to replace, had its own tenets, doctrines, and articles of faith (such as an unquestioning belief in evolution, and the separation of church and state), though many of its adherents would argue that it is more about perspectives and attitudes than dogma. Although humanist philosophies predate Socrates, the development of an organized school of thought has been surprisingly late in flowering. Today there are an estimated five million secular humanists throughout the world. BS

1934

Falsifiability

Karl Popper

An argument that unless a theory has the capacity to be falsified, it cannot be science

In science, falsifiability means that if something has the capacity to be false, or to be disproved, then rigorous testing will, sooner or later, demonstrate it to be so. If a theory is not capable of being proven false, then scientists should place it in the category of pseudoscience. For example, the statement that you would be a lot happier if you were twenty years younger cannot be falsifiable because it is not possible to make ourselves younger and so test the hypothesis.

The influential Austrian-born philosopher Karl Popper (1902–94) first introduced his theory of potential falsifiability in Logik der Forschung (The Logic of Scientific Discovery, 1934). Popper held that in science first comes observation, then speculation, then falsification. Popper was uncompromising on the issue of falsifiability and not even the giants of the scientific world escaped his scrutiny. Himself a “critical rationalist,” Popper criticized neurologist Sigmund Freud (1856–1939) for what he considered his “unscientific methods,” and even called to account Darwinism, saying that evolution was “not a testable scientific theory but a metaphysical research program,” little more than a framework for future scientific inquiry.

“Our theory of evolution … cannot be refuted by any possible observations.”

L. C. Birch and P. R. Ehrlich, Nature (1967)

Popper rejected confirmation as a means of theory proving, using his “white swan” concept as an example. No matter how many white swans are observed, it is unscientific to say that “all swans are white.” To verify the statement, someone would have to travel the world and catalog and account for every swan in existence. Sheer logistics mean the statement cannot be falsified. Just one black swan is all a person needs to find. BS

1934

Weak Force

Enrico Fermi

A fundamental force involved in the radioactive decay of subatomic particles

Enrico Fermi photographed in 1948 with equipment for the particle accelerator at the University of Chicago.

One of the four fundamental forces in nature, the existence of weak force was first hypothesized by Enrico Fermi (1901–54) in 1934. It is responsible for the decay of nuclear particles and is essential for the nuclear fusion that causes the sun’s heat. It was also crucial in the process of nuclear synthesis that produced the heavy elements of which we are all composed.

The discovery of elementary particles and their interactions has often been triggered by difficulties in explaining the behavior of already observed particles. With the discovery of the neutron in 1932, the nucleus of the atom was known to consist of protons, neutrons, and electrons. Yet, it soon became apparent that the actions of these particles could not be adequately described by all the forces then known. Radioactivity occurs when the nucleus decays, and the process is explained by electromagnetic force; but that force could not account for a form of radiation called beta decay, in which a neutron is changed into a proton. As strong nuclear force was already known, this led Fermi to propose a weak form that would involve a particle called neutrinos, which was finally detected in 1956.

“The weak force operates in a variety of radioactive transmutations …”

Dr. Paul Davies, New Scientist (August 9, 1979)

The weak force was still not fully described unless other particles could be found to mediate these weak interactions. The existence of these W and Z bosons was confirmed by particle accelerator experiments in 1983. These high energy experiments have also shown weak force to be remarkably powerful over very short distances, having a strength similar to the electromagnetic force. This led to a new theory that unifies these two forces as the electroweak force. TJ

1934

Orgone Energy

Wilhelm Reich

A hypothetical life force composed of a massless, omnipresent substance

One of the more radical figures in psychiatry, the Austrian-born psychoanalyst Wilhelm Reich (1897–1957) put forward the notion that there was a secret, unseen energy present throughout the universe, which he called “orgone energy”—a combination of the words “orgasm” and “organism,” hardly a surprise, perhaps, for a man who was a disciple of the Freudian idea that libidinal energy was the energy of life itself. Reich, however, carried Sigmund Freud’s theory to bizarre extremes, isolating himself from mainstream psychiatry after massaging his disrobed patients in order to help dissolve their “muscular armor.”

Orgone, which has never been proven to exist, is supposedly an omnipresent form of energy, a life force that is in constant motion and is, according to Reich, responsible for everything from the color of the sky and the presence of gravity to the failure of political revolutions and the satisfaction inherent in orgasm. He claimed he had seen transitional, microscopic beings that he called bions (which only staunch orgonomists can see). He claimed to have invented a new science, and felt that mainstream scientists only attacked his work because they felt it too “emotionally disturbing.”

“It is sexual energy which governs the structure of human feeling and thinking.”

Wilhelm Reich

Reich also constructed what he called “orgone accumulators”—metal and organic-lined boxes that he said “trapped” orgone energy and harnessed it for use in biology, medicine, and even meteorology. Reich died in the Federal Penitentiary in Lewisburg, Pennsylvania, on November 3, 1957—imprisoned for criminal contempt when he refused to stop selling his discredited “orgone accumulators.” BS

1935

Impossible Object

Oscar Reutersvärd

Illusional drawings that challenge our perceptions as to what is real—while the eye accepts them as eminently feasible, the mind knows them to be impossible

A concept derived by Swedish artist Oscar Reutersvärd, this figure has become known as the Penrose triangle, named for the mathematician, Roger Penrose, who popularized it.

An impossible object is an undecipherable figure, an optical illusion represented by a two-dimensional drawing that our brain immediately perceives as three-dimensional despite its existence being a geometric impossibility. Impossible objects can take many forms. They can include ambiguous line drawings, such as the Necker cube, possibly the world’s first impossible object, which was drawn by the Swiss crystallographer Louis Necker in 1832. They can present false perspectives, such as parallel lines seeming to be parallel when they are in fact tilted, or depth ambiguity as seen in the well-known Penrose stairs, drawn by the English mathematical physicist Roger Penrose. Comprising a set of steps that connects back onto itself with four ninety-degree turns in a continuous loop, the drawing made it appear possible to infinitely either ascend or descend the staircase without, in the process, actually going up or down.

“The impossibility of most impossible objects is not immediately registered, but requires scrutiny and thought before the inconsistency is detected …”

Irving Biederman, neuroscientist

The “father of the impossible object,” however, was the Swedish artist Oscar Reutersvärd (1915–2002), who drew his first object—the “impossible triangle”—during an idle moment in a Latin lecture at Stockholm University in 1934, and which Penrose would later describe as “impossibility in its purest form.” Reutersvärd went on to produce more than 2,500 such drawings over his lifetime.

Impossible objects have garnered the attention of psychologists because they provide insights into the nature of perception. Computer scientists use them to develop models of our perceptive abilities, and mathematicians study their abstract structures. Moreover, they are whimsical conundrums, tricking our minds by initially giving us the impression of a known spatial object before their subtle geometry begins to confound us with images that at first seem possible, and yet are fundamentally and forever ambiguous. BS

1935

Alcoholics Anonymous

Dr. Robert Smith and William B. Wilson

An organization established to assist alcoholics with their recovery

Alcoholics Anonymous (AA) is an organization whose fundamental intuition is that addicts require a conversion moment. This breakthrough is needed in order for them to recognize their own fallibility and relinquish control to God, a higher power, or whatever they conceive that to be. Recovery, one day at a time, involves the support of other members but with their identities protected. Former addicts can feel they are in a safe space to consider how to redress for past harm.

On June 10, 1935, Dr. Robert Smith (1879–1950), an alcoholic surgeon, finally succumbed to the urgings of his new friend and former alcoholic, William B. Wilson (1895–1971), to turn his fight with alcohol over to God. This became the founding day of the organization. Medical expertise had shrugged its shoulders, admitting that clinical treatment of alcoholics was doomed to failure and the only “cure” was a spiritual awakening. “Bill W.” could personally testify to this when he experienced a flash of insight while attending the Oxford Group, a newly founded Christian sect. Both he and “Dr. Bob” acknowledged the need for a more ecumenical approach, and this led to Wilson developing his quasi-religious “Twelve Steps.”

In 1939 Wilson wrote Alcoholics Anonymous, known as the “Big Book” by AA members. The Big Book’s methods have been emulated by those seeking recovery from almost any potentially obsessive activity, from sex to shopping. How it works and who it works for, however, remain unanswered questions. Designed to be every bit as habit-forming as the initial addiction, its anonymity and “no dues or fees” rules keep it accessible and safe. A lack of hierarchical structure fosters a feeling of egalitarianism. The idea’s success may have nothing to do with the steps, however. It may derive, instead, from the powerful sense of belonging that membership creates. LW

1935

Keynesian Economics

John Maynard Keynes

A macroeconomic model that supports government intervention in fiscal policy

It is difficult to underestimate the significance of the economic theories of John Maynard Keynes (1883–1946). Finally, after 150 years of the laissez-faire economics of the eighteenth-century founder of contemporary economics Adam Smith, along came a set of principles, honed in the fires of the Great Depression, that turned traditional economics on its head. Keynes outlined his ideas in The General Theory of Employment, Interest and Money published in 1935–36. Instead of allowing the market to govern itself, Keynes argued that governments needed to get more involved in economic management. They needed to spend more, regulate more, and intervene more. In depressions or downturns, governments should inject their economies with cash so that spending can increase. More money needed to be printed and pumped into society so that people would purchase and consume more. As far as Keynes was concerned, when spending increases so do earnings. In turn, higher earnings then lift demand and productivity, and pull economies out of recession. Once good times return, borrowed monies can be repaid.

“All production is for the purpose of ultimately satisfying a consumer.”

John Maynard Keynes

U.S. President Franklin Roosevelt put Keynes’s ideas into action in the mid-1930s with a massive public works program, but it was the United States’ entry into World War II in 1941 that seemed to validate his theories, boosting productivity and laying the foundations for U.S. postwar economic prosperity. Keynes’s theories are still debated today, and his approach of pump-priming the economy was returned to again during the recent global financial crisis. BS

1935

Lobotomy

António Moniz and Pedro Almeida Lima

A surgical procedure that severed the brain’s neural pathways with serious side effects

Dr. Walter Freeman, who specialized in lobotomy in the United States, performs the procedure in 1949.

A lobotomy, or prefrontal leukotomy, is a neurological procedure that involves severing the neural pathways that lead to and from the brain’s prefrontal cortex. The world’s first lobotomy on a human was performed in 1935 by the Portuguese neurophysicians António Moniz (1874–1955) and Pedro Almeida Lima (1903–85), who were inspired by a successful removal of tissue from a chimpanzee by two U.S. neuroscientists, John Fulton and Carlyle Jacobsen, earlier that same year. Moniz and Lima drilled two holes into the head of their patient, and injected ethanol into the prefrontal cortex. It was hoped the ethanol would interfere with the brain’s neuronal tracts, which were believed to encourage the processing of irrational thoughts common to the mentally ill. The paranoia and anxiety that the patient was suffering underwent a marked decrease after the groundbreaking operation, which would likely not have been possible were it not for Moniz’s own work eight years earlier while at the University of Lisbon developing cerebral angiography, a technique allowing for visualization of the blood vessels surrounding the brain.

Moniz, convinced that patients who exhibited obsessive compulsive disorders were suffering from “fixed circuits” in their brains, later designed an instrument called a leukotome, which he used to remove brain matter and so disrupt the neuronal tracts more efficiently. The U.S. neurologist Walter Freeman, impressed by Moniz’s work, performed the first lobotomy in the United States in 1936; he claimed that the debilitating effect the surgery had on patients needed to be weighed against a lifetime of padded cells and straitjackets. Freeman invented an ice pick-like instrument for performing the surgery. In 1949 Moniz shared a Nobel Prize for his pioneering work; however, the procedure began to fall out of favor in the 1950s with the introduction of a new generation of psychiatric (or antipsychotic) medicines. BS

1935

The Holocaust

Adolf Hitler

The state-sponsored killing of Jews and others by Nazi Germany

The arrival of a train transporting deported Jews at the Auschwitz death camp in Poland.

The Holocaust refers to the wholesale murder of six million European Jews and millions of others, including Gypsies, during World War II. The word was chosen (from the Greek meaning “sacrifice by fire”) because the ultimate manifestation of the Nazi killing program was the extermination camps where victims’ bodies were consumed in crematoria. The biblical word “Shoah” became the Hebrew term for the genocide in the 1940s.

Although the word “Holocaust” itself was only chosen later to describe the Nazi genocide, the ideology that led to it commenced with the Nürnberg Laws enacted on September 15, 1935. The Law for the Protection of German Blood and German Honor and the Law of the Reich Citizen were the centerpiece of anti-Semitic legislation and they were used to categorize Jews in all German-controlled lands. Categorization was the first stage of what the Nazis called “the final solution to the Jewish question.” Besides Jews, trade unionists and social democrats were among the first to be arrested and incarcerated in concentration camps. Gypsies were the only other group that the Nazis systematically killed in gas chambers alongside the Jews. In 1939, the Nazis instituted the T4 Program, euphemistically a “euthanasia” program, to exterminate mentally retarded, physically disabled, and emotionally disturbed Germans who violated the Nazi ideal of Aryan supremacy. After the invasion of Poland in 1939, German occupation policy sought to systematically destroy the Polish nation and society.

In the immediate aftermath of the Holocaust, the International Military Tribunal in Nuremberg, Germany, was constituted. In October 1945, the defendants were formally indicted for crimes against peace, war crimes, crimes against humanity, and conspiracy to commit these crimes. The Nuremberg tribunal set precedents in seeking justice, however inadequate, for victims of atrocities. BC

1935

The Radiant City

Le Corbusier

A radical new vision for the skylines of our modern cities

Le Corbusier photographed standing behind one of his high-rise architectural models in c. 1950.

In the 1920s the Swiss-born, modernist architect Le Corbusier (1887–1965) began formulating and refining his ideas on futuristic inner city living, with cruciform skyscrapers sheathed in glass, steel, and concrete that would, he believed, create a more efficient, classless society. In 1935 he published his ideas under the title La Ville Radieuse (The Radiant City).

Toronto in Canada embraced Le Corbusier’s vision in a number of high-rise communities with appealing names such as Regent Park and Parkway Forest. Set well back from surrounding streets and footpaths, residents felt no connection to their environment, inhabiting structures separated by broad, mostly empty streets that were in fact just driveways devoid of pedestrian traffic, shops, or any other kind of “pulse.”

“ … To bring back the sky. To restore a clear vision of things. Air, light, joy.”

Le Corbusier

Whereas most observers who looked at the skyline of 1930s Stockholm saw overwhelming beauty and grace, Le Corbusier saw only “frightening chaos and saddening monotony.” In 1925 he had proposed bulldozing most of central Paris north of the Seine and replacing it with his cruciform towers. In the 1930s and 1940s he attempted to implement his vision of an ideal city by building a series of unités—the housing block unit of his Radiant City. The best-known example of these was the Unité d’Habitation (Housing Unit), constructed in 1952. Although Le Corbusier’s designs were initially seen as utopian and geared to improving living conditions in urban areas, his work was later criticized for being soulless and totalitarian, and his vision has become associated with the alienating effects of modern urban planning and architecture. BS

1935

Schrödinger’s Cat

Erwin Schrödinger

A thought experiment used to question the status of subatomic particles

Schrödinger’s cat is a paradoxical experiment in quantum mechanics devised in 1935 by the Austrian physicist Erwin Schrödinger (1887–1961). It is a thought experiment designed as a response to the Copenhagen interpretation, which implies that subatomic particles can exist in two distinct states up until the moment they are observed. To demonstrate his own skepticism of the Copenhagen interpretation, Schrödinger conducted a hypothetical experiment using a Geiger counter, a radioactive substance, a flask, a hammer, some acid, a steel chamber, and a cat.

The cat is placed inside the chamber so it cannot be seen, along with the aforementioned items. If the radioactive substance decays sufficiently over the course of the test period, the hammer shatters the flask, which contains hydrocyanic acid (poisonous gas), and the cat dies. But has the flask been shattered? Is the cat dead? While the chamber remains sealed, it is impossible to know. According to superposition and the Copenhagen interpretation designed to explain it, particles can simultaneously exist in all their possible states, therefore, the cat inside the chamber is, until observed, both alive and dead. But how long is a superposition supposed to exist? At what point does the poor cat become either demonstrably alive or very dead? The Objective Collapse theory says superpositions end spontaneously, meaning the cat will be alive or dead long before the chamber is opened. In any case just because something may occur on the subatomic level hardly means it can be replicated with a cat. BS

“When I hear about Schrödinger’s cat, I reach for my gun.”

Stephen Hawking, physicist

1936

The Work of Art in the Age of Mechanical Reproduction

Walter Benjamin

An assessment of the impact of mass reproduction on the unique work of art

A photographic portrait of Walter Benjamin, taken in 1925. Tragically, he later committed suicide in 1940 while attempting to escape from the Nazis in France.

It may have been written more than seventy-five years ago, but the German literary critic Walter Benjamin’s (1892–1940) essay “The Work of Art in the Age of Mechanical Reproduction” (1936) remains a seminal and oft-debated work. Benjamin argued that the meaning of a piece of art alters as the means of creating it alter. Composed in an era of significant technological advancements, he saw the advent of mechanical (and therefore mass) reproduction as a progression from one-off pieces that had characterized art since its inception. Benjamin believed that greater production meant that more people could be introduced to art than ever before. Art, for the first time, would lose the “aura” it had acquired after centuries of depicting religious themes. No longer would it be there only for the privileged few; no longer would it be so authoritarian. It would, at last, be there for us all.

“For the first time in world history, mechanical reproduction emancipates the work of art from its parasitical dependence on ritual.”

Walter Benjamin

Benjamin felt it was important not to view reproductions, which could be mass-produced in books and other media, as possessing any less of the aura that was imparted to one-off masterworks by their inherent exclusivity. Consider photography, he told us. It makes no sense to make a single photograph from a photographic negative—there is no “original” photograph. Every identical image, no matter how many copies are made from that same negative, is as valid as the next. The same argument might be applied to film. People needed, in response to the new technological age in which they now found themselves, to alter how they saw and perceived art. In a time of extreme ideologies, such as fascism and communism, that used art as propaganda, Benjamin sought to de-politicize art, to make it beautiful—and truly universal. His arguments remain pertinent today, particularly in relation to contemporary debates about the opportunities for participation in art that are offered by electronic media. BS

1936

Turing Machine

Alan Turing

A hypothetical machine that provided the basis for modern computers

The Turing machine was a theoretical construct first described by its inventor Alan Turing (1912–54) in his paper “On computable numbers, with an application to the Entscheidungsproblem” (1936–37) According to Turing’s explanation, it showed that machines could provide calculations from “an unlimited memory capacity obtained in the form of an infinite tape marked out into squares, on each of which a symbol could be printed. At any moment there is one symbol in the machine; it is called the scanned symbol. The machine can alter the scanned symbol, and its behavior is in part determined by that symbol, but the symbols on the tape elsewhere do not affect the behavior of the machine.” In other words, Turing’s machine was able to read a paper tape and the symbols it contained, and carry out certain operations on the tape. The machine was also able to store a limited amount of information. This information enabled the machine to decide what to do as it scanned each symbol: change the information it had stored, write a new symbol onto the current tape cell, and/or move one cell left or right. The machine could also decide to halt; when solving a mathematical query, this was the point at which the answer had been reached.

“Everyone who taps at a keyboard … is working on an incarnation of a Turing machine.”

Time magazine (March 29, 1999)

Turing was a British mathematician and computer scientist who later worked at Britain’s codebreaking cypher school at Bletchley Park during World War II (1939–45). He created his “theoretical computing machine” to act as a model for complex mathematical calculations. It could be set to run indefinitely, to operate in a loop, or to continue until it arrived at a given set of conditions. By the time that digital computers began to be developed in the 1940s, Turing’s papers on how they would work were already ten years old. His machine was the first step in answering a fundamental question of computer science: what does it mean for a task to be computable? BS

1938

Nausea

Jean-Paul Sartre

A novel that is one of the canonical works of existentialism

The front cover of a 1977 edition of Jean-Paul Sartre’s La Nausée (Nausea, 1938).

Set in 1930s France, La Nausée (Nausea, 1938)—the first novel by the existential philosopher Jean-Paul Sartre (1905–80)—is an account of the life of fictional historian Antoine Roquentin, who after years spent traveling the world returns to his native France. He settles down in a seaside town with the intention of continuing his research into the life of an eighteenth-century diplomat, Marquis de Rollebon. However, the Marquis ceases to be a historical figure as Roquentin brings him from the past into the present, and absorbs his subject slowly into himself: “I could feel him,” Roquentin says, “like a slight glow in the pit of my stomach.”

The “problem” with Roquentin is that he is compulsively introspective, to the point at which he observes even himself as little more than an object. He strives to conform his consciousness to the world around him but fails. He becomes depressed, and a nauseous, “sweetish sickness” begins to descend upon him. In bringing the Marquis into the present in a quest for completeness, Roquentin—and Sartre—is rejecting, in true nihilist fashion, the psychoanalytic notion that to ignore the past is to annihilate one’s own roots. Roquentin, however, triumphs over the world’s indifference to man’s aspirations, and commits to using his freedom to create his own meaning.

“Nausea gives us a few of the … most useful images of man in our time that we possess.”

Hayden Carruth, poet

So, is Nausea a novel or a work of philosophy? French writer Albert Camus felt it an “uneasy marriage” of the two. The U.S. poet Hayden Carruth described it as a “proper work of art.” Certainly it acknowledges that life can be meaningless, but also that it can be imbued with meaning through the choices we make. BS

1938

People’s War

Mao Zedong

An uprising of the peasantry in a struggle to overthrow an entrenched ruling class

The idea of a “people’s war” emerged in its purest form in mid-1920s’ China, when the young revolutionary Mao Tse-tung (1893–1976) began to comprehend the latent potential for rebellion among China’s peasantry. It would not be until the late 1930s, however, that Mao’s thoughts on precisely what a people’s war should be, and how it should be fought, finally came together. His On Protracted War (1938) referred to three essential dogmas: first, that any rebellion would be rooted in the peasantry and promulgated from remote, rural bases; second, that the battles to be fought would be more political and ideological than military in nature; and third, that it would need to be a protracted war, with very clear stages and agreed long-term goals. Mao also believed people to be spiritual, strong, and resourceful beings, much more powerful than weapons, which he saw as lesser, “material” things

“Weapons are … not the decisive factor; it is people, not things, that are decisive.”

Mao Zedong

The primary goal of Mao’s people’s par was political mobilization, and the greatest challenge was to make his political objectives known to all. Mao was acutely aware that the peasantry yearned for a measure of economic prosperity, and so his economic reform program also had to be seen to offer the peasants a new sense of dignity and identity. It followed that the accomplishments of the leadership had to be made unambiguously clear to the people, as did its policies and objectives. The people were brought into the Communist Party fold by being made aware of, and being encouraged to identify with, the party’s aims and ideals. Thus they could see that their fate and the fate of the party were one and the same. BS

1939

The Birthday Problem

Richard von Mises

A mathematical problem with a solution that seems paradoxical

One of the best-known problems in the world of probability theory is what the U.S. mathematician Richard von Mises (1883–1953) called his birthday problem—calculating the likelihood of people in a given group sharing the same birthday. So what is the probability of two people sharing their special day, or three people, or four or more, and why does it matter? Prior to any analysis, assumptions need first be made. Leap years are ignored—the year in question is always 365 days long—and birthdays of the participants must be spread uniformly throughout the year. And so …

“ … we tend to mistake the question for one about ourselves. My birthday.”

Steven Strogatz, The New York Times (October 1, 2012)

Person A has a birthday, and the chance that Person B shares that birthday is 1 in 365. To calculate this we multiply their separate probabilities (365/365) x (1/365), which equals 0.27 percent. Adding more people increases the chance that we will find two with the same birthday, but not by much. Even with five people, the likelihood of a shared birthday remains less than 1 percent. These low probabilities seem logical to us, but the confounding beauty of the birthday problem is that our logic is soon exponentially challenged when a lengthy equation confirms that only twenty-three subjects are needed to reach a 50 percent probability (50.73 percent, to be precise) that two people in a group will share the same birthday.

Only twenty-three, when one has 365 possibilities? And here is why we find the birthday problem so confounding. Our brains cannot cope with exponential increases or decreases in probabilities. And it is counterintuitive. It has the look of a paradox when in fact it is a demonstrable, hardwired equation. BS

1940

MMPI

Starke Hathaway and John McKinley

A set of questions designed to provide revealing insights into our psyches

One of the most common personality assessment techniques in the world of mental health, the MMPI, or Minnesota Multiphasic Personality Inventory, was designed in 1940 to assess the psychopathology and personality traits of patients suspected of having various clinical and mental health issues and how they might benefit from possible treatments, including shock therapy. The original MMPI was based on more than 1,000 questions or “items” collected by its originators, psychologist Starke Hathaway (1903–84) and psychiatrist John McKinley (1891–1950) of the University of Minnesota, which they gleaned from psychology textbooks and clinical records of the time, before beginning on reducing them down to a “workable” 550.

“1. I like mechanics magazines

2. I have a good appetite …”

MMPI-2 true or false test questions

New scales to increase the accuracy in interpreting the data have since been added, and in 1989 the MMPI was revised as the MMPI-2 (comprising 567 items) to reflect our changing characteristics. The MMPI and its variants, though not perfect tools in the diagnosis of mental illness, nevertheless have retained their relevance. They have been used in court rooms, and as a screening technique in the selection of applicants for a variety of occupations, including airline pilots, police officers, and roles in the U.S. Secret Service. The application of MMPI-2 is widespread around the world, and it has been translated into more than thirty-four languages. A diagnosis should never be made on the strength of the MMPI alone, but it remains a key tool in determining a person’s degree of social or personal maladjustment. BS

1940

How to Read a Book: The Art of Getting a Liberal Education

Mortimer Adler

A guide to reading comprehension for the general reader

A photograph of Mortimer Adler in 1947, surrounded by his “Great Ideas”—a selection of 102 of the most important concepts in Western literature and thought.

U.S. philosopher and educator Mortimer Adler (1902–2001) received a scholarship to Columbia Univeristy in the 1920s; although he completed his undergraduate degree requirements in philosophy, he was never granted a BA. Nonetheless, he enrolled in the graduate program and was granted a PhD and became a philosophy of law professor at the University of Chicago. While teaching, Adler came to believe that students needed to be grounded in the “great ideas” embodied within the great classics. He was an advocate of reading and discussing great books in the pursuit of liberal education, and it was this that led him to publish his work, How to Read a Book: The Art of Getting a Liberal Education (1940).

“In the case of good books, the point is not to see how many of them you can get through, but rather how many can get through to you.”

Mortimer Adler

How to Read a Book is divided into four parts. Part one includes what Adler termed the first two levels of reading: elementary and inspectional reading. Part two contains the third level of reading: analytical reading. Part three tells how to read different types of literature, including practical books, imaginative literature, stories, plays, poems, history, philosophy, science, mathematics, and social science. The final part of the book is dedicated to the ultimate goals of reading. The first goal is synoptical reading—the reading of different works on the same subject with a view to constituting a general view of a subject; the last goal of reading is to expand one’s mind for further understanding.

Adler believed that, in a democracy, people must be enabled to discharge the responsibilities of free men and that liberal education is an indispensable means to that end. He wrote that the art of reading well is intimately related to the art of thinking clearly, critically, and freely. Adler described How to Read a Book as a book about reading in relation to life, liberty, and the pursuit of happiness. He went on to found the Great Books of the Western World series with Robert Hutchins in 1952. BC

1941

Three Laws of Robotics

Isaac Asimov

A fictional set of laws for governing the behavior of artificially intelligent beings

Runaround (1941), the short story in which Isaac Asimov first introduced his Laws of Robotics, was one of nine science fiction short stories featured in his collection, I, Robot (1950).

In 1941 the great science fiction writer Isaac Asimov (1920–92) wrote Runaround, the latest in a series of short stories on robots with positronic, humanlike brains in the era of space exploration. In Runaround, two astronauts and a robot, SPD-13, are sent to an abandoned mining station on Mercury. While there, a compound necessary to power the station’s life-giving photo-cells, selenium, affects SPD-13 and causes him to become confused and unable to operate under the three Laws of Robotics. These laws are: 1) A robot must not injure a human being or allow a human to come to harm, 2) A robot must obeys orders given to it by a human, except if that order conflicts with the First Law, and 3) A robot must protect its own existence as long as that does not conflict with the First or Second Laws.

“Many writers of robot stories, without actually quoting the Three Laws, take them for granted, and expect the readers to do the same.”

Isaac Asimov

Over the course of his career, Asimov’s position on the inviolability of the Three Laws varied, from seeing them as mere guidelines through to wholly uncompromising subroutines hotwired into the robot’s brain. The Three Laws lifted robots from the mindless cadre of Frankenstein-like machines and creatures with no guiding principles that had characterized horror and science fiction for decades, and gave them the capacity to wrestle with moral dilemmas. In 1985 Asimov added a fourth law, known as the “Zeroth Law,” to precede the First Law: a robot may not harm humanity, or, by inaction, allow humanity to come to harm.

Asimov’s laws were so highly regarded that it was thought his Three Laws would, in the real age of robotics to come, be a foundational stone in a positronic Brave New World. The reality is that no computer or robot has so far had the Three Laws built into its network. They were, and remain, little more than imaginary literary devices, designed to further the plot of some of the finest science fiction novels ever written. BS

1942

Epigenetics

C. H. Waddington

The study of modifications in genes that do not involve changes in DNA sequence

A computer artwork of a U1 Adaptor (orange) silencing genes and disrupting DNA code.

The biologist C. H. Waddington (1905–75) popularized the term “epigenetics” in 1942, calling it “biology which studies the causal interactions between genes and their products which bring the phenotype [an organism’s observable characteristics] into being.” Many geneticists at that time believed that there was a simple correspondence between an organism’s genes and its characteristics; Waddington, however, argued that the phenotype of an organism was determined by the interaction of many genes with each other and with the environment.

The definition of epigenetics has since altered over time, but it is safe to say that the science of modern epigenetics is the study of alterations in gene activity not derived from changes to the genetic code. Or, to put it another way, epigenetics attempts to understand what can influence the development of an organism other than the usual DNA sequencing.

How genes develop is governed by the epigenone, which is located just outside and above the genome. Epigenones are like “tags” on our DNA, telling genes how to behave and passing down information crucial to the genome’s development—information that is then passed down to following generations. For example, scientists know that if a pregnant woman eats poorly, her developing baby will suffer nutritional deficiencies after birth. However, could such transfers be carried a step further, to before pregnancy? How likely could the effects of a nutritional or dietary deficiency—say from a prolonged famine, for example—progress down through subsequent generations?

Epigenetics is helping us to understand how a parent’s diet or illness can in effect switch on or off their child’s genes long before the mother ever falls pregnant. Our genetic slates are no longer conveniently wiped clean with each successive generation. It is no longer a case of “you are what you eat.” BS

1942

The Myth of Sisyphus

Albert Camus

The notion that hoping to gain a sense of meaning from the universe is absurd

Destined to forever push a boulder up a hill, Sisyphus represents man’s futile search for life’s meaning.

In ancient Greek mythology, Sisyphus, a king of Ephyra (Corinth), was condemned to an eternity of futility and turmoil by the gods. He was forced forever to push a boulder up a steep hill, only to be compelled upon nearing the summit to roll it back down the hill; he therefore never completes his task. This endless repetitive toil that culminates only in failure summarizes everyone’s life, according to French philosopher and novelist Albert Camus (1913–60). The universe in which we live offers no chance of a life of meaning or purpose, although by understanding this we can obtain whatever possible measure of meaning there is.

Camus published his essay, The Myth of Sisyphus, in 1942. In it, he described how mankind can best live in the light of what he calls “the absurd.” Absurdity arises when humanity attempts to extract some meaning from a meaningless universe that offers only a life of endless toil followed by inevitable death. A life such as that, according to Camus, allows no purpose or meaning unless a person chooses to make an irrational leap of faith and adopt some religious credo. Any person who does not might otherwise conclude that the only real option is suicide.

In The Myth of Sisyphus, Camus argues that meaning can only come from accepting the toil and absurdity of life at face value. Yes, all life ends in death, and so it is only by being free to think and behave as we wish, pursuing our passions and rebelling against passive resignation to our certain fate, that human beings can achieve any true measure of happiness.

Camus’s myth of Sisyphus is often included, fairly or not, with existential philosophy and its reflections on the quality of being, or ontology. For anyone struggling to find meaning in a world perceived as fundamentally meaningless, Camus suggested that defiance of meaninglessness could be the foundation on which meaning could be built. MT

1942

The Manhattan Project

Franklin D. Roosevelt

A scientific research program established to develop the first atomic weapon

The Manhattan Project was the codename given to the U.S. research and development program that created the first nuclear weapons. It lasted from 1942 to 1945, resulting in the detonation of the first atomic device on July 16, 1945, at a site near Alamogordo, New Mexico.

In the summer of 1939, before the outbreak of World War II, President Franklin D. Roosevelt (1882–1945) received the Einstein-Szilárd letter, a plea from physicists Albert Einstein, Leó Szilárd, and Eugene Wigner warning him of the possibility that Nazi Germany could develop an atomic bomb. Within a year, the United States was conducting research into the feasibility of such a weapon. The project grew steadily over time, and in 1942 President Roosevelt signed an order directing the creation of a nuclear weapon. Eventually falling under the direction of General Leslie Groves (1896–1970), the scientific research team that designed and built the device was led by physicist J. Robert Oppenheimer (1904–67).

“Now I am become Death, the destroyer of worlds.”

J. R. Oppenheimer, on witnessing the first nuclear test

The moment the Manhattan Project succeeded, it irrevocably altered human history and ushered in the Atomic Age. On August 6, 1945, the United States dropped an atomic weapon on Hiroshima, Japan, followed three days later by another weapon dropped on Nagasaki, Japan. Four years later, the Soviet Union developed its own atomic weapon, and the Cold War that followed lasted for decades. By its end, several nations possessed nuclear weapons that could reach across continents, threatening nuclear annihilation at the push of a button. Though the Cold War has ended, the threat nuclear weapons pose to humanity remains. MT

1942

Carpooling

United States

The practice of sharing a car, usually for the daily commute to and from work

A wartime poster from 1942 by Weimer Pursell encourages Americans to join a carpool to help beat Hitler.

The concept of carpooling first arose in the United States in 1942. Oil was in short supply as a result of World War II, and so the U.S. government carried out a marketing campaign to encourage people to join car-sharing clubs as part of a general approach to rationing. Enthusiasm for carpooling waned after the end of the war, thanks to low fuel prices, but the practice was revived again in the 1970s following further fuel shortages. In response to the oil crisis of 1973, President Nixon’s administration enacted a number of measures to provide funding for carpool initiatives. This was later added to by President Carter, who introduced a bill that sought to create a National Office of Ridesharing.

The idea has since spread to more than forty countries, and its benefits are plain: reduced travel expenses, forgoing the need to purchase a second car, less overall fuel consumption, and less congested roads. Some countries have even designated carpooling lanes to encourage the practice, while businesses, too, have designated parking spaces for carpoolers.

“Carpooling is often described as the ‘invisible mode’ [of transport] …”

Paul Minett, Trip Convergence Ltd.

There were more than a billion automobiles in the world as of 2010, and in 2012, for the first time in history, the number of cars produced exceeded 60 million in a single year. This represents an inexorable upward trend that shows no sign of abating. Although carpooling does have its drawbacks, not least the issue of legal liability for the driver should passengers be injured in an accident, it still represents one of the more practical, immediate solutions to the problem of our dependence on fossil fuels, which reduce the quality of air in our congested cities. BS

1943

Grandfather Paradox

René Barjavel

A hypothetical problem posed by the practicalities of time travel, which questions whether a time traveler can perform an action that eliminates the possibility of that action in the future

The Hollywood movie Back to the Future II (1989) was part of a franchise exploring plotlines in which a visitor from the future potentially affects present-day—and therefore future—events.

One of the great theorectical problems inherent in time travel, the grandfather paradox was first explored in René Barjavel’s (1911–85) book Le Voyageur Imprudent (Future Times Three, 1943). It goes like this: you go back in time and kill your grandfather before he has children, thus making it impossible for your parents to have been born and, in turn, you. The conundrum is self-evident: how can killing your only chance of being born possibly result in you going on to exist? How could someone whose grandparents never conceived their own father or mother possibly come into existence?

“In our version of time travel, paradoxical situations are censored. The grandfather is a tough guy to kill.”

Seth Lloyd, professor of mechanical engineering

The grandfather paradox has been used to argue that backward time travel must be impossible. One proposed solution involves there being two parallel timelines in which each reality is possible. Another, murkier theory is the Huggins Displacement Theory, which consigns a time traveler who goes back in time, say, three years, to also suffer being displaced by three light years, which in turn means being prevented by Albert Einstein’s theory of relativity from doing anything that would tamper with his own timeline. And if you do not like those theories, there are plenty more. There is the Parallel Universe Theory, the Nonexistence Theory, and the Temporal Modification Negation Theory. Traveling back in time can be a confounding business.

Perhaps the best chance of rendering the paradox neutral comes from Seth Lloyd, a professor of mechanical engineering at the Massachusetts Institute of Technology. Lloyd’s experiments with photons and a quantum mechanics principle called postselection (a cosmic facility that selects paradoxical situations and ensures they do not occur) suggest that if anyone did go back in time, the laws of probability would so conspire against them as to make their task of doing away with a grandparent nigh on impossible. BS

1943

Hierarchy of Needs

Abraham Maslow

A pyramid of wellbeing intended to make the world a healthier place

Abraham Maslow (1908–70) was a U.S. psychologist who railed against the notion in psychiatry that people are simply “bags of symptoms.” In 1943 he wrote a self-affirming paper that he titled “A Theory of Human Motivation,” published in Psychological Review. In it, he stressed the importance of striving for self-actualization by pursuing love and a sense of belonging. He wanted to emphasize the positive aspects of being human, and believed that the study of “crippled, stunted, immature, and unhealthy specimens can yield only a crippled psychology.” It was time to start affirming people again.

Maslow’s model involved what he called a “hierarchical levels of needs.” This hierarchy began with the most basic—our biological and physical need for air, food, shelter, sexual fulfilment, and sleep. Next came our need for safety in the form of law and order and protection from the elements; then the need for a sense of belonging, either in a family, in a group, or in relationships; followed by self-esteem, which involved the pursuit of cognitive growth, independence, prestige, and status; and finally the need to achieve self-actualization through realizing our own potential, personal growth, fulfilment, and peak experiences.

Загрузка...