What Makes Humans Human?

Little, today, is as it was.

Anatomically modern humans have existed for about 200,000 years, but only since the end of the eighteenth century has artificial lighting been widely used. Gas lamps were introduced in European cities about that time, and electric lights came into use only in the twentieth century.

In other words, for most of human history, when night fell, it fell hard. Things got really, really dark,

and people gathered under the stars, which they could actually see, in those days before nighttime light pollution,

and under those stars, they told stories.

In EVERY culture around the globe, storytelling, in the form of narrative poetry, existed LONG before the invention of writing. We know this because the earliest manuscripts that we have from every culture record stories that were already ancient when they were finally written down. One of the earliest texts in English is that of the poem Beowulf. It reworks and retells, in a much distorted manner, much, much older stories—ones that predate the emergence of English as a distinct language. Stith Thompson, the great folklorist, did the literary world an enormous favor by compiling a massive index, today known as the Arne-Thompson Index, of motifs of ancient folktales worldwide. Name a story motif—three wishes, talking animals, the grateful dead, cruel stepsisters, golden apples, dragons, the fairy or demon lover, the instrument that plays itself –and you will find that the motif has an ancient pedigree and was already spread about the world long before historical times.

English is a Germanic language. All ancient Germanic societies had official storytellers whose job it was to entertain people in those days before modern entertainments like television and movies and the Internet and drones with laser-guided Hellfire missiles. In ancient Denmark, the storyteller was called a skaald. In Anglo-Saxon England, the storyteller was a scop (pronounced like MnE “shop”). The scop accompanied his stories on the Anglo-Saxon harp, a kind of lyre.

Of course, the telling of stories wasn’t the only entertainment around campfires. In most cultures, people danced and chanted and sang as well, and sometimes stories were told by the dancers or singers or chanters. All this was part of acting out the stories. (Want to know where the Christian devil, with his red body and horns, comes from? Well, in ancient Europe, people worshiped an Earth Mother and her consort, a Lord of the Forest, and they told stories of the hunt. When they acted these out around campfires, they held up to their heads animal horns, or branches in the shape of horns, and that’s how they pictured their Lord of the Forest, as a therianthrope, red from the campfire, with horns. When the Christians spread North across Europe, they made the god of the Old Religion into The Adversary. Grendel’s mother, the monster from the bog in Beowulf, is a demonized version, in a Christian story, of the ancient Anglo-Saxon fertility goddess Nerthus, to whom sacrifices were made by binding people, cutting their throats, and throwing them into a bog. You can see an ancient bas relief of the Lord of the Forest, btw, on the Gundestrup cauldron dating from 150 to 1 BCE. See the accompanying illustration.)

But where does this storytelling urge among humans come from, and why is it universal? Storytelling takes energy. And it doesn’t produce tangible results. It doesn’t mend bones or build houses or plant crops. So, why would it survive and be found among every people on Earth from the earliest times onward?

Contemporary cognitive scientists have learned that storytelling is an essential, built-in part of the human psyche, involved in every aspect of our lives, including our dreams, memories, and beliefs about ourselves and the world. Storytelling turns out to be one of the fundamental ways in which our brains are organized to make sense of our experience. Only in very recent years have we come to understand this. We are ESSENTIALLY storytelling creatures, in the Aristotelian sense of essentially. That is, it’s our storytelling that defines us. If that sounds like an overstatement, attend to what I am about to tell you. It’s amazing, and it may make you rethink a LOT of what you think you know.

At the back of each of your eyes are retinas containing rods and cones. These take in visual information from your environment. In each retina, there is a place where the optic nerve breaks through it. This is the nerve that carries visual signals to your brain. Because of this interruption of the retinas, there is a blind spot in each where NO INFORMATION AT ALL IS AVAILABLE. If what you saw was based on what signals actually hit your retina at a given moment, you would have two big black spots in your field of vision. Instead, you see a continuous visual field. Why? Because your brain automatically fills in the missing information for you, based on what was there when your eye saccaded over it a bit earlier. In other words, your brain makes up a story about what’s there. Spend some time studying optical illusions, and you will learn that this is only one example of many ways in which you don’t see the world as it is but, rather, as the story concocted by your brain says it is.

This sort of filling in of missing pieces also happens with our memories. Scientists have discovered that at any given moment, people attend to at most about seven bits of information from their immediate environment. There’s a well-known limitation of short-term memory to about seven items, give or take two, and that’s why telephone numbers are seven digits long. So, at any given moment, you are attending to only about seven items from, potentially, billions in your environment. When you remember an event, your brain FILLS IN WHAT YOU WERE NOT ATTENDING TO AT THE TIME based on general information you’ve gathered, on its predispositions, and on general beliefs that you have about the world. In short, based on very partial information, your brain makes up and tells you a STORY about that past time, and that is what you “see” in memory in your “mind’s eye.”

So, people tend to have a LOT of false memories because the brain CONFABULATES—it makes up a complete, whole story about what was PROBABLY the case and presents that whole memory to you, with the gaps filled in, for your conscious inspection. In short, memory is very, very, very faulty and is based upon the storytelling functions of the brain!!!! (And what are we except our memories? I am that boy in the Dr. Dentons, in my memory, sitting before the TV with the rabbit ears; I am that teenager in the car at the Drive-in with the girl whom I never thought in a million years would actually go out with me. But I’m getting ahead of myself.)

You can also see this storytelling function of the brain at work in dreaming. Years ago, I had a dream that I was flying into the island of Cuba on a little prop plane. Through the window, I could see the island below the plane. It looked like a big, white sheet cake, floating in an emerald sea. Next to me on the airplane sat a big, red orangutan smoking a cigar.

Weird, huh? So why did I have that dream? Well, in the days preceding the dream I had read a newspaper story about the Fidel Castro, the leader of Cuba, being ill; I had flown on a small prop plane; I had attended a wedding where there was a big, white sheet cake; I had been to the zoo with my grandson, where we saw an orangutan; and I had played golf with some friends, and we had smoked cigars.

The neural circuits in my brain that had recorded these bits and pieces were firing randomly in my sleeping brain, and the part of the brain that does storytelling was working hard, trying to piece these random fragments together into a coherent, unified story. That’s the most plausible current explanation of why most dreams occur. The storytelling parts of the brain are responding to random inputs and tying them together—making sense of this random input by making a plausible story of them. This is akin to the process, pareidolia, that leads people see angels in cloud formations and pictures of Jesus on their toast.

So, those are three important reasons why the brain is set up as a storytelling device. Storytelling allows us to see a complete visual field; creates for us, from incomplete data, coherent memories; and ties together random neural firings in our brains to into the wholes that we call dreams.
But that’s not all that storytelling does for us. Storytelling about the future allows us to look ahead—for example, to determine what another creature is going to do. We often play scenarios in our minds that involve possible futures. What will she say if I ask her to the prom? What will the boss say if I ask for a raise? How will that go down? In other words, storytelling provides us with a THEORY OF MIND for predicting others’ behavior.

Stories also help people to connect to one another. When we tell others a story, we literally attune to them. We actually get “on the same wavelengths.” Uri Hasson, a neuroscientist at Princeton, recorded the brainwaves of people during rest and while listening to a story. During rest, their waves were all over the place. While listening to the same story, even at different times and places, those people had brainwaves that were in synch.

Storytelling also provides a mechanism for exploring and attempting to understand others generally. Our basic situation in life is that your mind is over there and mine is over here. We’re different, and we have to try to figure each other out—to have a theory of other people’s minds. By telling myself a story about you, I can attempt to bridge that ontological gap. Unfortunately, the stories we tell ourselves about others tend to be fairly unidimensional. You are simply this or that. I, on the other hand, am an international man of mystery. This is a tendency we need to guard against.

We also tell stories in order to influence others’ behavior–to get them to adopt the story we’re telling as their own. This is how advertising works, for example. The advertiser gets you to believe a story about how you will be sexier or smarter or prettier or more successful or of higher status if you just buy the product with the new, fresh lemony scent. And it’s not just advertisers who do this. Donald Trump sold working class Americans a fiction about how he could strike deals that would make America great again because he was such a great businessman, one who started with nothing and made billions. The coach tells a story in which her team envisions itself as the winners of the Big Game. The woo-er tells the woo-ee the story of the great life they will have together (“Come live with me and be my love/And we shall all the pleasures prove”). And so on. Successful cult leaders, coaches, lovers, entrepreneurs, attorneys, politicians, religious leaders, marketers, etc., all share this is common: they know that persuasion is storytelling. The best of them also understand that the most successful stories, in the long run, are ones that are true, even if they are fictional.

When we tell stories, we spin possible futures—we try things on, hypothetically. And that helps us to develop ideas about who we want to be and what we want to do. Gee, if I travel down that road, I may end up in this better place.

And that observation leads to one final, supremely important function of storytelling: Who you are—your very SELF—is a story that you tell yourself about yourself and your history and your relations to others—a story with you as the main character. The stories you tell yourself about yourself become the person you are. The word person, by the way, comes from the Latin persona, for a mask worn by an actor in the Roman theatre.

So, our very idea of ourselves, of our own personal identity, is dependent upon this storytelling capacity of the human brain, which takes place, for the most part, automatically. There is even a new form of psychotherapy called cognitive narrative therapy that is all about teaching people to tell themselves more life-enhancing, affirmative stories about themselves, about who they are.

Telling yourself the right kinds of stories about yourself and others can unlock your creative potential, improve your relationships, and help you to self create—to be the person you want to be.

So, to recapitulate, storytelling . . .

helps us to fill in the gaps so that we have coherent memories,

ties together random firings in the brain into coherent dreams,

enables us to sort and make sense of past experience,

gives us theories of what others think and how they will behave,

enables us to influence others’ behavior,

enables us to try on various futures, and

helps us to form a personal identity, a sense of who were are.

Kinda important, all that!

Storytelling, in fact, is key to being human. It’s our defining characteristic. It’s deeply embedded in our brains. It runs through every aspect of our lives. It makes us who we are.

It’s no wonder then, that people throughout history have told stories. People are made to construct stories—plausible and engaging accounts of things—the way a stapler is made to staple and a hammer is made to hammer. We are Homo relator, man the storyteller.

(BTW, the root *man, meaning “human being” in general, without a specific gender reference, is ancient. It goes all the way back to Proto-Indo-European, but there’s still good reason, today, to seek out gender-neutral alternatives, when possible, of course.)

Copyright 2015. Robert D. Shepherd. All rights reserved.

Art: Detail from the Gundestrup Cauldron. Nationalmuseet [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0) or CC BY-SA 2.5 (https://creativecommons.org/licenses/by-sa/2.5)]


For more pieces by Bob Shepherd on the topic of Education “Reform,” go here: https://bobshepherdonline.wordpress.com/category/ed-reform/

For more pieces on the teaching of literature and writing, go here: https://bobshepherdonline.wordpress.com/category/teaching-literature-and-writing/

Posted in Short Stories, Teaching Literature and Writing, Uncategorized | 14 Comments

It’s about Time (a Catena)



A brief tour of fascinating (and lunatic) notions that philosophers (and a few poets) have had about time. 

The Mystery of Time

“What then is time? If no one asks me, I know; if I wish to explain it to one who asks, I know not.”

–St. Augustine (345–430 CE), Confessions

PART 1: What Is Time? Types of Time

Albert_Einstein_at_the_age_of_three_(1882)Absolute or Scientific Newtonian Time

“Absolute, true and mathematical time, of itself, and from its own nature flows equably without regard to anything external, and by another name is called duration.”

–Sir Isaac Newton (1643–1727), Philosophiae naturalis principia mathematica (Mathematical Principles of Natural Philosophy)

The Specious (Nonexistent) Present

“The relation of experience to time has not been profoundly studied. Its objects are given as being of the present, but the part of time referred to by the datum is a very different thing from the conterminous of the past and future which philosophy denotes by the name Present. The present to which the datum refers is really a part of the past — a recent past — delusively given as being a time that intervenes between the past and the future. Let it be named the specious present, and let the past, that is given as being the past, be known as the obvious past. [Each of] all the notes of a bar of a song seem to the listener to be contained in the [specious] present. [Each of] all the changes of place of a meteor seem to the beholder to be contained in the [specious] present. At the instant of the termination of [each element in] such series, no part of the time measured by them seems to be [an obvious] past. Time, then, considered relatively to human apprehension, consists of four parts, viz., the obvious past, the specious present, the real present, and the future. Omitting the specious present, it consists of three . . . nonentities — the [obvious] past, which does not [really] exist, the future, which does not [yet] exist, and their conterminous, the [specious] present; the faculty from which it proceeds lies to us in the fiction of the specious present.”

–E. Robert Kelley, from The Alternative, a Study in Psychology (1882). Kelley’s concept of the specious present has been extremely influential in both Continental and Anglo-American philosophy despite the fact that Kelley was not a professional philosopher.

Albert_Einstein_as_a_childSubjective Time

“Oh, yeah. Hegel’s Phenomenology of Spirit. I never finished it, though I did spent about a year with it one evening.”

Experienced Time: The “Wide” Present

“In short, the practically cognized present is no knife-edge, but a saddle-back, with a certain breadth of its own on which we sit perched, and from which we look in two directions into time. The unit of composition of our perception of time is a duration, with a bow and a stern, as it were—a rearward- and a forward-looking end. It is only as parts of this duration-block that the relation or succession of one end to the other is perceived. We do not first feel one end and then feel the other after it, and forming the perception of the succession infer an interval of time between, but we seem to feel the interval of time as a whole, with its two ends embedded in it.”

–William James, “The Perception of Time,” from The Principles of Psychology, Book I

459px-Einstein_patentofficeA, B, and C Series Time (Three Ways of Looking at Time)

  • The A Series: Time as Past, Present, and Future
  • The B Series: Time as Earlier, Simultaneous, and Later
  • The C Series: Time as an Ordered Relation of Events (with the direction being irrelevant)

Influential distinctions made by John Ellis McTaggart in “The Unreality of Time.” Mind 17 (1908): 456-476. The three types are much discussed by philosophers in the Anglo-American analytic tradition.

See also The Unreality of Time 2: Block Time, below

PART 2: Does Time Exist?

No, It Doesn’t: Change Is a Self-Contradictory Idea

“For this view can never predominate, that that which IS NOT exists. You must debar your thought from this way of search. . . .There is only one other description of the way remaining, namely, that what IS, is. To this way there are very many signposts: that Being has no coming-into-being . . . . Nor shall I allow you to speak or think of it as springing from not-being; for it is neither expressive nor thinkable that what-is-not is. . . . How could Being perish? How could it come into being? If it came into being, it is not; and so too if it is about-to-be at some future time. . . .For nothing else either is or shall be except Being, since Fate has tied it down to be a whole and motionless; therefore all things that mortals have established, believing in their truth, are just a name: Becoming and Perishing, Being and Not-Being, and Change of position, and alteration of bright color.”

–Parmenides of Elea (c. 475 BCE), fragment from The Way of Truth, in Ancilla to the PreSocratic Philosophers, ed. Kathleen Freeman

Albert_Einstein_(Nobel)“Does the arrow move when the archer shoots it at the target? If there is a reality of space, the arrow must at all times occupy a particular position in space on its way to the target. But for an arrow to occupy a position in space that is equal to its length is precisely what is meant when one says that the arrow is at rest. Since the arrow must always occupy such a position on its trajectory which is equal to its length, the arrow must be always at rest. Therefore, motion is an illusion.”

–Zeno of Elea (c. 450 BCE), fragment from Epicheriemata (Attacks), in Ancilla to the PreSocratic Philosophers, ed. Kathleen Freeman

“One part of time has been [the past] and is not, while the other is going to be and is not yet [the future]. Yet time, both infinite time and any time you care to take, is made up of these. One would naturally suppose that what is made up of things which do not exist could have no share in reality.”

–Aristotle (384–322 BCE), Physics, IV, 10–14. 217b-244a.

462px-Einstein-formal_portrait-35Yes, It Does: Change Is the Fundamental Reality of Our Lives

“It is not possible to step twice into the same river.”

–Heraclitus, (c. 475 BCE), fragment from unnamed book, in Ancilla to the PreSocratic Philosophers, ed. Kathleen Freeman

[Heraclitus seems to have held this fact to be one of many indications of the essential unworthiness/irredeemability of this life; the other fragments of his writings that have survived suggest that Heraclitus was a kind of 5th century fundamentalist preacher, upset about the moral decay around him, who viewed the world as synonymous with decay, and who wanted to point his readers, instead, toward the eternal Logos. Plato inherited this view; the Christian church inherited Plato’s. Such contemptu mundi (contempt for the world) is often, in that tradition, expressed as contempt for that which exists “in time” and is not eternal.]

“Time is nature’s way of keeping everything from happening at once.”

–Woody Allen (1935–      )


No, It Doesn’t: Time is an Illusion Due to Vantage Point in an Eternal Space Time (the “Block Time” Hypothesis):

“Now Besso has departed from this strange world a little ahead of me. That means nothing, for we physicists believe the separation between past, present, and future is only an illusion, although a convincing one.”

–Albert Einstein (1879­–1955), in a letter written to the family of Michele Besso, on Besso’s death

“All time is all time. It does not change. It does not lend itself to warnings or explanations. It simply is. Take it moment by moment, and you will find that we are all, as I’ve said before, bugs in amber.”

462px-Einstein-formal_portrait-35–Kurt Vonnegut, Jr. (1922–2007), who is in heaven now, Slaughterhouse Five

Time present and time past
Are both perhaps present in time future,
And time future contained in time past.
If all time is eternally present
All time is unredeemable.

–T.S. Eliot (1888–1965), “Burt Norton,” from Four Quartets

No, It Doesn’t: The Now as Consequence of the Blindness of the Brain to Its Own Processing of Temporal Data (the “Blind Brain” Hypothesis)

“Nothing, I think, illustrates this forced magic quite like the experiential present, the Now. Recall what we discussed earlier regarding the visual field. Although it’s true that you can never explicitly ‘see the limits of seeing’–no matter how fast you move your head–those limits are nonetheless a central structural feature of seeing. The way your visual field simply ‘runs out’ without edge or demarcation is implicit in all seeing–and, I suspect, without the benefit of any ‘visual run off’ circuits. Your field of vision simply hangs in a kind of blindness you cannot see.

“This, the Blind Brain Hypothesis suggests, is what the now is: a temporal analogue to the edgelessness of vision, an implicit structural artifact of the way our ‘temporal field’–what James called the ‘specious present’–hangs in a kind temporal hyper-blindness. Time passes in experience, sure, but thanks to the information horizon of the thalamocortical system, experience itself stands still, and with nary a neural circuit to send a Christmas card to. There is time in experience, but no time of experience. The same way seeing relies on secondary systems to stitch our keyhole glimpses into a visual world, timing relies on things like narrative and long term memory to situate our present within a greater temporal context.

“Given the Blind Brain Hypothesis, you would expect the thalamocortical system to track time against a background of temporal oblivion. You would expect something like the Now. Perhaps this is why, no matter where we find ourselves on the line of history, we always stand at the beginning. Thus the paradoxical structure of sayings like, “Today is the first day of the rest of your life.” We’re not simply running on hamster wheels, we are hamster wheels, traveling lifetimes without moving at all.

“Which is to say that the Blind Brain Hypothesis offers possible theoretical purchase on the apparent absurdity of conscious existence, the way a life of differences can be crammed into a singular moment.”

–Scott Bakker, “The End of the World As We Knew It: Neuroscience and the Semantic Apocalypse”

PART 3: What Contemplation of Time Teaches Us about Living

Carpe Diem

“Such,” he said, “O King, seems to me the present life of men on Earth, in comparison with that time which to us is uncertain, as if when on a winter’s night, you sit feasting . . . and a simple sparrow should fly into the hall, and coming in at one door, instantly fly out through another. In that time in which it is indoors it is indeed not touched by the fury of winter; but yet, this smallest space of calmness being passed almost in a flash, from winter going into winter again, it is lost to our eyes.

“Something like this appears the life of man, but of what follows or what went before, we are utterly ignorant.”

–The Venerable Bede (c. 672–735), Ecclesiastical History of the English People, Book II


“Seize the day, trusting as little as possible in the future.”

–Horace (65–8 BCE), Odes 1.11

Oh, come with old Khayyam, and leave the Wise
To talk; one thing is certain, that Life flies;
One thing is certain, and the Rest is Lies;
The Flower that once has blown for ever dies.

Omar Khayyám (1048–1131), “Rubiyat,” trans. Edward FitzGerald

Gather ye rosebuds while ye may
Old time is still a-flying:
And this same flower that smiles to-day
To-morrow will be dying.

–Robert Herrick (1591–1674), “To the Virgins, to Make Use of Time”

459px-Einstein_patentofficeBut at my back I alwaies hear
Times winged Charriot hurrying near:
And yonder all before us lye
Desarts of vast Eternity.
Thy Beauty shall no more be found;
Nor, in thy marble Vault, shall sound
My ecchoing Song: then Worms shall try
That long preserv’d Virginity:
And your quaint Honour turn to durst;
And into ashes all my Lust.
The Grave’s a fine and private place,
But none I think do there embrace.
Now therefore, while the youthful hew
Sits on thy skin like morning glew,
And while thy willing Soul transpires
At every pore with instant Fires,
Now let us sport us while we may;
And now, like am’rous birds of prey,
Rather at once our Time devour,
Than languish in his slow-chapt pow’r.
Let us roll all our Strength, and all
Our sweetness, up into one Ball:
And tear our Pleasures with rough strife,
Thorough the Iron gates of Life.
Thus, though we cannot make our Sun
Stand still, yet we will make him run.

–Andrew Marvell (1621–1678), “To His Coy Mistress”

“Get it while you can.
Don’t you turn your back on love.”

–The American philosopher Janis Joplin (1943–1970)

Albert_Einstein_as_a_childGive Up/It’s All Futile Anyway

“A man finds himself, to his great astonishment, suddenly existing, after thousands of years of nonexistence: he lives for a little while; and then, again, comes an equally long period when he must exist no more. The heart rebels against this, and feels that it cannot be true.

“Of every event in our life we can say only for one moment that it is; for ever after, that it was. Every evening we are poorer by a day. It might, perhaps, make us mad to see how rapidly our short span of time ebbs away; if it were not that in the furthest depths of our being we are secretly conscious of our share in the exhaustible spring of eternity, so that we can always hope to find life in it again.

“Consideration of the kind, touched on above, might, indeed, lead us to embrace the belief that the greatest wisdom is to make the enjoyment of the present the supreme object of life; because that is the only reality, all else being merely the play of thought. On the other hand, such a course might just as well be called the greatest folly: for that which in the next moment exists no more, and vanishes utterly, like a dream, can never be worth a serious effort.”

–The ever-cheerful Arthur Schopenhauer (1788–1860), “The Vanity of Existence,” from Studies in Pessimism

Three Phenomenologist/Existentialist Views of Time

NB: the following are NOT quotations. I’ve summarized material that appears in much longer works. You’re welcome. I have included Husserl in this section, even though his work is just an attempted explanation of time, because the other two philosophers treated here are reacting to Husserl’s ideas.

Albert_Einstein_at_the_age_of_three_(1882)Husserl (very bright dude, this one): All our ideas about time spring from our conscious experience of the present. That experience is characterized by being intentional, by being toward something. We typically recognize three kinds of time: 1. scientific, objective, Newtonian time, which we think of as being independent of ourselves and as independently verifiable; 2. subjective time, in which events seem to move slower or faster; and 3. phenomenological or intentional time, which is the fundamental experience on which the other concepts of time are based, from which the other concepts derive because the phenomenological present includes not only awareness of present phenomena (the present), but retention (awareness of that which is not present because it no longer is—the past), and protention (awareness of that which is not present because it is about to be). The present is intentionality toward phenomena before us here, now. The past is present intentionality toward phenomena that are not present but are with us and so must be past (that’s where the definition of past comes from). The future is present intentionality toward phenomena that also are present but are not with us (as the past is) and so must be the future, which will be (that’s where the definition of future comes from). Therefore, in their origins in our phenomenological experiences, the future and the past are parts of the present, conceptual phenomena held in the present, alongside actual phenomena, as phenomena no longer present and not yet present.

Albert_Einstein_as_a_childHeidegger: Husserl had it all wrong. It’s the future, not the present, that is fundamental. We are future-oriented temporalities by nature, essentially so. Our particular type of being, Dasein, or being-there, is characterized by having care (about its projects, its current conditions, about other beings)—about matters as they relate to those projects. Our being is characterized by understanding, thrownness, and fallenness. Understanding, is the most fundamental of the three. It is projection toward the future, comportment toward the possibilities that present themselves, potentiality for being. Our understanding seizes upon projects, projecting itself on various possibilities. In its thrownness, Dasein always finds itself in a certain spiritual and material, historically conditioned environment that limits the space of those possibilities. As fallenness, Dasein finds itself among other beings, some of which are also Dasein and some of which (e.g., rocks) are not Dasein, and it has, generally respectively, “being-with” them or “being alongside” them, and these help to define what possibilities there are.  “Our sort of being (Dasein) is being for which being is an issue.” Why is it an issue? Well, we are finite. We know that we are going to die. This is the undercurrent that informs our essential being, which is care, concern. We are projections toward the future because undertaking these projects is an attempt, however quixotic, to distract ourselves from or even to cheat death. We care about our projects because, at some level, we care about not dying, having this projection toward the future for which we are living.

459px-Einstein_patentofficeSartre: The world is divided into two kinds of being: being-for-itself (the kind of being that you and I have) and being-in-itself (the kind of being that a rock or a refrigerator has). Let’s think a bit about our kind of being. Take away your perceptions, your body, your thoughts. Strip everything away, and you still have pure being, the being of the being-for-itself, but it is a being that is also nothing. (The Buddha thought this, too). Being-for-itself has intentional objects, but itself is no object (there’s no there there) and so is nothing, a nothingness. Time is like being in that respect. It consists entirely of the past (which doesn’t exist) and the future (which doesn’t exist) and the present (which is infinitesimally small and so doesn’t exist). So time, like being, is a nothingness. This being-for-itself is not just nothingness, however; it has some other bizarre, contradictory characteristics: Its being, though nothing, allows a world to be manifest (how this is so is unclear), a world that includes all this stuff, including others, for example, who want to objectify the being-for-itself, to make it into a something, a thing, a being-in-itself, like a rock. (“Oh, I know you. I’m wise to you. You’re . . . .” whatever.) The being-for-itself also has a present past (in Husserl’s sense) and is subject to certain conditions of material construction (the body) and material conditions (in an environment of things), and all these givens—the body, the environment, one’s own past, and other people seen from the outside in their thinginess—make up the being-for-itself’s facticity. The being-for-itself wants to be SOMETHING, and so lies to itself. It acts in bad faith, playing various roles (playing at being a waiter, for example) and creating for itself an ego (via self-deceptive, magical thinking). But in fact, being in reality nothing, being-for-itself (each of us) knows that that’s all a lie. We transcend our facticity and can be anything whatsoever, act in any way whatsoever. In other words, we are absolutely free and therefore absolutely responsible. This responsibility is absurd, because there is no reason for being/doing any particular thing. “Man is a meaningless passion.” But the absolute freedom that derives from our essential nothingness also allows for action to be truly authentic (as opposed to the play-acting) in addition to being responsible. Only in death does the being-for-itself succeed in becoming a being-in-itself, a completed thing, and then only if and in the manner in which he or she is remembered by others. A person who is not remembered never existed. Death is a time stamp or, if we are not remembered, an expiration date.

Albert_Einstein_(Nobel)The Eternal Return and the Weight of Being

“341. The Greatest Weight. What, if some day or night a demon were to steal after you into your loneliest loneliness and say to you: ‘This life as you now live it and have lived it, you will have to live once more and innumerable times more; and there will be nothing new in it, but every pain and every joy and every thought and sigh and everything unutterably small or great in your life will have to return to you, all in the same succession and sequence—even this spider and this moonlight between the trees, and even this moment and I myself. The eternal hourglass of existence is turned upside down again and again, and you with it, speck of dust!’

“Would you not throw yourself down and gnash your teeth and curse the demon who spoke thus? Or have you once experienced a tremendous moment when you would have answered him: “You are a god and never have I heard anything more divine.” If this thought gained possession of you, it would change you as you are or perhaps crush you. The question in each and every thing, “Do you desire this once more and innumerable times more?” would lie upon your actions as the greatest weight. Or how well disposed would you have to become to yourself and to life to crave nothing more fervently than this ultimate eternal confirmation and seal?”

–Friedrich Nietzsche (1844–1900), The Gay Science

462px-Einstein-formal_portrait-35The Fleeting One-Offness of Everything and the Resulting Unbearable Lightness of Being

“But Nietzsche’s demon is, of course, wrong. There is no eternal return. Where does that leave us? Isn’t life ALWAYS a matter of I should have’s and I would have’s and if I had only knowns? “[W]hat happens but once, might as well not have happened at all. If we have only one life to live, we might as well not have lived at all. . . .

“The heaviest of burdens crushes us, we sink beneath it, it pins us to the ground. But in love poetry of every age, the woman longs to be weighed down by the man’s body. The heaviest of burdens is therefore simultaneously an image of life’s most intense fulfillment. The heavier the burden, the closer our lives come to the earth, the more real and truthful they become. Conversely, the absolute absence of burden causes man to be lighter than air, to soar into heights, take leave of the earth and his earthly being, and become only half real, his movements as free as they are insignificant. What then shall we choose? Weight or lightness?”

–Milan Kundera (1929­–     ), contra Nietzsche, from The Unbearable Lightness of Being

Albert_Einstein_HeadCopyright 2010, Robert D. Shepherd. All rights reserved.

Posted in Existentialism, Metaphysics, Philosophy, Philosophy of Mind, Time | Tagged , , | 4 Comments

Why Writing Science Fiction Is Impossible (a Science Fiction Writer’s Confession)

When I was still a child, I fell in love with Sci Fi. I stayed up nights devouring stories by Asimov and Heinlein, Clarke and Bradbury, Poul Anderson, Frederick Pohl, Theodore Sturgeon, Pierre Boyle, and Harlan Ellison. Sometimes, I read fat collections of short stories, and these often had introductions. I ate those up, too. Those introductions were the first literary criticism I ever read, though I didn’t know at the time that that’s what they were.

From those introductions, I learned what Science Fiction and Fantasy have in common. Both are imaginative literature dealing with alternate worlds. Often, the reader or the characters or both get to those other worlds by going into the distant past (The Time Machine) or into the far future (The Foundation Trilogy) or to another planet (Dune) or to an exotic civilization in the jungle (King Kong) or far underground (Journey to the Center of the Earth) or to the sea floor (the lost city of Atlantis in Twenty Thousand Leagues under the Sea). You might get there in some sort of vehicle—Ezekiel’s chariot, a whirlwind or tornado, a spaceship or a time machine or a submarine, or you might go through a portal by stepping through a wardrobe or a mirror, falling down a rabbit hole, or traveling through a wormhole. I learned that there was a really ancient tradition of fanciful Traveler’s Tales that posited exotic places full of wonders—the trippy travels described in The Book of Enoch; in works by Homer, Virgil, and Dante; in Herodotus;  in Lucian of Samosata’s True History (second century CE). Nephilim/Watchers, Gorgons and Cyclops and Cerberus, demons and Seraphim, and Amazons and armies on the moon, oh my!

I was a child in the Hippie Era, when lots of middle-class American young people were first experimenting with psychedelic drugs—LSD, mescaline, peyote, magic mushrooms, STP, and so on. Another way you could get to one of those alternate worlds was to take one of these drugs, which were also portals. I learned that shamans had been taking people to alternate worlds for a long, long time.

And yet another way was to experience an extreme state of consciousness, by practicing austerities such as fasting or exposure or by experiencing extreme trauma or madness. Edgar Allan Poe invented a way of structuring stories that became a staple of Fantasy and Science Fiction stories. The entire story would ride on an ambiguity. The strange events and characters in the suddenly strange place of the story could be interpreted in one of two ways—either these were supernatural events and characters that actually existed, or they were figments of the diseased mind—of the madness—of the narrator. Part of Poe’s genius was that he just let the ambiguity ride, never resolving it as many lame screenwriters do today.

Both kinds of story, Science Fiction and Fantasy, bent reality. They asked, “What if?” What if the world were different in this way or in these ways? What if there is a monster under the bed? What if there were a country inhabited only by women? What if there were dragons? What if men replaced their wives with robots? What if a person could be invisible? or the size of an ant or of a redwood?

So, the two kinds of story—Science Fiction and Fantasy—both dealt with alternate worlds. That’s what made them so interesting, and it’s also what made them important and critiques of human life, culture, norms. One could argue, for example, as H. G. Wells did in The Time Machine, that if the rich keep exploiting the workers as they do, and living in ever increasing refinement and comfort and ease while the workers live ever more crudely, harshly, and basely, eventually they will evolve into weak, pathetic Eloi (the rich) fed upon by the strong, bestial Morlocks (the workers). Or you could argue, as Orwell did in 1984 (published in 1948), that if the state continued growing more powerful and better at surveillance, propaganda, command, coercion, control, we would end up in a dystopian nightmare. So, alternate worlds provided a means for critiquing this one. What if we extrapolate this trend into the future? What if our best-laid plans, our technologies, go terribly astray, as when Victor Frankenstein’s attempt to conquer death leads to the creation of a hated monster (Mary Shelley’s Frankenstein, the first real Science Fiction novel [talk about genius, at the freaking age of 18!]) or the world is mostly destroyed and thrown back into barbarity (Stephen Vincent Benet’s “By the Waters of Babylon,” the first post-apocalyptic fiction [another creation of a whole genre; again, real genius and still a GREAT read today]).

Fine. But what distinguishes the two types of fiction dealing with alternate worlds? Well, the difference is that Science Fiction has to be possible, given what is known of science or given some future scientific discovery, whereas Fantasy does not. So, The Invisible Man is Science Fiction because it posits the discovery of a chemical that makes people invisible, whereas The Hobbit isn’t because it doesn’t present the hobbits as an actual scientific discovery (European adventurers travel to remote Papua New Guinea and discover the Valley of Mordor). It’s Fantasy.

Supernatural Fiction, like Bram Stoker’s Dracula or Ira Levin’s brilliant Rosemary’s Baby, would be, according to this distinction, a kind of Fantasy literature. But because it is rooted in religious superstition, it has claims on “realness” that rival, but are different from, those of Science Fiction. If you believe that witches covens that can call up the Devil actually exist, then you are very, very indoctrinated and confused but will think of something like Rosemary’s Baby not as Fantasy but as Horror Realism. Lord help you, living in such a demon-haunted world. We have real human demons enough to contend with, thank you. One of them is currently in the Oval Office in the now Whiter House.

I will admit to disliking a lot of Fantasy literature because much of it seems, to me, just stupid, the stuff of childhood. Yes, I like “Jack and the Beanstalk” as much as you do, and Alice in Wonderland is AMAZING, a work of genius. But a steady diet of this stuff? 900-page books about the wizardress who trains dragons and leads them into battle against the forces of the Evil Zauron of Xacharia? Uh, no thanks. The problem, it long seemed to me, with Fantasy literature is that it was too easy. You could simply make up anything without having to worry about plausibility. Much of it requires readers slow-witted enough that they are willing to suspend disbelief about anything–are capable of entertaining anything as true. (The main character is in a jam, but hey, I didn’t tell you, but he has this superpower, and he can immediately teleport himself elsewhere because, well, no because, he just can.) I dutifully read to my children all of the Harry Potter books, for they loved them, but I must admit that they mostly bored me to tears. They were derivative and slow paced, and the author wasn’t exactly a great prose stylist. But, hey, she’s almost a billionaire. There’s an audience for that kind of stuff.

I must admit I’ve always been a bit snobbish about this distinction between Sci Fi and Fantasy. Science Fiction, I thought but rarely said, was for bright folk, and Fantasy for dummies. There was a problem with this formulation of mine, however. Science Fiction writers tended to be men more interested in science than in other people. The poet Dylan Thomas spoke of meeting a fellow at a party who wanted to be a novelist even though he didn’t know any people well. LOL. That’s a problem. Because Sci Fi tends to be written by nerdy men, characters in Science Fiction often tended to be extraordinarily shallow. Michael Crichton created Sci Fi novels based on current scientific work (DNA extraction—Jurassic Park; nanotechnology—Prey), and because the ideas were interesting to people and the books were full of action, they sold well, even though you could literally exchange the names of characters, making this man a woman and that woman a man, for example, and it would make no difference, for they had almost no interior lives, certainly not interior lives that were deeply explored.

However, this was not the only problem, I’ve come to realize, with my beloved genre of Science Fiction. A problem that Sci Fi writers are now facing is that the pace of change today is so fast, and the technologies currently being developed are so bizarre, that they typically outpace the imaginations of even the most imaginative, the most far-out Sci Fi writers. Yes, we have discovered what causes aging, and people are working on curing it. Yes, we shall soon control evolution and be able to create humans vastly different from us, leading, very possibly, to human speciation. Yes, we are working on building microscopic molecular robotic assemblers that can reproduce and take things apart and put them together into other things—a nano-fog that can turn a sofa and some house plants into a cello or everything into grey goo. Yes, we are working on technology to read minds and broadcast the results. Yes, we are working on building artificial life forms. Yes, we are working on devices that will enable people to be remotely present anywhere while physically remaining in place. Yes, coming soon, people with super strength who have blue skin with telescopic and microscopic vision in the ultraviolet. Pity Science Fiction writers who try to come up with strange new worlds based on science at the cutting edge when actual science is this weird, is more outlandish than their crazy imaginations are!

But that’s not the only problem. One problem goes to the heart of the definition of the genre—plausibility. Remember, to be a Sci Fi story, and not a mere Fantasy, the story has to be plausible scientifically.

Sci Fi posits scientific breakthroughs (Orwell’s two-way telescreens that are both watched by people and that watch people; Mary Shelley’s animation of dead tissue using “animal magnetism,” aka, Galvanism) and spins yarns based on those. Here’s the problem: really advanced science doesn’t look realistic. It doesn’t look like science. It looks like magic. This is what Science Fiction master Arthur C. Clarke meant when he said that “Any sufficiently advanced technology is indistinguishable from magic.”

Imagine an airplane landing in a remote area inhabited by tribespeople who have never seen such technology. A strange, enormous, featherless bird lands. It immediately gives birth to strange white beings who carry sticks that shoot fire. A famous example of this happened during World War II on the island of Tanna, Vanuatu, in the Pacific. Natives stumbled upon military supplies dropped on the island from military planes and encountered an American named John Frum who came to collect them. Frum traded with the natives, and then left, and the natives created a whole religion based upon the return of this god, sometimes called John Frum and sometimes Tom Navy, who would bring goods, the “cargo.” This was one example of several cargo cults that emerged in the Pacific after such first contact. In Papua New Guinea, one early explorer returned to his airplane to find that a native had strapped himself to the bottom of its fuselage with ropes made of vines, hoping to travel back to the gods’ abode and see it for himself.

Sci Fi involves an encounter with or discovery of an advanced technology, but the problem with that is that truly advanced technologies aren’t explicable in terms that we have today and are impossible to envision. It’s funny to read mid-20th-century Sci Fi because none of the major writers of the time envisioned a world of cell phones and the Internet. In the film Blade Runner, based on Philip Dick’s book Do Androids Dream of Electric Sheep?, released in 1982 and set in the far future (November, 2019), the protagonist at one point runs around frantically looking for a pay phone. The director didn’t imagine that cell phones would become ubiquitous and phone booths a rare curiosity from the past. Yes, Mark Twain and H.G. Wells both foresaw something like an instantaneous worldwide communication system that would allow text and pictures to be sent through wires. H.G. Wells predicted something like genetic engineering and something like the atomic bomb (though he got the mechanisms completely wrong). Arthur C. Clarke invented, in fiction, the geosynchronous satellite, on which modern communications systems are based, long before it appeared in reality. But the point remains that generally, if we actually encountered a truly advanced technology, we would not recognize it as technology because we wouldn’t understand it at all. It would look like magic because its mechanisms would not be explainable in any terms that we can understand. In other words, it would look like Fantasy, not Science Fiction.

This is a kind of death ray for Science Fiction writers, n’est-ce pas?

This is a problem that I, as a Science Fiction writer, have to figure out how to solve.


Copyright 2020. Robert D. Shepherd. All rights reserved. This essay may be freely shared/distributed as long as this copyright notice is retained.


For short stories by Bob Shepherd, and more pieces about the short story and fiction generally, go here: https://bobshepherdonline.wordpress.com/category/short-stories/


Posted in Epistemology, Short Stories, Teaching Literature and Writing | Leave a comment

History of Ideas: Background to Puritan and Pilgrim Protestantism in North America

For some 1,300 years, from roughly 300 to 1600 CE, the Catholic Church, headquartered for most of this time in the city of Rome, was the most powerful institution in the Western world. It was the ultimate spiritual and temporal authority in Europe.

In the sixteenth century, after years of turmoil in the Church, a reform movement arose. The movement was called Protestantism, from the Latin protestari, meaning “to assert publicly, or witness.” There followed a period of a couple hundred years that has come to be known as the Protestant Reformation, during which many Europeans switched their allegiance away from the Catholic Church and toward the new Protestant churches. The Protestant Reformation had dramatic effects on later events, including the founding of the colonies that became the United States.

In 1517, Martin Luther (1483-1546) initiated the Protestant Reformation when he nailed his Ninety-Five Theses to the door of the cathedral at the University of Wittenberg, in Germany, where he taught theology. Luther opposed the sale of indulgences, which were held to provide remission from punishment for sins confessed to a member of the clergy. Luther’s opposition to the sale of indulgences was related to a more fundamental break that he had made with the Church. Both Catholics and Protestants believed that all people had inherited sin passed down from the first humans, Adam and Eve, who had disobeyed God. This was called the doctrine of Original Sin. The Catholic Church held that via the purchase of indulgences and the performance of certain actions, called sacraments, which were administered by clergy, one could cleanse one’s self of this Original Sin and of later sins that one committed. Luther came to believe, instead, the following:

  • That salvation could occur only through the Covenant of Grace (mercy extended to people despite their essential unworthiness)
  • That people received justification in the eyes of God through faith alone (not via our works, or actions, such as the purchase of indulgences or the taking of sacraments such as penitence)

And these beliefs were the basis of his Ninety-Five theses, which challenged the sale of indulgences and the authority of the Pope.

Around the same time as Luther, the French theologian John Calvin (1509–64) taught the primacy of revelation through the scripture and the doctrines of Election and Predestination. Calvin believed

  • that there exists a direct relation between the individual and God, without an intermediary (a priest or any member of the clergy—a bishop, archbishop, or Pope). Belief in direct relationships with God without an intermediary led Protestants like Luther and Calvin to translate the scriptures into vernacular (common) languages so that ordinary people could read them.
  • in predestination, that God exists outside time and already knows what course a person will take, for good or ill; those who would be saved were called the Elect, and salvation was known as election. As Calvin writes in his Institutes of the Christian Religion, Chapter 21:When we attribute prescience to God, we mean that all things always were, and ever continue, under his eye; that to his knowledge there is no past or future, but all things are present, and indeed so present, that it is not merely the idea of them that is before him (as those objects are which we retain in our memory), but that he truly sees and contemplates them as actually under his immediate inspection. This prescience extends to the whole circuit of the world, and to all creatures. By predestination we mean the eternal decree of God, by which he determined with himself whatever he wished to happen with regard to every man. All are not created on equal terms, but some are preordained to eternal life, others to eternal damnation; and, accordingly, as each has been created for one or other of these ends, we say that he has been predestinated to life or to death.” The notion that people play no role in their election but that God alone makes the decision as to who will and will not be saved is known as the doctrine of Absolute Sovereignty.

Like the Catholics before them, Luther and Calvin believed in Original Sin, that because of Adam’s sin, we all inherit sinfulness. The Catholics and the Protestants differed, however, in how they thought we could expiate, or rid ourselves, of this sinfulness. Catholicism stressed works (actions taken, like taking the sacraments), while Protestantism stressed God’s grace, extended to people despite their essential unworthiness. Calvin recognized only two sacraments—baptism and communion, and he denied transubstantiation (the literal transformation of the bread and wine into the body and blood of Christ), a key Catholic belief. Denial of the efficacy of sacraments such as confirmation and penance and last rights, indulgences, and works generally was part of Protestant Covenant Theology, propounded by Calvin, which held that God originally made a Covenant of Works with Adam, whereby he would receive eternal life in return for obedience, the so-called Old Covenant, which was replaced by the New Covenant, or Covenant of Grace, whereby one would be saved by belief in Christ, who died to redeem people from their sins.

Rejection of the Church and its authority led to Protestant belief in the right of individual congregations to govern themselves, a precursor to American ideas about local authority and democratic government. The strong belief in local governance held by many U.S. citizens today has one of its roots in this rejection of the distant authority of Rome.

Those who held beliefs like the ones outlined above were called Protestants because they protested the Catholic Church and its power.

These ideas led to a struggle for power throughout Europe—the Protestant Reformation.

The Protestant Reformation included a break by the country of England from the authority of Rome in 1534. At that time, King Henry VIII of England declared himself supreme head of the church and dissolved the monasteries. The church he established is known as the Church of England, or Anglican Church.

For years after Henry VIII, there was contention among various factions—some who wished to return to Catholicism (called by their enemies Papists), some who wished to reform the Anglican Church to purify it even more (called by their enemies and later by themselves Puritans), and some who believed that one could not reform the Anglican Church but needed to separate from it entirely (called Separatists). In the early 1600s, some English Separatists fled to Holland to escape persecution. Then, in 1620, they sailed for the New World and established a colony at Plymouth, called the Plymouth Plantation. This was a kind of pilgrimage, and these Separatists were called by their leader, William Bradford, Pilgrims, and the name stuck. As Bradford wrote in his history Of Plimoth Plantation,

“So they lefte [that] goodly & pleasante citie, which had been ther resting place, nere 12 years; but they knew they were pilgrimes, & looked not much on these things; but lift up their eyes to ye heavens, their dearest cuntrie, and quieted their spirits.”

Ten years later, in 1627, Puritans under John Endicott sailed for the New World and established the colony of Salem. In 1630, John Winthrop sailed for the New World carrying a royal charter for the Massachusetts colony, and Endicott’s Salem became the Massachusetts Bay Colony (with which Plymouth Plantation later merged). Unlike the Pilgrims, who wanted to separate from the Church of England entirely, the Puritans were reformers who wished to remain within the Anglican Church but to practice a “more pure” version of the religion. The Puritans believed in local governance of their churches, in which congregations elected their own ministers, so their churches were called Congregationalist churches.

One of the ways that you could know that a person was among the elect was that he or she lived a simple, frugal, hard-working, devout life and received various blessings as a result. Thus was born the Protestant Work Ethic—one showed through one’s actions, one’s hard work, that one was a member of the Elect. Of course, such an ethic was essential to a people carving out new lives “in the wilderness” of the New World.

Some important dates:

1492 – Columbus lands on island of Hispaniola
1493 – Papal Bull “Inter Caetera” (“Among other Works”) expounds the “Doctrine of Discovery,” saying that Christians can claim non-Christian lands as their own
1494 – Treaty of Tordesillas divides the New World up between Portugal and Spain

1498 – Explorer John Cabot sails along Massachusetts coast
1606 – King James I grants charter to Plymouth Company
1620 – Colony at Plymouth established after Mayflower Voyage
1628 – Colony at Salem established by John Endicott
1629 – Massachusetts Bay Company chartered
1630 – Massachusetts Bay Colony established at Boston; It would be lead, off and on, by John Winthrop
1632 – Boston is made capital of Massachusetts Bay Colony
1634 – Four Year War with Pequots begins, nearly wipes out tribe; remnant of tribe sold into slavery
1636 – Harvard College established at Cambridge
1638 – Slave Ship Desire arrives at Salem from Nicaraguan Coast
1641 – Province of New Hampshire merged into Massachusetts Bay Colony
1648 – Margaret Jones, herbal practitioner, hanged as a Witch at Boston
1659 – William Leddra hanged at Boston for practicing Quaker religion
1675 – King Philip’s War (Wampanoags) endangers colony for 3 Years
1680 – Province of New Hampshire separated from Mass Bay Colony
1692 – Salem witch hysteria occurs
1823 – Supreme Court in Johnson v. McIntosh rules unanimously that the principle of discovery gave Europeans and absolute right to the New World

Approximate Population of BOSTON

1650 – 3,000       1680 – 4,500        1690 – 7,000


How does all of this relate to the world today?

We study history because it has consequences for today. To understand the world we live in, we need to understand how it got to be that way.

  1. Today, Christianity is the most widely practiced religion in the world, with 33 percent of the world population identifying as Christian. Christianity today, as then, is divided into many denominations, most of which identify as either Catholic or Protestant. Numbers in the United States reflect the Puritan and Pilgrim heritage, with 70.6 percent of Americans identifying as Christians, 46.5 percent Protestant and 20.8 percent Roman Catholic.[1]
  2. The Puritan and Pilgrim insistence on local governance, as opposed to governance by a distant authority (the Pope) influenced strongly the development of Democratic ideals in the later United States. People came to believe that people had a right to govern themselves and not to be governed by distant Great Britain, and many insisted that their governance be local. The ongoing conflict between proponents of federal and state’s rights, a recurring them in U.S. politics throughout the history of the country, is another consequence of this seed planted by the Puritans and the Pilgrims.
  3. The Protestant Work Ethic made the later United States the most productive country in the world. Even today, U.S. workers are far more productive than are most of their counterparts elsewhere, producing, on average, $63,885 of wealth per worker per year.[2]
  4. The insistence by Puritans and Pilgrims on the basic sinfulness of people, due to Original Sin, led them to be quite harsh in their punishments. This continues to the present day. As of 2013, 2.8 percent of the adult U.S. population was under correctional supervision (on probation or parole, in jail or in prison).[3] That’s almost three people out of every hundred and is the second highest rate of punishment in the world after that of the island nation of Seychelles.
  5. The founders of the United States were rebels who insisted upon individual liberty and who rebelled against the authority of the English government. This rebelliousness had its precedents in the rebellion of Puritans against the authority of the Catholic Church and of the Pilgrims against the authority of both the Catholic and Anglican Churches. While the Puritans and Pilgrims came here to have the freedom to practice their religion, they were not particularly tolerant, on the whole, toward other religious beliefs, though there were notable exceptions. Roger Williams, founder of Rhode Island, broke with the Massachusetts Bay Colony over, among other issues, his insistence on the right of individual religious liberty.
  6. The Puritans saw nature as something to be subdued—as something to have dominion over. They viewed themselves as a small band of God’s people alone in a dangerous and encroaching wilderness. Their tasks were to survive it and to tame it. The vast natural resources of the New World made them very successful, but from them Europeans learned an ethic of exploitation that continues unabated.
  7. The Puritans viewed sex and sexuality as sinful, and they held modesty to be a great virtue. We inherited this Puritanical worldview.
  8. In “A Model of Christian Charity,” John Winthrop called upon the Puritans establishing the Massachusetts Bay Colony to share their “superfluities” with one another, to treat one another as brothers, and so to be a model to the world, a “city on a hill.” So, there was arguably among at least some Puritans an ethic of social welfare. It would be a long, long time, however, before basic protections for workers and the elderly became law in this country.

There are other legacies as well: the Congregationalist Churches of New England, the public Jeremiad, the revivalist style of preaching (from the Great Awakening), and, sadly, both enslavement of African-Americans and genocide against the native American peoples. Massachusetts inaugurated the genocide against native Americans in the United States with the 1637 Mystic Massacre. The first slaves arrived in Jamestown, Virginia, in 1619. However, in 1641, Massachusetts became the first colony on the North American continent to enact slavery into law in a bill called, ironically, “The Body of Liberties.”

[1] “The Pew Forum on Religion & Public Life – Asian Americans: A Mosaic of Faiths”. Pewforum.org. 2012-07-19. Retrieved 2012-12-29.

[2] “U.S. Workers World’s Most Productive.” CBS Money Watch. Sept. 3, 2007. http://www.cbsnews.com/news/us-workers-worlds-most-productive/. Accessed 9.2.15.

[3] Glaze, Lauren E., and Danielle Kaeble. “Correctional Populations in the United States, 2013.” Washington: U.S. Bureau of Justice Statistics (BJS). Dec., 2014.


Copyright 2017. Robert D. Shepherd. All rights reserved. This piece may be copied and freely distributed by teachers provided this copyright notice is retained.

For other pieces by Bob Shepherd dealing with the topic of religion, go here: https://bobshepherdonline.wordpress.com/category/religion/

For other pieces by Bob Shepherd dealing with the topics of teaching literature and writing, go here: https://bobshepherdonline.wordpress.com/category/teaching-literature-and-writing/

For other pieces by Bob Shepherd dealing with topics in philosophy, go here: https://bobshepherdonline.wordpress.com/category/philosophy/

Posted in Religion, Teaching Literature and Writing | 4 Comments

Mary Shelley’s Frankenstein: Notes Introducing a Debate Unit on Transformative Technologies

This is a little backgrounder on Shelley’s Frankenstein that I created for a Powerpoint presentation to a high-school Debate class. We were starting a unit in which students would be debating the merits of various emerging transformative technologies.

In preparation for this unit, I want to tell you about a truly remarkable young woman who at about your age, about 200 years ago, did an amazing thing. That’s her boyfriend and, later, her husband, the poet Percy Bysshe Shelley, in the center. The other fellow is their friend George Gordon, Lord Byron. Percy Shelley and Byron were both poets—very great poets.

Mary Shelley was born Mary Godwin, into a family of political radicals. Her father, William Godwin, was a Socialist utopian. Her grandmother, Mary Wollstonecraft, is widely considered to be the founder of the Women’s Rights Movement.

When Mary was only 16, much to the displeasure of her father, she became involved with the young radical poet Percy Shelley. Although Shelley died young, just before his 30th birthday, he had already produced work that made him, according to critic Harold Bloom, the greatest poet who ever wrote in the English language, excepting, perhaps, Shakespeare. Percy Shelley certainly was the greatest of the Romantic poets. It’s impossible to exaggerate how much his work influenced our culture. He invented a way of speaking, in verse, that has permeated our popular culture and personal lives—the ways in which we talk about our feelings, about romantic love, about justice and idealism. Every popular love ballad owes a debt to him. He was a brilliant intellect, a master of ancient Greek, an idealist, a vegetarian, a political and social radical. He died when his boat was destroyed while sailing on the Gulf of Sperzia, in Italy. It is possible that the British intelligence services arranged for his boat to be rammed and capsized, for Percy Shelley supported Irish rebellion against the British and he wrote in favor of revolution—like those that had occurred in the United States and France—to overthrow the nobility and to establish social equality and democracy. Shelley detested tyranny, and he was fearless. It’s easy to see why Mary was attracted to him. Many of the young women of Europe, at the time, were. He was something of a counterculture hero of his day.

In 1816, Mary and Percy joined Lord Byron and Byron’s girlfriend Claire Claremont at this house on Lake Geneva, in Switzerland, for a holiday. Byron was rich, and he could afford this sort of thing. (Shelley was also from a noble family of some means, but his father had disowned him for his radical behavior and views.) Also joining the friends was the young English doctor and physician John Polidori.

The friends had intended to spend their summer holiday sailing and hiking in the mountains.

As it turned out, that year was known as “the year without a summer.” A volcano had erupted on an island in Indonesia, throwing enough dust into the air to affect climate worldwide. The weather was awful, and the friends were confined indoors in rainy, cold weather.

The friends amused themselves by telling German ghost stories, and Bryon proposed a contest: They would each write a ghost story to tell to the group. Naturally, since Percy and Lord Byron were both famous writers (though they were quite young), they expected that one of them would win the contest.

But Mary Shelley surprised everyone by writing the best story of all, a novella that would make her internationally famous. Though she was only 18 years old, it was here, at this time, that she wrote the novel Frankenstein.

And Polidori wrote, for this same contest, the first vampire novel, The Vampyre—a riveting tale, but not nearly as well written as Mary’s was. She won the contest—and immortal fame.

Consider this fact: Humans have never created a new technology that they have not then used. They have never looked at a technological possibility and said, “No, that’s crazy. We shouldn’t do that.” But that’s just the question that young Mary Shelley’s novel raised: “Should we do that?”


And so, at the age of 18, Mary Shelley created a new genre of literature—the science fiction novel—and a new area of thought regarding what moral limits, if any, should be placed on technological and scientific “progress.”

Mary’s idea for the novel was based on one of the latest scientific discoveries of her time. The Italian scientist Luigi Galvani had discovered that if you hooked nerve cells up to electricity, the muscles operated by those nerves would twitch. Many thought that this “animal electricity” might be the secret to life itself. Mary wrote a novel in which a young scientist, Victor Frankenstein, uses galvanism to reanimate a creature sewn together from corpses of the dead. Today, the science of the electrical properties of cells, including neural cells, is called electrophysiology. (Note: the scientist, not the “monster,” is named Frankenstein. Victor Frankenstein’s creation is rejected and unloved and unnamed. The creature refers to himself as the “Adam of [Victor’s] labours.”)

So, the overarching, Essential Question, for our debate will be

Should technological progress proceed unchecked?


Should limits be imposed on some technological developments?

Issues we shall be debating related to this question:

  • Greater-than-human general artificial intelligence
  • Genetic engineering of humans/manipulation of the human genome/designer babies
  • Indefinite life extension/the end of aging
  • High-tech surveillance
  • Brain-computer interfaces, or BCIs
  • Human-computer hybrids, or cyborgs

Look around you. How many items can you list, in your immediate environment, that are artificial, or made by humans using technology? (Yes, even that bag of potato chips is a technological marvel.)

What are some ways in which we benefit from technology? What are some the existential risks (risks to our continued existence) that our technologies pose?

Are there technologies that should be banned or strictly controlled? What limits or controls should exist? And are such limits or controls even possible? (Consider: What if some countries ban genetic engineering but others don’t? What happens then?)

  • Why is it especially important for young people to be asking these questions? (Consider: You will live in the world that these new technologies create.)


For more on Education Deform, by Bob Shepherd, go here: https://bobshepherdonline.wordpress.com/category/ed-reform/

For more on teaching literature and writing, by Bob Shepherd, go here: https://bobshepherdonline.wordpress.com/category/teaching-literature-and-writing/

For short stories, flash fictions, and writing about fiction, by Bob Shepherd, go here: https://bobshepherdonline.wordpress.com/category/short-stories/

Posted in Ed Reform, Teaching Literature and Writing | Leave a comment

How Do I Loathe Thee?

Trump, as Trump will tell you, is the best. I quite agree. He is, of politicians, the most

vile, and


For more songs and poems about the Trump the Chump misadministration, go here: https://bobshepherdonline.wordpress.com/…/trump-don-the-con/

For more humor from Bob Shepherd, go here: https://bobshepherdonline.wordpress.com/category/humor/

Posted in Humor, Trump (Don the Con) | 2 Comments

Essential Reading | Book Review


Diane Ravitch’s Slaying Goliath: The Passionate Resistance to Privatization and the Fight to Save America’s Public Schools

Slaying Goliath: The Passionate Resistance to Privatization and the Fight to Save America's Public SchoolsIn these dark days of Trumpism, reasons for optimism are the spars to which the rest of us, the passengers on the now disastrously helmed ship of state, attempt to cling. Diane Ravitch’s new book, Slaying Goliath is such a spar. It’s a celebration of those who have pushed back against the oligarch-led disruption and attempted privatization of our preK-12 educational system. But it’s more than just a lot of cheering stories (though it is that, and we need those; reading this, you will find yourself cheering again and again). It’s also, effectively, a manual for the Resistance, a how-to book detailing a way forward not only for parents and teachers but for workers generally (and so, like classrooms themselves, it has profound import beyond the classroom). And, of course, the book is imbued with the defining style, wit, intelligence, courage, compassion, and moral clarity we’ve come to associate with the de facto leader of the Resistance to privatization and oligarchy, Diane Ravitch. In stark contrast to, say, our Narcissist-in-Chief part-time President, Ravitch doesn’t speak to promote Ravitch. The greatest leader in the Democratic Resistance Movement not touted and thanked in this book is Diane Ravitch herself. So, let me do that here. Thank you, Dr. Ravitch, for all that you do, every day, because you give a damn about kids and parents and teachers and workers and democracy.

Like Uncle Tom’s Cabin or Silent Spring, Slaying Goliath is one of those books that can make important change happen. Tremble, oligarchs, for our Jeanne d’Arc, our Boadicea, our David is in the field, and millions are ranged behind her, not many millions of Gates or Koch or Walton or DeVos dollars, mind you, but millions of teachers and students and parents and others who care about public schools and other democratic institutions. As Ravitch explains in this book, education disruption and deformation (so-called “Reform”) is not a real movement. It depends entirely on paid, Vichy collaborators with a handful of profiteering oligarchs in the Billionaire Boys and Girls’ Club. But that makes it all the more insidious, pernicious, dangerous.

We are in a phase transition, like a pot of water just before it starts boiling. Or, to change the metaphor, there’s a war going on to determine whether the United States, in the future, will

instantiate a New Feudal Order of oligarchical command, coercion, and control*


revive its democratic institutions, flourish free, and prosper.

This book is the chronicle of the beginnings of that war for democracy, of the many battles the good guys have won recently, and an explanation of how we’ve won those battles and can win the war. It’s an inspiring, moving work about teachers in the street and in the statehouse–teachers teaching other workers how democratic change, real change, is made. Ever the historian, but here treating very recent history (and history makers!), Ravitch details in this book an important piece of the current phase transition. And like all truly great historians, she presents the vivid, concrete facts, not a lot of blithering generalities. It is a mark of Ravitch’s keen intelligence that the generalizations she does present–the roadmap to a better world in the making–are earned, are so clearly won.

Need a shot of optimism for the future? Find it here.

Why the optimism? Well, read the book and find out. It’s the most important reading you will do this year.

*An attempt to create a national curriculum gatekeeper, a Curriculum Commissariat and Thought Police, is coming soon from the oligarchs and their sycophantic, toadying, dependent minions–paid bobblehead politicians, judicial and bureaucratic wind-up figures, and Vichy collaborators in think tanks where thinking tanks; be warned. If you care about democracy, kill that beast in its cradle.

This post is hereby released into that beautiful place, the Commons, the Public Domain. Feel free, without changing its wording, to copy and disseminate it. (Please do!) When possible, please include an attribution to its author. Thank you!

Buy the book here: https://www.amazon.com/Slaying-Goliath-Passionate-Resistance-Privatization/dp/0525655379

For more by Bob Shepherd on Education “Reform,” go here: https://bobshepherdonline.wordpress.com/category/ed-reform/

A good place to start: https://bobshepherdonline.wordpress.com/2020/01/06/stopping-by-school-on-a-disruptive-afternoon/

Another: https://bobshepherdonline.wordpress.com/2014/04/10/on-developing-curricula-in-the-age-of-the-thought-police/

For more by Bob Shepherd (including cartoons!) on the Trump misadministration, go here: https://bobshepherdonline.wordpress.com/category/trump-don-the-con/


Posted in Ed Reform, Teaching Literature and Writing, Trump (Don the Con) | 1 Comment

Stopping by School on a Disruptive Afternoon

after decades of test-driven education “reform”

Whose schools these are, I think I know.
His house is near Seattle though.
He will not see me stopping here
to watch what kids now undergo.

My better angels think it queer
to see a place so void of cheer
what with the tests and data chats,
the data walls with children’s stats.

Where are the joys of yesterday—
when kids would draw and sing and play?
The only sound I hear’s defeat
and pencils on the bubble sheets.

Disrupters say, unflappable,
“We’re building Human Capital!”
Such word goes out from their think tanks,
as they their profits build and bank.

“Music, stories, art, and play
won’t teach Prole children to obey
with servile, certain, gritful grace
and know their rightful, lowly place.”

The fog is heavy, dark and deep.
Where thinking tanks, Deformers creep
and from our children childhood steal
and grind them underneath the wheel.


Disruption of the Commonweal
is that in which Deformers deal
that they might thereby crises fake
as cover whereby they might take
(the smiling villains!) take and take
and take and take and take and take.

Robert D. Shepherd. Copyright 2020. This post may be shared freely. (Please do!) But please include the attribution. Thanks!


For more on Education Deform, by Bob Shepherd, go here: https://bobshepherdonline.wordpress.com/category/ed-reform/

Posted in Art, Ed Reform, Teaching Literature and Writing | 1 Comment

Rudolph the Brown-Nosed | Bob Shepherd

(to the tune of “Rudolph the Red-Nosed Reindeer”)

Rudolph the Ghouliani
had a very brown, brown nose,
squandered his former goodwill,
stroking Trump on TV shows.

All of the other Trumpties
used to laugh and call him names.
Even those abject toadies
thought him crooked and insane.

Then one Foggy Bottom eve,
Trumpty called to say,
“Rude one with your nose so brown,
won’t you take Joe Biden down?”

What happened then’s sheer folly:
thanks to lows the two men reached,
history will most remember that
Trumpty Dumpty was impeached.


For more songs and poems about the Trump the Chump misadministration, go here: https://bobshepherdonline.wordpress.com/…/trump-don-the-con/

For more humor from Bob Shepherd, go here: https://bobshepherdonline.wordpress.com/category/humor/

For more poetry by Bob Shepherd, and essays on the art of poetry, go here: https://bobshepherdonline.wordpress.com/category/poetry/

Posted in Humor, Trump (Don the Con) | Leave a comment