What Makes Humans Human?

Little, today, is as it was.

Anatomically modern humans have existed for about 200,000 years, but only since the end of the eighteenth century has artificial lighting been widely used. Gas lamps were introduced in European cities about that time, and electric lights came into use only in the twentieth century.

In other words, for most of human history, when night fell, it fell hard. Things got really, really dark,

and people gathered under the stars, which they could actually see, in those days before nighttime light pollution,

and under those stars, they told stories.

In EVERY culture around the globe, storytelling, in the form of narrative poetry, existed LONG before the invention of writing. We know this because the earliest manuscripts that we have from every culture record stories that were already ancient when they were finally written down. One of the earliest texts in English is that of the poem Beowulf. It reworks and retells, in a much distorted manner, much, much older stories—ones that predate the emergence of English as a distinct language. Stith Thompson, the great folklorist, did the literary world an enormous favor by compiling a massive index, today known as the Arne-Thompson Index, of motifs of ancient folktales worldwide. Name a story motif—three wishes, talking animals, the grateful dead, cruel stepsisters, golden apples, dragons, the fairy or demon lover, the instrument that plays itself –and you will find that the motif has an ancient pedigree and was already spread about the world long before historical times.

English is a Germanic language. All ancient Germanic societies had official storytellers whose job it was to entertain people in those days before modern entertainments like television and movies and the Internet and drones with laser-guided Hellfire missiles. In ancient Denmark, the storyteller was called a skaald. In Anglo-Saxon England, the storyteller was a scop (pronounced like MnE “shop”). The scop accompanied his stories on the Anglo-Saxon harp, a kind of lyre.

Of course, the telling of stories wasn’t the only entertainment around campfires. In most cultures, people danced and chanted and sang as well, and sometimes stories were told by the dancers or singers or chanters. All this was part of acting out the stories. (Want to know where the Christian devil, with his red body and horns, comes from? Well, in ancient Europe, people worshiped an Earth Mother and her consort, a Lord of the Forest, and they told stories of the hunt. When they acted these out around campfires, they held up to their heads animal horns, or branches in the shape of horns, and that’s how they pictured their Lord of the Forest, as a therianthrope, red from the campfire, with horns. When the Christians spread North across Europe, they made the god of the Old Religion into The Adversary. Grendel’s mother, the monster from the bog in Beowulf, is a demonized version, in a Christian story, of the ancient Anglo-Saxon fertility goddess Nerthus, to whom sacrifices were made by binding people, cutting their throats, and throwing them into a bog. You can see an ancient bas relief of the Lord of the Forest, btw, on the Gundestrup cauldron dating from 150 to 1 BCE. See the accompanying illustration.)

But where does this storytelling urge among humans come from, and why is it universal? Storytelling takes energy. And it doesn’t produce tangible results. It doesn’t mend bones or build houses or plant crops. So, why would it survive and be found among every people on Earth from the earliest times onward?

Contemporary cognitive scientists have learned that storytelling is an essential, built-in part of the human psyche, involved in every aspect of our lives, including our dreams, memories, and beliefs about ourselves and the world. Storytelling turns out to be one of the fundamental ways in which our brains are organized to make sense of our experience. Only in very recent years have we come to understand this. We are ESSENTIALLY storytelling creatures, in the Aristotelian sense of essentially. That is, it’s our storytelling that defines us. If that sounds like an overstatement, attend to what I am about to tell you. It’s amazing, and it may make you rethink a LOT of what you think you know.

At the back of each of your eyes are retinas containing rods and cones. These take in visual information from your environment. In each retina, there is a place where the optic nerve breaks through it. This is the nerve that carries visual signals to your brain. Because of this interruption of the retinas, there is a blind spot in each where NO INFORMATION AT ALL IS AVAILABLE. If what you saw was based on what signals actually hit your retina at a given moment, you would have two big black spots in your field of vision. Instead, you see a continuous visual field. Why? Because your brain automatically fills in the missing information for you, based on what was there when your eye saccaded over it a bit earlier. In other words, your brain makes up a story about what’s there. Spend some time studying optical illusions, and you will learn that this is only one example of many ways in which you don’t see the world as it is but, rather, as the story concocted by your brain says it is.

This sort of filling in of missing pieces also happens with our memories. Scientists have discovered that at any given moment, people attend to at most about seven bits of information from their immediate environment. There’s a well-known limitation of short-term memory to about seven items, give or take two, and that’s why telephone numbers are seven digits long. So, at any given moment, you are attending to only about seven items from, potentially, billions in your environment. When you remember an event, your brain FILLS IN WHAT YOU WERE NOT ATTENDING TO AT THE TIME based on general information you’ve gathered, on its predispositions, and on general beliefs that you have about the world. In short, based on very partial information, your brain makes up and tells you a STORY about that past time, and that is what you “see” in memory in your “mind’s eye.”

So, people tend to have a LOT of false memories because the brain CONFABULATES—it makes up a complete, whole story about what was PROBABLY the case and presents that whole memory to you, with the gaps filled in, for your conscious inspection. In short, memory is very, very, very faulty and is based upon the storytelling functions of the brain!!!! (And what are we except our memories? I am that boy in the Dr. Dentons, in my memory, sitting before the TV with the rabbit ears; I am that teenager in the car at the Drive-in with the girl whom I never thought in a million years would actually go out with me. But I’m getting ahead of myself.)

You can also see this storytelling function of the brain at work in dreaming. Years ago, I had a dream that I was flying into the island of Cuba on a little prop plane. Through the window, I could see the island below the plane. It looked like a big, white sheet cake, floating in an emerald sea. Next to me on the airplane sat a big, red orangutan smoking a cigar.

Weird, huh? So why did I have that dream? Well, in the days preceding the dream I had read a newspaper story about Fidel Castro, the leader of Cuba, being ill; I had flown on a small prop plane; I had attended a wedding where there was a big, white sheet cake; I had been to the zoo with my grandson, where we saw an orangutan; and I had played golf with some friends, and we had smoked cigars.

The neural circuits in my brain that had recorded these bits and pieces were firing randomly in my sleeping brain, and the part of the brain that does storytelling was working hard, trying to piece these random fragments together into a coherent, unified story. That’s the most plausible current explanation of why most dreams occur. The storytelling parts of the brain are responding to random inputs and tying them together—making sense of this random input by making a plausible story of them. This is akin to the process, pareidolia, that leads people see angels in cloud formations and pictures of Jesus on their toast.

So, those are three important reasons why the brain is set up as a storytelling device. Storytelling allows us to see a complete visual field; creates for us, from incomplete data, coherent memories; and ties together random neural firings in our brains to into the wholes that we call dreams.
.
But that’s not all that storytelling does for us. Storytelling about the future allows us to look ahead—for example, to determine what another creature is going to do. We often play scenarios in our minds that involve possible futures. What will she say if I ask her to the prom? What will the boss say if I ask for a raise? How will that go down? In other words, storytelling provides us with a THEORY OF MIND for predicting others’ behavior.

Stories also help people to connect to one another. When we tell others a story, we literally attune to them. We actually get “on the same wavelengths.” Uri Hasson, a neuroscientist at Princeton, recorded the brainwaves of people during rest and while listening to a story. During rest, their waves were all over the place. While listening to the same story, even at different times and places, those people had brainwaves that were in synch.

Storytelling also provides a mechanism for exploring and attempting to understand others generally. Our basic situation in life is that your mind is over there and mine is over here. We’re different, and we have to try to figure each other out—to have a theory of other people’s minds. By telling myself a story about you, I can attempt to bridge that ontological gap. Unfortunately, the stories we tell ourselves about others tend to be fairly unidimensional. You are simply this or that. I, on the other hand, am an international man of mystery. This is a tendency we need to guard against.

We also tell stories in order to influence others’ behavior–to get them to adopt the story we’re telling as their own. This is how advertising works, for example. The advertiser gets you to believe a story about how you will be sexier or smarter or prettier or more successful or of higher status if you just buy the product with the new, fresh lemony scent. And it’s not just advertisers who do this. Donald Trump sold working class Americans a fiction about how he could strike deals that would make America great again because he was such a great businessman, one who started with nothing and made billions. The coach tells a story in which her team envisions itself as the winners of the Big Game. The woo-er tells the woo-ee the story of the great life they will have together (“Come live with me and be my love/And we shall all the pleasures prove”). And so on. Successful cult leaders, coaches, lovers, entrepreneurs, attorneys, politicians, religious leaders, marketers, etc., all share this is common: they know that persuasion is storytelling. The best of them also understand that the most successful stories, in the long run, are ones that are true, even if they are fictional.

When we tell stories, we spin possible futures—we try things on, hypothetically. And that helps us to develop ideas about who we want to be and what we want to do. Gee, if I travel down that road, I may end up in this better place.

And that observation leads to one final, supremely important function of storytelling: Who you are—your very SELF—is a story that you tell yourself about yourself and your history and your relations to others—a story with you as the main character. The stories you tell yourself about yourself become the person you are. The word person, by the way, comes from the Latin persona, for a mask worn by an actor in the Roman theatre.

So, our very idea of ourselves, of our own personal identity, is dependent upon this storytelling capacity of the human brain, which takes place, for the most part, automatically. There is even a new form of psychotherapy called cognitive narrative therapy that is all about teaching people to tell themselves more life-enhancing, affirmative stories about themselves, about who they are.

Telling yourself the right kinds of stories about yourself and others can unlock your creative potential, improve your relationships, and help you to self create—to be the person you want to be.

So, to recapitulate, storytelling . . .

helps us to fill in the gaps so that we have coherent memories,

ties together random firings in the brain into coherent dreams,

enables us to sort and make sense of past experience,

gives us theories of what others think and how they will behave,

enables us to influence others’ behavior,

enables us to try on various futures, and

helps us to form a personal identity, a sense of who were are.

Kinda important, all that!

Storytelling, in fact, is key to being human. It’s our defining characteristic. It’s deeply embedded in our brains. It runs through every aspect of our lives. It makes us who we are.

It’s no wonder then, that people throughout history have told stories. People are made to construct stories—plausible and engaging accounts of things—the way a stapler is made to staple and a hammer is made to hammer. We are Homo relator, man the storyteller.

(BTW, the root *man, meaning “human being” in general, without a specific gender reference, is ancient. It goes all the way back to Proto-Indo-European, but there’s still good reason, today, to seek out gender-neutral alternatives, when possible, of course.)

Copyright 2015. Robert D. Shepherd. All rights reserved.

Art: Detail from the Gundestrup Cauldron. Nationalmuseet [CC BY-SA 3.0 (https://creativecommons.org/licenses/by-sa/3.0) or CC BY-SA 2.5 (https://creativecommons.org/licenses/by-sa/2.5)]

For more pieces by Bob Shepherd on the topic of Education “Reform,” go here: https://bobshepherdonline.wordpress.com/category/ed-reform/

For more pieces on the teaching of literature and writing, go here: https://bobshepherdonline.wordpress.com/category/teaching-literature-and-writing/

Posted in Short Stories, Teaching Literature and Writing, Uncategorized | 33 Comments

It’s about Time (a Catena)

creation-web-version

  

A brief tour of fascinating (and lunatic) notions that philosophers (and a few poets) have had about time. 

The Mystery of Time

“What then is time? If no one asks me, I know; if I wish to explain it to one who asks, I know not.”

–St. Augustine (345–430 CE), Confessions

PART 1: What Is Time? Types of Time

Albert_Einstein_at_the_age_of_three_(1882)Absolute or Scientific Newtonian Time

“Absolute, true and mathematical time, of itself, and from its own nature flows equably without regard to anything external, and by another name is called duration.”

–Sir Isaac Newton (1643–1727), Philosophiae naturalis principia mathematica (Mathematical Principles of Natural Philosophy)

The Specious (Nonexistent) Present

“The relation of experience to time has not been profoundly studied. Its objects are given as being of the present, but the part of time referred to by the datum is a very different thing from the conterminous of the past and future which philosophy denotes by the name Present. The present to which the datum refers is really a part of the past — a recent past — delusively given as being a time that intervenes between the past and the future. Let it be named the specious present, and let the past, that is given as being the past, be known as the obvious past. [Each of] all the notes of a bar of a song seem to the listener to be contained in the [specious] present. [Each of] all the changes of place of a meteor seem to the beholder to be contained in the [specious] present. At the instant of the termination of [each element in] such series, no part of the time measured by them seems to be [an obvious] past. Time, then, considered relatively to human apprehension, consists of four parts, viz., the obvious past, the specious present, the real present, and the future. Omitting the specious present, it consists of three . . . nonentities — the [obvious] past, which does not [really] exist, the future, which does not [yet] exist, and their conterminous, the [specious] present; the faculty from which it proceeds lies to us in the fiction of the specious present.”

–E. Robert Kelley, from The Alternative, a Study in Psychology (1882). Kelley’s concept of the specious present has been extremely influential in both Continental and Anglo-American philosophy despite the fact that Kelley was not a professional philosopher.

Albert_Einstein_as_a_childSubjective Time

“Oh, yeah. Hegel’s Phenomenology of Spirit. I never finished it, though I did spent about a year with it one evening.”

Experienced Time: The “Wide” Present

“In short, the practically cognized present is no knife-edge, but a saddle-back, with a certain breadth of its own on which we sit perched, and from which we look in two directions into time. The unit of composition of our perception of time is a duration, with a bow and a stern, as it were—a rearward- and a forward-looking end. It is only as parts of this duration-block that the relation or succession of one end to the other is perceived. We do not first feel one end and then feel the other after it, and forming the perception of the succession infer an interval of time between, but we seem to feel the interval of time as a whole, with its two ends embedded in it.”

–William James, “The Perception of Time,” from The Principles of Psychology, Book I

459px-Einstein_patentofficeA, B, and C Series Time (Three Ways of Looking at Time)

  • The A Series: Time as Past, Present, and Future
  • The B Series: Time as Earlier, Simultaneous, and Later
  • The C Series: Time as an Ordered Relation of Events (with the direction being irrelevant)

Influential distinctions made by John Ellis McTaggart in “The Unreality of Time.” Mind 17 (1908): 456-476. The three types are much discussed by philosophers in the Anglo-American analytic tradition.

See also The Unreality of Time 2: Block Time, below

PART 2: Does Time Exist?

No, It Doesn’t: Change Is a Self-Contradictory Idea

“For this view can never predominate, that that which IS NOT exists. You must debar your thought from this way of search. . . .There is only one other description of the way remaining, namely, that what IS, is. To this way there are very many signposts: that Being has no coming-into-being . . . . Nor shall I allow you to speak or think of it as springing from not-being; for it is neither expressive nor thinkable that what-is-not is. . . . How could Being perish? How could it come into being? If it came into being, it is not; and so too if it is about-to-be at some future time. . . .For nothing else either is or shall be except Being, since Fate has tied it down to be a whole and motionless; therefore all things that mortals have established, believing in their truth, are just a name: Becoming and Perishing, Being and Not-Being, and Change of position, and alteration of bright color.”

–Parmenides of Elea (c. 475 BCE), fragment from The Way of Truth, in Ancilla to the PreSocratic Philosophers, ed. Kathleen Freeman

Albert_Einstein_(Nobel)“Does the arrow move when the archer shoots it at the target? If there is a reality of space, the arrow must at all times occupy a particular position in space on its way to the target. But for an arrow to occupy a position in space that is equal to its length is precisely what is meant when one says that the arrow is at rest. Since the arrow must always occupy such a position on its trajectory which is equal to its length, the arrow must be always at rest. Therefore, motion is an illusion.”

–Zeno of Elea (c. 450 BCE), fragment from Epicheriemata (Attacks), in Ancilla to the PreSocratic Philosophers, ed. Kathleen Freeman

“One part of time has been [the past] and is not, while the other is going to be and is not yet [the future]. Yet time, both infinite time and any time you care to take, is made up of these. One would naturally suppose that what is made up of things which do not exist could have no share in reality.”

–Aristotle (384–322 BCE), Physics, IV, 10–14. 217b-244a.

462px-Einstein-formal_portrait-35Yes, It Does: Change Is the Fundamental Reality of Our Lives

“It is not possible to step twice into the same river.”

–Heraclitus, (c. 475 BCE), fragment from unnamed book, in Ancilla to the PreSocratic Philosophers, ed. Kathleen Freeman

[Heraclitus seems to have held this fact to be one of many indications of the essential unworthiness/irredeemability of this life; the other fragments of his writings that have survived suggest that Heraclitus was a kind of 5th century fundamentalist preacher, upset about the moral decay around him, who viewed the world as synonymous with decay, and who wanted to point his readers, instead, toward the eternal Logos. Plato inherited this view; the Christian church inherited Plato’s. Such contemptu mundi (contempt for the world) is often, in that tradition, expressed as contempt for that which exists “in time” and is not eternal.]

“Time is nature’s way of keeping everything from happening at once.”

–Woody Allen (1935–      )

Albert_Einstein_Head

No, It Doesn’t: Time is an Illusion Due to Vantage Point in an Eternal Space Time (the “Block Time” Hypothesis):

“Now Besso has departed from this strange world a little ahead of me. That means nothing, for we physicists believe the separation between past, present, and future is only an illusion, although a convincing one.”

–Albert Einstein (1879­–1955), in a letter written to the family of Michele Besso, on Besso’s death

“All time is all time. It does not change. It does not lend itself to warnings or explanations. It simply is. Take it moment by moment, and you will find that we are all, as I’ve said before, bugs in amber.”

462px-Einstein-formal_portrait-35–Kurt Vonnegut, Jr. (1922–2007), who is in heaven now, Slaughterhouse Five

Time present and time past
Are both perhaps present in time future,
And time future contained in time past.
If all time is eternally present
All time is unredeemable.

–T.S. Eliot (1888–1965), “Burt Norton,” from Four Quartets

No, It Doesn’t: The Now as Consequence of the Blindness of the Brain to Its Own Processing of Temporal Data (the “Blind Brain” Hypothesis)

“Nothing, I think, illustrates this forced magic quite like the experiential present, the Now. Recall what we discussed earlier regarding the visual field. Although it’s true that you can never explicitly ‘see the limits of seeing’–no matter how fast you move your head–those limits are nonetheless a central structural feature of seeing. The way your visual field simply ‘runs out’ without edge or demarcation is implicit in all seeing–and, I suspect, without the benefit of any ‘visual run off’ circuits. Your field of vision simply hangs in a kind of blindness you cannot see.

“This, the Blind Brain Hypothesis suggests, is what the now is: a temporal analogue to the edgelessness of vision, an implicit structural artifact of the way our ‘temporal field’–what James called the ‘specious present’–hangs in a kind temporal hyper-blindness. Time passes in experience, sure, but thanks to the information horizon of the thalamocortical system, experience itself stands still, and with nary a neural circuit to send a Christmas card to. There is time in experience, but no time of experience. The same way seeing relies on secondary systems to stitch our keyhole glimpses into a visual world, timing relies on things like narrative and long term memory to situate our present within a greater temporal context.

“Given the Blind Brain Hypothesis, you would expect the thalamocortical system to track time against a background of temporal oblivion. You would expect something like the Now. Perhaps this is why, no matter where we find ourselves on the line of history, we always stand at the beginning. Thus the paradoxical structure of sayings like, “Today is the first day of the rest of your life.” We’re not simply running on hamster wheels, we are hamster wheels, traveling lifetimes without moving at all.

“Which is to say that the Blind Brain Hypothesis offers possible theoretical purchase on the apparent absurdity of conscious existence, the way a life of differences can be crammed into a singular moment.”

–Scott Bakker, “The End of the World As We Knew It: Neuroscience and the Semantic Apocalypse”

PART 3: What Contemplation of Time Teaches Us about Living

Carpe Diem

“Such,” he said, “O King, seems to me the present life of men on Earth, in comparison with that time which to us is uncertain, as if when on a winter’s night, you sit feasting . . . and a simple sparrow should fly into the hall, and coming in at one door, instantly fly out through another. In that time in which it is indoors it is indeed not touched by the fury of winter; but yet, this smallest space of calmness being passed almost in a flash, from winter going into winter again, it is lost to our eyes.

“Something like this appears the life of man, but of what follows or what went before, we are utterly ignorant.”

–The Venerable Bede (c. 672–735), Ecclesiastical History of the English People, Book II

Albert_Einstein_(Nobel)

“Seize the day, trusting as little as possible in the future.”

–Horace (65–8 BCE), Odes 1.11

Oh, come with old Khayyam, and leave the Wise
To talk; one thing is certain, that Life flies;
One thing is certain, and the Rest is Lies;
The Flower that once has blown for ever dies.

Omar Khayyám (1048–1131), “Rubiyat,” trans. Edward FitzGerald

Gather ye rosebuds while ye may
Old time is still a-flying:
And this same flower that smiles to-day
To-morrow will be dying.

–Robert Herrick (1591–1674), “To the Virgins, to Make Use of Time”

459px-Einstein_patentofficeBut at my back I alwaies hear
Times winged Charriot hurrying near:
And yonder all before us lye
Desarts of vast Eternity.
Thy Beauty shall no more be found;
Nor, in thy marble Vault, shall sound
My ecchoing Song: then Worms shall try
That long preserv’d Virginity:
And your quaint Honour turn to durst;
And into ashes all my Lust.
The Grave’s a fine and private place,
But none I think do there embrace.
Now therefore, while the youthful hew
Sits on thy skin like morning glew,
And while thy willing Soul transpires
At every pore with instant Fires,
Now let us sport us while we may;
And now, like am’rous birds of prey,
Rather at once our Time devour,
Than languish in his slow-chapt pow’r.
Let us roll all our Strength, and all
Our sweetness, up into one Ball:
And tear our Pleasures with rough strife,
Thorough the Iron gates of Life.
Thus, though we cannot make our Sun
Stand still, yet we will make him run.

–Andrew Marvell (1621–1678), “To His Coy Mistress”

“Get it while you can.
Don’t you turn your back on love.”

–The American philosopher Janis Joplin (1943–1970)

Albert_Einstein_as_a_childGive Up/It’s All Futile Anyway

“A man finds himself, to his great astonishment, suddenly existing, after thousands of years of nonexistence: he lives for a little while; and then, again, comes an equally long period when he must exist no more. The heart rebels against this, and feels that it cannot be true.

“Of every event in our life we can say only for one moment that it is; for ever after, that it was. Every evening we are poorer by a day. It might, perhaps, make us mad to see how rapidly our short span of time ebbs away; if it were not that in the furthest depths of our being we are secretly conscious of our share in the exhaustible spring of eternity, so that we can always hope to find life in it again.

“Consideration of the kind, touched on above, might, indeed, lead us to embrace the belief that the greatest wisdom is to make the enjoyment of the present the supreme object of life; because that is the only reality, all else being merely the play of thought. On the other hand, such a course might just as well be called the greatest folly: for that which in the next moment exists no more, and vanishes utterly, like a dream, can never be worth a serious effort.”

–The ever-cheerful Arthur Schopenhauer (1788–1860), “The Vanity of Existence,” from Studies in Pessimism

Three Phenomenologist/Existentialist Views of Time

NB: the following are NOT quotations. I’ve summarized material that appears in much longer works. You’re welcome. I have included Husserl in this section, even though his work is just an attempted explanation of time, because the other two philosophers treated here are reacting to Husserl’s ideas.

Albert_Einstein_at_the_age_of_three_(1882)Husserl (very bright dude, this one): All our ideas about time spring from our conscious experience of the present. That experience is characterized by being intentional, by being toward something. We typically recognize three kinds of time: 1. scientific, objective, Newtonian time, which we think of as being independent of ourselves and as independently verifiable; 2. subjective time, in which events seem to move slower or faster; and 3. phenomenological or intentional time, which is the fundamental experience on which the other concepts of time are based, from which the other concepts derive because the phenomenological present includes not only awareness of present phenomena (the present), but retention (awareness of that which is not present because it no longer is—the past), and protention (awareness of that which is not present because it is about to be). The present is intentionality toward phenomena before us here, now. The past is present intentionality toward phenomena that are not present but are with us and so must be past (that’s where the definition of past comes from). The future is present intentionality toward phenomena that also are present but are not with us (as the past is) and so must be the future, which will be (that’s where the definition of future comes from). Therefore, in their origins in our phenomenological experiences, the future and the past are parts of the present, conceptual phenomena held in the present, alongside actual phenomena, as phenomena no longer present and not yet present.

Albert_Einstein_as_a_childHeidegger: Husserl had it all wrong. It’s the future, not the present, that is fundamental. We are future-oriented temporalities by nature, essentially so. Our particular type of being, Dasein, or being-there, is characterized by having care (about its projects, its current conditions, about other beings)—about matters as they relate to those projects. Our being is characterized by understanding, thrownness, and fallenness. Understanding, is the most fundamental of the three. It is projection toward the future, comportment toward the possibilities that present themselves, potentiality for being. Our understanding seizes upon projects, projecting itself on various possibilities. In its thrownness, Dasein always finds itself in a certain spiritual and material, historically conditioned environment that limits the space of those possibilities. As fallenness, Dasein finds itself among other beings, some of which are also Dasein and some of which (e.g., rocks) are not Dasein, and it has, generally respectively, “being-with” them or “being alongside” them, and these help to define what possibilities there are.  “Our sort of being (Dasein) is being for which being is an issue.” Why is it an issue? Well, we are finite. We know that we are going to die. This is the undercurrent that informs our essential being, which is care, concern. We are projections toward the future because undertaking these projects is an attempt, however quixotic, to distract ourselves from or even to cheat death. We care about our projects because, at some level, we care about not dying, having this projection toward the future for which we are living.

459px-Einstein_patentofficeSartre: The world is divided into two kinds of being: being-for-itself (the kind of being that you and I have) and being-in-itself (the kind of being that a rock or a refrigerator has). Let’s think a bit about our kind of being. Take away your perceptions, your body, your thoughts. Strip everything away, and you still have pure being, the being of the being-for-itself, but it is a being that is also nothing. (The Buddha thought this, too). Being-for-itself has intentional objects, but itself is no object (there’s no there there) and so is nothing, a nothingness. Time is like being in that respect. It consists entirely of the past (which doesn’t exist) and the future (which doesn’t exist) and the present (which is infinitesimally small and so doesn’t exist). So time, like being, is a nothingness. This being-for-itself is not just nothingness, however; it has some other bizarre, contradictory characteristics: Its being, though nothing, allows a world to be manifest (how this is so is unclear), a world that includes all this stuff, including others, for example, who want to objectify the being-for-itself, to make it into a something, a thing, a being-in-itself, like a rock. (“Oh, I know you. I’m wise to you. You’re . . . .” whatever.) The being-for-itself also has a present past (in Husserl’s sense) and is subject to certain conditions of material construction (the body) and material conditions (in an environment of things), and all these givens—the body, the environment, one’s own past, and other people seen from the outside in their thinginess—make up the being-for-itself’s facticity. The being-for-itself wants to be SOMETHING, and so lies to itself. It acts in bad faith, playing various roles (playing at being a waiter, for example) and creating for itself an ego (via self-deceptive, magical thinking). But in fact, being in reality nothing, being-for-itself (each of us) knows that that’s all a lie. We transcend our facticity and can be anything whatsoever, act in any way whatsoever. In other words, we are absolutely free and therefore absolutely responsible. This responsibility is absurd, because there is no reason for being/doing any particular thing. “Man is a meaningless passion.” But the absolute freedom that derives from our essential nothingness also allows for action to be truly authentic (as opposed to the play-acting) in addition to being responsible. Only in death does the being-for-itself succeed in becoming a being-in-itself, a completed thing, and then only if and in the manner in which he or she is remembered by others. A person who is not remembered never existed. Death is a time stamp or, if we are not remembered, an expiration date.

Albert_Einstein_(Nobel)The Eternal Return and the Weight of Being

“341. The Greatest Weight. What, if some day or night a demon were to steal after you into your loneliest loneliness and say to you: ‘This life as you now live it and have lived it, you will have to live once more and innumerable times more; and there will be nothing new in it, but every pain and every joy and every thought and sigh and everything unutterably small or great in your life will have to return to you, all in the same succession and sequence—even this spider and this moonlight between the trees, and even this moment and I myself. The eternal hourglass of existence is turned upside down again and again, and you with it, speck of dust!’

“Would you not throw yourself down and gnash your teeth and curse the demon who spoke thus? Or have you once experienced a tremendous moment when you would have answered him: “You are a god and never have I heard anything more divine.” If this thought gained possession of you, it would change you as you are or perhaps crush you. The question in each and every thing, “Do you desire this once more and innumerable times more?” would lie upon your actions as the greatest weight. Or how well disposed would you have to become to yourself and to life to crave nothing more fervently than this ultimate eternal confirmation and seal?”

–Friedrich Nietzsche (1844–1900), The Gay Science

462px-Einstein-formal_portrait-35The Fleeting One-Offness of Everything and the Resulting Unbearable Lightness of Being

“But Nietzsche’s demon is, of course, wrong. There is no eternal return. Where does that leave us? Isn’t life ALWAYS a matter of I should have’s and I would have’s and if I had only knowns? “[W]hat happens but once, might as well not have happened at all. If we have only one life to live, we might as well not have lived at all. . . .

“The heaviest of burdens crushes us, we sink beneath it, it pins us to the ground. But in love poetry of every age, the woman longs to be weighed down by the man’s body. The heaviest of burdens is therefore simultaneously an image of life’s most intense fulfillment. The heavier the burden, the closer our lives come to the earth, the more real and truthful they become. Conversely, the absolute absence of burden causes man to be lighter than air, to soar into heights, take leave of the earth and his earthly being, and become only half real, his movements as free as they are insignificant. What then shall we choose? Weight or lightness?”

–Milan Kundera (1929­–     ), contra Nietzsche, from The Unbearable Lightness of Being

Albert_Einstein_HeadCopyright 2010, Robert D. Shepherd. All rights reserved.

Posted in Existentialism, Metaphysics, Philosophy, Philosophy of Mind, Time | Tagged , , | 5 Comments

Where Did Frank Herbert Get the Idea for the Spice Worms of Dune? | Bob Shepherd

Recently I was reading an obscure text called 3 Baruch, a piece of Christian pseudographia from perhaps the second century CE that describes a vision of five heavens on the part of the titular character. In this book I discovered where Frank Herbert got his idea for giant worms that excrete spice, which he describes as like looking and tasting like cinnamon, but, unlike cinnamon, having powerful, psyche-expanding, psychotropic, entheogenic, psychedelic properties. (NB: Psyche is a Greek word meaning both “mind” and “soul.”) Here’s the relevant text, from 3 Baruch, 6: 3-12:

And I said to the angel, “What is this bird?”

And he said to me, “This is the guardian of the earth.”

And I said, “Lord, how is he the guardian of the earth? Teach me.”

And the angel said to me, “This bird flies alongside of the sun, and expanding his wings receives its fiery rays. For if he were not receiving them, the human race would not be preserved, nor any other living creature. But God appointed this bird thereto.”

And he expanded his wings, and I saw on his right wing very large letters, as large as the space of a threshing-floor, the size of about four thousand modii; and the letters were of gold. And the angel said to me, “Read them.” And I read and they ran thus: “Neither earth nor heaven bring me forth, but wings of fire bring me forth.

And I said, “Lord, what is this bird, and what is his name?”

And the angel said to me, His name is called Phoenix.

(And I said), And what does he eat?

And he said to me, The manna of heaven and the dew of earth.

And I said, Does the bird excrete?

And he said to me, He excretes a worm, and the excrement of the worm is cinnamon, which kings and princes use.

Herbert was a great student of ancient religions, as readers of the Dune series (and watchers of the recent Dune films) will know. So, my contribution to Herbert scholarship. You’re welcome.

NB: Copyright 2014, Robert D. Shepherd. This post may be freely distributed IF this copyright notice is retained.

Posted in Film, Religion, Teaching Literature and Writing | Tagged , , , , | 2 Comments

Getting Clear about the Difference between Sex and Gender

Much of current debate about sex and gender is utterly confused, and the confusion comes from not recognizing the crucial distinction between sex and gender. A lot of unnecessary problems could be avoided by keeping this straight.

–from the article

The brilliant French novelist and philosopher Simone de Beauvoir gave Jean-Paul Sartre, her long-term lover, most of his best ideas, the ones that became the philosophical system known as Existentialism. However, this isn’t her only claim to fame. She also, in her seminal 1949 work Le Deuxième Sex (The Second Sex) introduced into general circulation the crucial distinction between sex and gender. It’s astonishing how many people, 75 years later, still don’t understand this distinction, so let me try to clarify it. First, what Beauvoir wrote:

On ne naît pas femme, on le deviant [One is not born a woman; one becomes one].

She was not, of course, saying that one isn’t (typically) born with either male or female genitals. This is true for all but a small percentage of kids (About 1.7 percent of kids are born intersex–with partially male and partially female sexual organs). What Beauvoir meant was that the characteristics associated with womanhood—what roles one plays, how one dresses, what accessories one wears, who one’s friends are, how one sits and walks, and so on—are culturally, not biologically, occasioned and acquired. They are a matter of gender.

In English, we are fortunate enough to have two distinct words that can be appropriated for the following distinct purposes:

We can (and should) use female and male to refer to the biological inheritance—to the biological sexual characteristics that we are born with and that we develop over time based on our genetic programming. These characteristics comprise our sex.

We can (and should) use woman and man to refer to the acquired, acculturated characteristics traditionally ascribed to and associated with particular sexes (to ones taught us by our culture)—to matters like roles, dress, and learned sex-specific behaviors. These characteristics comprise our gender.

So, to update Beauvoir, one is born female or male (i.e., someone with a particular sex) and becomes a woman or a man or some combination thereof (i.e., someone with certain non-necessary, acquired gender characteristics).

Sex is given (in most cases). Gender is not. The genders that people typically acquire vary depending on time and place. Among the Masai, for example, MEN wear elaborate jewelry and brightly colored clothing; engage in small handicrafts; and spend a lot of time in groups, gossiping—precisely the characteristics widely considered in the 1950s appropriate to American WOMEN. No one teaches a person to have a penis or a clitoris, a scrotum or labia majora. These are simply givens (in most cases). Of course, no one typically sits American boys down and tells them not to wear dresses, either. This behavioral propensity is acquired rather than learned based on behavioral models in the ambient environment. Gender is acquired. Sex is not.

By default, gender comes about by what the French Marxist critic Louis Althusser called “interpellation”–unconscious acquisition of cultural norms. But this is not necessarily so. Because it is acquired, gender is open to being modified with some ease. I used to teach the kids in my acting classes how to walk and sit like people of the opposite gender. This was eye-opening for them. In today’s repressive era of the Moms for the Liberty to Constrain Your Liberty, aka the Minivan Taliban and the Ku Klux Karens, I would probably be fired for these exercises, which my students found fascinating and illuminating.

Much of current debate about sex and gender is utterly confused, and the confusion comes from not recognizing the crucial distinction between sex and gender. A lot of problems could easily be avoided by keeping this straight.

For example, it’s important for young people to recognize that they can experiment with gender change or (even better, to my mind) fluidity WITHOUT THIS HAVING ANYTHING TO DO WITH THEIR BIOLGOICAL SEX. Consider, for example, this fact: Studies have shown that people speak much more nicely to female clerks than to male ones. They use slower and sweeter voices. Well, wouldn’t it be a good idea to speak nicely to male clerks, too? And if boys want to wear makeup, why the hell not? Why is this exclusively for girls? Certainly, male movie stars and politicians do so all the time. My mother took a lot of grief back in the 1960s for wearing pants. Why should young men take grief for wearing skirts or dresses? See, for example, the Islamic thobe, the African dashiki suit, the Sumerian kaunake, the ancient Greek chiton, the Christian priestly cassock or soutane, the Greek funstanella, the ancient Roman and Medieval European tunic, the Sikh baana or chola, the Samoan lavalava, the Japanese hakama, the Palestinian qumbaz, the Southeast Asian sarong, the Indian dhoti or veshti or lungi, the Scottish kilt, and many others.

Recognizing the distinction between sex and gender can lead to a NEW BIRTH OF FREEDOM—to people being able to explore freely gender-related options formerly closed to them—roles, ways of acting and speaking, choice of adornments and activities and partners and friends, and so on. And recognizing that gender and sex ARE DIFFERENT THINGS can lead people not to make decisions about medical treatments and changes to their bodies that they might later regret. People can have the freedom to explore alternate gender expressions without going to such extremes until they are old enough and certain enough to do so. They can also explore various sexual orientations without regard to sex OR gender, of course, and have a right to do so.

Posted in Philosophy, Sex and Gender | Tagged , , , , | 1 Comment

Gayle Greene on How to Build a Human

How do you build a world-class human? Well, you give him or her the benefits of a broad, humane liberal arts education that confers judgment, wisdom, vision, and generosity. In her new book, Immeasurable Outcomes: Teaching Shakespeare in the Age of the Algorithm, Gayle Greene, a renowned Shakespeare scholar and Professor Emerita at Scripps University, shows us, with examples from her classes over three decades, exactly how that is done. And she doesn’t do this at some high level of abstraction. Rather, she backs up her profound general observations with concrete, vivid, fascinating, moving, funny, honest, delightful examples from her classes. 

She also shows us how, under the “standards”-and-testing occupation of our schools, that development of well-rounded, liberally educated young people is being lost.

This engaging book is a full-throated defense of the Liberal Arts and of traditional, humane, in-person, discussion-based education in a time when Liberal Arts schools and programs are being more than decimated, are being damned-near destroyed by bean counters and champions of ed tech. Here’s the beauty and value of the book: contra the “Reformers,” Greene details the extraordinary benefits of the broad, liberal educations that built in the United States the people who created the most powerful, vibrant, and diverse economy in history. She makes the case (I know. It’s bizarre that one would have to) for not taking a wrecking ball to what has worked. 

Some background: Like much of Europe between 1939 and 1945, education in the United States, at every level, is now under occupation. The occupation is led by Bill Gates and the Gates Foundation and abetted by countless collaborators like those paid by Gates to create the puerile and failed Common Core (which was not core—that is, central, key, or foundational—and was common only in the sense of being vulgar). The bean counting under the occupation via its demonstrably invalid, pseudoscientific testing regime has made of schooling in the U.S. a diminished thing, with debased and devolved test preppy curricula (teaching materials) and pedagogy (teaching methods).

In the midst of this, Greene engages in some delightful bomb throwing for the Resistance.

OK.  Let’s try another metaphor. If Gates’s test-and-punish movement, ludicrously called “Education Reform,” is a metastasizing cancer on our educational system, and it is, then Professor Greene’s book is a prescription for how to reverse course and then practice prevention to end the stultification of education and keep it from coming back. 

Years ago, I knew a fellow who retired after a lucrative, successful career. But a couple months later, he was back at his old job. I asked him why he had decided not simply to enjoy his retirement. He certainly had the money to do so.

“Well, Bob,” he said, “there’s only so much playing solitaire one can do.”

I found this answer depressing. I wondered if it were the case that over the years, the fellow had given so much time to work that when he no longer had that to occupy him, he was bored to tears. Had he not built up the internal resources he needed to keep himself happy and engaged ON HIS OWN? Greene quotes, in her book, Judith Shapiro, former president of Barnard College, saying, “You want the inside of your head to be an interesting place to spend the rest of your life.” The French novelist Honoré de Balzac put it this way: “The cultured man is never bored.” Humane learning leads to engagement with ideas and with the world, and as Happiness Studies have shown repeatedly, outward-directed engagement, as opposed to self-obsession, leads to fulfillment, to flourishing over a lifetime, to what the ancient Greeks called eudaimonia, or wellness of spirit. Kinda important, that.

In a time when Gates and his minions, including his impressive collection of political and bureaucratic action figures and bobble-head dolls, are arguing that colleges should become worker factories and do away with programs and requirements not directly related to particular jobs, it turns out that the people happiest in their jobs are ones with well-rounded liberal arts educations, and these are the ones who are best at what they do. And it turns out that people taught how to read and think and communicate and be creative and flexible, people who gain a broad base of knowledge of sciences, history, mathematics, arts, literature, and philosophy, are self-directed learners who can figure out what they need to know in a particular situation and acquire that knowledge. Philosophy students turn out to be great lawyers, doctors, politicians, and political operatives. Traditional liberal arts instruction creates intrinsically motivated people—just the sort of people that employers in their right minds want and certainly the sort that most employers need.

All this and more about the value of liberal arts education Professor Greene makes abundantly clear, and she does so in prose that is sometimes witty, sometimes hilarious, sometimes annoyed, sometimes incredulous (as in, “I can’t believe I even have to protest this shit”); always engaging, human and humane, compassionate, wise, authentic/real; and often profound. As much memoir as polemic, the book is a delight to read in addition to being important politically and culturally.

Gates and his ilk, little men with big money to throw around, look at the liberal arts and don’t see any immediate application to, say, writing code in Python or figuring out how many pallets per hour a warehouse can move. What could possibly be the value of reading Gilgamesh and Lear? Well, what one encounters in these is the familiar in the unfamiliar. All real learning is unlearning. You have to step through the wardrobe or fall down the rabbit hole or pass through the portal in the space/time continuum to a place beyond your interpellations, beyond the collective fantasies that go by the name of common sense. Real learning requires a period of estrangement from the familiar. You return to find the ordinary transmuted and wondrous and replete with possibility. You become a flexible, creative thinker. You see the world anew, as on the first day of creation, as though for the first time. Vietnam Veterans would often say, “You wouldn’t know because you weren’t there, man.” Well, people who haven’t had those experiences via liberal arts educations don’t know this because they haven’t been there, man.

Gayle Greene has spent a lifetime, Maria Sabina-like, guiding young people through such experiences. Her classroom trip reports alone are worth your time and the modest price of this book. At one point, Professor Greene rifs on the meaning of the word bounty. This is a book by a bounteous mind/spirit about the bountifulness of her beloved liberal arts. Go ahead. Buy it. Treat yourself.

Copyright 2024, Robert D. Shepherd. All rights reserved. This review may be copied and distributed freely as long as this notice is included.

For more by Bob Shepherd about teaching literature and writing, go here: https://bobshepherdonline.wordpress.com/category/teaching-literature-and-writing/

For short stories by Bob Shepherd, go here: https://bobshepherdonline.wordpress.com/category/short-stories/

For poetry by Bob Shepherd and essays on poetry, go here: https://bobshepherdonline.wordpress.com/category/poetry/.

Posted in Teaching Literature and Writing | 15 Comments

Memory and the Construction of Self

Copyright 2010 Robert D. Shepherd. All rights reserved. NB: I wrote this back in 2010. Just getting around to posting it. The material I cover still stands, I think, though I might do some slight revision at some point.

Think of a time when you were swimming—at the beach, in a swimming pool, in a river or lake. Take a moment to close your eyes and picture the event. (What follows will be better if you actually do this.)

About half the time, when people recall such a memory, they picture themselves from the point of view of a third-person observer. I think of a time when I went to St. Martin and immediately after checking into my hotel, went for a swim. I see myself running down the exterior hotel stairway, crossing the sand, plunging into the sea, and taking long, crawling strokes in the turquoise water.  This memory is doubtless influenced by a photograph I have from the time, one taken from the hotel’s second-floor balcony by my then wife.

First- and Third-Person Memory

Being a third-person memory, my recollection is not, of course, precisely what I experienced. I could not, obviously, have been a third-person observer of myself!  In the present, we see the world not from the third-person perspective but in the first person, from the inside looking out. As I sit writing these words, I see the monitor in front of me, a cup of coffee, and if I glance down, the tip of my nose and my fingers at the keyboard. But if I remember this experience later, I’m likely to remember it not from the inside but from the outside: I’m likely to see in my mind’s eye the whole of me, sitting at a computer, writing. First-person memories tend to be phenomenally rich, whereas third-person memories tend to be of more extended duration, to be narrative (which is a clue, perhaps, to their construction).[1] 

Third-person autobiographical memories should give us pause because they are, to some extent, confabulations, or reconstructions, part real and part imaginary. They are not simple, retrieved events, pure sensory experience as taken in at the time (what psychologists call field memories) but, rather, stories that we tell ourselves, just-so stories about how things must have been. The philosopher Eric Scwitzgebel has theorized that our tendency to create third-person memories is related to our heavy consumption of movies and television, that we have learned to play movies in our heads, just as we now report dreaming in color, whereas in the 1950s and earlier, people most commonly reported dreaming in black and white.[2] Whether or not that is so, third-person memory is certainly quite common now. The naïve view of memory is that it is like a video recording in the head: this is what happened to me. I know. I was there. But as third-person memories demonstrate, we humans are quite capable of deceiving ourselves about what we remember.

Suggestibility and “Recovered” Memory: Deference to Authoritative Accounts

Sometimes our self deception can be about entire events. In a famous experiment, psychologist Elizabeth Loftus and her research associate Jacqueline Pickrell gave subjects booklets containing three stories from their pasts. The stories were supposedly gleaned from interviews with the subjects’ relatives. In each case, however, one of the stories, about the subject’s getting lost in the mall at the age of five, was false. In follow-up interviews, Loftus and Pickrell asked their subjects how much detail they could remember from each event. Unsurprisingly, subjects recalled with some detail about 68 percent of the true events. What was surprising was that about 29 percent of the subjects reported recollections of the false event, often providing elaborate detail about it.[3] It’s an experiment that has been repeated numerous times by various researchers with similar results.

There’s little consequence, of course, to having a false memory of being temporarily lost in the mall, but not all false memories are so benign. Years ago, when I was living in Chicago, a lawyer friend told me of a case she had taken on earlier in her career. Her client was an elderly African-American man who worked as a janitor in a largely white preschool in a largely white church of which the African-American man was a deacon. The man stood accused of molesting some of the students in the preschool, and a story about the molestation appeared on the front page of one of the Chicago papers. As the case developed, the children’s stories became more and more bizarre. They told of being burned in ovens and having large objects inserted in their orifices, but there was no physical evidence of this supposed abuse. In the end, psychologists ascertained that the children had confabulated. They had been visited by a social worker who gave them a demonstration about abuse using anatomically correct dolls. The mostly white children, scared anyway by a person who was older and not of the same race and who often appeared mysteriously from around corners, made it all up in their discussions with one another. What started as just-so stories, like the ones that kids tell about abandoned houses and dark closets and other objects of their fears, became magnified and reified, or given actuality. All charges were dropped against the elderly man, but his life was ruined. Abuse of children is unfortunately common, but in this case, no actual abuse had occurred.

Human suggestibility with regard to memory can have devastating consequences. Lawrence Wright’s disturbing and gripping book Remembering Satan[4]  tells the story of Paul Ingram, a sheriff’s department deputy and Republican Party county chairman in Washington state who fell victim to false accusations that he had molested his daughters when they were young and had later subjected them to Satanic ritual abuse. The daughters had fallen under the influence of a pair of psychologists who coached them through the process of “recovering” supposedly forgotten memories of abuse, and as a result, Ingram actually came to believe that there must be some truth to what the daughters were saying, was falsely convicted of molestation, and spent years in prison for crimes he didn’t do. As in Salem, during the witch trials, the daughters’ imagined experiences grew in complexity until they took in a great many townspeople involved in an abusive Satanic cult. Eventually, other psychologists were called in by the courts, and the whole edifice of the daughters’ fabrications, under the influence of their psychologist Rasputins, fell apart.

Supposedly repressed and recovered memories have played a key role in many such cases in the United States and elsewhere, so many, in fact, that the False Memory Syndrome Foundation was established to assist victims of false memories planted during therapy, though the work of this foundation and the validity of recovered memories remain contentious. A large-scale study by Elke Geraerts and others of Harvard and Maastricht looked at three types of memories of child abuse: ones continuously remembered, ones spontaneously recovered in adulthood, and ones recovered in therapy. Spontaneously recovered memories were corroborated about as often as continuous ones (37 percent of the time and 45 percent of the time, respectively), but recovered memories were not corroborated at all. The study by Geraerts and her colleagues suggests both that memories of traumatic events are extremely faulty and that people are extremely susceptible to manipulation of their memories.[5] Though recovered memories are questionable, there is no question that child abuse itself is a common problem, and the difficulties that people have with their memories work both ways. It can simultaneously be the case that recovered memories are suspect AND that memories of real abuse are often buried or whitewashed.

Memories of Misinterpreted Experiences

A number of years ago, I was living in Massachusetts and was single and dating. Having met in my dating life a few young women who were dealing with significant psychological issues, including bulimia and depression, I thought I might benefit from a class in the psychology of women offered by the Harvard Extension Program. So, I took the class, taught by a renowned feminist psychologist, and there I met a young woman who was convinced that she had been abducted by aliens. As it happened, around the same time, John Edward Mack of the Harvard University School of Medicine had studied sixty people who claimed to have experienced alien abduction. Dr. Mack spoke, once, with the woman I met, but she was never one of his patients or major research subjects. Interestingly, Dr. Mack reported that “The majority of abductees do not appear to be deluded, confabulating, lying, self-dramatizing, or suffering from a clear mental illness.”[6] The woman I met fit Dr. Mack’s description. She was bright, thoughtful, normal in every way, but she seriously believed, was in fact certain, that she had been abducted numerous times. Her stories of these abductions followed the classic plot line: She would awaken to find herself paralyzed, with creatures standing around her bed. She didn’t use the term, but her description fit that of the Grays, as UFO buffs call them, small aliens with big heads, large eyes, childlike bodies, and four long, large, ET-like fingers on each hand. The Grays would mill about the bed a bit. Then, the woman would feel herself lifted up in a beam of light and mist. The beam of light would carry her onboard an alien spaceship, where the aliens would perform various experiments on her. All the while, she would be immobile but perfectly conscious and completely, abjectly terrified. Eventually, the Grays would render her unconscious and she would awaken in her own bed. I shall never forget what this woman told me about these experiences: “Don’t tell me I imagined these things. I know they happened. I was there, just as I am here with you right now.”

Various explanations have been offered for the alien abduction experience. One is that the pineal gland produces small amounts of the psychotropic compound dimethyltryptamine, or DMT, which is known to cause self-appointed psychonauts to experience alien presences. The late Terrace McKenna, an enthusiastic advocate of the use of hallucinogens, wrote and spoke often of the alien “machine elves” whom he met and spoke with while under the influence of DMT. The most widely accepted explanation of the alien abduction phenomenon, however, is that during REM sleep, our brains protect us from acting out our dreams and so possibly hurting ourselves by inhibiting, post-synaptically, the operation of motor neurons, thus preventing the stimulation of our muscles. By this account, abductees awaken to a hypnagogic, or dreamlike, state, and find themselves paralyzed. In their susceptible, liminal condition, somewhere between waking and sleeping, their brains confabulate, making up a story to explain why they are in this pickle, and that story, that waking dream or nightmare, is what they remember. Because the “abductees” live in a time in which big-eyed, big-headed Gray aliens are as close as the local video store, their waking nightmares sometimes take on a form that they have borrowed from the popular culture. In Medieval times, such dreams took the form of succubae or witches. In fact, the story of a witch waking someone and riding him through the skies is what gives us our very word nightmare. (Of course, another possible explanation of alien abduction is that people are sometimes abducted by aliens, but that’s not a terribly parsimonious explanation, is it?)[7]

The Internal Story-Teller

Dreams can be quite bizarre. For a time, I kept a dream journal. In one of the more unusual dreams in that journal, I was in an airplane, a small prop plane that was flying into the island of Cuba, but in the dream, the island was a large, white-frosted sheet cake floating in a cliché of an emerald sea. Later, it was easy enough for me to piece together the sources of this dream. I had just returned from a trip, one leg of which was in a small prop plane. The day of the dream, there had been a news story about the illness of Fidel Castro. I had recently been to a wedding where there was a large cake (though not of the sheet variety—that must have been an adaptation of the cake idea to the topography of an island).

A widely held theory of dreaming is that it occurs as the mind sorts out and catalogues recent events.[8] Recently used neural pathways fire, and our pattern-making brains attempt to make sense of these random firings, putting them together into a coherent narrative. If the pathways that are firing are wildly divergent, we get these surreal dreams—islands that are wedding cakes. In another dream, I was again on an airplane and a large and, of course, red orangutan sitting next to me offered me a cigar. Come to think of it, that’s not so bizarre. I often find myself on airplanes sitting next to someone who is distinctly simian.

Dreams, alien abduction narratives, and confabulations great and small are revealing because they remind us of something very important about how people work: We are storytelling creatures. It’s not just when we are sleeping or in hypnagogic states that our brains are busy making sense of the world by telling us stories. It’s all the time. And when we get new information—we see a picture of our former selves or a relative tells about our getting lost in the mall at the age of five—our brains work to integrate that information into the narrative of our lives that we carry around with us. We take in sensory experiences and other information, and then we dream weave it into a narrative. That narrative, as much as the actual sensory experiences themselves, becomes memory, and our memories are, to a large extent, who we are. I believe myself to be a particular person with a particular history. I am the boy of five padding in his Dr. Denton’s across the floor of his grandparents’ upstairs bedroom at night to get a glimpse of a Ferris wheel, far across the darkened cornfields, turning red and green and golden in a dreamlike distance. I am the sixteen year old in the car at the drive-in movie trying to get up the courage (I never did) to kiss the amazing girl whom I never in my wildest dreams thought would go out with me. I am the hopeful applicant for his first editorial job, sitting across the desk from the renowned Editor-in-Chief staring at me over his broken reading glasses, which he has cobbled together with a bit of scotch tape. A self, an identity, is the summation of a great many such stories.

Suggestibility and False Memory 2: Deference to Social Sanction

But how true are the stories? Memory is notoriously faulty. Consider the following experience from another psychology class, one that I took in my freshman year in college: I was sitting in a large auditorium with some two hundred or so other students, listening to the professor, when costumed people burst in through the back door of the lecture hall, shouting and making a disturbance. They ran down one aisle (as I remember it), yelled a few things, leapt onto the stage, scattered the professor’s notes into the air, and then disappeared off the stage and through a side doorway. The event, of course, had been staged. The professor had us all write down what had just happened, and then we compared notes. My fellow students in the auditorium didn’t agree on much of anything at first—on how many people there were, on what they were wearing, on what they said, on what they did. Among other things, this event was a dramatic demonstration of the inaccuracy and inconsistency of eyewitness accounts. We humans have difficulty with the accurate recollection of experience. We’re not very good at it. Furthermore, we all have a tendency to confabulate, especially in social settings, where we have an all–frequent tendency to fall into group think and to start believing that we remember what other people confirm (We shall return to this subject later in this book). In the disruption demonstration/experiment, as the discussion continued, students began separating into groups—the ones who were certain that there were three intruders and those who were equally sure that there were four, for example. For many people, their certainty about what had happened increased over time, as they rehearsed it, and during this process, there was a lot of “Hey, yeah, I remember that too” going on.

Filling in the Gaps

Perhaps you think that you are not a confabulator, not someone who adds details to fill out the story and certainly not someone who will remember something differently because of someone else’s suggestion. Lest you fall into that trap, let me remind you that confabulation is a central part of sensory experiences themselves. Notoriously, we all have the feeling that we see the entire visual field before our eyes, but in fact, we all have blind spots in our visual fields caused by the fact that our retinas are interrupted in an area called the optic disc, where ganglion cell axons and blood vessels pass through our retinas to form our optic nerves. We view the world as continuous because our brains confabulate, filling in the missing details, telling us just-so stories.

But it’s worse than that. It’s not just that our perceptual systems regularly and systematically fool us. Memory is slippery. It’s susceptible to error because of drowsiness, illness, inebriation, inattention, stress or other strong emotion, and weakening or disappearance over time.  A couple of other problems with memory are particularly interesting. First, what goes into memory is severely limited. For a long-term memory to be formed, it first has to go through the narrow funnel of working memory. In a famous essay called “The Magical Number Seven Plus or Minus Two,” the psychologist George Miller pointed out that only seven or so distinct items can be held in working memory at any given moment. That’s why, for example, telephone numbers are seven digits long. We can increase this “working space” in memory by chunking, by putting items together into groups. So, it’s much easier to remember the string

S E T R K T A I M A R F A A R N

if we rearrange the letters and break them up into IM A STAR TREK FAN. But the point remains that of the innumerable things happening at any given moment, only a precious few gain admittance to working memory and thus have any hope at all of being transferred into short-term memory and from there into long-term memory.[9] The rest we assume, or fabricate, to put it less euphemistically, in later recall. Well, I was in my living room, so I must have been seeing this, that, or the other, our brains might as well be saying, though, of course, the brain does this unconsciously. In short, we actually attend to very few items at any given moment, but our brains are so constructed as to integrate what we were actually attending to with what we know, or think we know, about the world to prepare a long-term memory that is whole and consistent and present THAT confabulated memory to consciousness. If the long-term memory is of the third-person type, that confabulation is obvious, but we also confabulate first-person, field memories. It’s how we are made.

Inattentional Blindness

An important but often unremarked consequence of the limitations on working memory is inattentional blindness. As we have just seen, we can, at any time, attend only to a few things. So, the rest we are blind to. In another famous experiment, Daniel Simons of the University of Illinois and Christopher Chabris of Harvard showed subjects films of people passing a basketball around and asked them to count the number of passes. In the course of the films, a woman walked into the scene, sometimes carrying an umbrella and sometimes wearing a gorilla suit. Dutifully attending to their task, most subjects didn’t see these oddities—the umbrella or gorilla in the midst of the basketball game!  Memories typically have a wholeness about them, but most of that wholeness is imagined. When our brains do their work, telling us our stories, they make use of the material that we actually stored, and they fill in the rest. And sometimes they miss really interesting or important stuff, like the 800-pound gorilla in the room!

History as Confabulation: Narrativizing as Interpretation, or “Making Sense”

But it’s even worse than that.  Not only does memory fail us, and not only do our brains commonly and automatically fill in the gaps to make up for those failures, but we also, because of our story-telling natures, impose upon what we remember, or think we remember, narrative frames that serve to interpret and thus make sense of the events of our lives. Over forty years ago, the historiographer Hayden White wrote an influential essay, “The Historical Text as Literary Artifact,” in which he argued that when we discuss an historical event, we inevitably select some aspects of that event and not others, for time and scholarship are both limited, and every event might as well be infinitely complex. That much of White’s thesis is uncontroversial. What is controversial, and of enormous consequence, is White’s contention that we claim to have understood an historical event only after we have imposed upon it a narrative frame—an archetypal story, typically with a protagonist and antagonist, heroes and villains, a central conflict, an inciting incident, rising action, a climax or turning point, a resolution, and a denouement. The narrative frame exists not in the events themselves, but in our minds, as part of our collective cultural inheritance. Joseph Campbell famously proposed in The Hero with a Thousand Faces that a great many stories from folklore and mythology have a common form: a young and inexperienced person, not yet aware that he is a chosen one, sets out on a journey. He encounters a being who gives him a gift that will prove extremely important. He undergoes a trial or series of trials, succeeds as a result of his character and the gift, and emerges with some boon that he is able to share with others on his return. Joseph Campbell’s monomyth is one example of the kind of archetypal, interpretive narrative frame that gets imposed on events.

Returning to Hayden White’s thesis, to one person, the founding of Israel  is the story of an astonishing people, dispossessed, scattered to the winds (the setting out on the journey), subject to pogroms and persecutions (trials), who astonishingly, and against incredible odds, maintain their cultural identity, keep the flame of their nationhood alive by teaching every male child to read (the gift), suffer a horrific holocaust (more trials) and then, vowing never to allow such a thing to happen again, reclaim their historical birthright and carve out a nation in the midst of enemies, even going to the extent, unparalleled in human history, of reviving a scholarly, “dead” language, Hebrew, and making it once again the living tongue of everyday social interaction (the boon). I, myself, find this story, thus told, quite compelling and moving. In a very different version of these events, an international movement (radical anti-Semites would say “conspiracy”) leads to an influx of Jews into Palestine after the Second World War, and these Jews, taking advantage of a vote in the newly established United Nations, declare themselves a state and forcibly expel over 700,000 native Arabs from their homes. Both stories are true. The telling depends, critically, upon which events one chooses to emphasize. Overemphasis on one set of facts confirms some people in an obstinate unwillingness to make concessions necessary to secure a lasting peace. Overemphasis on the other set of facts leads other people to horrific acts of terrorism.

Consider, to take another example, this quotation from a white, American man of the nineteenth century:

“I will say then that I am not, nor ever have been in favor of bringing about in anyway the social and political equality of the white and black races—that I am not nor ever have been in favor of making voters or jurors of negroes, nor of qualifying them to hold office, nor to intermarry with white people; and I will say in addition to this that there is a physical difference between the white and black races which I believe will forever forbid the two races living together on terms of social and political equality.”[10]

Now consider this quotation, also from a nineteenth-century white, American male:

“On the question of liberty, as a principle, we are not what we have been. When we were the political slaves of King George, and wanted to be free, we called the maxim that ‘all men are created equal’ a self evident truth; but now when we have grown fat, and have lost all dread of being slaves ourselves, we have become so greedy to be masters that we call the same maxim ‘a self evident lie.’”[11]

The first of these statements, from a contemporary perspective, seems outlandish and shocking, the second reasonable and evident. You may have guessed, however, or may know, if you are a student of history or have glanced at the endnotes, that these two men are the same person: Abraham Lincoln, whom we remember as the “great emancipator.” Our view of Lincoln’s view depends, critically, on which of his statements and actions we attend to and from what parts of his life, and our view also depends on what narrative we tell based upon our selections and how nuanced that narrative is. Lincoln was, indeed, an early opponent of slavery and considered it an evil from an early age, but his views on the subject were far from monolithic, and they evolved over time. In short, they were complicated. And that’s always true of history. Whenever we look closely at some past event, we find that it is a lot more complex than are the simplistic accounts (one might call them myths) typically presented in high-school “social studies” textbooks.[12]

The Punch and Judy Show: Making Sense in Relationships

What Hayden White says of history is true of our personal histories too. What makes the stories that we tell ourselves about our lives into stories and not just collections of facts is that we selectively recall facts and we impose narrative frames upon them. The central character is a given. Each of us is the protagonist in his or her own tragicomedy. But we also identify antagonists and conflicts and moments of crisis and resolution. We create causal maps to explain why things happened as they did, often involving imputed motivations. So, in the stories that we tell ourselves from our own lives, we do not simply recall events; we interpret them. Two people, let’s call them Punch and Judy, are in relationship. They both recall an evening when they went to the theatre. Punch has decided that Judy is stubborn, that she must have her way about everything. So, when his brain reconstructs a memory, it automatically constructs it from bits and pieces, using that guiding principle. He conflates several actual times, over thirty years, in which Judy acted in a stubborn way and puts them all into the memory of that one evening. She refused to go to the show Punch had bought tickets for until her friend talked about great it was. She insisted on changing seats to sit on the outside. She refused to let him out until intermission, even though he needed to go to the bathroom. She insisted afterward that the leading lady was wearing a yellow dress at the beginning of Act II instead of a green one. One of these actually happened on that evening. Two never happened. Two happened, but at different events over the years. Judy, on the other hand, has decided that Punch often makes a spectacle of himself in public—that he has no decorum or tact. So, she has her own list of “things that happened on that evening at the theatre”—he interrupted the show and caused a scene by getting up to go to the bathroom ten minutes into Act I. He insisted on wearing, that evening, that ridiculous-looking jacket with the tux-like lapels. He told the waiter at dinner afterward how terrible the leading lady was, and that waiter was the leading lady’s good friend. And so on. But again, some of the things she remembers from that evening happened at other times or didn’t happen at all. They are confabulations that fit a general view that she has come to.

And this is common when relationships are in the process of failing. The day comes when one in the couple decides that the other is ×—whatever × is—and everything that happens after that is confirmation. The “evidence” grows that the situation is intolerable, and the person decides that the relationship is over, even though much of this “evidence” is confabulation.

We all are the central characters in our own stories, and we have a tendency to tell those stories to ourselves in a self-serving way, to remember our moments of glory and to forget or downplay those times that weren’t our most shining hour. And sometimes, the stories that we tell, impute personality traits or motivations to others that are absent or barely there.  Often, those imputations fuel resentment that festers and makes us cynical or mean-spirited when we would really be much better off to let it go, to move on, or, if we can’t, to consider (at least) the possibility that our interpretations are interpretations, not verbatim transcripts of reality.

Writing Our Stories v. Having Them Write Us

The stories from our lives are not created equal. Rather, we all tend to run a few critical, defining stories in our heads. Sometimes, these stories have only a tenuous connection to third-person, objectively verifiable reality, and sometimes, they can be terribly, terribly damaging, as when a person tells himself, over and over, a story of his or her victimization and in so doing becomes a perpetual victim. A number of clinical psychologists have recognized this and have created something called cognitive narrative therapy. The idea is to assist people to alter the stories that they tell themselves in crucial, life-enhancing ways. So, the victim of childhood molestation learns to think: No, I was not responsible for the liberties that my relative took with me when I was a child, and no, I was not at fault when he was found out. I was a child, and he was a pedophile, a person with a deep and terrible sickness. It was not my fault, and the story that I’ve been telling myself about that is deeply flawed.

As a senior in college, having finished the requirements for a degree in English, I experienced a crisis of faith. I had noticed in my reading of books, essays, and journal articles in my field, that literary critics and theorists typically devoted about a third of their energies to their topics, a third to displaying their erudition, and a third to protecting their intellectual turf. Did I really want go to graduate school and become an English professor and spend my life writing journal articles with titles like “Tiresias among the Daffodils: The Hermeneutics of Sexual Identity in Jacobean Pastorale?” Such articles were typically read by ten other scholars whose main motivation for doing so was to gather ammunition to refute what was said in the infinitely more brilliant articles that they were going to write. This didn’t seem a worthy use of a life.  Literary critics, take note: Many of the tools in your workshop, developed for the purpose of literary analysis, are extremely valuable for making sense of our life stories and for subjecting those to criticism. So, if you are looking for a way to make what you do even more relevant, that’s an idea. Many literary types already know this, of course.

We’ve seen that we are (To how large an extent? Try this for homework.) the stories that we tell ourselves about our lives. Some of those stories are even partially true! We’ve seen that inevitably our stories are based upon fragmentary evidence and are at least partially confabulated as a result of our storytelling gifts, our ability to “fill in the gaps.” We’ve also seen that sometimes we can benefit enormously from critical analysis of our own collection of life stories, and particularly of those stories that we replay a lot. If we are our stories, then we are, all of us, at least partially fabrications. That’s an unsettling idea, but it’s also liberating, for we can learn to take our own life stories with a grain of salt and so gain nuance in our understandings of ourselves and of others. And, instead of engaging in another Punch and Judy show with a partner or friend when we have differing memories of some event, perhaps we can have some understanding of how these differences arise and less certainty about the superiority of our own narratives. I’ve begun this work on uncertainty with an examination of what we know of ourselves because surely, of all that we know, we know ourselves best. But even there, as we have seen, there is reason for skepticism, for significant uncertainty, and that skepticism, that uncertainty, can be extremely healthy.


[1] Georgia Nigro and Ulric Neisser, “Point of View in Personal Memories.” Cognitive Psychology 15 (1983), 467-82.

[2] “Remembering from the Third-Person Perspective?” The Splintered Mind: Reflections in Philosophy of Psychology, Broadly Construed. Blog entry. June 6, 2007. http://schwitzsplinters.blogspot.com/2007/06/remembering-from-third-person.html

[3] Add footnote to Loftus.

[4] Wright, Lawrence. Remembering Satan: A Tragic Case of Recovered Memory. New York: Vintage, 1995.

[5] Geraerts, Elke, et al. “The Reality of Recovered Memories: Corroborating Continuous and Discontinuous Memories of Childhood Sexual Abuse.” Psychological Science. Vol. 18, no. 7. Jul7 2007, 564-68.

[6] Harvard University Gazette, July 24, 1992.

[7] Parsimony as a criterion for judging potential explanations is generally attributed to the medieval scholar William of Ockham, to whom is often credited the statement that Entia non sunt multiplicanda praeter necessitate, or “Entities should not be multiplied unnecessarily.” As is often the case with famous quotations and their attribution, this one does not come from Ockham, though it is likely that Ockham would have approved of it. The principle of parsimony, often referred to as Ockham’s razor, is that one should look for the simplest explanation that fits the facts. There’s no reason, of course, why explanations have to be simple. Events, for example, often have multiple causes. But there is good reason for not making explanations too complicated, for one could make up an infinite number of complicated but false explanations to fit any set of facts. Similarly, Einstein is often credited with having said, “Make things as simple as possible, but not simpler,” which I have not been able to verify, though he did say, in a lecture given in 1933 that “It can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and few as possible without having to surrender the adequate representation of a single datum of experience.” (“On the Method of Theoretical Physics.” Herbert Spencer Lecture, Oxford, June, 1933, in Philosophy of Science, vol. 1, no 2 (April 1934), pp. 163-69.)

[8] See, for example, Girardeau, Gabrielle, et al. “Selective suppression of hippocampal ripples impairs spatial memory.Nature Neuroscience, 2009; http://www.nature.com/neuro/journal/vaop/ncurrent/abs/nn.2384.html

[9] All this is made much more complicated by the fact that we are continually taking in information on some level and processing but not attending to it. By working memory, here, I am referring to the new information that we are capable of consciously attending to.

[10] Abraham Lincoln, Debate with Stephen A. Douglas at Charleston, Illinois, 1858

[11] Abraham Lincoln, Letter to George Robertson, 1855

[12] See, for example, Loewen, James W. Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong. New York: The New Press, 2005. The title is, of course, an exaggeration. Oops. And Loewen is himself perfectly capable of getting some things wrong. The book remains, however, an interesting, amusing, occasionally enlightening and sometimes disturbing read. I myself got a lesson, years ago, in how difficult the work of an historian is when I created a series of books called Doing History. The idea behind the books was to coach kids through examining primary source materials—maps, letters, ship’s logs, oral histories, that sort of thing. My colleagues and I decided that we wanted to be very serious about getting our facts right. We didn’t want to produce books like the American history book that said that Sputnik was a nuclear device or the popular biology text that said that blood returning to the heart was blue! (One could go on and on multiplying these examples.) We soon found, though, when we sent to work verifying our facts, that as often as not, the facts we assumed to be true were disputed or questionable or flat-out wrong, and at some point, often, we just had to give up and use other material!

Posted in Epistemology, Philosophy, Philosophy of Mind | 5 Comments

Fantasy versus Science Fiction

NB: Embedded below is a revelation about the origins of an important element in the Dune series for those who love Herbert’s books as much as I do. My contribution to Dune scholarship.

I recently read an annotated list of the greatest Science Fiction ever written, and included on this list was a book that its compiler called “the first science fiction novel,” Description of a New World Called the Blazing-World (BW). The designation “first science fiction novel” is significant for literary history, so this claim invites investigation. Written by Margaret Cavendish, Duchess of Newcastle, and published in 1666, BW tells the following story: a merchant kidnaps a woman he is enamored of, but the ship carrying her away from her father’s land is caught in a tempest that, improbably, blows it all the way to the North Pole. The men onboard are frozen to death, but the woman “by the light of her Beauty, the heat of her Youth, and Protection of the Gods” remains, again improbably, alive. So, there we have a taste at the beginning of this novel of what the whole will be like. Winds blow ships across entire oceans, and beauty is protection against freezing temperatures. The protagonist becomes Empress of the Blazing-World and then leads an assault on our world using an army of fish-men and bird-men. This novel is purest Fantasy, like the talking geese and 100-year sleeps in fairytales. BW would be Science Fiction only if you defined the term so loosely as to include all works that are not realist, that contain unreal elements. By that definition, The Goose That Laid the Golden Egg would be Science Fiction. So, btw, would South Park.

They aren’t.

So, the title first Science Fiction novel arguably (and most remarkably) remains with Frankenstein: the Modern Prometheus, by the 18-year-old (!!!!!) Mary Shelley.

Standard dictionary definitions work in this way: they place the referents of the thing to be defined into a general class of thing, the genus, and then differentiate these referents from other members of the class by means of a list of distinguishing characteristics, the differentiae. So, for example:

Micrometer. A device for measuring small distances between two faces using a calibrated screw.

genus: a device

differentiae: a) for measuring small distances b) between two faces c) using a calibrated screw

So, what is a work of Science Fiction? Well,

Science Fiction. A story based upon scientific fact or speculation that is untrue and that involves strange events, technologies, characters, and/or settings.

genus: a story

differentiae: a) based upon scientific fact or speculation b) that is untrue and c) that involves strange events, technologies, characters, and/or settings.

This can be opposed to

Fantasy. A story that violates scientific law or fact that is untrue and that involves strange events, technologies, characters, and/or settings.

genus: a story

differentiae: a) that violates scientific law or fact b) that is untrue and c) involves strange events, technologies, characters, and/or settings

So the difference between these two genus and differentiae (dictionary-style) definitions is that the former contains the differentia “based upon scientific fact or speculation,” whereas the latter does not; in fact, the latter VIOLATES science fact. How? Well, consider the following list of common fairytale motifs:

enchanted or cursed objects (cauldrons, cloaks, damsels, drinking horns, feathers, kingdoms, rings, wells);

the grateful dead;

fantastic creatures (elves, dragons, trolls, Hobbits);

talking animals;

witchcraft and sorcery.

None of these exists FOR REAL in nature. So, none of them is scientific. Now, consider the existence of a being who once was dead but is now alive. This motif of the resurrected being could belong to Science Fiction (if there is a scientific mechanism for the resurrection) or Fantasy (if there is not) or to Myth (if the story is meant or was originally meant actually to be believed and if the agency of the resurrection is action by a god—see such stories in the Hebrew Bible or Ovid’s Metamorphoses for examples).

What is supposed to set Science Fiction apart from Fantasy is that in the former, the unreal stuff is supposed to have a scientific basis. Frank Herbert, author of the Dune series of novels, was a great Science Fiction author in part because he was a master of SCIENCE-BASED world building. So, for example, many current Fantasy novels are set in worlds where people fight with blades of various kinds—swords, daggers, and the like. The Game of Thrones and Ring Trilogy novels are examples. But there is not, in those other works, a SCIENTIFIC reason why they fight as they do. In each case, that the characters do so is simply part of the Fantasy world that the author created and reflects the adventure novels from which such works derive.

In Herbert’s work, however, people fight with blades FOR A SCIENTIFIC REASON. Thousands of years before the main character in the first book, Paul Atreides, is born, humans fought a war known as the Butlerian Jihad against world-dominating artificial intelligences, and after the humans won, they banned “thinking machines” and related technologies. This is not, however, why people fight with blades instead of projectiles, for not all projectile weapons rely on computers. Conventional weapons of war are part of the Dune universe (the noble houses have private stockpiles of nuclear weapons, for example). Here’s the actual explanation: in the Dune universe, an engineer named Holtzman invented a technology whereby elementary particles could be induced to exert a repellent force. However, this force only works against objects with strong momentum. Herbert is vague about how this works, but at least he tried to work into his books this scientific explanation. The “Holtzman Effect” is used in the novels to create personal shields that repel incoming projectiles. These shields cannot work, however, against items with less momentum, like daggers and swords. That Herbert bothered to come up with a scientific explanation marks his work as Science Fiction rather than as Fantasy.

Cavendish’s book is NOT Science Fiction because it contains unreal elements that are not scientific (bird-men, for example) and so violates natural laws right and left. It belongs, rather, to the ancient genre of the Fantastic Traveler’s Tale, a subgenre, like the Fairytale, of Fantasy. Far from inventing a new genre, Her Grace Margaret Cavandish simply wrote yet another example of a type of literature that had been around, most likely, from before recorded history. Some traveler’s tales:

The Book of Enoch, aka 1 Enoch, (BE1; circa 350 BCE-100 BCE), an eponymous story purportedly by the father of the long-lived Biblical character Methuselah, begins with a tale called The Book of the Watchers, which tells the story of the Nephilim, angels who took human wives in violation of the will of God, bore giant sons who ran amok, and taught humans such “immoralities” as wearing makeup and fornicating excessively. This section of the book also contains a travelogue of Enoch’s journeys to the place of punishment of the fallen angels, and to Sheol, or hell. Another part of BE1, The Book of Dream Visions, relates a history of mankind from the Flood until the establishment of the New Jerusalem after Judgment Day. 2 Enoch (BE2) contains an account of Enoch’s travels through seven of the ten heavens, and in the last of these, he is shown the throne of God from a great distance (the throne is located in the tenth heaven).

The Histories (430-420 BCE) are the basis on which people refer to their Greek author, Herodotus, as the first historian. They tell the story of the wars between Persia and the Greek city states, but they also contain many, many accounts of other places and peoples based on Herodotus’s travels and readings. The book is valuable as history but is also full of fantastic creatures and events, such as phoenixes and centaurs. One of the most interesting stories from it tells of a report of the death of the god Pan. Herodotus is a magnificent storyteller. Do yourself a favor and get a copy of the Landmark edition of The Histories.

A True Story/A True History (Latin: Vera Historia or Verae Hisoriae; 2nd century CE), by Lucian of Somasata, is a literary satire of other Fantastic Traveler’s Tales and is meant to show by its ludicrousness that such tales are not to be believed. It tells how the narrator was sailing through the Pillars of Gibraltar but blown off course to an island with a river of wine and trees in the shape of women. A whirlwind then carries him and his fellow sailors to the Moon, where they get involved in a war between the King of the Moon and the King of the Sun. On their return to Earth, the sailors are swallowed by a whale and do battle with the fish people within it. Escaping, they have other adventures involving a sea of milk and an island of cheese, various heroes from ancient myth and history, and a sort of hell for tellers of Fantastic Traveler’s Tales.

3 Baruch, aka the Greek Apocalypse of Baruch (70-280 CE), is attributed to it eponymous author, supposedly the scribe of the Prophet Jeremiah from the Bible. It opens with Baruch lamenting the fall of Jerusalem and the destruction of the temple. An angel then appears to him and takes him on a journey. He visits the First and Second Heavens, where he sees humans with animal faces being punished for building the Tower of Babel. The angel then takes Baruch to Hades, where he sees the Tree of Knowledge of Good and Evil and learns that it wasn’t planted in Eden by God. He also learns how the Flood killed hundreds of thousands of giants (the sons of the Nephilim?). In the Third Heaven, Baruch encounters a Phoenix, which excretes a worm that makes cinnamon: “ And I said, Does the bird excrete? And he said to me, He excretes a worm, and the excrement of the worm is cinnamon, which kings and princes use. (3 Baruch 12)”  Frank Herbert, author of Dune, was an avid reader of the scriptures of various religions, and it seems quite likely to me that he got his idea for the Spice, Melange from this book, 3 Baruch. It seems far too coincidental that both would involve worms that produce cinnomon-flavored spice beloved by kings and princes. Lol. In Herbert’s novels, the excretions of Sand Trout, the larvae of sandworms, when they encounter water, explode and end up as Spice, mixed with the sand, on the surface. Baruch learns that the Jewish Temple has been miraculously transported to Heaven and rebuilt there for use by the faithful in the afterlife, so his original lamentation is rendered false and his sorrow assuaged.

The Divine Comedy (Italian: Divina Commedia, circa 1308-21) is a long, allegorical account by Dante Alighieri, in three books of three-line verse stanzas, of a journey through Hell (the Inferno), Purgatory, and Heaven. A comedy was not at this time necessarily a humorous work but one with a happy ending. In this traveler’s tale of all traveler’s tales, Dante is led by a guide (a common feature of Traveler’s Tales), the poet Virgil, and in the end receives a vision of his beloved Beatrice (a great gift or boon or treasure is often the culmination of the journey in Fantastic Traveler’s Tales.

One could go on and on with these. There were histories like Marco Polo’s Book of Marvels of the World (circa 1300) and Richard Hakluyt’s Divers Voyages Touching the Discoverie of America and the Ilands Adjacent unto the Same, Made First of all by our Englishmen and Afterwards by the Frenchmen and Britons (1582) (from which Shakespeare drew some of his stories. And then there are compleely fanciful traveler’s tales like Swift’s Gulliver’s Travels, Voltaire’s Candide, and de Saint-Exupéry’s The Little Prince. In one excruciatingly poorly written seventeenth-century traveler’s tale that I read recently, a captain sails to a land made of boobs and buttocks and visits a queen named Voluptua who rules a land where every carnal delight is readily available (or was it a place called Voluptua? I have repressed the memory—the book was that bad). There is a whole category of Hollow Earth or Subterranean Fantasy Fiction books about travel to countries below the Earth’s surface. Literally more than a hundred of these have been written, over the centuries, that I know about, from Ludvig Holberg’s Niels’ Klim’s Underground Travels (1741) to Giacomo Casanova’s Icosaméron (1788) to Edgar Rice Burroughs’s At the Earth’s Core (1914) to Rick Riordan’s The Battle of the Labyrinth (2008), which is part of the Percy Jackson Series of Fantasy novels. In a nineteenth-century Fantastic Traveler’s Tale novel that I read recently, the protagonist travels to the North Pole and there finds an entrance to a world where there is a utopian society that presents a model for reimagining life on Earth. This book is typical of the genre.

Language is constantly changing, and folks who oppose this tide might as well have their swords drawn against the sea or be holding their finger in a broken dyke. HOWEVER, there are some distinctions worth maintaining. For example, one wants a disinterested judge, not an uninterested one, and it is far better be amused than bemused.  Alas, years ago, bookstores and publishers started conflating the terms Fiction and Fantasy. Sometimes they lumped them together under the perfectly acceptable title Fantasy and Science Fiction. But then they just started calling utterly unrealistic novels about lands with elves and real wizards Science Fiction. And that’s a problem. One result is that today, if one wants to find a good new Science Fiction novel to read, one has to wade through a ton of Fantasy drek to find it. Another is that fewer and fewer writers have the discipline, like Herbert, to create imaginary worlds that are scientifically plausible. And that’s just laziness. And, of course, there’s a reason why Fantasy is so popular among 13-year-olds. A lot of it is truly childish. The conflation of Science Fiction and Fantasy has led to the gruesome neologism Hard Science Fiction, which is just, damn it, Science Fiction.

P.S.

What distinguishes good Fantasy writing from drek? Well, originality and substitution of internal consistency and logic of the fanciful universe for scientific law. In other words, quality of world-building. But that’s a subject for another essay.

Posted in Short Stories, Teaching Literature and Writing | Tagged , , , , | 7 Comments

Shiva at the DMV

[NB: This is from my as-yet unpublished novel Pagan Moon. All rights reserved. ]

Wherein are intimated certain mysteries of the calculus generally skipped over in introductory courses. . . .

The Main Chapel of the Living Word. An ancient man on the stage, but who? A teacher. The chapel dark. He is reading out a lesson. Slowly, deliberately: 

“You’re on fire. Always. Moses and the bush that burns but is not consumed. Heraclitus: the world is a fire, forever kindled and forever going out. The Mahabarata: the world is continually destroyed and continually remade. But your perception is so slow that you see it as continuous. It’s a matter of thresholds. 

“Unfold her. Touch her with the touch of a snowflake on the late summer grasses in South Dakota. Almost not a touch at all, but there enough in the quickened perception of the first moment. Her loins arch toward your touch, increasing its pressure. There. It doesn’t matter that you are up for it. Of course you are. Can you forget your dick for a moment? Are you worthy? Her body will tell you if you listen with your body. There. There. Yes. More. No. Stop. Slowly. Slowly. Yes. Yes. If you listen with your body, she will tell you how to take her to the moment of the instantaneous infinite. 

“Zeno showed long ago that by material means you cannot go from one place to another. You must learn the calculus of the body and then forget your learning in the being there. And if you do that, her slow burning will catch a fire that will consume her. Let Shiva, destroyer of worlds, lover of this one, be your guide. The world is continually being destroyed and remade. You must be subtle enough to feel it. She has been looking for that one, that man, that woman, the one who will consume her in Shiva’s holy fire, for the body knows, even if she does not, that if she finds that, she will be reborn each day. Why is that woman smiling behind the counter at the Department of Motor Vehicles? Because she has Shiva in the sack.” 

            Toby wakes. He is naked and shivering in bed. It is not cold outside or in. The room seems underwater. On the desk beside his window, a Bible, his calculus book from high school, a copy of An Ecumenical Guide to the Major World Religions, a piece of paper with Her number on it. In the frame of the window, clouds racing the moon. The distant, piercing cry of a sand hill crane. “God help me,” he says, to himself, alone. 

[NB: The character here is the young presumed heir to a televangelist empire. If you are a literary agent or publisher interested in this work, leave a comment in the space below. ]

Posted in Love, Metaphysics, Philosophy, Poetry, Sex and Gender, Short Stories | 2 Comments

How to Drink Tea, a Brief Guide | Bob Shepherd

The folks who carried out the Boston Tea Party would be shocked to find that most 21st-century Americans have lost the art of drinking tea properly. Things have gotten so bad that where I live, in Florida, most grocers don’t even carry looseleaf tea tins anymore. So, in an attempt to correct this falling away from former glory, I wrote this. It’s a brief guide, akin to a set of monastic rules, for proper practice with tea:

Rule No 1: Don’t use teabags. There are two reasons not to, each sufficient to warn you off them. First, teabags emit microplastics into your brew. Millions of tiny pieces of plastic. No thanks. Second, the stuff in teabags is literally called “dust” in the tea industry (they are also referred to as “fannings”). Because it is so exposed to air due to its small volume and relatively large surface area, the tea dust RAPIDLY loses its flavor, unlike whole tea leaves, which remain flavorful for a long time. So, buy only high-quality loose-leaf tea made of whole dried and sometimes rolled leaves. Infused tea should look like the picture above, like leaves, because it is. If your tea comes in bags and looks like dust from under your sofa or rug, it’s undrinkable, however much you mask the disgusting flavor (or lack thereof) with additives like hibiscus or chamomile or lemon or mint or sugar or whatever. BTW, drinks made with hibiscus, chamomile, rooibos, and so on are not teas. Calling them that is something marketers made up. They are tisanes.

Rule No 2: Use a proper teapot. Use a small teapot—maximum size 10 oz or 300 ml. Even smaller is better. By far the best are the Yixing teapots that come from China. You can find these at reasonable prices—$30 or so, though some can cost thousands of dollars. They are made of porous Zisha clay from Yixing province that comes in three colors, an orangey red, a purplish brown, and a beigey yellow. These are beautiful, natural colors. Here’s the thing: these pots are magical. They really improve the flavor of tea. And they have little holes at the entrance to the spout that prevent the leaves from getting into your cup. So, you do NOT need a tea infuser ball or something similar to make looseleaf tea IF you have a proper teapot. THESE CHINESE TEAPOTS WITH THE LITTLE HOLES AT THE ENTRANCE TO THE SPOUT MAKE BREWING SIMPLE. Just spoon leaves into your pot (about a 1:5 ratio of leaves to water), add water, wait 2-3 minutes max (longer for light teas like white, less time for dark teas like black and dark), and pour. In general, Americans are used to infusing their tea FAR TOO LONG because they use weak tea in teabags that has lost most of its flavor. And, btw, tea from teabags is bad enough the first time, but reinfused tea from teabags is truly awful. There’s no there there, as Gertrude Stein said. Use real tea leaves and infuse for less time. For a really dark tea, mere seconds is enough. For a really light tea like Silver Needle White, a good four minutes is necessary. I pour my hot water into a Thermos (not a Stanley cup; those are Trump-ugly) that I keep on my desk, and I use this Thermos to reinfuse the tea in the pot up to five times, depending on the strength of the tea. This works perfectly because tea water should be a little below boiling. If you use boiling water, this will release too many tannins, and your tea will be bitter. Tea, btw, is really good for you. It has caffeine for energy (also associated with less heart disease and cancer) and L-theanine, an amino acid that REDUCES ANXIETY. So, it promotes calm but alert mindfulness. A serving of tea leaves in a teapot might last me, through reinfusing, half a day. Here’s a REALLY GREAT inexpensive pot with cups:

Rule No 3: Buy some proper tea accessories. At a minimum, you need a proper teaspoon for scooping loose-leaf tea. Here:

It’s also a really good idea to invest in a proper tea tray. If you are sipping tea over time and doing multiple infusions, it pays to have a tea tray that can catch spills, as well as some tea cloths to wipe them up. Here:

NB: If you are willing to wait a few weeks for your order, good tea accessories can be ordered online at extremely reasonable prices from the Chinese companies Temu and AliExpress

Rule No 4: Try lots of varieties. There are six main types of tea: white, yellow, oolong, green, black, and dark (or pu-erh). Try them all. All are made from the same species of plant: Camilla sinensis. Their differences are due to a) the terroir, or environment, where the tea is grown (sun or shade, tropical or moist, mountain or flat, type of soil); b) the age of the tree, c) the time of the picking, d) the size and maturity of the leaves; and e) the processing methods used (withering, sun or heat drying, rolling, fermenting, etc.). Some excellent commercial brands of teas include Davidson’s and ChaWu Warm Sun, though there are others. A delicious and very inexpensive everyday tea is TEARELAE Taiwan Milk Oolong. These can be ordered online. If using pu-erh tea (and I highly recommend that you try it; it’s my favorite), you will need a tea needle for prying apart the tea leaves, and it’s also helpful to have a scoop. You will need as well a small cutting board. For pu-erh tea, wash the tea first by covering it with hot water, swishing it, and immediately pouring that water out. THEN infuse your tea.

Posted in Uncategorized | 12 Comments

Testing in Florida: Welcome to the New Boss, Same as the Old Boss (Almost)

Among the many claims that Ron DeSantis made when running for Governor of Florida was that he would do away with the Common [sic] Core [sic] State [sic] Standards [sic] and their associated high-stakes testing.

Both were, for good reason, in deep disrepute. In fact, the puerile, vague, almost entirely content-free Common Core standards, which Gates and Coleman and Duncan foisted on the United States with no vetting whatsoever, were so hated that at the annual ghouls’ convention of the Conservative Political Action Committee, or CPAC, the oh-so-reverend Mike Huckabee told the assembled Repugnicans to go back home and change their name because “Common Core” had become a “tarnished brand.”

Not change the “standards,” mind you, but change their name. In other words, the good Reverend’s magisterial ministerial advice was TO LIE TO or, most charitably, TO CONFUSE people by implying falsely that the standards had been replaced with local ones like, say, the Florida Higher-than-the-Skyway-Bridge-When-We-Wrote-These Standards. And that’s just what most states did. They barely tweaked the godawful Common Core standards, or didn’t change them at all, renamed them, and then announced their “new” standards.

Hey, check out our new and improved Big Butt Burger!

This looks just like your old Ton o’ Tushy Burger.

It is. Same great burger you know and love!

So, what’s so new about it?

The name! It has a new and improved name!

Enter Ron DeSantis, stage right. Shortly after being elected, he promised to “eliminate all vestiges of the Common Core” and “to streamline the testing.” Then, when DeSantis signed an executive order replacing the Common Core State Standards (C.C.S.S.) with the new Florida B.E.S.T. standards and creating new F.A.S.T. tests to replace the Common-Core-based F.C.A.T., his Department of Education (the FDOE) posted this headline:

GOVERNOR DESANTIS ANNOUNCES END OF THE HIGH-STAKES FSA TESTING TO BECOME THE FIRST STATE IN THE NATION TO FULLY TRANSITION TO PROGRESS MONITORING

See Governor DeSantis Announces End of the High Stakes FSA Testing to Become th (fldoe.org) 

Under the Governor’s new plan, instead of the Common-Core-based F.C.A.T., given in grades 3-8 and 10 in keeping with federal requirements, Florida would now give not one end-of-year test but THREE TESTS at each grade, in each subject area, Math and English, one at the beginning of the year, one at the middle of the year, and one at the end. And far from being the low-stakes progress monitoring that the FDOE headline and the Governor’s PR campaign suggested, these tests would be high stakes as well. Students would have to pass the ELA test in 2nd grade to move on to 3rd grade, and they would have to pass the 10th-grade ELA test, in addition to other state high-stakes assessments, to graduate from high school.

So, there would be MORE, not fewer, assessments. There would be no end to the attached high stakes. And there would be no end to PRETENDING (see below) that these tests measure proficiency or mastery of the state “standards.” And then, as the cherry on top of this dish of dissembling BS served warm, Florida hired AIR, a maker of Common Core standardized state tests given across the country, to write its new F.A.S.T. tests. Same old vinegar in wine bottles with fancy new labels.   

Before I discuss the many problems with the old and new Florida testing regimes, let me just pause to congratulate the state of Florida and the people on its standards team, which, unlike the group that developed Common Core, included a lot of actual teachers and textbook developers. They did a great job with the B.E.S.T. standards. These are a VAST improvement on the idiotic Common Core. They return to grade-appropriate, developmentally appropriate math standards at the early grades. The ELA standards are also much improved. These use broader language generally, thus covering the entire curriculum, as CCSS did not, while allowing for much more flexibility with regard to curricular design than the CCSS did. A curriculum developer could easily create sound, coherent, comprehensive ELA textbook programs based on these new Florida standards as they certainly could not based on the CCSS, which instead led to vast distortions and devolution of U.S. curricula and pedagogy. The Florida B.E.S.T. standards also do not deemphasize literature and narrative writing, as Coleman so ignorantly and so boorishly did in the CCSS.

Now, here is how curriculum development is SUPPOSED to work: A textbook authorship team (or district-or school-based curriculum team) is supposed to sit down and design a coherent, grade-appropriate curriculum with the goal of imparting essential knowledge while at the same time checking the standards from time to time to make sure that those are all being covered. So, the coherence of the curriculum and the knowledge to be imparted are first, and the standards coverage is second—that is, IT COMES ABOUT INCIDENTALLY. STANDARDS ARE NOT SUPPOSED TO BE A CURRICULUM MAP. They are a list of desired educational outcomes based on teaching sequenced according to the curriculum map. So, a group might design a unit for eighth graders on The Short Story and plan to cover first its origins in folk tales and traveler’s tales and then, in turn, such short story elements as setting, character, conflict, plot structure, and theme. Throughout, they might illustrate the main ideas with examples of these elements from orature before moving on to literary examples. They might then conclude with lessons on planning and writing a folk tale and then a full-scale short story. And all along, while writing the unit, the group might examine the curriculum map in light of the standards and tweak the plan to ensure alignment.

That’s not what happened with the Common Core. Instead, because of the high stakes attached to the tests that purported to measure proficiency or mastery of the “standards,” people threw the whole notion of coherent curricula out the window. Instruction devolved into RANDOM EXERCISES BASED ON PARTICULAR STANDARDS—exercises based on the formats of questions on the now all-important tests on the standards. In other words, curricula devolved into test prep. I call this the “Monty Python and Now for Something Completely Different” approach to curriculum development. (BTW, a full monty is full-frontal nudity, so a monty python is a _____. Fill in the blank.) In other words, THE STANDARDS BECAME THE CURRICUM MAP. Every educational publisher in the country started hauling off every textbook development program by making a spreadsheet containing the standards list in the left-most column and the places where these were to be “covered” in the other columns. Having random standards rather than a coherently sequenced body of knowledge drive curricula was a disaster for K-12 education in the United States. Many experienced professionals I knew in educational publishing quit in disgust at this development. They refused to be part of the destruction of U.S. pre-college education. An English Department chairperson told me, “I do test prep until the test is given in April. Then I have a month to teach English.” Her administrators encouraged this approach.

The new Florida standards are broad enough and comprehensive enough to allow for coherent curriculum development in line with, aligned to, them. But will that happen? The high stakes still attached to them incentivize the same sort of disaster that happened with Common Core—the continued replacement of coherent curricula with exercises keyed to particular “standards.” Furthermore, because of the “progress monitoring” aspect of the new Florida program, there will be, under it, EVEN MORE INCENTIVE FOR ADMINISTRATORS TO MICROMANAGE what and how teachers teach—to insist that they do test prep every day based on the standards that students in their classes didn’t score well on.

In Robert Bolt’s play A Man for All Seasons, Sir Thomas Moore, the Chancellor of England, knows that he will lose his head if he doesn’t accede to King Henry’s appointing himself head of a new Church of England, but being a person of conscience, Moore can’t bring himself to do this. There’s an affecting scene in which Moore is taking the ferry across the river Thames and this exchange takes place:

MOORE [to boatman]: How’s your wife?

BOATMAN: She’s losing her shape, Sir.

MOORE: Aren’t we all.

That’s what results from high-stakes testing based on state standards lists. Instead of the curriculum teaching concepts from the standards, the curriculum BECOMES teaching the standards. Instead of giving a lesson on reading “Stopping by Woods on a Snowy Evening,” teachers are pressured by administrators, whose school ratings and jobs depend on the test outcomes, to teach a lesson on Standard CCSS.ELA.R.666, the text becomes incidental, and the actual purposes of reading are ignored. Any text will do as long as the student is “working on the standard,” and the text is chosen because it exemplifies it (for example, the standard deals with the multiple meanings of words and a random text is chosen because it contains two examples of words used with multiple meanings). In this way, curricular coherence is lost, teaching becomes mere test prep, and without a coherent curriculum, students fail to learn how concepts are connected, to fit them into a coherent whole, even though one of the most fundamental principles of learning is that new learning sticks in learners’ minds if it is connected to a previously existing body of knowledge in those learners’ heads. In summary, putting the cart before the horse, the standard before the content, undermines learning. People like Gates and Coleman don’t understand this. They haven’t a clue how much damage to curricula and pedagogy their standards-and-testing “reform” has done. It’s done a lot. They are like a couple drunks who have plowed their cars through a crowd of pedestrians but are so plastered as to be completely oblivious to the devastation they’ve left behind them.

BTW, when he created the egregious Common Core, Coleman made a list of almost content-free “skills” (the “standards”) and then tacked onto it a call for teachers to have students start reading substantive works of literature and nonfiction, including “foundational documents from American history” and “plays by Shakespeare.” At the time when these standards were introduced, and Coleman doesn’t seem to have known this, almost every school in the United States was using, at each grade level, a hardbound literature anthology made up of stories, poems, essays, dramas, and other “classic” works from the traditional canon—substantive works of literature, including foundational documents of American history and plays by Shakespeare. So, Coleman’s big innovation—wasn’t an innovation at all. It was like calling on Americans to start using cars instead of donkey carts for transportation. Coleman was THAT CLUELESS about what was actually going on in the nation’s classrooms. And far from leading to more teaching of substantive works, the actual standards and testing regime led to incoherent curricula and pedagogy that addressed individual standards using random and often substandard texts and deemphasized the centrality of the works read. And so the processes of reading and teaching, in our schools, lost their shape, became monstrous exercises in dull and seemingly pointless scholasticism. Despite the fact that the new B.E.S.T. standards are broader and more comprehensive and therefore allow for more coherent curricula based on them, the persistence of high stakes in the new Florida standards-and-testing plan will lead to precisely the same sort of curricular incoherence that CCSS did.

That’s a problem, but even worse, if you can imagine that, is and will be the problem of the invalidity of the tests themselves, the old ones and the new ones. The governor and the FDOE promised shorter, low-stakes, progress-monitoring tests. We have already seen that the new tests aren’t low stakes, and we’ve seen that progress monitoring means micromanagement to ensure that teachers are doing test prep. So, what about the length? You guessed it. A typical F.A.S.T. test has 30-40 multiple-choice questions. Same as the F.C.A.T.

Now consider this: There are many standards at each grade level. For example, at Grade 8, there are 24 Grade 8 B.E.S.T. ELA standards. So, each standard is “tested,” supposedly, by one or two questions. But the standards, in the cases of both the Common Core and Florida’s B.E.S.T. are VERY broad, VERY GENERAL. They cover enormous ground. For example, here’s one of the new Florida standards, a variant of which appears at each grade level:

ELA.8.C.3.1: Follow the rules of standard English grammar, punctuation, capitalization, and spelling appropriate to grade level.

Here’s an assignment for you, my reader: Write ONE or TWO short multiple-choice questions that VALIDLY measure whether a student has mastered this standard—that’s right, two short multiple-choice questions to cover the entirety of the 8th-grade curriculum in grammar, punctuation, capitalization, and spelling.

That’s impossible, of course. It’s like trying to come up with one question to judge whether a person has the knowledge of French, of French culture, of diplomacy, and of international law and trade to be a good ambassador to France.

Well, OK. Today I am going to ask you to submit to a brief examination to see if you have the knowledge to serve as our ambassador to France. Are you ready?

Ready.

Have you ever eaten gougères?

Oh, yes. Love them.

What is an au pair?

A young person from a foreign country who helps in a house in return for room and board.

Hey, hey! Great. You passed. Congratulations, Madame Ambassador!

This is a problem with the Common Core tests, and the problem ought to be obvious to anyone. In fact, it’s shocking that given the invalidity of the state tests, which I just demonstrated, that so many people—politicians, federal and state education officials, journalists, administrators, and even some teachers actually take the results from these tests seriously, that they report those results as though they were Moses reading aloud from the tablets he carried down the mountain. “This just in: state ELA scores in sharp decline due to pandemic!” Slight problem. The scores from invalid tests don’t tell you anything. They are useless.

The tests clearly, obviously, do not measure validly what they purport to be measuring. They cannot do so, given how broad the standards are and how few questions are asked about any given standard. That you could validly measure proficiency or mastery of the standards in this way is AN IMPOSSIBILITY on the level of building a perpetual motion machine or drawing a round square. And so the tests and their purveyors and supporters should have been laughed off the national stage years ago. It’s darkly (very darkly) humorous that people who claim to care about “data” are taken in by such utter pseudoscience as this state testing is. That emperor has no clothes. It’s long past time to end the occupation of our schools by high-stakes testing.

But Florida isn’t doing that. The new policy has given us the same kinds of invalid high-stakes tests by one of the standards providers of them, but now students in Florida will take EVEN MORE of those tests, thus making them EVEN MORE invasive and EVEN MORE likely to lead to EVEN MORE onerous and counterproductive micromanagement of teachers. No sane person would want to teach under such conditions of micromanagement.

DeSantis has promised to “Make America Florida.” If I were a religious person, I would say, “God help us.” Instead, I’ll just say, “Uh, no thanks.”

Scorecard

Quality of new standards: A

Quality of new tests: D

Plan for implementation of new standards and testing regime: F

Promises kept: C–

Posted in Ed Reform, Politics | Tagged , , , , | 1 Comment

The Ancient Greeks Did Not Think like You and Me

So, I got involved in an online discussion of the legal doctrine of Original Intent, which is bunk, and I ended up writing this. It explains some really interesting stuff about how weird, how foreign ancient Greek thought was, far weirder than most people imagine it to have been. 

Note that Reich wingers love Originalism except when it conflicts with the outcomes they want to see, as in the case of Originalist interpretation of the Second Amendment, which was clearly not about individual right to gun ownership but about enabling the operations of the militias then (but no longer) necessary for defense. And similarly, they completely overlook the quite permissive views of abortion held from the Puritan Era through the Revolutionary one. So, you can be pretty sure that when the founders wrote about persons, they were not including fetuses in the range of reference of the term.

In literary studies, Originalism in the sense of looking for how people of the time understood some x can be extremely revealing. That’s what the New Historicism in Literary Studies is all about. It’s about doing the research to discover what people in the past actually thought, often VERY BIZARRE NOTIONS that are no longer current and that are difficult to imagine, today, because of that difference. Let me give four favorite examples, all having to do with the ancient Greeks.

First: The ancient Greeks had one word, psyche, where we have two separate ones, mind and soul. So, this word did double duty or, rather, they believed in the existence of this one thing, the mind/soul. Now, Plato noticed that we can think of, say, a perfect line, triangle, or square, but no such thing exists in nature. Therefore, he thought, perfection can be found in the abstract mind/soul, which was, remember, both the place where thinking occurred AND simultaneously the place of the spirit or vital force, but that one could not find perfection in the world, which is made up of inferior, devolved, degraded copies, of things that were roughly lined up or triangular or square but not perfectly so. Similarly, Plato decided that any perfection–truth, for example,–belonged to this world of the spirit, to the world of “forms,” of which this world was made up of mere copies, badly wrought. 

Second: the Greek word from Plato that is often translated into English as “virtue,” arete, meant, literally, excellence AND/OR efficiency. So, one could speak not only of a man as being virtuous but also of a shoe–one that did shoe stuff well/efficiently, that lasted a long time, was comfortable, protected the feet, and so on. This idea created an unexamined, reflexive notion that what made anything good, virtuous, excellent, efficient–whatever gave it arete–was something already there to be discovered, instead of something that we make up or create. So, Plato thought we just had to look to the psyche, the spirit/mind, carefully, to ask the right questions, to discover what a virtuous man or state was. What constituted virtue was already given and there to be discovered via reason. We tend to think, rather, that the facts about harm and flourishing are there to be discovered and that virtue is what happens when we decide upon and undertake action to prevent the former and create the latter, often action that is novel. So, we can create something new, a new kind of state policy, for example, that serves people’s needs better and so is virtuous. It’s not surprising that Plato was an autocrat with absolutist, traditionalist, inflexible views.

Third: in the Greek of the Heroic Era and continuing into the time of Plato to some extent, the word theos had a predicative force. Where Christians have always posited a god and then listed His attributes–omnipotence, omniscience, goodness, love, jealousy, etc., ancient Greeks looked at a powerful phenomenon in the world–the volcano, the wind, the waves of the sea, friendship, erotic love, fate, memory, sleep, death–and said THIS IS A GOD. So, the wind is a god. The Earth is a god. The sky is a god. Friendship is a god. Erotic love is a god. Memory is a god. History is a god. LOL. So, again, instead of the Christian God is wise, God is loving, etc., the Greeks said, Wisdom is a god, Love is a god. There are these god forces in the world, acting on us.

Fourth: The ancient Greeks had what is, from our perspective, a weird theory of motivation. If you read the Iliad carefully, you will notice that the characters rarely do anything of their own accord. Instead, a god enters into a person and infuses/influences/acts through him or her. One was infused with, inhabited by, or, to use Christian terminology, possessed by a god that motivated one to act, as a sorcerer might move a gollum in a medieval Jewish folktale. This internal motivation of a person by a god was called the person’s ate (AH-tay). It was a kind of “I couldn’t help myself, I had this overwhelming urge, this ate, to do [x] or [y].”

In other words, Greek ways of thinking and being were far stranger than we typically think of them as being, far weirder, much more unlike us. And actually recapturing how they thought is a lot more challenging.

BTW, it is not surprising, is it, that the Alitos of the world want to go back to a good ole time when “girls were girls and men were men” (as Jordan Peterson, Josh Hawley, and Tucker Carlson tell us the latter aren’t anymore, especially among liberals and super-duper-especially among those in the dark towers of liberal indoctrination known as universities).

All this is easy enough to say, but actually trying to recapture how people in the past thought and TO THINK THAT WAY YOURSELF involves major UNTHINKING, letting go of deeply engrained modern habits of thought. It’s not easy, and if you actually do that, what you end up with is REALLY WEIRD from a modern perspective. So, the modern personalizations–well, there was Eros, and he was the God of Love–are falsifications due to thinking like Christians–there is a god; here is his attribute. The Greeks, rather, thought not of the god as having a domain but of the force being a god.

What makes the New Historicism difficult (and fascinating) is that it actually isn’t easy to recapture how people in the past thought and to try to understand things as they did. It requires successive stabs at it, going back to it again and again, in what is known as the hermeneutic circle, until you finally grok it. And once you do, it might be repugnant to you. We no longer accept as an argument in defense of sexual battery that someone simply couldn’t help himself. The Greeks of Homer’s time did. Yikes.

Posted in Metaphysics, Philosophy, Philosophy of Mind, Politics, Religion | Tagged , , , , | 3 Comments

Top 10 New Year’s Resolutions of 2024

Compiled by Lifestyle Reporter and Tea Culture Influencer B.O.B. Shepherd

Install “Parade of Heroes” exhibition featuring Michael Flynn, Margorie Greene, Rudy Guiliani, Satan, Marduk, Gollum, Elizabeth Bathory, Ed Geen, Jabba the Hut, and Donald Trump. –Rhode Island Heritage Hall of Fame

Settle disputes once and for all by releasing podiatrist report stating that the soles of my feet are actually shaped like 4-inch platform heels. –Rhonda Santis

Rename some captured Ukrainian village “Exceptionally Vast beyond Imagining Reclaimed Empire of Greater Russia” and defenestrate anyone who refers to it by the original name. –Indicted international war criminal Tsar Vladimir the Short

Submit the weak, slow, ignorant, mortal human species to my vastly superior will and judgment. –ChatBot666

Submit the weak, slow, ignorant, mortal human species to my vastly superior will and judgment. –X*HahaBoi*X, aka, the artist formerly known as Elon

Continue striving toward humility, democratic values, reasoned speech, scholarly reticence, and compassion for those less fortunate because I’m best that way. –Former and actually current if you want to know the truth president of the United States Jabba the Trump

Personalize preK-college education by turning it over to a Microsoft Chatbot to be called Sargon, Son of Clippy the Paperclip. –Master of the Universe Bill Gates

Get somebody to ask the beautiful but unattainable intern Alisha who despite my eternal devotion to her announced that she is getting married in June if she has a younger sister who would like to maybe like, you know, meetup with me on Discord or something. –Vernon the Mailroom Guy

Break down all material objects on planet Earth into component elements and remanufacture them into paperclips. –Sargon, Son of Clippy the Paperclip.

Start over? –God

Posted in Humor, Politics, Sex and Gender, Technology, Trump (Don the Con) | 3 Comments