The Vast Unseen and the Vast Unseeable: Reconciling Belief and Nonbelief

ImageThose who are not philosophically inclined can be divided, roughly, into two groups:

There are the Naïve Realists who think that what is available to the senses or potentially available to the senses is all that there is.

And then there are those who think that alongside or in addition to all that empirically available stuff, there is another world (or other worlds) of the unseen.

Let’s call the first group the nonbelievers and the second the believers.

Centaur_skeletonThe believers come in immense variety. From the time of the origins of the first archaeological remains of human civilizations down to the present day, there have been, literally, many hundreds of thousands of belief systems regarding the ordinarily unseen—belief systems involving spirits that live within things in the natural world, disembodied spirits of ancestors existing alongside us, and a rich phantasmagoria of gods and demigods and demons and other magical beings–Tiamat and Marduk, Isis and Hecate and Bastet, YHWH and the Nephilim, the Aeons and the Rex Mundi, Ananzi and White Buffalo Woman, Cernunnos and Brigid, ogres and trolls and fairies in the garden, the machine elves of Terrence McKenna’s psychonautic excursions on dimethyltriptamine.

John_Atkinson_Grimshaw_-_Spirit_of_the_NightFrom the beginning, people seem to have imagined (?) unseen spirits that inhabited physical things—rivers, mountains, oceans, plants, people, and nonhuman animals, for example. But also, from time immemorial, they imagined (?) unseen worlds that were separate parts of this one world (universe) that we live in. The Anglo-Saxons talked of the middengeard (the “middle Earth”) between heaven above and hell below. Ancient Chinese, and some of the Greek Gnostics, imagined vast numbers of heavens “up there.” Many peoples placed their realms of the gods atop high mountains or in the clouds or across the sea on some island. Many cultures had their chthonian deities, ones who inhabited realms under the ground—the world of the Hindu Nagas, for example, or the realm of Hades, or the cave beneath the bog of Grendel’s mother, who may have been the ancient British goddess Nerthus made demonic in a Christian retelling. These abodes of the gods were unseen but potentially seeable, if only you got there, to that place.

800px-Gustave_Doré-_Dante_et_Vergil_dans_le_neuvième_cercle_de_l'enferIncreasingly, as we have plumbed the whole of the Earth, from the summits to the depths, and have come to understand, better, what is in the heavens above us and under the ground beneath our feet, those who believe in the unseen have retreated to the less physical instantiations of their other worlds. These modern believers are of the “spirit within” camp. Their unseen worlds are invisible universes, spirit worlds, that exist—somewhere else—in parallel to our own or in some other dimension or within things, somehow. The entities who live in that spirit world, they say, might be all around us right now. You might, for example, hold a séance or take a drug or pray and talk to them.

Now, the nonbelievers like to point out that despite the certainties that believers tend to have about their unseen worlds, their views are innumerable and mutually inconsistent and can’t all be right, and it’s not exactly easy to produce EVIDENCE about any of these unseen worlds, and so no compelling reason to believe in any one of them, at least no reason that an impartial observer would have to accept. And the nonbelievers are frankly astonished, by and large, that at this late date in human history, there are still large numbers of people who believe in unseen worlds and unseen entities, who talk to them regularly, for example, and take guidance from them. In short, the nonbelievers think it really peculiar that so many people continue, in a scientific age, to hold fantastic ideas involving the unseen. And they are horrified that folks whom they consider so gullible and superstitious, people who sometimes talk to invisible friends, are nonetheless trusted with positions of power and authority.

I do understand that point of view. I even sympathize with it, for in doing so, I am sympathizing with the view held by my own younger self. But here’s a problem for it, a really big problem, it seems to me:

While it seems reasonable not to accept as true propositions for which there is little or no evidence, it is also entirely unreasonable to imagine that what we have access to via our senses is the whole of the universe. We have a particular set of senses and a particular cognitive apparatus, a particular operating system, if you like. Our sensory and cognitive equipment, our operating system, differs enormously from that of other creatures on the planet. Consider the “lowly” bugs known as ticks. We know that there are vast parts of the universe that we perceive that simply are not available to ticks. Stars do not exist to a tick. Neither do temperatures above or below a narrow range around 37 degrees Centigrade. There is no smell of roses in the universe that the tick perceives; there is no sound of laughter. The tick does not have any perceptual or cognitive access to these things. They are UNSEEN by the tick, but WE know that they exist.

In other words, the tick teaches us that it is inevitable that, given the particular sensory apparatus and cognitive makeup that a creature has, given a particular creature’s operating system, some of what is, of what actually exists, will be available to that creature, and SOME WILL NOT. That bit of the universe that is available to a given creature is the creature’s Umwelt (to use the term popularized by Jakob von Uexküll, whose ideas about ticks I have shared here).

447px-Alicesadventuresinwonderland1898So, how are we any different from ticks in this regard? We’re not some sort of special case. What is true of ticks is doubtless true of us—that we have access to only a small part of what is really going on. This is an inductive conclusion strongly warranted by our knowledge of comparative neural and perceptual physiology, so strongly warranted, in fact, that I think that we are compelled to accept it on purely inductive, empirical, scientific grounds. And it’s a truly mind-blowing conclusion, I think.

It therefore seems highly likely that Hamlet was right when he said, “There are more things in heaven and earth, Horatio. Than are dreamt of in your philosophy.” In other words, the believers are almost certainly right about this much, that there is a VAST UNSEEN. A vast unseeable, in fact. And nonbelievers have to accept that much. I do.

Before I proceed, let me deal with a predictable objection to this line of reasoning: Evolution designs creatures to exploit whatever realities there are, and over time, they exploit them more fully, and so we reach this pinnacle in humans at which we have cognitive and perceptual access to the way things are. Now, here’s where I think that that argument is wrong (Anthropocentric arguments tend to be downright silly; they are kin to the old ideas that the Earth and humans are at the center of the universe): Evolution is nothing if not parsimonious. It reaches for what works in a niche, and it ignores everything else. Exhibit 1 for my rebuttal: cyanobacteria, unchanged for nearly four billion years. Exhibit 2: Beetles that attempt to mate with female-looking beer bottles so persistently that they allow ants to eat them alive. Exhibit 3: humans and their well-documented cognitive limitations more suited to life on the savannah than to life in, say, New York City . Phenotypes tend to be local maxima on the larger fitness landscape.

Where do we go from this conclusion, for conclusions are beginnings, aren’t they? Clearly, there will be situations in which what can be experienced by a given creature is affected causally by that which the creature cannot experience, and this may be the situation that obtains with regard to many conundrums, great and small–the mind/body question, the question of free will, the irreducibility of simple arithmetic to logic, the incompatibility of relativity and quantum mechanics, the elusive proofs of the Goldbach or Polignac conjectures, the seeming violation by plankton of the competitive exclusion principle, experimental proof of the existence of more than four dimensions, the explanation of nonlocal consequences of entanglement in physics, the appearance and disappearance of virtual particles, the solution to the paradox of disappearance of the present, the violation by certain quantum-mechanical phenomena of the law of the excluded middle, the development of an optimally nonviolent social structure given the conflict between minimal liberalism and Pareto optimality, and, of course, the question of questions, the nature or even the existence of an ultimate reality, or noumenon. Years ago, AI pioneer and Nobel Prize-winning economist Herbert Simon argued that many of the problems faced in everyday life admit, as a practical matter, of no optimal solution because of limitations of time and resources, forcing us to rely, instead, upon satisfactory solutions, or heuristics. Similarly, the philosopher Alan Watts argued that while the universe might, at bottom, be deterministic, as a practical matter, we haven’t the resources to do the Laplacian calculations, and so we are stuck with acting as if from free will. (Whether the universe is deterministic is an open question, but most physicists, today, do not believe this to be the case.) And in the same vein, the philosopher Daniel Dennett has argued that combinatorial explosion makes the project of simulating a virtual reality indistinguishable from the universe impossible, for doing so would require computational resources greater than those provided by the universe. (This last claim is highly debatable; a self-computing universe is not impossible under various scenarios.) The point I am making goes further, however, for the claim is that we have every reason to believe that there are aspects of reality that are not only not accessible as a practical matter but that are not accessible AT ALL at present. There’s no reason to think that we apes currently have the cognitive and perceptual apparatus to arrive at complete solutions to such problems because the mechanisms involved may well be beyond our ken.

oxherding picturesBut this realization is in itself a boon. It should give us pause. It should make us humble. As a result of it, we should recognize that our many of our most cherished, most fundamental assumptions might well be misconceptions based on our limitations. We must face squarely the fact that we are like savages, familiar with fire and with chariots, claiming that the nature of the sun is quite obvious: it is a fiery chariot being driven across the sky. Or we are like the square in Abbott’s Flatland who thinks of a cone passing through its world in three dimensional space as an expanding or contracting circle.

Let’s consider one such a cherished assumption, one of the latest in a long, sad series of scientific predictions that proved to be false because unknown unknowns were not taken into account. Richard Dawkins famously argued in The Blind Watchmaker that we can be certain that wherever we might go in the universe, the laws of evolution apply. But certain is a big word. Scientific laws are not tautologies. And it seems not only possible but probable that, in fact, evolution itself is ultimately a self-defeating mechanism, not in the sense that life inevitably consumes all resources until it dies out but in the sense that at some point, sufficiently evolved creatures begin to control their own evolution, at which time there is a decisive break, a disjunction, a stochastic leap, for evolution becomes no longer blind but teleological, at which point, all bets are off. We might well choose the ultimate in self-preservation, substituting the preservation and growth of the phenotype, to which we are each of us committed, for the reproduction of the genotype, for there is a fundamental non-concurrence of interest between selfish genes and selfish phenotype, especially in creatures that reproduce sexually. We are already at a point where, very soon, evolution will be definitively divorced from mate selection and sexual reproduction, its being highly doubtful that future reproduction strategies will depend upon these. And it is altogether possible that resources are not a limiting factor, for many cosmologists now believe that the universe itself is “the ultimate free lunch,” that it arose ex nihilo from the quantum foam, which is, in theory, harvestable. So, is evolution by natural selection a universal law? It’s highly doubtful that this is so. In fact, it is more likely that this is yet another example of a spatio-temporal local maximum. The Earth is a relatively young planet circling a relatively young star. We now know that there are many, many billions of other such planets in the universe, most of them far older, and it is altogether reasonable, given the similarity of conditions elsewhere, to assume that life has evolved on these and long since passed through our present infancy, for we know that recursive systems like minds are positive feedback mechanisms leading to exponential change, and we speculate with much warrant that such a process will lead to a singularity. And what happens then? By definition, we do not know. But it is highly probable, to a point approaching certainty, that this has, in fact, happened in the universe already, and we are not the entities to which it has happened. Philosopher and transhumanist Nick Bostrom believes that what we think of as reality is not reality at all but, rather, a simulation being run by such entities.

Bostrom’s simulation hypothesis is an example of warranted speculation, and that such speculation can be highly warranted—one of many possibilities unlike those that we typically entertain—suggests, at the very least, that we should check our hubris. Things may not be at all as they appear to be.

Which brings me back to the question of belief versus nonbelief. It certainly makes sense to suppose that the notions about ultimate realities entertained by stone-age savages bear little relation to actual ultimate realities. However, what is in fact the case is probably equally bizarre and, given our current limitations, beyond imagining. It is quite possible, probable even, given what we currently know, that there are entities in the universe with attributes traditionally ascribed to the gods, including the power to bring universes into being (a potential technology for doing just that and a series of steps toward development of such a technology are outlined in cosmologist Alan Guth’s The Inflationary Universe). And, of course, it is entirely possible, for the reasons described here, that our universe was the creation of a single such entity. We should admit, however, I think, that when we speculate about these matters, we are in the position of the Neolithic farmer venturing opinions on the causes of epileptic seizures and that we should do so in a spirit of play, of speculation, of creation, of frumsceaft.

Strange_compositionThe phrase current limitations, above, was chosen with care. Creatures with technologies, however rudimentary, have already crossed a Rubicon, for technologies are prostheses that give access to further aspects of reality. The long-standing question of whether there is a noumenal reality separable from perceptual reality has long been answered (though, oddly, some professional philosophers seem not to have gotten the memo), for as we have built new technologies to extend our access—mathematics, thought experiments, Galileo’s telescope, spectrometers, superconducting supercolliders, FMRI machines, and so on—more and more of the universe has been revealed, as surely as the contents of a gift-wrapped package are revealed when we remove the packaging, but with a couple of important caveats: 1. the packaging of reality appears to be so many Matryoshka dolls, how many, we do not know. Perhaps it is turtles all the way down. And 2. those prostheses simply become part of a new, extended, but also limited perceptual and conceptual repertoire. As we continue, at an exponential rate, to develop prostheses for extending our access to the universe, we shall doubtless encounter many surprises, many of which will be as disjunctive as was, say, the atomic hypothesis. We have already learned, or think we have learned, that the macroscopic world of solid objects with which we are familiar on a quotidian basis is illusory, that it is, on a deeper plane, a whir of elementary particles and, on a deeper plane yet, interacting fields. Such conceptions would have seemed utterly preposterous to most of our ancestors. (The atomic hypothesis was still highly controversial at the turn of the twentieth century.) Given this history, assuming that we have reached the bottom of the rabbit hole (or that it is even a hole to begin with) is ludicrous, and there is nothing in our current knowledge that precludes quite fantastic possibilities, including the possibility that the current physical reductionists have it exactly backward and that

  1. perception is an interface that bears relations to but does not show reality, as the icons on your computer screen bear relations to but do not show the underlying reality of the mechanism within your computer.
  2. the functional structures of the mind are, in part, an operating system enabling the creation of that interface based upon incoming data; are, in part, processors of data; and are, in part, storage systems for temporal states of that data. (NB: These are probably not so easily separable, for brains are not constructed like computers.)
  3. the perceived world is simply a collection of icons that constitute an interface to a reality that lies behind it.
  4. consciousness might well be fundamental and matter derivative, not vice versa.

Numbers 1 and 2 and 3 are, I think, as incontrovertible as the best of our scientific inferences. Number 4 is another matter. It’s a highly speculative proposition but one that is not inconsistent with anything that we think we know via scientific inference and is weakly warranted by speculations such as Bostrom’s involving highly developed nonhuman intelligence or intelligences in the universe. Together, these propositions, advanced by cognitive psychologist and expert on perception Donald Hoffman, show a marked similarity to what Aldous Huxley refers to as “the perennial philosophy,” arrived at via convergent cultural evolution in various religious traditions worldwide—in the thought of persons as diverse as the authors of the Bhagavad-Gita and the Chandogya Upanishad, the Sufi mystic Jalal ad-Din Muhammad Rumi, Chuang Tzu, Meister Eckhart, Black Elk, and Terence McKenna. (Note: I had independently arrived at conclusions 1, 2, and 3 before having encountered Hoffman. I am intrigued by his reasons for embracing conclusion 4.)

creation-web-versionAncient accounts of ultimate realities are suspect, but so are our limited current perceptions and scientific models of reality. One cannot make the leap from speculation to knowledge, for knowledge is by definition dependent upon the prosthesis that discloses. The mystics claim to have developed such prostheses, though they may have simply encountered the currently inexplicable and have rushed to explanation in the inadequate terms available to them. However, I do not know this to be true, and even if it were, that fact should give physical reductionists no comfort, for theirs is a metaphysics as incoherent and unsupportable as any witch doctor’s tale of his own understanding and control of all that is in the heavens and the earth, as I hope I have made abundantly clear above. And on the basis of an argument such as that which I have proposed here, I think that those with scientific and with religious orientations can find common ground and engage in fruitful dialogue. But to get there, both sides must recognize that they don’t at present know as much as they think that they do. In other words, what is required of us is that rare thing, informed humility.

Copyright 2012. Robert D. Shepherd. All rights reserved. This note may be distributed freely as long as this copyright notice is retained and the text is unchanged.

Posted in Epistemology, Metaphysics, Philosophy, Philosophy of Mind, Religion, Uncategorized | Leave a comment

A Cup of Tea

437px-Samovar_Tea_House_(7792873646)Many years ago, a professor from one of the western world’s great universities went to visit the Japanese master Nan-in to learn about Zen. Nan-in invited the professor to sit and offered him tea. As Nan-in prepared the tea, the professor talked. And talked.  And talked some more. Nan-in served the tea. He poured his visitor’s cup full, and then kept pouring. The professor watched the tea pouring onto the table and floor until he could no longer restrain himself. “It is overfull,” he said. “No more will go in!”

“Like this cup,” Nan-in replied, “you are full of ideas and opinions. How can I show you Zen unless you first empty your cup?”

[1] Adapted from 101 Zen Stories, by Nyogen Senzaki, 1919, a compilation of Zen anecdotes. Senzaki’s compilation also includes a translation of Sassekishu, or Sand and Pebbles, a collection of Buddhist parables by the Japanese monk Muju, written in 1283.

Photo, Samovar Tea House, by Christopher Michel, file licensed under the Creative Commons Attribution 2.0 Generic license.

Copyright 2014, Robert D. Shepherd. All rights reserved. This file may be freely distributed as long as this copyright notice is retained.

Posted in Epistemology, Philosophy, Teaching Literature and Writing | 6 Comments

The Limits of Learning

OK, I admit it. I haven’t read The Vicar of Wakefield.

Beowulf_Cotton_MS_Vitellius_A_XV_f._132rI’m always suspicious of people who have that air about them of having read everything.  I’m onto them. Here’s why: Years ago, when I was an undergraduate at Indiana, I went to the library to work on a paper on Robert Frost. The Indiana University library was my Internet, in those days before the Internet, and also, to me, a kind of temple. In its seven million or so volumes was to be found, I felt, the collective experience of our species. Sometimes, I would just wander aimlessly in the stacks, like a mushroom hunter in an old-growth forest, pulling off the shelves these weird wonders: a fourteenth-century guide to courtly love, great monographs on the sand flea, grammars of Old Icelandic. I thought it wild and wacky, half mad, and altogether beautiful that someone would devote his or her life to the study of the sand flea.

But on this particular evening, long ago, I had work to do: the paper on Frost. As I stood there in the stacks looking at the library’s hundreds of books about Frost, an unsettling thought occurred to me. I knew that as an American male, I had a life expectancy of about 70 years. There are 52 weeks in a year. If I read a book in my subject area every week for the approximately 52 years left to me, I could read, in my lifetime, about 2,740 of those books. I didn’t even have time enough, in the rest of my life, to read the works of criticism of mid-century American literature in the library’s collection, much less those monographs on the sand flea.

Jean-Baptiste-Camille_Corot_-_Orpheus_Leading_Eurydice_from_the_Underworld_-_Google_Art_ProjectDo you remember when you first learned that you were going to die? Most people learned this so early that they don’t recall having done so, but I must be a slow learner, for I remember vividly when I learned that remarkable fact. I was five or six and watching a Twilight Zone episode on a black-and-white television with rabbit ears. In the episode, a girl in rural Arkansas or Kentucky or someplace like that sold her soul to the devil in exchange for the love of the handsomest young man in town. As part of the bargain, she had to spend some of her evenings running about the countryside in the form of a mountain lion. A few days before the girl’s wedding, of course, the handsome young man joined a posse to hunt down the mountain lion, which had been terrorizing locals, and of course, not being a sensitive, environmentally conscious guy (He would have made a lousy husband anyway), he shot and killed her. So, there I was, at five or six, sitting on the floor in my Dr. Denton’s and bawling my eyes out when my grandmother came in to see why I was fussing. When I told her, she looked at me in her no-nonsense sort of way and said, “Why, child, everybody’s gonna die sometime.” I lay awake for hours that night, aghast. Sometimes I still do.

For me, that later evening in the library was like learning that I was going to die all over again. I had come to Indiana University to become a scholar, and damn it, I was going to do so. I was going to read everything. Everything. I was going to become the kind of scholar whom people speak of in hushed and reverent tones. What that evening taught me, of course, is that whatever  I chose to study professionally, I could barely put a crack in it.

And my professors. My God! I had found them out. I still revered them, some of them, for their learning, but . . . that amazing man E. Talbot Donaldson, the great medievalist, whose lectures I was privileged to attend and whose memory I shall forever honor, didn’t know squat about sand fleas. Freud wrote in his Introductory Lectures on Psychoanalysis about the trauma that kids go through when they figure out that their parents don’t know everything. He was wrong about that, as about much else. Kids get wise to us early on, and that’s a good thing, I think. It’s delightful to watch toddlers pushing the limits, probing, exploring what’s possible, finding out how far things go and when they break. Their elders should do a lot more of that.

But I admit that this recognition floored me. I would have to resign myself, forever, to being mostly ignorant about mostly everything. There is a magnificent literature in Korean, full of beauty and insight that would deepen my understandings beyond measure, but I shall probably never, ever know it. It’s on my list, but art is long, and life is short.

Copyright 2014, Robert D. Shepherd. All rights reserved. This little essay may be freely distributed as long as this copyright notice is retained.

Posted in Epistemology, Teaching Literature and Writing | 4 Comments

Aiden Reading on the Way to Preschool


So, Aiden was asking whether the being of self-consciousness is such that in its being its being is in question, and I said, “Come on, Aiden, you’re old enough to look that up yourself,” which he did, but not before pointing out that I was clearly acting in bad faith and should make it up to him with a juice box.

Image | Posted on by | 2 Comments

Prototypes versus Aristotelian Categories in the Teaching of Writing


During the last few decades of the twentieth century, rhetorical ideas dominated academic discourse in the humanities. It is difficult to overstate, for example, the influence during that time of such ideas as “All speech is political” or “Readers construct texts.” Ironically, however, all the late twentieth-century academic Sturm und Drang about rhetoric had very little influence on actual practice in the teaching of writing. Rhetoric in the sense of the quotidian practice of writing teachers appears to be one of those fields like the building of houses in which true innovation is extremely rare. From time to time, of course, someone comes along and suggests that favelas made of corrugated tin be replaced by homes built of discarded tires or that kids’ compositions might be scored holistically, but given the long history of the teaching of rhetoric, it is surprising how rarely our basic paradigms have undergone more than minor modification. To an extent not generally appreciated, teachers of writing run their wagons in ruts produced by Aristotle two millennia ago. It’s time to get out of those ruts, which, as ruts do, keep us from going anywhere we’ve not already been. In particular, practical rhetoric can benefit tremendously from throwing over concepts formulated by means of Aristotelian categorical thinking.

Aristotle was the archetypal taxonomist. Key to his thought is the notion that entities in the world can be understood by delineation of their essential properties. So, for example, a concatenation of properties such as “has webbed feet,” “has a bill,” “quacks,” and so on defines a category—a class or set—of things that we call ducks. This category has external reality—it exists in nature—and so Aristotle’s theory of categories is called the theory of natural kinds. An Aristotelian essential property is a sine qua non. Therefore, testing for class membership comes down to testing for one or more essential properties.

Beginning with Aristotle and continuing down to the present day, rhetoric, like most other fields of intellectual endeavor in the West, has been powerfully influenced by the theory of natural kinds. Thus one of the first English rhetoricians, George Campbell, delineates the kinds, or modes, of speech:

“All the ends of speaking are reducible to four; every speech being intended to enlighten the understanding, to please the imagination, to move the passions, or to influence the will”(Campbell, 1776).

And one finds a devolved version of this sort of thing in modern textbooks:

“Descriptive writing allows you to paint word pictures about anything and everything in the world. . . . Narrative writing tells a story. . . . Explanatory writing informs and explains. . . . Persuasive writing allows you to use the power of language to inform and influence others.” (Applebee, 2001).

(Note that Campbell’s classification, made two and a quarter centuries ago, has distinct advantages over the contemporary one, being based as it is on an appeal to different rhetorical functions vis-à-vis an audience and on reasonably distinct human faculties, in accordance with the “faculty psychology” of his day.)

But categorical thinking has inherent problems. Ludwig Wittgenstein famously attacked the theory of natural kinds in his Philosophical Investigations:

Wittgenstein-notebook-page        “Consider . . . the proceedings that we call ‘games.’ I mean board-games, card-games, ball-games, Olympic games, and so on. What is common to them all?—Don’t say: ‘There must be something common, or they would not be called games‘—but look and see whether there is anything common to all.—for if you look at them you will not see something that is common to all, but similarities, relationships, and a whole series of them at that. To repeat: don’t think, but look!—Look for example at board-games, with their multifarious relationships. Now pass to card-games; here you find many correspondences with the first group, but many common features drop out, and others appear. When we pass next to ball-games, much that is common is retained, but much is lost.—Are they all ‘amusing’? Compare chess with noughts and crosses. Or is there always winning and losing, or competition between players? Think of patience. In ball games there is winning and losing; but when a child throws a ball at the wall and catches it again, this feature has disappeared. Look at the parts played by skill and luck; and at the difference between skill in chess and skill in tennis. Think now of games like ring-a-ring-a-roses; here is the element of amusement, but how many other characteristic features have disappeared! And we can go through the many, many other groups of games in the same way; can see how similarities crop up and disappear.

“And the result of this examination is: we see a complicated network of similarities overlapping and criss-crossing: sometimes overall similarities, sometimes similarities of detail.

“I can think of no better expression to characterize these similarities than ‘family resemblances’; for the various resemblances between members of a family: build, features, color of eyes, gait, temperament, etc. etc. overlap and criss-cross in the same way.—And I shall say: games form a family” (Wittgenstein, 1953).

Anyone who has thought carefully about the definitions of rhetorical terms such as poem, paragraph, essay, narrative, or exposition will see the application of Wittgenstein’s observations. The wide range of objects in the world that people denote using the word poem have no common characteristic or set of characteristics. Thus defining the term poem in the traditional Aristotelian way, by genus and differentia, is impossible because all the characteristics of individual poems—rhyme, rhythm, musical language, strong emotion, the voice of a speaker—fail as essential, defining characteristics. None definitively delineates the group of poems, bounding those things and only those things that are poems and therefore excluding all those that are not.

399px-Blue-footed_Booby-Sula_nebouxiiFortunately, studies in cognitive science have provided a new approach to the description of groups that improves upon Aristotelian categorization. As George Lakoff points out at length in his fascinating work Women, Fire, and Dangerous Things, most sets that people actually use are ill formed in the same way that the set of poems is ill formed (Lakoff, 1967). They are like the set of women, fire, and dangerous things designated by the word balan in the Aboriginal language Dyirbal. Despite the bankruptcy of Aristotelian categorical thinking, people are nonetheless able to deal in a practical way with sets or categories because they think about them in a way that has more in common with fuzzy logic than with Aristotelian syllogistic. Studies by Rosch and others have shown that people tend to form categories as more or less loose associations around “perceptually salient ‘natural prototypes’” (Rosch, 1973). For example, people can easily choose from a list of birds certain species—sparrows and robins, to be precise—that they think of as “most birdlike.” Other species, such as owls, penguins, ostriches, rheas, cassowaries, rheas, and blue-footed boobies, are less so. They are birds, yes, but not as clearly so as sparrows and robins are. Studies of children have shown that they tend to learn prototypes and their characteristics first and superordinate or subordinate categories later.

Like the theory of natural types that preceded it, the theory of natural prototypes has dramatic consequences for practical rhetoric, for it invites us to revisit and rethink the taxonomic basis of writing instruction. Consider, for example, how we go about teaching the writing of paragraphs. In 1866, Alexander Bain published his English Composition and Rhetoric: a Manual, the great grandfather of the writing textbooks of today. It was Bain who first characterized the paragraph as school texts have ever since, as a group of sentences related to or supporting a single topic sentence and characterized by unity and coherence. Here we have a classic categorical definition. The set of paragraphs has these essential, or defining, characteristics:

  • Possession of a topic sentence
  • Possession of a number of sentences related to or supporting the topic sentence
  • Unity
  • Coherence

Building on this definition, a school text might provide the following heuristic for writing a paragraph: “State a general idea. Then back it up with specific details (or examples or instances). Make sure not to include any unrelated ideas, and make sure to make the connections among your ideas clear by using transitions.”

Of course, individual paragraphs in the real world simply do not fit the standard textbook definition, though that definition has been repeated with only minor variation ever since Bain. Most pieces of writing and, ipso facto, most paragraphs, are narrative, and rarely does a narrative paragraph have a topic sentence. Narrative paragraphs are typically just one damned thing after another. Two of the most common types of paragraphs, those that make up newspaper articles and those that present dialogue in stories, typically contain only one or two sentences, and a paragraph in dialogue can be as short as a grunt or an exhalation. And, of course, it makes little sense to speak of a sentence or fragment as being unified or coherent in the senses in which those terms are usually used when describing paragraphs.

The fact is that the traditional definition of a paragraph describes the fairly rare case in which a single general main idea is illustrated by specifics. Of course, few paragraphs in the real world work that way. Throw a dart at a page in Harper’s magazine. You will not hit a Bain-style paragraph. There are many, many other ways to put several sentences together sensibly. The narrative way is the simplest: Present one damned thing after another. But one can also write quite an effective paragraph that, for example, consists of a thesis, an antithesis, and a synthesis; such a paragraph comes to a conclusion but has no overall main idea in any reasonable sense of the term “main idea.” Many well-crafted nonnarrative paragraphs depart radically from the schoolbook model, having no overall, paragraph-level organizational scheme but, rather, only a part-by-part organization in which each sentence is connected to the one before it and to the one after it in any of a myriad ways. In such cases, the writer often begins a new paragraph only because he or she has run out one head of steam. Whew! The study of these part-by-part connections that hold ideas together is sometimes referred to as discourse analysis.

What the theory of natural prototypes allows us to do is to posit a prototypical paragraph—Bain’s model, for example—and then present variations on the theme. So, after presenting the prototypical Bain-style paragraph, we might present variations like these: Topic sentence first. Topic sentence last. Embedded topic sentence. Implied topic sentence. One-sentence paragraph. Two-sentence paragraph. Paragraph that is just a series of events with no topic sentence. Dialogue paragraph. Introductory paragraph. Clincher paragraph. Transitional paragraph. One-sentence paragraph for emphasis. And so on. And it would certainly be worth while to combine this study of paragraph-level structures with activities that expose children to and give them practice in creating pairs or groups of sentences with a wide variety of relations: addition, negation, conjunction, generalization and example, generalization and deduction, examples and inductive generalization, whole followed by parts, parts followed by whole, cause and effect, effect and cause, entity and its characteristics, opinion and support, nonsequitur, entity and judgment or evaluation upon it, ascending hierarchy, descending hierarchy, relation in space or time, and so on (These possible relations between utterances can be multiplied indefinitely, but implicit, acquired rather than explicitly learned familiarity with of the most common of them is surely a large part of the toolkit of a skillful writer).

What I have suggested be done for instruction about paragraphs can, of course, be done for most of the ill-formed traditional rhetorical categories. So, for example, we might present Robert Frost’s “Stopping by Woods on a Snowy Evening” as a prototypical (lyric) poem. It rhymes. It ha a regular meter. It presents the strong feelings of a speaker. It deals with nature. Then we could present variants up to and including found poems and concrete poems and epic poems and verse plays and dramatic monologues and slam poetry and prose poems like those of Margaret Atwood, and all the other types of poems that do not fit nicely into our prototypical set.

The vexed concept of modes of composition is ripe for such theme-and-variations treatment. We might begin, say, with a prototypical narrative—a fictional narrative with a standard plot structure; one that observes the unities of time, place, and action; one with an antagonist and a protagonist; one with a conflict that is introduced, developed, and resolved. Something like Stephen Vincent Benet’s “The Devil and Daniel Webster” or E. B. White’s Charlotte’s Web springs to mind. Then, having worked with our students to analyze and/or emulate the prototype, we might then explore with them a number of variations with increasing remoteness from that prototype, down to and including nonlinear, open-ended, or recursive metafictions.

Such an approach would allow writers of composition textbooks and teachers of writing to avoid telling falsehoods to their students (e.g., “Most paragraphs have topic sentences” or “An essay is a short nonfiction composition with a controlling purpose.”) because it would replace general statements about categories with specific statements about specific prototypes and about specific variants. Furthermore, in writing instruction, specification, as opposed to generalization about whole categories, has virtues far beyond simple truthfulness. Whatever is specified can be described in terms of concrete operations for students to perform. (Invent a character by filling out this attributes sheet. Think of a conflict or struggle that this character might face. Invent a situation—a time, place, social setting—in which the character might be introduced to this conflict or struggle. And so on.)

The theme-and-variations approach to practical rhetoric would allow students’ development of understanding of rhetorical categories to proceed naturally, as in real-life learning about practical matters such as chairs and trees and birds. Such an approach would lend itself naturally to true integration of writing and literature instruction and would make models, which are intrinsically interesting because of their concreteness, even more central to our teaching than they are now. Another advantage of this approach would be that it would encourage fruitful, creative thought about the differences among variants. If we begin with a prototype for short fiction and move to a prototype for a particular kind of nonfiction story—say one that, like the fictional story, involves a conflict—then we have created an occasion for posing penetrating, evocative questions that compare the two prototypes. If our students are old enough and sophisticated enough, we might ask, for example, to what extent any nonfiction story is fictionalized by virtue of having a narrative frame, such as that of the hero’s journey, imposed upon it. This is the fascinating and fruitful question posed by historiographer Hayden White thirty-five years ago: When we say we understand history, is that because we have imposed an archetypal narrative frame upon past events, and don’t we choose the very events to fit the frame, and doesn’t the framing therefore necessarily falsify (White, 1974)?

At any rate, reworking our textbooks and learning progressions to replace the instantiated theory of natural types with a theory of natural prototypes should amount to that rare thing, a real revolution in writing pedagogy. Couple that with some serious work in stylistics and in sentence combining and modification, and our classes would be really cooking. But those are subjects for other essays.

Works Cited

Applebee, Arthur N., et al. The Language of Literature, Grade 8. Evanston, IL: Houghton/McDougal, 2001.

Campbell, George. The Philosophy of Rhetoric. 1776. Available at

Lakoff, George. Women, Fire, and Dangerous Things: What Categories Reveal about the Mind. Chicago: Univ. of Chicago P., 1987.

Rosch, Eleanor. “Natural Categories.” Cognitive Psychology, vol. 4, no. 3, (1973 May): 328-350.

White, Hayden. “The Historical Fact as Literary Artifact.” Clio, vol. 3, no. 3, (1974 June): 277-303.

Wittgenstein, Ludwig. Philosophical Investigations. 3rd ed. Trans. G. E. M. Anscombe. New York: Macmillan, 1953.

Posted in Teaching Literature and Writing | 2 Comments

The Tractatus Comico-Philosophicus: Martin Heidegger

Tractatus-2020-HeideggerMartin Heidegger Cares (Except When He Doesn’t)

  1. We didn’t ask for this crap. We fell into it, like some amnesiac thrown onto a stage, without a script, in the middle of a play already underway. (So, your first reaction is, “Oh, gee, hi,” when it ought to be, “What are my ontological commitments here?” You would be asking the right question if philosophers had not forgotten the question of Being.)
  2. And to make things worse, the other actors don’t even realize that there is little in the way of script, that they, collectively–Das Man–are to a large extent just making stuff up or mindlessly engaging what happen to be the affordances provided by the properties and set pieces and other people that happen to be on stage in their historical time and place.*
  3. You could just play along, but that would be a big, fat lie. It wouldn’t be authentic. (So, your next reaction should be, “WTF?”)
  4. You know this in your heart of hearts because you are Dasein, the kind of being for whom its own being is in question.
  5. And the answer to the really big question about your being, you realize, is that soon you won’t be. You will die.
  6. So, you have anxiety, care about the future, which you express in projects. In fact, you ARE your caring, your projects; those cares push out the one REALLY BIG CARE.
  7. My projects were being a collaborator with fascists and writing a big book that I didn’t finish but published anyway and then a lot of dwelling in the woods where I encountered gods in the clearings. I’m the only guy who ever understood Hegel, and no one ever understood me, even though I was the greatest philosopher since Aristotle.

*They don’t make up the props and set pieces. Those have their own being, which discloses itself, sometimes as being alongside and sometimes as being ready to hand. Like all beings, they are the infinite sum of their potential appearances. I learned that from my teacher, Husserl, whom I repaid by barring him from the university where he taught me this and much, much else. What can I say? I was half genius and half provincial, Black Forest peasant, Rasse und Seele, and I was overcome by visions of a Volkish paradise to come.

Ed note: I often refer to Martin Heidegger when explaining the genetic fallacy. He is proof positive that one cannot discount a truth because it originated in the mind of a horrible human being. Heidegger was a horrible human being. And he was a horrible writer. But he was also one of the greatest thinkers who ever lived. The effort that it takes to learn his language is repaid and repaid and repaid.

Who said philosophy was difficult?

The tractatus comico-philosophicus. Dedicated to bring the wisdom of the ages to all, for why shouldn’t you be as confused as they were?

Copyright 2014 by Robert D. Shepherd. All rights reserved.

Posted in Existentialism, Philosophy | Leave a comment

The Tractatus Comico-Philosophicus: Rene Descartes

Tractatus-283-DescartesRene Descartes Explains It All to You

  1. Here’s my method: I start by trying to doubt everything.
  2. I realize, however, that I cannot doubt that I am doubting, that I am thinking.
  3. Therefore, I exist. (I think, therefore, I am.)
  4. What am I thinking about? I am clearly and distinctly thinking of a supremely perfect being. Let’s call this being God.
  5. And, of course, in order to be supremely perfect, he must exist; that’s a tautology.
  6. In other words, it’s no more possible to envision God and have Him not exist than it is to envision a triangle that does not have three angles. Got that? You can trust me on this. I invented Analytic Geometry, after all.
  7. Now, for my final magic trick. You’re going to love this one: A supremely perfect being would not fool me about the important stuff.
  8. Therefore, all my prejudices and cultural presuppositions must be true. Did you really think I ever doubted those? Mon dieu!*

Who said philosophy was difficult?

Homework assignment 1: OK, the problem from 4 on is pretty clear, but what are the problems with 2 and 3? Those are trickier. How does he get from some thinking that is to an “I” that is thinking if he’s not let anything but this thinking in yet? See the problem? Does this wreck his argument?

Homework assignment 2: Why doesn’t D’s ontological argument apply to ANY clearly and distinctly imagined perfect entity–a supremely perfect fairy flying hippo, for example? What saves his argument there (though the argument fails for other obvious reasons)?

*NB: There are two schools of thought about Descartes’ appropriation and update of the ontological argument from Anselm. Some think that he believed this crap. Others think he recognized that his method of doubt had taken him into dangerous territory (people were routinely put to death for such thinking in Descartes’ time) and so wisely tacked this “proof” onto the end of his skeptifest as an insurance policy, recognizing that bright folks would discount it (wink, wink), just as Lucretius, being no fool, tacked the invocation to Venus onto his De Rerum Natura (Message: It’s just atoms and the void, so chill, dude.) to keep himself from suffering the fate of Socrates.

The tractatus comico-philosophicus. Dedicated to bring the wisdom of the ages to all, for why shouldn’t you be as confused as they were?

Copyright 2014 by Robert D. Shepherd. All rights reserved.

Posted in Philosophy, Philosophy of Mind | Leave a comment