A Cup of Tea

437px-Samovar_Tea_House_(7792873646)Many years ago, a professor from one of the western world’s great universities went to visit the Japanese master Nan-in to learn about Zen. Nan-in invited the professor to sit and offered him tea. As Nan-in prepared the tea, the professor talked. And talked.  And talked some more. Nan-in served the tea. He poured his visitor’s cup full, and then kept pouring. The professor watched the tea pouring onto the table and floor until he could no longer restrain himself. “It is overfull,” he said. “No more will go in!”

“Like this cup,” Nan-in replied, “you are full of ideas and opinions. How can I show you Zen unless you first empty your cup?”


[1] Adapted from 101 Zen Stories, by Nyogen Senzaki, 1919, a compilation of Zen anecdotes. Senzaki’s compilation also includes a translation of Sassekishu, or Sand and Pebbles, a collection of Buddhist parables by the Japanese monk Muju, written in 1283.

Photo, Samovar Tea House, by Christopher Michel, file licensed under the Creative Commons Attribution 2.0 Generic license.

Copyright 2014, Robert D. Shepherd. All rights reserved. This file may be freely distributed as long as this copyright notice is retained.

Posted in Epistemology, Philosophy, Teaching Literature and Writing | 6 Comments

The Limits of Learning

OK, I admit it. I haven’t read The Vicar of Wakefield.

Beowulf_Cotton_MS_Vitellius_A_XV_f._132rI’m always suspicious of people who have that air about them of having read everything.  I’m onto them. Here’s why: Years ago, when I was an undergraduate at Indiana, I went to the library to work on a paper on Robert Frost. The Indiana University library was my Internet, in those days before the Internet, and also, to me, a kind of temple. In its seven million or so volumes was to be found, I felt, the collective experience of our species. Sometimes, I would just wander aimlessly in the stacks, like a mushroom hunter in an old-growth forest, pulling off the shelves these weird wonders: a fourteenth-century guide to courtly love, great monographs on the sand flea, grammars of Old Icelandic. I thought it wild and wacky, half mad, and altogether beautiful that someone would devote his or her life to the study of the sand flea.

But on this particular evening, long ago, I had work to do: the paper on Frost. As I stood there in the stacks looking at the library’s hundreds of books about Frost, an unsettling thought occurred to me. I knew that as an American male, I had a life expectancy of about 70 years. There are 52 weeks in a year. If I read a book in my subject area every week for the approximately 52 years left to me, I could read, in my lifetime, about 2,740 of those books. I didn’t even have time enough, in the rest of my life, to read the works of criticism of mid-century American literature in the library’s collection, much less those monographs on the sand flea.

Jean-Baptiste-Camille_Corot_-_Orpheus_Leading_Eurydice_from_the_Underworld_-_Google_Art_ProjectDo you remember when you first learned that you were going to die? Most people learned this so early that they don’t recall having done so, but I must be a slow learner, for I remember vividly when I learned that remarkable fact. I was five or six and watching a Twilight Zone episode on a black-and-white television with rabbit ears. In the episode, a girl in rural Arkansas or Kentucky or someplace like that sold her soul to the devil in exchange for the love of the handsomest young man in town. As part of the bargain, she had to spend some of her evenings running about the countryside in the form of a mountain lion. A few days before the girl’s wedding, of course, the handsome young man joined a posse to hunt down the mountain lion, which had been terrorizing locals, and of course, not being a sensitive, environmentally conscious guy (He would have made a lousy husband anyway), he shot and killed her. So, there I was, at five or six, sitting on the floor in my Dr. Denton’s and bawling my eyes out when my grandmother came in to see why I was fussing. When I told her, she looked at me in her no-nonsense sort of way and said, “Why, child, everybody’s gonna die sometime.” I lay awake for hours that night, aghast. Sometimes I still do.

For me, that later evening in the library was like learning that I was going to die all over again. I had come to Indiana University to become a scholar, and damn it, I was going to do so. I was going to read everything. Everything. I was going to become the kind of scholar whom people speak of in hushed and reverent tones. What that evening taught me, of course, is that whatever  I chose to study professionally, I could barely put a crack in it.

And my professors. My God! I had found them out. I still revered them, some of them, for their learning, but . . . that amazing man E. Talbot Donaldson, the great medievalist, whose lectures I was privileged to attend and whose memory I shall forever honor, didn’t know squat about sand fleas. Freud wrote in his Introductory Lectures on Psychoanalysis about the trauma that kids go through when they figure out that their parents don’t know everything. He was wrong about that, as about much else. Kids get wise to us early on, and that’s a good thing, I think. It’s delightful to watch toddlers pushing the limits, probing, exploring what’s possible, finding out how far things go and when they break. Their elders should do a lot more of that.

But I admit that this recognition floored me. I would have to resign myself, forever, to being mostly ignorant about mostly everything. There is a magnificent literature in Korean, full of beauty and insight that would deepen my understandings beyond measure, but I shall probably never, ever know it. It’s on my list, but art is long, and life is short.

Copyright 2014, Robert D. Shepherd. All rights reserved. This little essay may be freely distributed as long as this copyright notice is retained.

Posted in Epistemology, Teaching Literature and Writing | 2 Comments

Aiden Reading on the Way to Preschool

1555576136514

So, Aiden was asking whether the being of self-consciousness is such that in its being its being is in question, and I said, “Come on, Aiden, you’re old enough to look that up yourself,” which he did, but not before pointing out that I was clearly acting in bad faith and should make it up to him with a juice box.

Image | Posted on by | 2 Comments

Prototypes versus Aristotelian Categories in the Teaching of Writing

Plato-and-Aristotle

During the last few decades of the twentieth century, rhetorical ideas dominated academic discourse in the humanities. It is difficult to overstate, for example, the influence during that time of such ideas as “All speech is political” or “Readers construct texts.” Ironically, however, all the late twentieth-century academic Sturm und Drang about rhetoric had very little influence on actual practice in the teaching of writing. Rhetoric in the sense of the quotidian practice of writing teachers appears to be one of those fields like the building of houses in which true innovation is extremely rare. From time to time, of course, someone comes along and suggests that favelas made of corrugated tin be replaced by homes built of discarded tires or that kids’ compositions might be scored holistically, but given the long history of the teaching of rhetoric, it is surprising how rarely our basic paradigms have undergone more than minor modification. To an extent not generally appreciated, teachers of writing run their wagons in ruts produced by Aristotle two millennia ago. It’s time to get out of those ruts, which, as ruts do, keep us from going anywhere we’ve not already been. In particular, practical rhetoric can benefit tremendously from throwing over concepts formulated by means of Aristotelian categorical thinking.

Aristotle was the archetypal taxonomist. Key to his thought is the notion that entities in the world can be understood by delineation of their essential properties. So, for example, a concatenation of properties such as “has webbed feet,” “has a bill,” “quacks,” and so on defines a category—a class or set—of things that we call ducks. This category has external reality—it exists in nature—and so Aristotle’s theory of categories is called the theory of natural kinds. An Aristotelian essential property is a sine qua non. Therefore, testing for class membership comes down to testing for one or more essential properties.

Beginning with Aristotle and continuing down to the present day, rhetoric, like most other fields of intellectual endeavor in the West, has been powerfully influenced by the theory of natural kinds. Thus one of the first English rhetoricians, George Campbell, delineates the kinds, or modes, of speech:

“All the ends of speaking are reducible to four; every speech being intended to enlighten the understanding, to please the imagination, to move the passions, or to influence the will”(Campbell, 1776).

And one finds a devolved version of this sort of thing in modern textbooks:

“Descriptive writing allows you to paint word pictures about anything and everything in the world. . . . Narrative writing tells a story. . . . Explanatory writing informs and explains. . . . Persuasive writing allows you to use the power of language to inform and influence others.” (Applebee, 2001).

(Note that Campbell’s classification, made two and a quarter centuries ago, has distinct advantages over the contemporary one, being based as it is on an appeal to different rhetorical functions vis-à-vis an audience and on reasonably distinct human faculties, in accordance with the “faculty psychology” of his day.)

But categorical thinking has inherent problems. Ludwig Wittgenstein famously attacked the theory of natural kinds in his Philosophical Investigations:

Wittgenstein-notebook-page        “Consider . . . the proceedings that we call ‘games.’ I mean board-games, card-games, ball-games, Olympic games, and so on. What is common to them all?—Don’t say: ‘There must be something common, or they would not be called games‘—but look and see whether there is anything common to all.—for if you look at them you will not see something that is common to all, but similarities, relationships, and a whole series of them at that. To repeat: don’t think, but look!—Look for example at board-games, with their multifarious relationships. Now pass to card-games; here you find many correspondences with the first group, but many common features drop out, and others appear. When we pass next to ball-games, much that is common is retained, but much is lost.—Are they all ‘amusing’? Compare chess with noughts and crosses. Or is there always winning and losing, or competition between players? Think of patience. In ball games there is winning and losing; but when a child throws a ball at the wall and catches it again, this feature has disappeared. Look at the parts played by skill and luck; and at the difference between skill in chess and skill in tennis. Think now of games like ring-a-ring-a-roses; here is the element of amusement, but how many other characteristic features have disappeared! And we can go through the many, many other groups of games in the same way; can see how similarities crop up and disappear.

“And the result of this examination is: we see a complicated network of similarities overlapping and criss-crossing: sometimes overall similarities, sometimes similarities of detail.

“I can think of no better expression to characterize these similarities than ‘family resemblances’; for the various resemblances between members of a family: build, features, color of eyes, gait, temperament, etc. etc. overlap and criss-cross in the same way.—And I shall say: games form a family” (Wittgenstein, 1953).

Anyone who has thought carefully about the definitions of rhetorical terms such as poem, paragraph, essay, narrative, or exposition will see the application of Wittgenstein’s observations. The wide range of objects in the world that people denote using the word poem have no common characteristic or set of characteristics. Thus defining the term poem in the traditional Aristotelian way, by genus and differentia, is impossible because all the characteristics of individual poems—rhyme, rhythm, musical language, strong emotion, the voice of a speaker—fail as essential, defining characteristics. None definitively delineates the group of poems, bounding those things and only those things that are poems and therefore excluding all those that are not.

399px-Blue-footed_Booby-Sula_nebouxiiFortunately, studies in cognitive science have provided a new approach to the description of groups that improves upon Aristotelian categorization. As George Lakoff points out at length in his fascinating work Women, Fire, and Dangerous Things, most sets that people actually use are ill formed in the same way that the set of poems is ill formed (Lakoff, 1967). They are like the set of women, fire, and dangerous things designated by the word balan in the Aboriginal language Dyirbal. Despite the bankruptcy of Aristotelian categorical thinking, people are nonetheless able to deal in a practical way with sets or categories because they think about them in a way that has more in common with fuzzy logic than with Aristotelian syllogistic. Studies by Rosch and others have shown that people tend to form categories as more or less loose associations around “perceptually salient ‘natural prototypes’” (Rosch, 1973). For example, people can easily choose from a list of birds certain species—sparrows and robins, to be precise—that they think of as “most birdlike.” Other species, such as owls, penguins, ostriches, rheas, cassowaries, rheas, and blue-footed boobies, are less so. They are birds, yes, but not as clearly so as sparrows and robins are. Studies of children have shown that they tend to learn prototypes and their characteristics first and superordinate or subordinate categories later.

Like the theory of natural types that preceded it, the theory of natural prototypes has dramatic consequences for practical rhetoric, for it invites us to revisit and rethink the taxonomic basis of writing instruction. Consider, for example, how we go about teaching the writing of paragraphs. In 1866, Alexander Bain published his English Composition and Rhetoric: a Manual, the great grandfather of the writing textbooks of today. It was Bain who first characterized the paragraph as school texts have ever since, as a group of sentences related to or supporting a single topic sentence and characterized by unity and coherence. Here we have a classic categorical definition. The set of paragraphs has these essential, or defining, characteristics:

  • Possession of a topic sentence
  • Possession of a number of sentences related to or supporting the topic sentence
  • Unity
  • Coherence

Building on this definition, a school text might provide the following heuristic for writing a paragraph: “State a general idea. Then back it up with specific details (or examples or instances). Make sure not to include any unrelated ideas, and make sure to make the connections among your ideas clear by using transitions.”

Of course, individual paragraphs in the real world simply do not fit the standard textbook definition, though that definition has been repeated with only minor variation ever since Bain. Most pieces of writing and, ipso facto, most paragraphs, are narrative, and rarely does a narrative paragraph have a topic sentence. Narrative paragraphs are typically just one damned thing after another. Two of the most common types of paragraphs, those that make up newspaper articles and those that present dialogue in stories, typically contain only one or two sentences, and a paragraph in dialogue can be as short as a grunt or an exhalation. And, of course, it makes little sense to speak of a sentence or fragment as being unified or coherent in the senses in which those terms are usually used when describing paragraphs.

The fact is that the traditional definition of a paragraph describes the fairly rare case in which a single general main idea is illustrated by specifics. Of course, few paragraphs in the real world work that way. Throw a dart at a page in Harper’s magazine. You will not hit a Bain-style paragraph. There are many, many other ways to put several sentences together sensibly. The narrative way is the simplest: Present one damned thing after another. But one can also write quite an effective paragraph that, for example, consists of a thesis, an antithesis, and a synthesis; such a paragraph comes to a conclusion but has no overall main idea in any reasonable sense of the term “main idea.” Many well-crafted nonnarrative paragraphs depart radically from the schoolbook model, having no overall, paragraph-level organizational scheme but, rather, only a part-by-part organization in which each sentence is connected to the one before it and to the one after it in any of a myriad ways. In such cases, the writer often begins a new paragraph only because he or she has run out one head of steam. Whew! The study of these part-by-part connections that hold ideas together is sometimes referred to as discourse analysis.

What the theory of natural prototypes allows us to do is to posit a prototypical paragraph—Bain’s model, for example—and then present variations on the theme. So, after presenting the prototypical Bain-style paragraph, we might present variations like these: Topic sentence first. Topic sentence last. Embedded topic sentence. Implied topic sentence. One-sentence paragraph. Two-sentence paragraph. Paragraph that is just a series of events with no topic sentence. Dialogue paragraph. Introductory paragraph. Clincher paragraph. Transitional paragraph. One-sentence paragraph for emphasis. And so on. And it would certainly be worth while to combine this study of paragraph-level structures with activities that expose children to and give them practice in creating pairs or groups of sentences with a wide variety of relations: addition, negation, conjunction, generalization and example, generalization and deduction, examples and inductive generalization, whole followed by parts, parts followed by whole, cause and effect, effect and cause, entity and its characteristics, opinion and support, nonsequitur, entity and judgment or evaluation upon it, ascending hierarchy, descending hierarchy, relation in space or time, and so on (These possible relations between utterances can be multiplied indefinitely, but implicit, acquired rather than explicitly learned familiarity with of the most common of them is surely a large part of the toolkit of a skillful writer).

What I have suggested be done for instruction about paragraphs can, of course, be done for most of the ill-formed traditional rhetorical categories. So, for example, we might present Robert Frost’s “Stopping by Woods on a Snowy Evening” as a prototypical (lyric) poem. It rhymes. It ha a regular meter. It presents the strong feelings of a speaker. It deals with nature. Then we could present variants up to and including found poems and concrete poems and epic poems and verse plays and dramatic monologues and slam poetry and prose poems like those of Margaret Atwood, and all the other types of poems that do not fit nicely into our prototypical set.

The vexed concept of modes of composition is ripe for such theme-and-variations treatment. We might begin, say, with a prototypical narrative—a fictional narrative with a standard plot structure; one that observes the unities of time, place, and action; one with an antagonist and a protagonist; one with a conflict that is introduced, developed, and resolved. Something like Stephen Vincent Benet’s “The Devil and Daniel Webster” or E. B. White’s Charlotte’s Web springs to mind. Then, having worked with our students to analyze and/or emulate the prototype, we might then explore with them a number of variations with increasing remoteness from that prototype, down to and including nonlinear, open-ended, or recursive metafictions.

Such an approach would allow writers of composition textbooks and teachers of writing to avoid telling falsehoods to their students (e.g., “Most paragraphs have topic sentences” or “An essay is a short nonfiction composition with a controlling purpose.”) because it would replace general statements about categories with specific statements about specific prototypes and about specific variants. Furthermore, in writing instruction, specification, as opposed to generalization about whole categories, has virtues far beyond simple truthfulness. Whatever is specified can be described in terms of concrete operations for students to perform. (Invent a character by filling out this attributes sheet. Think of a conflict or struggle that this character might face. Invent a situation—a time, place, social setting—in which the character might be introduced to this conflict or struggle. And so on.)

The theme-and-variations approach to practical rhetoric would allow students’ development of understanding of rhetorical categories to proceed naturally, as in real-life learning about practical matters such as chairs and trees and birds. Such an approach would lend itself naturally to true integration of writing and literature instruction and would make models, which are intrinsically interesting because of their concreteness, even more central to our teaching than they are now. Another advantage of this approach would be that it would encourage fruitful, creative thought about the differences among variants. If we begin with a prototype for short fiction and move to a prototype for a particular kind of nonfiction story—say one that, like the fictional story, involves a conflict—then we have created an occasion for posing penetrating, evocative questions that compare the two prototypes. If our students are old enough and sophisticated enough, we might ask, for example, to what extent any nonfiction story is fictionalized by virtue of having a narrative frame, such as that of the hero’s journey, imposed upon it. This is the fascinating and fruitful question posed by historiographer Hayden White thirty-five years ago: When we say we understand history, is that because we have imposed an archetypal narrative frame upon past events, and don’t we choose the very events to fit the frame, and doesn’t the framing therefore necessarily falsify (White, 1974)?

At any rate, reworking our textbooks and learning progressions to replace the instantiated theory of natural types with a theory of natural prototypes should amount to that rare thing, a real revolution in writing pedagogy. Couple that with some serious work in stylistics and in sentence combining and modification, and our classes would be really cooking. But those are subjects for other essays.

Works Cited

Applebee, Arthur N., et al. The Language of Literature, Grade 8. Evanston, IL: Houghton/McDougal, 2001.

Campbell, George. The Philosophy of Rhetoric. 1776. Available at http://people.cohums.ohio-state.edu/Ulman1/Campbell/

Lakoff, George. Women, Fire, and Dangerous Things: What Categories Reveal about the Mind. Chicago: Univ. of Chicago P., 1987.

Rosch, Eleanor. “Natural Categories.” Cognitive Psychology, vol. 4, no. 3, (1973 May): 328-350.

White, Hayden. “The Historical Fact as Literary Artifact.” Clio, vol. 3, no. 3, (1974 June): 277-303.

Wittgenstein, Ludwig. Philosophical Investigations. 3rd ed. Trans. G. E. M. Anscombe. New York: Macmillan, 1953.

Posted in Teaching Literature and Writing | 2 Comments

The Tractatus Comico-Philosophicus: Martin Heidegger

Tractatus-2020-HeideggerMartin Heidegger Cares (Except When He Doesn’t)

  1. We didn’t ask for this crap. We fell into it, like some amnesiac thrown onto a stage, without a script, in the middle of a play already underway. (So, your first reaction is, “Oh, gee, hi,” when it ought to be, “What are my ontological commitments here?” You would be asking the right question if philosophers had not forgotten the question of Being.)
  2. And to make things worse, the other actors don’t even realize that there is little in the way of script, that they, collectively–Das Man–are to a large extent just making stuff up or mindlessly engaging what happen to be the affordances provided by the properties and set pieces and other people that happen to be on stage in their historical time and place.*
  3. You could just play along, but that would be a big, fat lie. It wouldn’t be authentic. (So, your next reaction should be, “WTF?”)
  4. You know this in your heart of hearts because you are Dasein, the kind of being for whom its own being is in question.
  5. And the answer to the really big question about your being, you realize, is that soon you won’t be. You will die.
  6. So, you have anxiety, care about the future, which you express in projects. In fact, you ARE your caring, your projects; those cares push out the one REALLY BIG CARE.
  7. My projects were being a collaborator with fascists and writing a big book that I didn’t finish but published anyway and then a lot of dwelling in the woods where I encountered gods in the clearings. I’m the only guy who ever understood Hegel, and no one ever understood me, even though I was the greatest philosopher since Aristotle.

*They don’t make up the props and set pieces. Those have their own being, which discloses itself, sometimes as being alongside and sometimes as being ready to hand. Like all beings, they are the infinite sum of their potential appearances. I learned that from my teacher, Husserl, whom I repaid by barring him from the university where he taught me this and much, much else. What can I say? I was half genius and half provincial, Black Forest peasant, Rasse und Seele, and I was overcome by visions of a Volkish paradise to come.

Ed note: I often refer to Martin Heidegger when explaining the genetic fallacy. He is proof positive that one cannot discount a truth because it originated in the mind of a horrible human being. Heidegger was a horrible human being. And he was a horrible writer. But he was also one of the greatest thinkers who ever lived. The effort that it takes to learn his language is repaid and repaid and repaid.

Who said philosophy was difficult?

The tractatus comico-philosophicus. Dedicated to bring the wisdom of the ages to all, for why shouldn’t you be as confused as they were?

Copyright 2014 by Robert D. Shepherd. All rights reserved.

Posted in Existentialism, Philosophy | Leave a comment

Philosophical Zombies with Chairs in Philosophy of Mind, or Confessions of a Neo-Dualist

Purkinje-Cell

For Rebecca Goldstein

Daniel Dennett called one of his books Consciousness Explained. He should have called it Why, Even with My Gifts, I Cannot Explain Consciousness Away. Dennett gives a compelling account of how various unconscious events occur in our brains and how some of these come to our attention, and he makes jolly good fun of what he calls the “Cartesian theatre” with its “homunculus” inside, “watching the show,” but he stops just short of explaining away consciousness, and there is a reason for this: It cannot be done. I shall argue, below, that some of the pretheoretical intuitions that nonphilosophers have about the reality of consciousness are surprisingly robust, clear, and distinct, while the dismissal of those intuitions by orthodox contemporary professional Anglo-American philosophers (who, by and large, accept some version of physicalism or functionalism) is an article of “faith” strong enough to cause them to dismiss that which is not only the clearest, most distinct of our understandings but also prerequisite to any clarity or distinctness. For the physicalists and functionalists, believing is seeing. Theirs is a kind of theology, and, as Wallace Stevens says, “theology before breakfast/sticks to the eye.”

There are many, many binary oppositions that seem to entail a default dualism on the part of those who employ them: mind/body, immaterial/material, universal/particular, proposition/sentence in a language, use/mention, a priori/a posteriori, sentience/nonsentience, being-for-itself/being-in-itself, thought/action, nominal/real, rule-creating/rule-following, subjective/objective, first-person/third-person, free/determined, teleological/nonteleological, experiencer/experienced. Contemporary reductionists who would exorcise mind from our vocabulary as though it were necessarily a ghost in the machine aren’t taking, I think, a scientific attitude with regard to the mind/body problem. George Santayana wrote in Reason in Religion that it’s easy enough for a worm-eaten old satirist to poke fun at the scientific inaccuracies of religion, but it’s much more difficult to account for it. The same is true of the default dualism that informed the development of those binaries. The burden is upon us, as philosophers, to account for the fact of the overwhelming tendency, throughout the centuries, of philosophers and nonphilosophers alike to carve up nature in that way. That most people, some pretheoretically and some theoretically, have considered it given that such binaries carve nature at its joints, to use an unpleasant but apt analogy, is itself a fact of nature that it is the job of our science to observe and explain. That these binaries are perfectly intelligible and that each is well-attested in our experience of the world means something, as I hope to show, below, and it means more than simply that we’ve always been confused, as when we thought that the sun rose and night fell. The inescapable fact is that intelligibility itself requires an ontological entity to which a concept can be said to be intelligible—that spooky experiencing mind that reductionists so anxiously wish to rid us of.

Let’s deal with the last of challenges to reductionism first. It’s simple enough to create physical processes that instantiate functional analogues of concepts, such as the sum of an addition, but there’s a difference between such a set of processes and an entity such as ourselves that experiences the having of concepts. That’s the point, of course, of Searle’s Chinese Room Gedankenexperiment, which has never been adequately refuted. People who identify the two are making a category mistake as surely as do those who confuse, say, use and mention. What is true of concepts and other ideas such as beliefs is true, as well, of qualia. One can say that my experience of pain entails that I have these C-fiber impulses (such an experience does except in a few bizarre, wildly pathological states), but one cannot say that the experience IS those. C-fiber impulses and experienced pains are ontologically distinct entities. A feeling is not the neural impulses that give rise to it. A mind is not a brain. Entailment or supervenience is not identity. One cannot make the mind-body problem go away, as, say, Richard Rorty attempts to do in Philosophy and the Mirror of Nature, by claiming that it’s just a matter of using the language wrong, of hypostetizing a predicate and making it a subject, in which case “This pain is terrible” is, like “Redness is yummy,” a loose use of language. (Rorty’s argument is a little more complicated, but that’s what it boils down to.) The mind-body problem and the universals-particulars problem are no more the same than a proposition is the same as the English (or Swahili) sentence that states it. Only a highly trained analytic philosopher could come to the Rorty’s conclusion, and it’s conclusions that preposterous that give philosophy a bad reputation among people who think that philosophers are supposed to be MORE careful in their thinking than the rest of us are. (Though, of course, there are highly trained analytic philosophers, Saul Kripke, for example, who don’t speak so outlandishly on this subject. See his discussion of pains and brains in Naming and Necessity.)

Because of the concomitance of physical and mental states, there is, of course, a sense in which talk about mental entities/experiences/states is adjectival, and it’s philosophically interesting that that’s so, but one can’t say, as Rorty does, that making that observation “dissolves” the mind-body problem. Neither is there anything scientific about trying to do so. A scientist does not ignore some facts, in this case the experience of entertaining a thought or of feeling a pain, just because they are inconvenient. Instead, a real scientist says about unexplained phenomena, “There are some things that I don’t yet understand.” There is no shame in that. Quite the contrary. That’s just being honest about the current state of knowledge and understanding. Dennett dismissively calls those who think that there is a hard problem of consciousness mysterians. But isn’t the proper response to a mystery to say, “That’s mysterious”? What other response is possible except silence?

One must sit down before the facts like a little child, Huxley said. That statement is in the true spirit of science and of philosophy, and it is in this spirit that we must reject the reductionist program that runs from Laplace, through the Vienna Circle misinterpreters of the younger Wittgenstein, to Ayer and Carnap and Ryle, to the twentieth-century behaviorists and the contemporary physicalists, functionalists, and hetereophenomenologists, a program that simply ignores much of what requires explanation. As Einstein is reputed to have said but did not, to our knowledge, actually say in precisely these words, our job is to seek answers that are as simple as possible, BUT NOT SIMPLER. Qualia (perceptions, bodily sensations, imagery, emotions) and other mental experiences (representations, beliefs, concepts, goals, intuitions) are not the physical processes on which they supervene. The world we experience has this dual aspect. As David Chalmers put it, that I have toes and thoughts are very different things. That’s simply the way things are. The universe contains us and other sentient entities with these spooky things called experiences, and professional philosophers need to accept that if they are to stop talking what is clearly arrant, errant nonsense.

Imagine a grandfather and his grandchild watching a PBS production of Romeo and Juliet on television. Romeo finds Juliet in the crypt, seemingly dead, and kills himself. The grandchild exlaims, “Why did he do that?!” The grandfather explains that a television camera recorded some actors in a studio, that the analogue recording was edited and sampled and digitized and stored as orientations of metal filaments on disc drives, that the recordings were played back by means of a mechanism that translated those orientations of metal filaments into streams of photons sent down fiber optic cables to a set-top box that transduced them into streams of electrons shot by a gun onto a fluorescent screen at one end of a cathode ray tube. An explanation like the grandfather’s (much elaborated) might be close to complete (though it cannot, our science tells us, be complete), but IT IS NOT AN ANSWER TO THE DAUGHTER’S QUESTION, which is asked AT A COMPLETELY DIFFERENT DESIGN LEVEL about phenomena at a completely different design level.

I suspect that there are two aspects of the hard problem of consciousness that make it particularly hard. One is that we have limited access to the way things are. We are like a child, confined to a room throughout her life, looking through a particular window at a particular courtyard and taking that courtyard for the world. Many years ago, the semiotician Jakob van Uexkull asked us to consider the lowly tick. A tick has four senses: she can sense light on her back; she can smell butyric acid, given off by the sebaceous (sweat) glands of mammals; she can sense temperatures in a narrow range around 35 degrees centigrade (the temperature of mammalian blood); she can feel with her feet. That’s it. That’s the entire access to the universe of a tick. Now, imagine that it is raining. For the tick, it’s welling up. The point is that we are ticks, too. There are explanations for phenomena experienced by the tick to which the tick has no access whatsoever. Clearly, the same is true for us, as the examples of other creatures with different kinds of access and the changing of our access through the building of prostheses, such as spectroscopes and electron microscopes and superconducting supercolliders demonstrates, but at any given time, there is much that we simply cannot know because we haven’t the perceptual or other cognitive tools, and in fact there are truths about the world that we are pretty sure that we shall never know, such as whether Socrates felt rain on his forehead on a particular day in his 23nd year. There’s no reason to think that we great apes have the conceptual machinery to understand why we have experiences as well as toes or how we can have real agency, that is neither determined nor random. Yet it is impossible for us to dispense with these very real aspects of our lives. That’s problem one. The second problem is that many complexly interacting systems evince characteristics at the design level of the system that do not exist in the substrate of the system’s components. In other words, they give rise to emergent phenomena. We are existence proofs that certain complexly interacting physical systems give rise to the phenomena of qualia, mattering, agency, purpose, freedom, what Sartre referred to as nonpositional reflective being, contents of consciousness, situatedness, thrownness, fallenness, orientation to Others, mind. We haven’t a clue how that can be so. But it is.

Some reductionists have fallen back on an epiphenomenal account of consciousness. Yes, experiences are quite real, but they ride on top of and are entirely accounted for by bottom-up systems (whether or not these are conceived of as deterministic). However, it should be obvious enough to those with scientific dispositions that nature is rarely so frivolous, so lacking in economy, as to create something complex that serves no purpose whatsoever. It’s a violation of the economy, the parsimony, so evident, everywhere, in nature to believe that experiencing does not, itself, play an causative role, top down. It’s difficult to understand why the same sorts of people who would accept as wholly reasonable the heuristic of, say, cladistic parsimony (motivated by repeatedly confirmed economy in nature), would think of experiencing as an inconsequential, non-causative free rider.

It would be the ULTIMATE irony, wouldn’t it, if that were so? That that to which things matter is the one thing that doesn’t matter?

The stubborn persistence of Anglo-American academic philosophers in their denial of mind is almost enough to make one think that David Chalmers’s philosophical zombies actually exist. Maybe there are such entities, and they all hold chairs of philosophy, and for THEM, pains are just C-fiber impulses because they don’t have a clue (or a functional physical process resulting in an output interpretable by a sentient consciousness as a clue) about what it means when people talk about “the experience of a pain.” Maybe one of those was named Richard Rorty. Maybe another is named Daniel Dennett. Maybe these zombie philosophers are marvelous to behold–philosophy machines of enormous sophistication–but just don’t have any there there, to borrow Gertrude Stein’s marvelous phrase. Maybe Kripke and Nagel and Chalmers have qualia and Dennett and the Churchlands, say, don’t. That would explain a lot.

Dennett points out that people are default dualists. As an evolutionist, he should have taken pause at that. It’s like saying that we’re default eaters or default procreators. We’re that way because it is possible for stuff to have experiences, because having experiences itself confers survival value to reproduction, enables us in this minimal sense to fare well, and evolution is a machine for faring well in the world. There’s nothing supernatural about any of this, of course, for whatever is, is the natural. But there’s nothing to suggest that experiencing is necessarily explainable by creatures such as us or by any creature living in a universe constructed like ours. Perhaps in the mind/body problem we are bumping up against our current cognitive limits or against more fundamental limits to any understanding that we might ever develop. There is much in nature that remains mysterious, much that our science tells us MUST remain mysterious, much that many think is not knowable, in principle (such as the precise simultaneous position and momentum of an elementary particle).

Some of the Gnostics said that first there was Sophia, and she gave rise to Eros, but they had it backward. First, there was the raw fact that some entities fared well and others didn’t. Blind processes (perhaps–we have scant evidence to the contrary) gave rise, in time, to a spectacularly successful means for faring well, to sentience and to an essential characteristic of sentient entities, the ability to reason, which is deeper than and considerably antecedent to the ability to create symbolic representations. I suspect, as Ginsburg and Jablonka have argued, that sentience is very, very ancient, that the first life forms with some version of sentience arose at the end of the PreCambrian and that that event in the history of life accounts, in part, for the Cambrian Explosion because, of course, even rudimentary sentience has enormous survival value. Perhaps inexplicable experiencing and making of choices acts upon the substrate that first gave rise to it, making it different in ways that confer survival value; perhaps reasons are fundamentally different from reactions, as the naively unphilosophical are inclined to think when they contrast a twitch with a grasp. Certainly, the choices that we make at the wholly different design level of the experiencing mind affect our wiring. Philosophy, if it is to be, in fact, fundamental, needs to go back to its source, its spring. It should conceive itself as the art of applying reason to the goal of faring well, which is why reason evolved in the first place.

While I’m dissociating myself from a number of received notions current among reigning philosophic and scientific intelligentsia, I might as well add this: It’s a commonplace of evolutionary theory that evolution is not teleological, and there is a sense, of course, in which that is true. But once the blind process of evolution hit upon the strategy of creating conscious minds, it gave birth to purposeful designers and purposeful design and so became teleological. We choose our mates, and often enough, fortunately, we do so based on the quality of their minds, though we have no notion what those might be.

And finally, in the next step, should we survive, we are in the process of becoming the designers of the designing process itself and may even, in time, use technological means that exploit the supervenience of the mental upon the physical to develop the means to bridge the ontological gap between subjectivities (my mind over here, yours over there), as, perhaps, other entities throughout the universe have long since done. That’s a scary and exciting possibility, fraught with potential dangers and rewards. But it does seem that we are headed in that direction, and again, as was true after the arrival of sentience, everything changes.

BTW, even now we have a rough means for bridging that ontological gap, of course. We call it love.

Copyright 2013, Robert D. Shepherd. All rights reserved.

Posted in Metaphysics, Philosophy, Philosophy of Mind | 8 Comments

It’s about Time

creation-web-version

  

A brief tour of fascinating (and lunatic) notions that philosophers (and a few poets) have had about time. 

The Mystery of Time

“What then is time? If no one asks me, I know; if I wish to explain it to one who asks, I know not.”

–St. Augustine (345–430 CE), Confessions

PART 1: What Is Time? Types of Time

Albert_Einstein_at_the_age_of_three_(1882)Absolute or Scientific Newtonian Time

“Absolute, true and mathematical time, of itself, and from its own nature flows equably without regard to anything external, and by another name is called duration.”

–Sir Isaac Newton (1643–1727), Philosophiae naturalis principia mathematica (Mathematical Principles of Natural Philosophy)

The Specious (Nonexistent) Present

“The relation of experience to time has not been profoundly studied. Its objects are given as being of the present, but the part of time referred to by the datum is a very different thing from the conterminous of the past and future which philosophy denotes by the name Present. The present to which the datum refers is really a part of the past — a recent past — delusively given as being a time that intervenes between the past and the future. Let it be named the specious present, and let the past, that is given as being the past, be known as the obvious past. [Each of] all the notes of a bar of a song seem to the listener to be contained in the [specious] present. [Each of] all the changes of place of a meteor seem to the beholder to be contained in the [specious] present. At the instant of the termination of [each element in] such series, no part of the time measured by them seems to be [an obvious] past. Time, then, considered relatively to human apprehension, consists of four parts, viz., the obvious past, the specious present, the real present, and the future. Omitting the specious present, it consists of three . . . nonentities — the [obvious] past, which does not [really] exist, the future, which does not [yet] exist, and their conterminous, the [specious] present; the faculty from which it proceeds lies to us in the fiction of the specious present.”

–E. Robert Kelley, from The Alternative, a Study in Psychology (1882). Kelley’s concept of the specious present has been extremely influential in both Continental and Anglo-American philosophy despite the fact that Kelley was not a professional philosopher.

Albert_Einstein_as_a_childSubjective Time

“Oh, yeah. Hegel’s Phenomenology of Spirit. I never finished it, though I did spent about a year with it one evening.”

Experienced Time: The “Wide” Present

“In short, the practically cognized present is no knife-edge, but a saddle-back, with a certain breadth of its own on which we sit perched, and from which we look in two directions into time. The unit of composition of our perception of time is a duration, with a bow and a stern, as it were—a rearward- and a forward-looking end. It is only as parts of this duration-block that the relation or succession of one end to the other is perceived. We do not first feel one end and then feel the other after it, and forming the perception of the succession infer an interval of time between, but we seem to feel the interval of time as a whole, with its two ends embedded in it.”

–William James, “The Perception of Time,” from The Principles of Psychology, Book I

459px-Einstein_patentofficeA, B, and C Series Time (Three Ways of Looking at Time)

  • The A Series: Time as Past, Present, and Future
  • The B Series: Time as Earlier, Simultaneous, and Later
  • The C Series: Time as an Ordered Relation of Events (with the direction being irrelevant)

Influential distinctions made by John Ellis McTaggart in “The Unreality of Time.” Mind 17 (1908): 456-476. The three types are much discussed by philosophers in the Anglo-American analytic tradition.

See also The Unreality of Time 2: Block Time, below

PART 2: Does Time Exist?

No, It Doesn’t: Change Is a Self-Contradictory Idea

“For this view can never predominate, that that which IS NOT exists. You must debar your thought from this way of search. . . .There is only one other description of the way remaining, namely, that what IS, is. To this way there are very many signposts: that Being has no coming-into-being . . . . Nor shall I allow you to speak or think of it as springing from not-being; for it is neither expressive nor thinkable that what-is-not is. . . . How could Being perish? How could it come into being? If it came into being, it is not; and so too if it is about-to-be at some future time. . . .For nothing else either is or shall be except Being, since Fate has tied it down to be a whole and motionless; therefore all things that mortals have established, believing in their truth, are just a name: Becoming and Perishing, Being and Not-Being, and Change of position, and alteration of bright color.”

–Parmenides of Elea (c. 475 BCE), fragment from The Way of Truth, in Ancilla to the PreSocratic Philosophers, ed. Kathleen Freeman

Albert_Einstein_(Nobel)“Does the arrow move when the archer shoots it at the target? If there is a reality of space, the arrow must at all times occupy a particular position in space on its way to the target. But for an arrow to occupy a position in space that is equal to its length is precisely what is meant when one says that the arrow is at rest. Since the arrow must always occupy such a position on its trajectory which is equal to its length, the arrow must be always at rest. Therefore, motion is an illusion.”

–Zeno of Elea (c. 450 BCE), fragment from Epicheriemata (Attacks), in Ancilla to the PreSocratic Philosophers, ed. Kathleen Freeman

“One part of time has been [the past] and is not, while the other is going to be and is not yet [the future]. Yet time, both infinite time and any time you care to take, is made up of these. One would naturally suppose that what is made up of things which do not exist could have no share in reality.”

–Aristotle (384–322 BCE), Physics, IV, 10–14. 217b-244a.

462px-Einstein-formal_portrait-35Yes, It Does: Change Is the Fundamental Reality of Our Lives

“It is not possible to step twice into the same river.”

–Heraclitus, (c. 475 BCE), fragment from unnamed book, in Ancilla to the PreSocratic Philosophers, ed. Kathleen Freeman

[Heraclitus seems to have held this fact to be one of many indications of the essential unworthiness/irredeemability of this life; the other fragments of his writings that have survived suggest that Heraclitus was a kind of 5th century fundamentalist preacher, upset about the moral decay around him, who viewed the world as synonymous with decay, and who wanted to point his readers, instead, toward the eternal Logos. Plato inherited this view; the Christian church inherited Plato’s. Such contemptu mundi (contempt for the world) is often, in that tradition, expressed as contempt for that which exists “in time” and is not eternal.]

“Time is nature’s way of keeping everything from happening at once.”

–Woody Allen (1935–      )

Albert_Einstein_Head

No, It Doesn’t: Time is an Illusion Due to Vantage Point in an Eternal Space Time (the “Block Time” Hypothesis):

“Now Besso has departed from this strange world a little ahead of me. That means nothing, for we physicists believe the separation between past, present, and future is only an illusion, although a convincing one.”

–Albert Einstein (1879­–1955), in a letter written to the family of Michele Besso, on Besso’s death

“All time is all time. It does not change. It does not lend itself to warnings or explanations. It simply is. Take it moment by moment, and you will find that we are all, as I’ve said before, bugs in amber.”

462px-Einstein-formal_portrait-35–Kurt Vonnegut, Jr. (1922–2007), who is in heaven now, Slaughterhouse Five

Time present and time past
Are both perhaps present in time future,
And time future contained in time past.
If all time is eternally present
All time is unredeemable.

–T.S. Eliot (1888–1965), “Burt Norton,” from Four Quartets

No, It Doesn’t: The Now as Consequence of the Blindness of the Brain to Its Own Processing of Temporal Data (the “Blind Brain” Hypothesis)

“Nothing, I think, illustrates this forced magic quite like the experiential present, the Now. Recall what we discussed earlier regarding the visual field. Although it’s true that you can never explicitly ‘see the limits of seeing’–no matter how fast you move your head–those limits are nonetheless a central structural feature of seeing. The way your visual field simply ‘runs out’ without edge or demarcation is implicit in all seeing–and, I suspect, without the benefit of any ‘visual run off’ circuits. Your field of vision simply hangs in a kind of blindness you cannot see.

“This, the Blind Brain Hypothesis suggests, is what the now is: a temporal analogue to the edgelessness of vision, an implicit structural artifact of the way our ‘temporal field’–what James called the ‘specious present’–hangs in a kind temporal hyper-blindness. Time passes in experience, sure, but thanks to the information horizon of the thalamocortical system, experience itself stands still, and with nary a neural circuit to send a Christmas card to. There is time in experience, but no time of experience. The same way seeing relies on secondary systems to stitch our keyhole glimpses into a visual world, timing relies on things like narrative and long term memory to situate our present within a greater temporal context.

“Given the Blind Brain Hypothesis, you would expect the thalamocortical system to track time against a background of temporal oblivion. You would expect something like the Now. Perhaps this is why, no matter where we find ourselves on the line of history, we always stand at the beginning. Thus the paradoxical structure of sayings like, “Today is the first day of the rest of your life.” We’re not simply running on hamster wheels, we are hamster wheels, traveling lifetimes without moving at all.

“Which is to say that the Blind Brain Hypothesis offers possible theoretical purchase on the apparent absurdity of conscious existence, the way a life of differences can be crammed into a singular moment.”

–Scott Bakker, “The End of the World As We Knew It: Neuroscience and the Semantic Apocalypse”

PART 3: What Contemplation of Time Teaches Us about Living

Carpe Diem

“Such,” he said, “O King, seems to me the present life of men on Earth, in comparison with that time which to us is uncertain, as if when on a winter’s night, you sit feasting . . . and a simple sparrow should fly into the hall, and coming in at one door, instantly fly out through another. In that time in which it is indoors it is indeed not touched by the fury of winter; but yet, this smallest space of calmness being passed almost in a flash, from winter going into winter again, it is lost to our eyes.

“Something like this appears the life of man, but of what follows or what went before, we are utterly ignorant.”

–The Venerable Bede (c. 672–735), Ecclesiastical History of the English People, Book II

Albert_Einstein_(Nobel)

“Seize the day, trusting as little as possible in the future.”

–Horace (65–8 BCE), Odes 1.11

Oh, come with old Khayyam, and leave the Wise
To talk; one thing is certain, that Life flies;
One thing is certain, and the Rest is Lies;
The Flower that once has blown for ever dies.

Omar Khayyám (1048–1131), “Rubiyat,” trans. Edward FitzGerald

Gather ye rosebuds while ye may
Old time is still a-flying:
And this same flower that smiles to-day
To-morrow will be dying.

–Robert Herrick (1591–1674), “To the Virgins, to Make Use of Time”

459px-Einstein_patentofficeBut at my back I alwaies hear
Times winged Charriot hurrying near:
And yonder all before us lye
Desarts of vast Eternity.
Thy Beauty shall no more be found;
Nor, in thy marble Vault, shall sound
My ecchoing Song: then Worms shall try
That long preserv’d Virginity:
And your quaint Honour turn to durst;
And into ashes all my Lust.
The Grave’s a fine and private place,
But none I think do there embrace.
Now therefore, while the youthful hew
Sits on thy skin like morning glew,
And while thy willing Soul transpires
At every pore with instant Fires,
Now let us sport us while we may;
And now, like am’rous birds of prey,
Rather at once our Time devour,
Than languish in his slow-chapt pow’r.
Let us roll all our Strength, and all
Our sweetness, up into one Ball:
And tear our Pleasures with rough strife,
Thorough the Iron gates of Life.
Thus, though we cannot make our Sun
Stand still, yet we will make him run.

–Andrew Marvell (1621–1678), “To His Coy Mistress”

“Get it while you can.
Don’t you turn your back on love.”

–The American philosopher Janis Joplin (1943–1970)

Albert_Einstein_as_a_childGive Up/It’s All Futile Anyway

“A man finds himself, to his great astonishment, suddenly existing, after thousands of years of nonexistence: he lives for a little while; and then, again, comes an equally long period when he must exist no more. The heart rebels against this, and feels that it cannot be true.

“Of every event in our life we can say only for one moment that it is; for ever after, that it was. Every evening we are poorer by a day. It might, perhaps, make us mad to see how rapidly our short span of time ebbs away; if it were not that in the furthest depths of our being we are secretly conscious of our share in the exhaustible spring of eternity, so that we can always hope to find life in it again.

“Consideration of the kind, touched on above, might, indeed, lead us to embrace the belief that the greatest wisdom is to make the enjoyment of the present the supreme object of life; because that is the only reality, all else being merely the play of thought. On the other hand, such a course might just as well be called the greatest folly: for that which in the next moment exists no more, and vanishes utterly, like a dream, can never be worth a serious effort.”

–The ever-cheerful Arthur Schopenhauer (1788–1860), “The Vanity of Existence,” from Studies in Pessimism

Three Phenomenologist/Existentialist Views of Time

NB: the following are NOT quotations. I’ve summarized material that appears in much longer works. You’re welcome. I have included Husserl in this section, even though his work is just an attempted explanation of time, because the other two philosophers treated here are reacting to Husserl’s ideas.

Albert_Einstein_at_the_age_of_three_(1882)Husserl (very bright dude, this one): All our ideas about time spring from our conscious experience of the present. That experience is characterized by being intentional, by being toward something. We typically recognize three kinds of time: 1. scientific, objective, Newtonian time, which we think of as being independent of ourselves and as independently verifiable; 2. subjective time, in which events seem to move slower or faster; and 3. phenomenological or intentional time, which is the fundamental experience on which the other concepts of time are based, from which the other concepts derive because the phenomenological present includes not only awareness of present phenomena (the present), but retention (awareness of that which is not present because it no longer is—the past), and protention (awareness of that which is not present because it is about to be). The present is intentionality toward phenomena before us here, now. The past is present intentionality toward phenomena that are not present but are with us and so must be past (that’s where the definition of past comes from). The future is present intentionality toward phenomena that also are present but are not with us (as the past is) and so must be the future, which will be (that’s where the definition of future comes from). Therefore, in their origins in our phenomenological experiences, the future and the past are parts of the present, conceptual phenomena held in the present, alongside actual phenomena, as phenomena no longer present and not yet present.

Albert_Einstein_as_a_childHeidegger: Husserl had it all wrong. It’s the future, not the present, that is fundamental. We are future-oriented temporalities by nature, essentially so. Our particular type of being, Dasein, or being-there, is characterized by having care (about its projects, its current conditions, about other beings)—about matters as they relate to those projects. Our being is characterized by understanding, thrownness, and fallenness. Understanding, is the most fundamental of the three. It is projection toward the future, comportment toward the possibilities that present themselves, potentiality for being. Our understanding seizes upon projects, projecting itself on various possibilities. In its thrownness, Dasein always finds itself in a certain spiritual and material, historically conditioned environment that limits the space of those possibilities. As fallenness, Dasein finds itself among other beings, some of which are also Dasein and some of which (e.g., rocks) are not Dasein, and it has, generally respectively, “being-with” them or “being alongside” them, and these help to define what possibilities there are.  “Our sort of being (Dasein) is being for which being is an issue.” Why is it an issue? Well, we are finite. We know that we are going to die. This is the undercurrent that informs our essential being, which is care, concern. We are projections toward the future because undertaking these projects is an attempt, however quixotic, to distract ourselves from or even to cheat death. We care about our projects because, at some level, we care about not dying, having this projection toward the future for which we are living.

459px-Einstein_patentofficeSartre: The world is divided into two kinds of being: being-for-itself (the kind of being that you and I have) and being-in-itself (the kind of being that a rock or a refrigerator has). Let’s think a bit about our kind of being. Take away your perceptions, your body, your thoughts. Strip everything away, and you still have pure being, the being of the being-for-itself, but it is a being that is also nothing. (The Buddha thought this, too). Being-for-itself has intentional objects, but itself is no object (there’s no there there) and so is nothing, a nothingness. Time is like being in that respect. It consists entirely of the past (which doesn’t exist) and the future (which doesn’t exist) and the present (which is infinitesimally small and so doesn’t exist). So time, like being, is a nothingness. This being-for-itself is not just nothingness, however; it has some other bizarre, contradictory characteristics: Its being, though nothing, allows a world to be manifest (how this is so is unclear), a world that includes all this stuff, including others, for example, who want to objectify the being-for-itself, to make it into a something, a thing, a being-in-itself, like a rock. (“Oh, I know you. I’m wise to you. You’re . . . .” whatever.) The being-for-itself also has a present past (in Husserl’s sense) and is subject to certain conditions of material construction (the body) and material conditions (in an environment of things), and all these givens—the body, the environment, one’s own past, and other people seen from the outside in their thinginess—make up the being-for-itself’s facticity. The being-for-itself wants to be SOMETHING, and so lies to itself. It acts in bad faith, playing various roles (playing at being a waiter, for example) and creating for itself an ego (via self-deceptive, magical thinking). But in fact, being in reality nothing, being-for-itself (each of us) knows that that’s all a lie. We transcend our facticity and can be anything whatsoever, act in any way whatsoever. In other words, we are absolutely free and therefore absolutely responsible. This responsibility is absurd, because there is no reason for being/doing any particular thing. “Man is a meaningless passion.” But the absolute freedom that derives from our essential nothingness also allows for action to be truly authentic (as opposed to the play-acting) in addition to being responsible. Only in death does the being-for-itself succeed in becoming a being-in-itself, a completed thing, and then only if and in the manner in which he or she is remembered by others. A person who is not remembered never existed. Death is a time stamp or, if we are not remembered, an expiration date.

Albert_Einstein_(Nobel)The Eternal Return and the Weight of Being

“341. The Greatest Weight. What, if some day or night a demon were to steal after you into your loneliest loneliness and say to you: ‘This life as you now live it and have lived it, you will have to live once more and innumerable times more; and there will be nothing new in it, but every pain and every joy and every thought and sigh and everything unutterably small or great in your life will have to return to you, all in the same succession and sequence—even this spider and this moonlight between the trees, and even this moment and I myself. The eternal hourglass of existence is turned upside down again and again, and you with it, speck of dust!’

“Would you not throw yourself down and gnash your teeth and curse the demon who spoke thus? Or have you once experienced a tremendous moment when you would have answered him: “You are a god and never have I heard anything more divine.” If this thought gained possession of you, it would change you as you are or perhaps crush you. The question in each and every thing, “Do you desire this once more and innumerable times more?” would lie upon your actions as the greatest weight. Or how well disposed would you have to become to yourself and to life to crave nothing more fervently than this ultimate eternal confirmation and seal?”

–Friedrich Nietzsche (1844–1900), The Gay Science

462px-Einstein-formal_portrait-35The Fleeting One-Offness of Everything and the Resulting Unbearable Lightness of Being

“But Nietzsche’s demon is, of course, wrong. There is no eternal return. Where does that leave us? Isn’t life ALWAYS a matter of I should have’s and I would have’s and if I had only knowns? “[W]hat happens but once, might as well not have happened at all. If we have only one life to live, we might as well not have lived at all. . . .

“The heaviest of burdens crushes us, we sink beneath it, it pins us to the ground. But in love poetry of every age, the woman longs to be weighed down by the man’s body. The heaviest of burdens is therefore simultaneously an image of life’s most intense fulfillment. The heavier the burden, the closer our lives come to the earth, the more real and truthful they become. Conversely, the absolute absence of burden causes man to be lighter than air, to soar into heights, take leave of the earth and his earthly being, and become only half real, his movements as free as they are insignificant. What then shall we choose? Weight or lightness?”

–Milan Kundera (1929­–     ), contra Nietzsche, from The Unbearable Lightness of Being

Albert_Einstein_HeadCopyright 2010, Robert D. Shepherd. All rights reserved.

Posted in Existentialism, Metaphysics, Philosophy, Philosophy of Mind, Time | Tagged , , | 4 Comments