Connecting the Pieces: Open Source, Big Data, and the Origins of the Common [sic] Core [sic]

800px-Japanese_camera_for_surveillance_2How educational publishers PLAYED and PWNED a nation’s educrats and politicians

(A term from the gaming world, pwned, a blend of pawn and owned, is a neologism meaning “achieved total control and/or domination over.” If an opponent uses you, against your better interests, to achieve his or her own objectives, or if you are obliterated within seconds of the beginning of game play, then you have been pwned.)

The last state has now pulled out of the proposed national database of student responses and scores. Those who were horrified at the prospect of such a privately held, Orwellian Total Information Awareness system for K-12 public school education, one that would have served as a de facto checkpoint and censor librorum for curricula, are cheering.

But don’t think for a moment that Big Data has been beaten. I am going to explain why. I hope that you will take the effort to follow the connections in the story below. The story is a bit complicated, and some of it hinges on matters of business and economics that make for dull reading. I think, however, that you’ll find the story as a whole both shocking and extraordinarily consequential and so worth the effort. The tale I am going to tell is a birth narrative. It’s the story of a monstrous birth, like that of the monsters that sprang from the primordial ocean in ancient Mesopotamian mythology. But this is a true story, and the monstrous birth was engineered. This is the story, as I understand it, of the birth of the Common [sic] Core [sic].

And what rough beast, its hour come round at last,
Slouches towards Bethlehem to be born?

The emergence of the Internet presented a challenge to the business model of the big educational publishers. It presented the very real possibility that they might go the way of the Dodo, the Passenger Pigeon, Kingman’s Prickly-Pear (date of extinction: 1978), and the Columbia Basin Pygmy Rabbit (date of extinction: 2007). Why? With a bit of effort, you will be able to find, right now, if you choose to look, some 80 or so complete, high-quality, absolutely FREE open-source textbooks on the Internet–textbooks written by various professors–textbooks in geology, biology, astronomy, physics, law, grammar, foreign languages, every conceivable topic in mathematics, and other subjects.

The development of the possibility of publishing via the Internet, combined with the wiring of all public schools for broadband access, removed an important barrier to entry to the educational publishing business–paper, printing, binding, sampling, warehousing, and shipping costs. Pixels are cheap. Objects made of dead trees aren’t. In the Internet Age, small publishers with alternative texts could easily flourish. Some of those—academic self publishers interested not in making money but in spreading knowledge of their subjects—would even do substantive work for free. Many have, already. There are a dozen great intro statistics texts , some with complete answer keys and practice books and teachers’ guides, available for FREE on the Web today.

Think of what Wikipedia did to the Encyclopedia Britannica. That’s what open-source textbooks were poised to do to the K-12 educational materials monopolists. The process had already begun in college textbook publishing. The big publishers were starting to lose sales to free, open-source competitors. The number of open-source alternatives would grow exponentially, and the phenomenon would spread down through the grade levels. Soon. . . .

How were the purveyors of textbooks going to compete with FREE?

What’s a monopolist to do in such a situation?

Answer: Create a computer-adaptive ed tech revolution. The monopolists figured out that they could create computer-adaptive software keyed to student responses in databases that they, and they alone, could get access to. No open-source providers admitted. They could also team up with tablet providers and sell districts tablets with their curricula preloaded, tablets locked to prevent access to other publishers’ materials.

Added benefit: By switching to computerized delivery of their materials, the educational publishing monopolists would dramatically reduce their costs and increase their profits, for the biggest items on the textbook P&L, after the profits, are costs related to the physical nature of their products–costs for paper, printing, binding, sampling, warehousing, and shipping.

By engineering the computer-adaptive ed tech revolution and having that ed tech keyed to responses in proprietary databases that only they had access to, the ed book publishers could kill open source in its cradle and keep themselves from going the way of typewriter and telephone booth manufacturers.

The Big Data model for educational publishing would prevent the REAL DISRUPTIVE REVOLUTION in education that the educational publishers saw looming–the disruption of THEIR BUSINESS MODEL posed by OPEN-SOURCE TEXTBOOKS.

A little history:

2007 was the fiftieth anniversary of the Standard and Poors Index. On the day the S&P turned fifty, 70 percent of the companies that were originally on the Index no longer existed. They had been killed by disruptions that they didn’t see coming.
The educational materials monopolists were smarter. They saw coming at them the threat to their business model that open-source textbooks presented. And so they cooked up computer-adaptive ed tech (including online state tests) keyed to standards, with responses in proprietary databases that they would control, to prevent that. The adaptive ed tech/big data/big database transition would maintain and even strengthen their monopoly position.

But to make that computer-adaptive ed tech revolution happen and so prevent open-source textbooks from killing their business model, the publishers would first need ONE SET OF NATIONAL STANDARDS. And that’s why they, and their new tech partners, paid to have the Common [sic] Core [sic] created. That one set of national standards would provide the tags for their computer-adaptive software. That set of standards would be the list of skills that the software would keep track of in the databases that open-source providers could not get access to. Only they would have access to the BIG DATA.

In other words, the Common [sic] Core [sic] was the first step in A BUSINESS PLAN.

A certain extraordinarily wealthy computer mogul described that business plan DECADES ago–the coming disruptive programmed learning model in education, the model now commonly referred to as computer-adaptive learning based on Big Data.

So, that’s the story, in a nutshell. And it’s not an education story. It’s a business story.

And a WHOLE LOTTA EDUCRATS haven’t figured that out and have been totally PLAYED. They are dutifully working for PARCC or SBAC and dutifully attending conferences on implementing the “new, higher standards” and are basically unaware that they have been USED to implement a business plan. They don’t understand that the national standards were simply a necessary part of that plan.

And here’s the kicker: The folks behind this plan also see it is a way to reduce, dramatically, the cost of U.S. education. How? Well, the biggest cost, by far, in education is teachers’ salaries and benefits. But, imagine 300 students in a room, all using software, with a single “teacher” walking around to make sure that the tablets are working and to assist when necessary. Good-enough training for the children of the proles. Fewer teacher salaries. More money for data systems and software. Ironically, the publishers and their high-tech Plutocratic partners were able to enlist both major teachers’ unions to serve as propaganda ministries for their new national bullet list of standards, even though the game plan for those standards is to reduce the number of teachers’ salaries that have to be paid. Thus the education deform mantra: “Class size doesn’t matter.”

Think of the money to be saved.

And the money to be made.

The wrinkle in the publishers’ plan, of course, is that people don’t like the idea of a single, Orwellian national database. From the point of view of the monopolists, that’s a BIG problem. The database is, after all, the part of the plan that keeps the real disruption, open-source textbooks, from happening–the disruption that would end the traditional textbook business as surely as MP3 downloads ended the music CD business and video killed the radio star.

So, with the national database dead, for now, the education deformers have to go to plan B.

What will they do? Here’s something that’s VERY likely: They will sell database systems state by state, to state education departments, or district by district. Those database systems will simply be each state’s or district’s system (who could object to that?), and only approved vendors (guess who?) will flow through each. Which vendors? Well, the ones with the lobbying bucks and with the money to navigate whatever arcane procedures are created by the states and districts implementing them, with the monopolists’ help, of course. So, the new state and district database systems will work basically as the old textbook adoption system did, as an educational materials monopoly protection plan.

So, to recap: to hold onto their monopolies in the age of the Internet, the publishers would use the Big Data ed tech model, which would shut out competitors, and for that, they would need a single set of national standards.

In business, such thinking as I have outlined above is called Strategic Planning.

The plan that a certain computer mogul had long had for ed tech proved to be just what the monopolist educational publishers needed. That plan and the publishers’ need to disrupt the open-source disruption before it happened proved to be a perfect confluence of interest–a confluence that would become a great river of green.

The educational publishing monopolists would not only survive but thrive. There would be billions to be made in the switch from textbooks to Big Data and computer-adaptive ed tech. Billions and billions and billions.

And that’s why you have the Common [sic] Core [sic].

 

Copyright 2014. Robert D. Shepherd. All rights reserved. This piece may be freely distributed if this copyright notice is retained on all copies.

Advertisements
Posted in Ed Reform, Uncategorized | 16 Comments

Becoming an EduPundit Made EZ

800px-Rotten_apples

 

 

Photo: Rotten Apples

The nineteenth century was the era of the traveling medicine show. Grifters slithered from town to town in rural parts of the country, peddling magical elixirs. John D. Rockefeller’s father was one such. He would show up in a town, put on a little spectacle, sell some bottled cures for cancer and lameness, and then skedaddle off just ahead of the law.

Today, in place of the Snake Oil Salesman, we have the EduPundit.

The EduPundit doesn’t sell magic elixirs. He or she sells Magic Formulas for learning. Now, how does the Aspiring EduPundit come up with a Magic Formula to sell? Well, that’s the easy part. Magic Formulas are lying around all over the place.

The secret to becoming a well-remunerated EduPundit is to take a blindingly obvious idea and make it into a Magic Formula by giving it a Brand Name. Or, if you are in a hurry, start with the Brand Name and then come up with the Magic Formula based on that. I’ve done some of this work for you. Just choose items from the following lists. Note: The Brand Name for your Magic Formula doesn’t have to have an item from List Three. Those are optional. And it can have an item from List Four OR List Five OR both.

List one:
Degrees
Design
Dimensions
Foundations
Paths
Program
Strategies

List two:
of
for

List three:
Close
Collaborative
Critical
Diagnostic
Disruptive
Effective
Empowering
Gritful
Formative
FUNdamental
Innovative
Metacognitive
Multidimensional
Peer
Performative
Positive
Rigorous
Successful
Total
Value-Added

List four:
Knowledge
Learning
Portfolio(s)
Reading
Teaching
Thinking

List five:
Assessment
Evaluation
Growth
Motivation
Outcomes
Performance
Power
Success

If you would like the complete Aspiring EduPundit iPhone App for Choosing Your Aspiring EduPundit Brand, which includes many more lists like the one above (Jump Starting Formative Engagement! Jump-Starting Engaging Formatives! Engaging Formative Jump Starting! et cetera) just sign up at our website or write your name on a stack of hundred dollar bills and send them to yours truly.

Of course, in addition to the Brand Name, you will need a “Key Graphic” or “Concept Map.” This you can very easily create yourself using Smart Art in Microsoft Word. A circle made of three arrows, an idea pyramid, a web—these are all standard. You know the shtick. Remember: In presentations, you must always unveil your inane graphic with great drama, as though it were the Holy of Holies. It is THE REVELATION.

2014 update: Be aware that the great river of Edupundit green is now running almost exclusively from the bank accounts of a few Ed Deform Plutocrats and from the coffers of those Plutocrats’ wind-up toys in foundations, think tanks, state departments of education, and the USDE. So, if you want to be a big barker on the educational midway this carnival season, if you want to be invited to speak at conferences, to write professional books for teachers, and to chair committees, if you want to get paid for putting your name on textbooks you didn’t actually write or edit—if you want to be a PLAYAH—you will have to PRACTICE YOUR EQUIVOCATION. Hold your nose and learn to collaborate with Ed Deform, but do so with sufficient finesse that you can deny your collaboration when actual classroom teachers seem ready to identify you as Vichy swine.

For a copy of Equivocating on the Common Core and Standardized Testing for Aspiring EduPundits, sign up for my course at Anyone Can Be an InstaEdupundit dot com.

 

copyright 2014. Robert D. Shepherd

Leave a Reply

Posted in Ed Reform, Teaching Literature and Writing | Tagged | 1 Comment

The Vast Unseen and the Vast Unseeable: Reconciling Belief and Nonbelief

ImageThose who are not philosophically inclined can be divided, roughly, into two groups:

There are the Naïve Realists who think that what is available to the senses or potentially available to the senses is all that there is.

And then there are those who think that alongside or in addition to all that empirically available stuff, there is another world (or other worlds) of the unseen.

Let’s call the first group the nonbelievers and the second the believers.

Centaur_skeletonThe believers come in immense variety. From the time of the origins of the first archaeological remains of human civilizations down to the present day, there have been, literally, many hundreds of thousands of belief systems regarding the ordinarily unseen—belief systems involving spirits that live within things in the natural world, disembodied spirits of ancestors existing alongside us, and a rich phantasmagoria of gods and demigods and demons and other magical beings–Tiamat and Marduk, Isis and Hecate and Bastet, YHWH and the Nephilim, the Aeons and the Rex Mundi, Ananzi and White Buffalo Woman, Cernunnos and Brigid, ogres and trolls and fairies in the garden, the machine elves of Terrence McKenna’s psychonautic excursions on dimethyltriptamine.

John_Atkinson_Grimshaw_-_Spirit_of_the_NightFrom the beginning, people seem to have imagined (?) unseen spirits that inhabited physical things—rivers, mountains, oceans, plants, people, and nonhuman animals, for example. But also, from time immemorial, they imagined (?) unseen worlds that were separate parts of this one world (universe) that we live in. The Anglo-Saxons talked of the middengeard (the “middle Earth”) between heaven above and hell below. Ancient Chinese, and some of the Greek Gnostics, imagined vast numbers of heavens “up there.” Many peoples placed their realms of the gods atop high mountains or in the clouds or across the sea on some island. Many cultures had their chthonian deities, ones who inhabited realms under the ground—the world of the Hindu Nagas, for example, or the realm of Hades, or the cave beneath the bog of Grendel’s mother, who may have been the ancient British goddess Nerthus made demonic in a Christian retelling. These abodes of the gods were unseen but potentially seeable, if only you got there, to that place.

800px-Gustave_Doré-_Dante_et_Vergil_dans_le_neuvième_cercle_de_l'enferIncreasingly, as we have plumbed the whole of the Earth, from the summits to the depths, and have come to understand, better, what is in the heavens above us and under the ground beneath our feet, those who believe in the unseen have retreated to the less physical instantiations of their other worlds. These modern believers are of the “spirit within” camp. Their unseen worlds are invisible universes, spirit worlds, that exist—somewhere else—in parallel to our own or in some other dimension or within things, somehow. The entities who live in that spirit world, they say, might be all around us right now. You might, for example, hold a séance or take a drug or pray and talk to them.

Now, the nonbelievers like to point out that despite the certainties that believers tend to have about their unseen worlds, their views are innumerable and mutually inconsistent and can’t all be right, and it’s not exactly easy to produce EVIDENCE about any of these unseen worlds, and so no compelling reason to believe in any one of them, at least no reason that an impartial observer would have to accept. And the nonbelievers are frankly astonished, by and large, that at this late date in human history, there are still large numbers of people who believe in unseen worlds and unseen entities, who talk to them regularly, for example, and take guidance from them. In short, the nonbelievers think it really peculiar that so many people continue, in a scientific age, to hold fantastic ideas involving the unseen. And they are horrified that folks whom they consider so gullible and superstitious, people who sometimes talk to invisible friends, are nonetheless trusted with positions of power and authority.

I do understand that point of view. I even sympathize with it, for in doing so, I am sympathizing with the view held by my own younger self. But here’s a problem for it, a really big problem, it seems to me:

While it seems reasonable not to accept as true propositions for which there is little or no evidence, it is also entirely unreasonable to imagine that what we have access to via our senses is the whole of the universe. We have a particular set of senses and a particular cognitive apparatus, a particular operating system, if you like. Our sensory and cognitive equipment, our operating system, differs enormously from that of other creatures on the planet. Consider the “lowly” bugs known as ticks. We know that there are vast parts of the universe that we perceive that simply are not available to ticks. Stars do not exist to a tick. Neither do temperatures above or below a narrow range around 37 degrees Centigrade. There is no smell of roses in the universe that the tick perceives; there is no sound of laughter. The tick does not have any perceptual or cognitive access to these things. They are UNSEEN by the tick, but WE know that they exist.

In other words, the tick teaches us that it is inevitable that, given the particular sensory apparatus and cognitive makeup that a creature has, given a particular creature’s operating system, some of what is, of what actually exists, will be available to that creature, and SOME WILL NOT. That bit of the universe that is available to a given creature is the creature’s Umwelt (to use the term popularized by Jakob von Uexküll, whose ideas about ticks I have shared here).

447px-Alicesadventuresinwonderland1898So, how are we any different from ticks in this regard? We’re not some sort of special case. What is true of ticks is doubtless true of us—that we have access to only a small part of what is really going on. This is an inductive conclusion strongly warranted by our knowledge of comparative neural and perceptual physiology, so strongly warranted, in fact, that I think that we are compelled to accept it on purely inductive, empirical, scientific grounds. And it’s a truly mind-blowing conclusion, I think.

It therefore seems highly likely that Hamlet was right when he said, “There are more things in heaven and earth, Horatio. Than are dreamt of in your philosophy.” In other words, the believers are almost certainly right about this much, that there is a VAST UNSEEN. A vast unseeable, in fact. And nonbelievers have to accept that much. I do.

Before I proceed, let me deal with a predictable objection to this line of reasoning: Evolution designs creatures to exploit whatever realities there are, and over time, they exploit them more fully, and so we reach this pinnacle in humans at which we have cognitive and perceptual access to the way things are. Now, here’s where I think that that argument is wrong (Anthropocentric arguments tend to be downright silly; they are kin to the old ideas that the Earth and humans are at the center of the universe): Evolution is nothing if not parsimonious. It reaches for what works in a niche, and it ignores everything else. Exhibit 1 for my rebuttal: cyanobacteria, unchanged for nearly four billion years. Exhibit 2: Beetles that attempt to mate with female-looking beer bottles so persistently that they allow ants to eat them alive. Exhibit 3: humans and their well-documented cognitive limitations more suited to life on the savannah than to life in, say, New York City . Phenotypes tend to be local maxima on the larger fitness landscape.

Where do we go from this conclusion, for conclusions are beginnings, aren’t they? Clearly, there will be situations in which what can be experienced by a given creature is affected causally by that which the creature cannot experience, and this may be the situation that obtains with regard to many conundrums, great and small–the mind/body question, the question of free will, the irreducibility of simple arithmetic to logic, the incompatibility of relativity and quantum mechanics, the elusive proofs of the Goldbach or Polignac conjectures, the seeming violation by plankton of the competitive exclusion principle, experimental proof of the existence of more than four dimensions, the explanation of nonlocal consequences of entanglement in physics, the appearance and disappearance of virtual particles, the solution to the paradox of disappearance of the present, the violation by certain quantum-mechanical phenomena of the law of the excluded middle, the development of an optimally nonviolent social structure given the conflict between minimal liberalism and Pareto optimality, and, of course, the question of questions, the nature or even the existence of an ultimate reality, or noumenon. Years ago, AI pioneer and Nobel Prize-winning economist Herbert Simon argued that many of the problems faced in everyday life admit, as a practical matter, of no optimal solution because of limitations of time and resources, forcing us to rely, instead, upon satisfactory solutions, or heuristics. Similarly, the philosopher Alan Watts argued that while the universe might, at bottom, be deterministic, as a practical matter, we haven’t the resources to do the Laplacian calculations, and so we are stuck with acting as if from free will. (Whether the universe is deterministic is an open question, but most physicists, today, do not believe this to be the case.) And in the same vein, the philosopher Daniel Dennett has argued that combinatorial explosion makes the project of simulating a virtual reality indistinguishable from the universe impossible, for doing so would require computational resources greater than those provided by the universe. (This last claim is highly debatable; a self-computing universe is not impossible under various scenarios.) The point I am making goes further, however, for the claim is that we have every reason to believe that there are aspects of reality that are not only not accessible as a practical matter but that are not accessible AT ALL at present. There’s no reason to think that we apes currently have the cognitive and perceptual apparatus to arrive at complete solutions to such problems because the mechanisms involved may well be beyond our ken.

oxherding picturesBut this realization is in itself a boon. It should give us pause. It should make us humble. As a result of it, we should recognize that our many of our most cherished, most fundamental assumptions might well be misconceptions based on our limitations. We must face squarely the fact that we are like savages, familiar with fire and with chariots, claiming that the nature of the sun is quite obvious: it is a fiery chariot being driven across the sky. Or we are like the square in Abbott’s Flatland who thinks of a cone passing through its world in three dimensional space as an expanding or contracting circle.

Let’s consider one such a cherished assumption, one of the latest in a long, sad series of scientific predictions that proved to be false because unknown unknowns were not taken into account. Richard Dawkins famously argued in The Blind Watchmaker that we can be certain that wherever we might go in the universe, the laws of evolution apply. But certain is a big word. Scientific laws are not tautologies. And it seems not only possible but probable that, in fact, evolution itself is ultimately a self-defeating mechanism, not in the sense that life inevitably consumes all resources until it dies out but in the sense that at some point, sufficiently evolved creatures begin to control their own evolution, at which time there is a decisive break, a disjunction, a stochastic leap, for evolution becomes no longer blind but teleological, at which point, all bets are off. We might well choose the ultimate in self-preservation, substituting the preservation and growth of the phenotype, to which we are each of us committed, for the reproduction of the genotype, for there is a fundamental non-concurrence of interest between selfish genes and selfish phenotype, especially in creatures that reproduce sexually. We are already at a point where, very soon, evolution will be definitively divorced from mate selection and sexual reproduction, its being highly doubtful that future reproduction strategies will depend upon these. And it is altogether possible that resources are not a limiting factor, for many cosmologists now believe that the universe itself is “the ultimate free lunch,” that it arose ex nihilo from the quantum foam, which is, in theory, harvestable. So, is evolution by natural selection a universal law? It’s highly doubtful that this is so. In fact, it is more likely that this is yet another example of a spatio-temporal local maximum. The Earth is a relatively young planet circling a relatively young star. We now know that there are many, many billions of other such planets in the universe, most of them far older, and it is altogether reasonable, given the similarity of conditions elsewhere, to assume that life has evolved on these and long since passed through our present infancy, for we know that recursive systems like minds are positive feedback mechanisms leading to exponential change, and we speculate with much warrant that such a process will lead to a singularity. And what happens then? By definition, we do not know. But it is highly probable, to a point approaching certainty, that this has, in fact, happened in the universe already, and we are not the entities to which it has happened. Philosopher and transhumanist Nick Bostrom believes that what we think of as reality is not reality at all but, rather, a simulation being run by such entities.

Bostrom’s simulation hypothesis is an example of warranted speculation, and that such speculation can be highly warranted—one of many possibilities unlike those that we typically entertain—suggests, at the very least, that we should check our hubris. Things may not be at all as they appear to be.

Which brings me back to the question of belief versus nonbelief. It certainly makes sense to suppose that the notions about ultimate realities entertained by stone-age savages bear little relation to actual ultimate realities. However, what is in fact the case is probably equally bizarre and, given our current limitations, beyond imagining. It is quite possible, probable even, given what we currently know, that there are entities in the universe with attributes traditionally ascribed to the gods, including the power to bring universes into being (a potential technology for doing just that and a series of steps toward development of such a technology are outlined in cosmologist Alan Guth’s The Inflationary Universe). And, of course, it is entirely possible, for the reasons described here, that our universe was the creation of a single such entity. We should admit, however, I think, that when we speculate about these matters, we are in the position of the Neolithic farmer venturing opinions on the causes of epileptic seizures and that we should do so in a spirit of play, of speculation, of creation, of frumsceaft.

Strange_compositionThe phrase current limitations, above, was chosen with care. Creatures with technologies, however rudimentary, have already crossed a Rubicon, for technologies are prostheses that give access to further aspects of reality. The long-standing question of whether there is a noumenal reality separable from perceptual reality has long been answered (though, oddly, some professional philosophers seem not to have gotten the memo), for as we have built new technologies to extend our access—mathematics, thought experiments, Galileo’s telescope, spectrometers, superconducting supercolliders, FMRI machines, and so on—more and more of the universe has been revealed, as surely as the contents of a gift-wrapped package are revealed when we remove the packaging, but with a couple of important caveats: 1. the packaging of reality appears to be so many Matryoshka dolls, how many, we do not know. Perhaps it is turtles all the way down. And 2. those prostheses simply become part of a new, extended, but also limited perceptual and conceptual repertoire. As we continue, at an exponential rate, to develop prostheses for extending our access to the universe, we shall doubtless encounter many surprises, many of which will be as disjunctive as was, say, the atomic hypothesis. We have already learned, or think we have learned, that the macroscopic world of solid objects with which we are familiar on a quotidian basis is illusory, that it is, on a deeper plane, a whir of elementary particles and, on a deeper plane yet, interacting fields. Such conceptions would have seemed utterly preposterous to most of our ancestors. (The atomic hypothesis was still highly controversial at the turn of the twentieth century.) Given this history, assuming that we have reached the bottom of the rabbit hole (or that it is even a hole to begin with) is ludicrous, and there is nothing in our current knowledge that precludes quite fantastic possibilities, including the possibility that the current physical reductionists have it exactly backward and that

  1. perception is an interface that bears relations to but does not show reality, as the icons on your computer screen bear relations to but do not show the underlying reality of the mechanism within your computer.
  2. the functional structures of the mind are, in part, an operating system enabling the creation of that interface based upon incoming data; are, in part, processors of data; and are, in part, storage systems for temporal states of that data. (NB: These are probably not so easily separable, for brains are not constructed like computers.)
  3. the perceived world is simply a collection of icons that constitute an interface to a reality that lies behind it.
  4. consciousness might well be fundamental and matter derivative, not vice versa.

Numbers 1 and 2 and 3 are, I think, as incontrovertible as the best of our scientific inferences. Number 4 is another matter. It’s a highly speculative proposition but one that is not inconsistent with anything that we think we know via scientific inference and is weakly warranted by speculations such as Bostrom’s involving highly developed nonhuman intelligence or intelligences in the universe. Together, these propositions, advanced by cognitive psychologist and expert on perception Donald Hoffman, show a marked similarity to what Aldous Huxley refers to as “the perennial philosophy,” arrived at via convergent cultural evolution in various religious traditions worldwide—in the thought of persons as diverse as the authors of the Bhagavad-Gita and the Chandogya Upanishad, the Sufi mystic Jalal ad-Din Muhammad Rumi, Chuang Tzu, Meister Eckhart, Black Elk, and Terence McKenna. (Note: I had independently arrived at conclusions 1, 2, and 3 before having encountered Hoffman. I am intrigued by his reasons for embracing conclusion 4.)

creation-web-versionAncient accounts of ultimate realities are suspect, but so are our limited current perceptions and scientific models of reality. One cannot make the leap from speculation to knowledge, for knowledge is by definition dependent upon the prosthesis that discloses. The mystics claim to have developed such prostheses, though many certainly simply encountered the currently inexplicable and thenrushed to explanation in the inadequate terms available to them. However, even if this were true, that fact should give physical reductionists no comfort, for theirs is a metaphysics as incoherent and unsupportable as any witch doctor’s tale of his own understanding and control of all that is in the heavens and the earth, as I hope I have made abundantly clear above. And on the basis of an argument such as that which I have proposed here, I think that those with scientific and with spiritual orientations can find common ground and engage in fruitful dialogue. But to get there, both sides must recognize that they don’t at present know as much as they think that they do. In other words, what is required of us is that rare thing, informed humility.

Copyright 2012. Robert D. Shepherd. All rights reserved. This note may be distributed freely as long as this copyright notice is retained and the text is unchanged.

Posted in Epistemology, Metaphysics, Philosophy, Philosophy of Mind, Religion, Uncategorized | Leave a comment

What Happens When Amateurs Write “Standards”

Martin,_John_-_Satan_presiding_at_the_Infernal_Council_-_1824

John Martin, illustration for John Milton’s Paradise Lost, “Satan Presiding at Internal Council.” (Or is this an illustration of a meeting of the CCSSO and NGA to come up with a totalitarian set of “standards” to foist upon everyone else in the country?) 

I am having a lot of fun identifying the howlers in the Common [sic] Core [sic] State [sic] Standards [sic] for English Language Arts. Here’s one for your amusement:

This is reading “anchor standard” 8:

CCSS.ELA-LITERACY.CCRA.R.8. Delineate and evaluate the argument and specific claims in a text, including the validity of the reasoning as well as the relevance and sufficiency of the evidence.

Amusingly, the “literature standards” tell us, over and over, that this anchor “standard” is “not applicable to literature,” that it applies only to “informative text.”

That would be news to the speaker of Milton’s Paradise Lost, who invokes the Holy Spirit, at the beginning of the poem, and asks this Christian Muse to help him, in the poem, present an argument to “justify the ways of God to men.”

Maybe it’s been a while since you read or thought about Paradise Lost. Go have a look at Book I. You will find, at the beginning of it, something the author actually calls “The Argument.” It’s a brief preface that serves as an abstract of the claims, reasoning, and evidence to be presented in the book.

Did the folks who put together these amateurish “standards” actually think that literary works never present arguments, make claims, use reasoning of varying degrees of validity, nor present evidence of varying degrees of relevance and sufficiency?

Do they actually think that Ambrose Bierce‘s “Chickamauga,” Thomas Hardy’s “Channel Firing” or “The Man He Killed,” Wilfred Owen’s “Dulce et Decorum Est,” Erich Maria Remarque’s All Quiet on the Western Front, Dalton Trumbo’s Johnny Got His Gun, Tolstoy’s War and Peace, Kurt Vonnegut’s Slaughterhouse-Five, and Randall Jarrell’s “The Death of the Ball Turret Gunner” do not present implicit and explicit arguments against war, do not advance specific claims, and do not employ reasoning and evidence in support of those claims? And what on earth would they imagine such poems as Hesiod’s Works and Days, Lucretius’s De rerum natura, Pope’s “An Essay on Man” and “An Essay on Criticism,” Wordsworth’s The Excursion, and Erasmus Darwin’s The Temple of Nature to be if not, primarily, arguments?

And do they really think that arguments are not put forward in, say, Rumi’s “Like This,” Donne’s “The Sun Rising,” Marvell’s “To His Coy Mistress,” “Gray’s “Stanzas Wrote in a Country Church-Yard,” Burns’s “Song Composed in August,” Blake’s The Marriage of Heaven and Hell,  Tennyson’s “In Memoriam A.H.H.,” Dickinson’s “I heard a Fly buzz—when I died,” FitzGerald’s The Rubáiyát of Omar Khayyám, Yeats’s “Adam’s Curse,” Eliot’s “Burnt Norton,” Wallace Stevens’s “Credences of Summer,” MacLeish’s “Ars Poetica,” Frost’s “Directive,” Levertov’s “A Tree Telling of Orpheus,” Mary Oliver’s “The Summer Day,” and Billy Collins’s “Introduction to Poetry”?

Really? Seriously? I know, it’s almost unimaginable that they do.

But let’s do a little CLOSE READING of the “standards” to see what EVIDENCE we can find to help us answer those questions. Inquiring minds want to know.

If you turn to the writing “standards,” the suspicion will grow in you that the authors of these “standards” were, indeed, that naïve. The breathtakingly puerile Common [sic] Core [sic] writing “standards” neatly divide up all writing into three “modes”–narrative, informative, and argumentative–and encourage teachers and students to think of these as DISTINCT classes, or categories, into which pieces of writing can be sorted.

Imagine, for a moment, that you are reading an exposé on this blog or Mercedes Schneider’s or Diane Ravitch’s that tells the story of how some people got together in a backroom and cooked up a bullet list of “standards” and foisted these on the entire country with no learned critique or vetting.

Perhaps such a piece would only SEEM to be an informative narrative told to advance an argument. Perhaps writing consists entirely of five-paragraph themes written in distinct modes and we’ve been hallucinating JUST ABOUT EVERYTHING ELSE EVER WRITTEN, which doesn’t fit neatly into the categories advanced in the “standards.”

LOL

And, standard after standard, one encounters the same sort of simple-mindedness about literary types and taxonomy. One gets the impression, reading these “standards,” that a group of nonliterary noneducators–some small-town insurance executives perhaps–got together and made up a bullet list of “stuff to learn in English class” based on their vague memories of what they studied in English back in the day. (I don’t intend, here, BTW, to disparage the literary sophistication of all insurance executives; Wallace Stevens was one, after all, and he may well have been the greatest American poet of the twentieth century. On what other would we place the red cloak?)

Of course, what the folks behind these “standards” really did was hire an amateur who hadn’t taught and who knew very little about the domains he was going to work in to hack together a bullet list based on a review of the lowest-common-denominator educratic groupthink in the previously existing state “standards.” In effect, a few plutocrats appointed this person (by divine right?) absolute monarch of instruction in the English language arts in the United States. My feeling is that similar results would have been obtained if a group of plutocrats had handed David Coleman a copy of the 1858 edition of Gray’s Anatomy and sent him to a cabin in Vermont to write new standards for the practice of medicine.

And, of course, the plutocrats hired this guy to do this because they wanted ONE set of standards for the entire country to which to correlate the products that they planned to sell “at scale.” In other words, the single bullet list was a necessary part of an ed tech business plan. One ring to rule them all!

And that ought to be obvious enough, for surely no one who thought even a bit about these matters would conclude that

a) this CC$$ ELA bullet list is the best we could come up with or that

b) one list is appropriate for all students and for all purposes or that

c) these matters should be set in stone instead of being continually rethought and revisited in light of the discoveries and innovations made by the millions of classroom practitioners, scholars, researchers, and curriculum developers working in the domains that the “standards” cover.

Obviously.

Of course, it’s typical of a certain kind of philistine to divide the world neatly up into the objective (informative works) and the subjective (literary works) and so to think that  simple-minded categorizations like the ones to be found in the Common [sic] Core [sic] make sense. The same sort of person thinks that one can reduce learning to a bullet list in a stack of Powerpoint slides.

And, it’s typical of such people to have a rage for order and an inclination toward authoritarianism. Such people admire regimentation and expect others–all those teachers, and curriculum coordinators and curriculum developers out there–simply to obey. In his Devil’s Dictionary, Ambrose Bierce defines arrayed as “drawn up and given an orderly disposition, as a rioter hanged from a lamppost.” I suspect that the people behind these “standards”–the folks who claim that standardization, centralization, and regimentation will lead to innovation, as Bill Gates just did in a speech to the National Board for Professional Teaching Standards–would approve of Bierce’s definition. And they would probably like to see folks like me so arrayed.

 

NB: Gray’s original title for “Stanzas Written in a Country Churchyard” used the word Wrote, a purposeful grammatical error, or solecism, of the kind that might be spoken by one of the subjects of his poem.

Posted in Ed Reform, Poetry, Teaching Literature and Writing | 11 Comments

A Cup of Tea

437px-Samovar_Tea_House_(7792873646)Many years ago, a professor from one of the western world’s great universities went to visit the Japanese master Nan-in to learn about Zen. Nan-in invited the professor to sit and offered him tea. As Nan-in prepared the tea, the professor talked. And talked.  And talked some more. Nan-in served the tea. He poured his visitor’s cup full, and then kept pouring. The professor watched the tea pouring onto the table and floor until he could no longer restrain himself. “It is overfull,” he said. “No more will go in!”

“Like this cup,” Nan-in replied, “you are full of ideas and opinions. How can I show you Zen unless you first empty your cup?”


[1] Adapted from 101 Zen Stories, by Nyogen Senzaki, 1919, a compilation of Zen anecdotes. Senzaki’s compilation also includes a translation of Sassekishu, or Sand and Pebbles, a collection of Buddhist parables by the Japanese monk Muju, written in 1283.

Photo, Samovar Tea House, by Christopher Michel, file licensed under the Creative Commons Attribution 2.0 Generic license.

Copyright 2014, Robert D. Shepherd. All rights reserved. This file may be freely distributed as long as this copyright notice is retained.

Posted in Epistemology, Philosophy, Teaching Literature and Writing | 6 Comments

The Limits of Learning

OK, I admit it. I haven’t read The Vicar of Wakefield.

Beowulf_Cotton_MS_Vitellius_A_XV_f._132rI’m always suspicious of people who have that air about them of having read everything.  I’m onto them. Here’s why: Years ago, when I was an undergraduate at Indiana, I went to the library to work on a paper on Robert Frost. The Indiana University library was my Internet, in those days before the Internet, and also, to me, a kind of temple. In its seven million or so volumes was to be found, I felt, the collective experience of our species. Sometimes, I would just wander aimlessly in the stacks, like a mushroom hunter in an old-growth forest, pulling off the shelves these weird wonders: a fourteenth-century guide to courtly love, great monographs on the sand flea, grammars of Old Icelandic. I thought it wild and wacky, half mad, and altogether beautiful that someone would devote his or her life to the study of the sand flea.

But on this particular evening, long ago, I had work to do: the paper on Frost. As I stood there in the stacks looking at the library’s hundreds of books about Frost, an unsettling thought occurred to me. I knew that as an American male, I had a life expectancy of about 70 years. There are 52 weeks in a year. If I read a book in my subject area every week for the approximately 52 years left to me, I could read, in my lifetime, about 2,740 of those books. I didn’t even have time enough, in the rest of my life, to read the works of criticism of mid-century American literature in the library’s collection, much less those monographs on the sand flea.

Jean-Baptiste-Camille_Corot_-_Orpheus_Leading_Eurydice_from_the_Underworld_-_Google_Art_ProjectDo you remember when you first learned that you were going to die? Most people learned this so early that they don’t recall having done so, but I must be a slow learner, for I remember vividly when I learned that remarkable fact. I was five or six and watching a Twilight Zone episode on a black-and-white television with rabbit ears. In the episode, a girl in rural Arkansas or Kentucky or someplace like that sold her soul to the devil in exchange for the love of the handsomest young man in town. As part of the bargain, she had to spend some of her evenings running about the countryside in the form of a mountain lion. A few days before the girl’s wedding, of course, the handsome young man joined a posse to hunt down the mountain lion, which had been terrorizing locals, and of course, not being a sensitive, environmentally conscious guy (He would have made a lousy husband anyway), he shot and killed her. So, there I was, at five or six, sitting on the floor in my Dr. Denton’s and bawling my eyes out when my grandmother came in to see why I was fussing. When I told her, she looked at me in her no-nonsense sort of way and said, “Why, child, everybody’s gonna die sometime.” I lay awake for hours that night, aghast. Sometimes I still do.

For me, that later evening in the library was like learning that I was going to die all over again. I had come to Indiana University to become a scholar, and damn it, I was going to do so. I was going to read everything. Everything. I was going to become the kind of scholar whom people speak of in hushed and reverent tones. What that evening taught me, of course, is that whatever  I chose to study professionally, I could barely put a crack in it.

And my professors. My God! I had found them out. I still revered them, some of them, for their learning, but . . . that amazing man E. Talbot Donaldson, the great medievalist, whose lectures I was privileged to attend and whose memory I shall forever honor, didn’t know squat about sand fleas. Freud wrote in his Introductory Lectures on Psychoanalysis about the trauma that kids go through when they figure out that their parents don’t know everything. He was wrong about that, as about much else. Kids get wise to us early on, and that’s a good thing, I think. It’s delightful to watch toddlers pushing the limits, probing, exploring what’s possible, finding out how far things go and when they break. Their elders should do a lot more of that.

But I admit that this recognition floored me. I would have to resign myself, forever, to being mostly ignorant about mostly everything. There is a magnificent literature in Korean, full of beauty and insight that would deepen my understandings beyond measure, but I shall probably never, ever know it. It’s on my list, but art is long, and life is short.

Copyright 2014, Robert D. Shepherd. All rights reserved. This little essay may be freely distributed as long as this copyright notice is retained.

Posted in Epistemology, Teaching Literature and Writing | 4 Comments

A Brief Analysis of Two Common [sic] Core [sic] State [sic] Standards [sic] in ELA

450px-FirstFolioMacbeth“And be these juggling fiends no more believed, / That palter with us in a double sense.”
–William Shakespeare, Macbeth

The defenders of the CC$$ often make the claim that “the standards do not tell you what to teach.” That’s purest equivocation.

The standards are a list, by domain, of outcomes to be measured in mathematics and in English language arts. If a standard says that a student will be able to x, then that means that the student will be taught to x. It also assumes that x should be taught, implies that x is to be taught explicitly, and, importantly, takes time from teaching y, where y is something not in the standards. The whole point of implementing standards is to have them drive curricula and pedagogy, and claims to the contrary are equivocation.

The equivocation from deformers on this issue means one of two things: a) they don’t know what they are talking about or b) they are dissembling. So, let’s look at a couple of specific “standards” taken at random from the CC$$ and do the sort of work that would have been done if the CC$$ in ELA had been subjected to any real critique. Bear in mind that the same sort of process that I’m going to carry out below could be carried out for almost any “standard” on the CC$$ bullet list.

Analysis of a Sample CC$$ Language “Standard”

150px-Minimalist_Syntax_Tree_1CC$$.ELA-Literacy.L.8.1a. Explain the function of verbals (gerunds, participles, infinitives) in general and their function in particular sentences.

This standard tells us students are to be assessed on their ability a) to explain the function of verbals (gerunds, participles, infinitives) in general and b) their function in particular sentences. In order for students to do this, they will have to be taught, duh, how to identify gerunds, participles, and infinitives and how to explain their functions generally and in particular sentences. That’s several curriculum items. So much for the Common Core not specifying curricula.

Furthermore, in order for the standard to be met, these bits of grammatical taxonomy will have to be explicitly taught and explicitly learned, for the standard requires students to be able to make explicit explanations. Now, there is a difference between having learned an explicit grammatical taxonomy and having acquired competence in using the grammatical forms listed in that taxonomy. The authors of the standard seem not to have understood this. Instead, the standard requires a particular pedagogical approach that involves explicit instruction in grammatical taxonomy. So much for the standards not requiring particular pedagogy.

So, to recap: the standard requires particular curricula and a particular pedagogical approach.

Let’s think about the kind of activity that this standard envisions our having students do. Identifying the functions of verbals in sentences would require that students be able to do, among other things, something like this:

Underline the gerund phrases in the following sentences and tell whether each is functioning as a subject, direct object, indirect object, object of a preposition, predicate nominative, retained object, subjective complement, objective complement, or appositive of any of these.

That’s what’s entailed by PART of the standard. And since the standard just mentions verbals generally and not any of the many forms that these can take, one doesn’t know whether it covers, for example, infinitives used without the infinitive marker “to,” so-called “bare infinitives,” as in “Let there be peace.” (Compare “John wanted there to be peace.”) Would one of you like to explain to your students how the infinitive functions in that sentence and to do the months and months of prerequisite work in syntax necessary for them to understand the explanation? Have fun. Then tell me whether you think it a good idea to waste precious class time getting kids to the point where they can parse that sentence and explain the function of the verbal in it.

Shouldn’t there have been SOME discussion and debate about this, at the very least? Do the authors of these “standards” have any notion how much curricula and what kinds of pedagogical approaches would be necessary in order for 8th-grade students to be able to do this?

And so it goes for the rest of the long, long list of specific, grade-level standards. All have enormous entailments, and none of these, it seems, were thought through, and certainly, none of them were subjected to critique, and no mechanism was created for revision in light of scholarly critique.

Given what contemporary syntacticians now know about how gerunds, participles, and infinitives function in general and in particular sentences, I seriously doubt that that the authors of this “standard” understood what they were calling for or that students can be taught to explain these at all accurately, at this level (Grade 8) without that teaching being embedded in an overall explicit grammar curriculum. Furthermore, the authors of the standard doubtless had in mind a prescientific folk theory of grammar that doesn’t remotely resemble contemporary, research-based models of syntax–so they are doing the equivalent, here, of, say, telling teachers of physics to explain to kids that empty space is filled with an invisible ether or telling teachers of biology to explain that living things differ from nonliving ones because of their élan vital.

Of course, people do not acquire competence in using syntactic forms via explicit instruction in those forms and the rules for using them. Anyone with any training whatsoever in language acquisition would know that. For example, you know, if you are a speaker of English, that

*the green, great dragon

is ungrammatical and that

the great, green dragon

is not. But you don’t know this because you were taught the explicit rules for order of precedence of adjectives in English.

While there are, arguably, some reasons for learning an explicit grammar (for example, one might want to do so in the process of training for work as a professional linguist), what we are (or should be) interested in as teachers of English is assisting students in developing grammatical competence, which, again, is done by means other than via explicit instruction in taxonomy and rules (e.g., through oral language activities involving language that uses the forms properly, through committing to memory sentences containing novel constructions, through exposure to these constructions in writing, through modeling of corrections of deviations from standard grammatical rules). The science on this is overwhelming, but the authors of these standards clearly weren’t familiar with it. Their “standard” requires particular curricula and pedagogical approaches if it is to be met, and these aren’t supported by what we know, scientifically, about language acquisition–about how the grammar of a language is acquired by its speakers. Many of the new “standards” assume and/or instantiate such backward, hackneyed, prescientific notions about what we should teach and how.

And, of course, again, these “standards” were foisted on the country with no professional vetting or critique, and no mechanism was created for ongoing improvement of them based on such critique.

Imagine, if you will, the whole design space of possible curricula and pedagogical approaches in the English language arts, a sort of Borges library of curricula and pedagogy. Standards such as these draw rather severe boundaries within that space and say, “What is within these boundaries is required, and what is outside these boundaries is not permitted.” In other words, the new “standards,” as written, preclude some curricula and pedagogical approaches and require others. Basically, they apply a severe prior constraint on curricular and pedagogical innovation based on current knowledge and emerging practice and research

I happen to believe, BTW, that there is a role to be played in the language and writing and literary interpretation portions of our curricula for explicit instruction in some aspects of current scientific models of syntax. However, that’s another discussion entirely, and it’s one that none of us will be having because the decisions about what we are to consider important in instruction have been made for us by Lord Coleman, and ours is but to obey.

That seems, sadly, to be OK with the defenders of the amateurishly prepared CC$$ in ELA.

Let’s turn to the place of this “standard” in the overall learning progression laid out by the Common Core.

Why verbals at this particular level? Why not case assignment or the complement/adjunct distinction or explicit versus null determiners or theta roles or X-bars or varieties of complement phrases or any of a long list of other equally important syntactic categories and concepts? And why are all those left out of the learning progression as a whole, across all the grades, given that they are key to understanding explicit models of syntax, which, evidently, the authors of these “standards” think important for some reason or another? Answer: this “standard” appears at this grade level pretty much AT RANDOM, not as part of a coherent, overall progression, the purpose of which was clearly thought out based on current best practices and scientific understanding of language acquisition. It’s as though one opened a text on syntax, laid one’s finger down randomly on a topic, plopped it into the middle of the Grade 8 standards with no consideration of the prerequisites for tackling the topic.

Let’s move on to how the existence of the “standard” precludes development of alternative curricula and pedagogical approaches—to how it stifles innovation in both areas. Suppose I had an argument to make that it’s useful for kids to learn construction of basic syntax trees for coordination as part of a section of a writing program in which students are learning how to create more various, more robust sentences. Now, you can agree or disagree with this proposal, but the point is that you should have the right to do so–to look at the specific proposal and accept it, reject it, or accept it with modifications. The answer to the question, “Should we do that?” should NOT BE, “Well, it’s not in the standards.” And your answer to that question should not be, “We can’t do this because we have to be concentrating on the functions of verbals at these grade levels.” Instead, educators should consider the relative merits of these proposals.

But now, because of the CC$$ in ELA, and previously, because of the state “standards,” those are the standard answers to most suggestions for innovation in curricula and pedagogy.

That’s not how you get continuous improvement. Continuous improvement comes about when people put forward their suggestions for curricula and pedagogy, without such prior constraint, and those are evaluated critically.

Why has there not been more critique, like this one, of the “standards” themselves? Now, THERE’S A PROBLEM. In order REALLY to be able to counter claims about the new “standards” made by the education deformers promoting them, one has to do fairly detailed analysis of particular “standards” and what they entail. That’s a big job. And the moment one starts to talk about those matters, people’s eyes glaze over. This stuff can’t be done in pithy soundbites of the kind that are the stock in trade of organizations like Achieve, the Chiefs for Change, Students First, and the Thoms B. Fordham Institute.

There are some nasty devils in the materials ancillary to the Common Core–the ones that present the educational philosophy of that renowned pedagogical theorist ex nihilo, Lord Coleman. But there are many, many devils in the details. Many of them. And there are NO MEANS WHATSOEVER built into the CC$$ implementations for exorcising those.

Analysis of a Sample CC$$ Literature “Standard”

Fragonard,_The_ReaderCC$$.ELA-Literacy.RL.11-12.5. Analyze how an author’s choices concerning how to structure specific parts of a text (e.g., the choice of where to begin or end a story, the choice to provide a comedic or tragic resolution) contribute to its overall structure and meaning as well as its aesthetic impact.

One of the problems with the CC$$ is that they are full of unexamined assumptions (one can also drive whole curricula through their lacunae, but that’s another issue). In this case, the standard [sic] assumes a particular hermeneutics, or theory of interpretation–that an author’s choices are a proper object of study. This is an extremely controversial position, and one that I hold, with reservations, but it is taken for granted in the standard [sic] as though there were no learned disagreement regarding it. The authors of these “standards” seem to be oblivious of the fact that E. D. Hirsch stood almost alone, throughout much of the past century, in his heroic defense of the author’s choices, or intentions, as proper objects of scholarly attention. During that time, many scholars and critics, perhaps most professional literary people, contended that the author’s choices, or intentions, were irrelevant or irrecoverable or both and that we must attend, instead,

• to the text itself (Ransom, Tate, Empson, Brooks, Warren, Wimsatt, Beardsley, and others of the New Critical school) or to formal or structural features or relations within the text (Propp, Jakobson, Stith Thompson, Levi-Strauss, and other Formalists and Structuralists);

• to the reader’s construction of the text (in their various ways, Barthes, Fish, Rosenblatt, Derrida, and other Reader Response, Postmodernist, and Deconstructionist critics); or

• to historically determined responses to the text and differences in these over time (Heidegger, Gademer, Foucault, Greenblatt, and other Historicist and New Historicist critics).

It’s fairly typical of these standards [sic] to be worded in complete obliviousness of the fact that people have thought pretty seriously about literature over the past hundred and fifty years and have, in the course of all that, learned a few things and in complete obliviousness of the fact that there are many possible approaches to literary study that the authors of the “standards” seem to have been clueless about. The controversial notion that we should focus on authors’ intentions was CENTRAL to the raging debates over approaches to literary interpretation, or hermeneutics, in the twentieth century. Who decided that David Coleman and Susan Pimentel had the right to overrule every scholar, every teacher, every curriculum designer, every curriculum coordinator, who belongs to a different camp, who champions a different approach? Are we to have a central committee deciding what IDEAS are acceptable? And isn’t the New Critical approach of these “standards,” generally, incompatible with this emphasis, in this one standard, on this one example, of an author exercising intention?

Now, let me hasten to add that I understand and share the concern that led Hirsch to his defense of the author’s intention. Hirsch recognized that our basic ontological position is that your mind is over there, and mine is over here, and that cultural products are created to bridge that ontological gap. If we throw out the idea of the author’s intention, we undermine that faith in the notion that an idea can be conceived and communicated–faith in the very possibility of faithful cultural transmission. The Ancient Mariner wants you to hear HIS story, not your deconstruction of it, and he fairly clearly insists upon that. So, the reports of the death of the author have been greatly exaggerated. However, all that said, it’s valuable for us, as readers, to poke at that author and intentions that we’ve posited in our reading when we are doing our rereading. The best reading is often such rereading.

Often, for example, in rereading my own work, I’m surprised at the one or ones I glimpse there, behind it all. And as I sit down to write, who is the “author” there? We are all of many parts and roles and conditionings, layers upon layers, worlds within worlds. At times when we write it seems that we are simply transcribing (that’s almost always when the writing is best, of course), and at others we are very much aware of consciously assembling an experience for the reader–laying a trap, casting a spell, tossing the reader off a cliff into something dark and disturbing–whatever. And at times the autoclitic or peformative aspect of the speech is definitely foregrounded, both when we are writing and in a text we are reading. Intention is complex.

I read Nietzsche’s Zarathustra, for example, and it seems to me that I am reading someone intent, foremost, on my having an authentic encounter with someone like him, encountering a mad genius of many parts content, often, with simply displaying those parts in all their dazzling self-contradiction. God or gnat? Well, both, in superimposition, like Schrodinger’s cat, and much more besides, and he’s aware of that and having a good laugh at us and himself whenever we presume to attain a bourgeois clarity, though he has experienced that other clarity, the Eleusinian one, atop the mountain, and shares some of what that is like, too, or claims to–even that he laughs at, as a model to us of what is, finally, it seems to me, a stance he wants to show us how to take or a dance to dance. The author I posit, there, is one who demands that I be him reading him but in his irreducibly suggestive, pregnant, generative entirety, not at all in the way of the narrow propagandist, not at all at all.

So, as writers, we cannot know, ourselves, all that went into the work we have produced and clearly sort out what was intentional and what was not (for a great many reasons and in a great many ways) in what we have produced. How much more removed are we then, in our vantage as readers, given all our complexities,  all our unexamined interpellations of everything, our often great distance in time and place from that author we think we are conversing with? For example, when I was a lot younger, Plato seemed, to me, clear as Bach cantata, but now that I am older, much of what is there seems to me extremely alien and perhaps irrecoverable, and that may be true of the Bach, too. I have my theories about Plato. I think, for example, that he had a transformative experience when he participated in the mysteries at Eleusis that is key to understanding him, but that’s an informed speculation, not a fact, about Plato. My notion, there, is a valuable lens through which to read him, I think, but I cannot claim more for it, though without the presupposition that there was a Plato with an intent that it is my work to discover, I cannot believe that the critical enterprise makes any sense.

So, I thought of myself in Hirsch’s camp on this stuff long before I met the man and came to admire him close up, but I also read critics of other schools to my great instruction and delight. We don’t even know ourselves, so it’s some presumption to think that we can have an easy grasp on the intentions of an Other–often a very distant Other–who is, after all, not some specimen of Lepidoptera labeled, pinned to a card, archived, and cataloged. There are many, many ways in and out of poems and tales, and it’s possible to read by many lights. A deconstruction of a text may run afoul of sense about cultural transmission and common understanding, and it may confuse significance for what Hirsch calls “verbal meaning,” but even if it’s a mistake in interpretation, as Hirsch contends in his valuable Validity in Interpretation, one can learn a lot from it.

The ed deformers, bless their simple, walnut-like hearts, don’t seem to understand that–that one can read by many lights.

This essay is not the place for me to lay out a hermeneutics of my own–the theory that informs my own teaching of literature (I will post more on this in time). Suffice it to say that there are complex issues involved that were not understood or ignored by the framers of these “standards.” There are many ways in and out of literary texts, and attempted positing of intent is only part of that, though an extremely important part. I believe that Hirsch is right that to the extent that we deny the determinacy of meaning, we deny the very possibility of faithful communication, but it is also true that we read because doing so matters to us, and it matters because of the significance of the text as vicariously and potentially lived experience, which will vary. There are many meanings of meaning, and one of these, the one that matters in the end, is “mattering” itself. Imagine Heidegger, in his mysticism about the German Volk, writing in the early days of Nazism, in praise of folk festivals. Then think of him rereading his own words in 1948. Words with the same meaning as intended model (Bild) will have a different meaning as mattering and so meaning in possible use. He might be able to re-cognize his intent, but he will find living in that building uncomfortable enough to drive him quite insane for a time. Enowning can be difficult.

Let’s proceed with analysis of the rest of this “standard.” Why, at this level (Grades 11 and 12) are students are being asked to concentrate, in particular, on the structures of specific parts of a text? Would it make more sense, instead, to address overall structure at these grade levels, building upon analyses of structures of specific parts of texts done at earlier grade levels? Was this possibility considered? Might this not be the time, at the end of the K-12 program, to sum up what has been learned in earlier grades about specific literary structures, to draw some broad conclusions about common overall literary structures and their determinative influence on the making of literary works? Do we want to make sure, before they graduate, that students understand the basics of conventional plot structure? Shouldn’t we review that because it is so fundamental and because this is our last chance to do so before we ship kids off into their post-secondary colleges and careers? Shouldn’t a school system or a planner of an instructional sequence be free to decide that such an approach would be more preferable in grades 11 and 12? Did someone make Coleman and Pimentel the “deciders” (to use George Bush’s unfortunate phrase) for everyone else in this regard? Were such questions considered by the authors of these standards [sic]? I doubt it.

Another issue: aren’t the relations of specific structure to a) overall structure, b) meaning, and c) aesthetic impact quite distinct topics of study? Why are they lumped together in this standard [sic]? Don’t these require quite a lot of unpacking? This is a common fault of the standards [sic]. They often combine apples and oranges and shoelaces and are ALL OVER THE PLACE with regard to their level of generality or specificity. Often, there seems to be no rationale for why a given standard is extremely specific or extremely broad or, like this one, both, in parts.

Yet another: does it make sense, at all, to work in this direction, from general notions about literary works as expressed in a standard [sic] like this, rather than from specific case studies? Wouldn’t real standards be encouraging empirical, inductive thinking, beginning with specific works, with study of patterns of relationship in those works, and then and only then asking students to make generalizations or exposing them to generalizations made by knowledgeable scholars who have thought systematically about those patterns of relationship? Wouldn’t that be a LOT more effective pedagogically? Isn’t that what the Publishers’ Criteria say? Isn’t the overall approach taken in these standards [sic] antithetical to the very “close reading” that they purport to encourage? Isn’t it true that by handing teachers and students nationwide a bunch of implicit generalizations like those in this standard [sic], the makers of the standards [sic] are encouraging uncritical acceptance of those generalizations about texts rather than an empirical approach that proceeds inductively, based on real analysis, to build understanding?

And another: what is meant by this word structure in the standard [sic]? The examples given (where the piece begins, comedic or tragic resolution) suggest that students are to analyze narrative structures, but there are many other kinds of structures in literary works. Are teachers to ignore those and concentrate on narrative structures? Was that among the “choices” that the authors of the standards made for the rest of us? Certainly, there is much that we know about structure in texts that is quite important to the interpretation of works of all kinds, literary and otherwise, that is never addressed anywhere in the standards [sic]. Unfortunately, the standards [sic] do not build in students, over time, familiarity with many extremely common structural patterns–episodic structure, cyclical structure, choral structure, the five-act play, the monomyth, the three unities–one could make a long list. What about rhetorical structures? metrical structures? logical structures? imitative or derivative structures based on forms in other media (e.g., John Dos Passos’s “Newsreels”)? Are teachers to ignore those? Is it unimportant for 11th- and 12th-grade students to learn about the reductio (Vonnegut’s Sirens of Titan or Bellow’s Henderson the Rain King); the thesis, antithesis, synthesis structure, or dialectic (Rebecca Goldstein’s Mazel); choral structure (The Book of Job, Antigone); or metrical structures like the ghazal or formulaic oral composition (the Sundiata, the Iliad)? Again, one could pilot whole curricula, whole learning progressions, through the lacunae in these standards [sic].

And doesn’t all this attention, based on the “standards” to identification of tropes and forms skip right over, render unimportant, authentic engagement with what the author intended to communicate–what he or she is saying to us? Shouldn’t THAT be what we are emphasizing, not why the author chose to use this or that structural element? Aren’t we concentrating on making sure the brightwork is all nice and shiny and ignoring the gaping hole in the hull?

One asks oneself, again and again, when reading these putative “standards,” why are students studying this, in particular, and not that? Why at this grade level? Why is this and this and this and this left out? And the answer seems to be that the authors of these standards [sic] didn’t think to ask such questions. Or, in short, that they didn’t think.

One could do the same as I have done here for most of the other CCSS ELA literature standards [sic]. My more general point is that these standards promote some approaches and preclude others and so enforce dramatic prior restraint on possible curricula and pedagogy.

Another significant issue is that the authors of these “standards” did not bother to think about differences in what might be meant by standard in each of the domains covered. In other words, they did not revisit the notion of a “standard” at its most fundamental level, that of its categorical conceptualization. Did anyone involved in drafting these standards stop to think for one moment about the fact that with very, very few exceptions, they are descriptions of abstract formal analysis skills? Did that not strike them as BIZARRE? It does me. The writing “standards” are almost identical from grade level to grade level and encourage the writing of five-paragraph themes in one of three spurious “modes” and contain no mention of any of the thousands and thousands of concrete techniques from the toolkits of writers, and so they will inevitably lead, are already leading, to non-operationalized instruction in vagueries, to writing instruction that is worse than useless because of its opportunity costs. One gets the impression, reading the writing “standards,” that Coleman and Pimentel simply ran out of time or energy and decided to copy over a few puerile generalizations at each grade, with slight rewording from year to year. These are amateurish in the extreme and will have dire consequences for writing instruction. Really, Coleman and Pimentel could have bothered to learn even a tiny bit from the vast and fruitful literature on instruction rhetoric and composition before foisting their embarrassing writing “standards” onto the entire country. And the language standards–well, I have given you a taste of those above–these are backward, unscientific, and seem to be placed at particular grade levels almost entirely AT RANDOM. In general, no thought was given by the authors of these “standards” to the differences among different kinds of learning and acquisition and thus to what should be measured, if at all, and how. I read these “standards” and think of the line spoken to Mehitabel, that cat of ill repute, by the elderly theater cat in Don Marquis’s “The Old Trouper”:

mehitabel he says
both our professions
are being ruined
by amateurs

And here’s another general point: Why weren’t these standards [sic] subjected to nationwide critique of the kind that I have given here, for these two standards [sic]? And why should we not be continuously subjecting proposals for standards, frameworks, pedagogical approaches, etc., to revision and critique? Why shouldn’t there be MANY voices as opposed to these two, the voices of a couple people chosen by Achieve to dictate to the rest of the country?

As I mentioned above, I happen to be one of those literature teachers who thinks that the reports of the “death of the author” (the phrase comes from Roland Barthes) were exaggerated, but it is not for me (or for Lord Coleman or for anyone else) to make that decision FOR EVERY OTHER LITERATURE TEACHER IN THE COUNTRY. The critique of the idea of authorial intention was fundamental to many schools of literary criticism developed in the twentieth century, and the authors of the standards betray, in their reference to analyzing the author’s choices, what has to be either complete ignorance of that or complete disregard of the opinions of thousands and thousands of scholars and critics and teachers of literature.

But who are we mere mortals to argue? After all, the masters at Achieve have appointed David Coleman and Susan Pimentel, by divine right, absolute monarchs of English language arts instruction in the United States, and surely, as Hobbes argued in the Leviathan, monarchy is best. Surely, in Hobbes’s words, we all need to live under “a common power to keep [us] all in awe,” for as Queen Elizabeth I wrote in 1601, “The Royal Prerogative [is] not to be canvassed, nor disputed, nor examined, and [does] not even admit of any limitation.”

In other words, forget about thinking for yourself about outcomes to be measured and learning progressions in the English language arts. Lord Coleman will do that for you. What a relief! All that thinking was so hard.

An Alternative to the CC$$

Education deformers love asking, “What’s your alternative?” But they expect stone-cold silence in response. Sorry to disappoint. Here’s an alternative to top-down, invariant, inflexible, mandatory, amateurish “standards” like those foisted on the country with no vetting whatsoever:

in place of the grade-by-grade bullet list, a few general guidelines (a very broad framework–perhaps four or five principles), continually revisited and critiqued, that provide the degrees of freedom within which real curricular and pedagogical innovation can occur

and

open-source crowd sourcing of alternative, innovative ideas. In other words, we could have

  • Competing, voluntary standards, frameworks, learning progressions, curriculum outlines, reading lists, pedagogical approaches, lesson templates, etc.,
  • for particular domains,
  • posted by scholars, researcher, curriculum developers, and teachers to an open national portal or wiki, and
  • subjected to ongoing, vigorous, public debate and refinement
  • based on results in the classroom and ongoing research and development,
  • freely adopted by autonomous local schools and districts
  • and subjected to continual critique by teacher-led schools–teachers who are given the time in their schedules to subject those, and their own practice, to ongoing critique via something like Japanese Lesson Study.

General Objections to Standardization

Albert_Einstein_Head“I believe in standardizing automobiles. I do not believe in standardizing human beings. Standardization is a great peril which threatens American culture.”

— Albert Einstein, Saturday Evening Post interview, 10/26/1929

“There’s no bullet list like Stalin’s bullet list.”

—Edward Tufte, “The Cognitive Style of Powerpoint”

I’ve shared, above, some objections to a couple specific ELA Common Core “standards.” Again, one could do the same for the rest of the CC$$ bullet list. But let me emphasize that the comments above reflect my own views, and no individual’s views of these matters should be transmogrified into mandates for the entire country. I’ve spent a lifetime thinking about K-12 ELA curricula, but I would not presume to tell everyone else in the country how he or she must teach English. Beyond the level of basic decoding skills, there are many, many possible paths that can lead to desirable outcomes. There are many, many possible ways in which to develop  superb readers, writers, speakers, listeners, and thinkers, and the best of these have yet to be conceived. What strikes me most, reading through the CC$$ in ELA, is how mind-numbingly unimaginative, hackneyed, received, and pedestrian they are. They are Common in the sense of being base and vulgar. The last thing we need is a forced march along a path of mediocrity.

Let me conclude with the following list of general objections to the whole idea of a single, invariant, top-down set of national standards and summative tests. Each could itself be a suitable topic for a book-length work. NB: If you haven’t the patience to read through this entire list, please skip to the last two, which summarize extremely important objections to the general approach taken in the CCSS for ELA.

  1. The CC$$ in ELA seem to have been written by amateurs with no knowledge of the sciences of language acquisition and little familiarity with best practices in the various domains that the standards cover. Achieve would have got similar results if it had handed David Coleman copies of Galen and of the 1858 edition of Gray’s Anatomy and sent him to a cabin in Vermont to write new “standards” for the medical profession.
  2. The CC$$ in Math barely tweak a long-existing consensus about the progression and approach to mathematics education, one that leaves most adult products of that education, a few years after they’ve happily put it behind them, basically innumerate and fine with that. (The preceding state standards were almost all based on the NCTM standards and so were remarkably similar.) Furthermore, the grade-by-grade math standards are forcing math teachers, all over the country, to teach and test whatever the standards [sic] say for that grade level, even when their students haven’t, at all, the necessary background for this study. So, for example, if you are a junior, you’re doing precalc, period, even if you can’t add and subtract fractions.
  3. Having national standards creates economies of scale that educational materials monopolists can exploit, enabling them to crowd out/keep out smaller competitors. This is a HUGE issue with the new national “standards” that has received almost no attention. There’s a reason why the education materials monopolists kicked in a lot of money to create these “standards.”
  4. Kids differ. Standards do not.
  5. Standards are treated by publishers AS the curriculum and imply particular pedagogical approaches, and so they result in DRAMATIC distortions of curricula and pedagogy. Every publisher in the country–God help us–is now beginning every project in ELA by making a spreadsheet with the amateurish CC$$ in one column and the places in their program where these are “covered” in the next. So much for curricular coherence.
  6. Innovation in educational approaches comes about from the implementation of competing ideas; creating one set of standards ossifies; it PRECLUDES potentially extraordinarily valuable innovation.
  7. Ten years of doing this standards-and-testing stuff under NCLB hasn’t worked. It’s idiotic to do more of what hasn’t worked and to expect real change/improvement.
  8. In a free society, no unelected group (the CCSSO) has the right to overrule every teacher, curriculum coordinator, and curriculum developer with regard to what the outcomes of educational processes should be.
  9. High-stakes tests lead to teaching to the test–for example, to having kids do lots and lots of practice using the test formats–and all this test prep has significant opportunity costs; it crowds out important learning.
  10. A complex, diverse, pluralistic society needs kids to be variously trained, not identically milled.
  11. The folks who prepared these standards did their work heedlessly; they did not stop to question what a standard should look like in a particular domain but simply made unwarranted but extremely consequential decisions about that based on current practice in state standards that were themselves the product of lowest-common-denominator educratic groupthink.
  12. The tests and test prep create enormous test anxiety and undermine the development of love of learning.
  13. Real learning tends to be unique and unpredictable. It can’t be summarized in a bullet list. The last thing that we need is this Powerpointing of U.S. K-12 education.
  14. We are living in times of enormous change; kids being born today are going to experience more change in their lifetimes than has occurred in all of human history up to this point, so they need to be intrinsically, not extrinsically, motivated to learn; high-stakes tests belong to a nineteenth-century and older extrinsic punishment/reward school of educational theory and fly in the face of the prime directives of the educator: to identify the unique gifts of unique kids, to build upon those, and so to assist in the creation of intrinsically motivated, independent, life-long learners.
  15. If we create a centralized Common Core Curriculum Commissariat and Ministry of Truth, that is a first step on a VERY slippery slope. Have we come to the point in the United States where we are comfortable with legislating ideas?
  16. The standards-and-testing regime usurps local teacher and administrator autonomy, and no one works well, at all well, under conditions of low autonomy.
  17. The standards and the new tests have not been tested.
  18. The standards and the new test formats, though extremely consequential in their effects on every aspect of K-12 schooling, were never subjected to national debate, nor were they subjected to the equivalent of failure modes and effects analysis.
  19. The legislation that created the Department of Education specifically forbade it from getting involved in curricula, but as E. D. Hirsch, Jr., pointed out on this blog a few weeks ago, the new math standards clearly ARE a curriculum outline, and the USDE has forced this curriculum outline on the country.
  20. No mechanism exists for ongoing critique and revision of these standards by scholars, researchers, and practitioners.
  21. The new tests—PARCC (spell that backward) and not-Smart imBalanced (collectively, the Common Core College and Career Ready Assessment Program, or C.C.R.A.P.) are just awful. There’s going to be a policy supernova when these hit nationwide.
  22. The ELA standards are a bullet list of abstractly formulated skills that barely touches upon knowledge of what (world knowledge) and that treats procedural knowledge (knowledge of how) so vaguely–without operationalization–that valid assessment based on the standards as written is impossible. I heartily approve of some of the general guidelines that surround these standards–read substantive, related texts closely–but I disapprove of the narrow New Critical emphasis of the standards generally (texts exist in context) and of the general formulation of the CCSSO bullet list as descriptions of abstract skills.
  23. The creators of these standards did not seem to understand that much learning in ELA is acquisition–is not acquired by explicit means. ALMOST NONE of the vocabulary and grammar that a person commands was learned via explicit teaching of that vocabulary and grammar. It’s extremely important that English teachers understand this and understand how, in fact, grammar and vocabulary are acquired so that they can create the circumstances wherein this acquisition can happen, and they are not going to begin to do that based on this bullet list, which, in its treatment of acquisition of linguistic competence, can most charitably be described as prescientific–as instantiating discredited mythologies or folk theories on which it is counterproductive to build curricula and pedagogy. In their instantiation of prescientific, folk theories of language acquisition, the new “standards” are rather like having new standards for the U.S. Navy that warn of the possibility of sailing off the edge of the world.

Copyright 2014, Robert D. Shepherd. All rights reserved. This essay may be freely copied and distributed as long as this copyright notice is retained.

Posted in Ed Reform | 34 Comments