Slurping the Money River

Uncle Sam's Island Home
Thousand Islands region, Saint Lawrence Seaway

“Life is hard enough, without people having to worry themselves sick about money, too. There’s plenty for everybody in this country, if we’ll only share more.”

“And just what do you think that would do to incentive?”

“You mean fright about not getting enough to eat, about not being able to pay the doctor, about not being able to give your family nice clothes, a safe, cheerful, comfortable place to live, a decent education, and a few good times? You mean shame about not knowing where the Money River is?”

“The what?

“The Money River, where the wealth of the nation flows. We were born on the banks of it—and so were most of the mediocre people we grew up with, went to private schools with, sailed and played tennis with. We can slurp from the mighty river to our hearts’ content. And we even take slurping lessons so we can slurp more efficiently.”

“Slurping lessons?”

“From lawyers! From tax consultants! From customers’ men! We’re born close enough to the river to drown ourselves and the next ten generations in wealth, simply by using dippers and buckets. But we still hire the experts to teach us the use of aqueducts, dams, reservoirs, siphons, bucket brigades, and the Archimedes’ screw. And our teachers in turn become rich, and their children become buyers of lessons in slurping.”

“I wasn’t aware that I slurped.”

Conversation between Senator Rosewater and Eliot Rosewater in God Bless You Mr. Rosewater by Kurt Vonnegut, p 121-122

*For the record, I was raised in a desert of sorts, where the Kern River is mostly choked off by a dam. My mother and father taught me not to slurp.

Letter from Elsinore

Dear Ophelia—

Elsinore isn’t quite what I expected, or maybe there’s more than one, and I’ve come to the wrong one.The high school football players here call themselves “The Fighting Danes.” In the surrounding towns they’re known as “The Melancholy Danes.” In the past three years they have won one game, tied two, and lost twenty-four. I guess that’s what happens when Hamlet goes in as quarterback.

The last thing you said to me before I got out of the taxi was that maybe we should get a divorce. I did not realize that life had become that uncomfortable for you. I do realize that I am a very slow realizer. I still find it hard to realize that I am an alcoholic, though even strangers know this right away.

Maybe I flatter myself when I think that I may have things in common with Hamlet, that I have an important mission, that I’m temporarily mixed up about how it should be done. Hamlet had one big edge on me. His father’s ghost told him exactly what he had to do, while I am not operating with instructions. But from somewhere something is trying to tell me where to go, what to do there, and why to do it. Don’t worry, I don’t hear voices. But there is this feeling that I have a destiny far away from the shallow and preposterous posing that is our life in New York. And I roam.

And I roam.

Kurt Vonnegut Jr., God Bless You, Mr. Rosewater, 34-35 (1965, 2006).

We watched Who’s Afraid of Virginia Woolf (1966) last night (it’s hard not to drink while watching that movie). The story begins with the violation of a trust, the disclosure of a secret imaginary story shared by an academic couple (George and Martha, America’s founding parents, played impressively by Liz Taylor and Richard Burton). It doesn’t take long to figure out that the form of the scathing banter is the point— not its content. The outsiders, George Segal and Sandy Dennis, furnish the only “trustworthy” content. You grow to expect that Burton and Taylor are simply confabulating from every fact they can get their hands on. The posing is all quite cruel and as a side effect, somewhat funny. It’s hard to look away, once this expectation is aroused in the audience.

This effect, interestingly, is precisely where Kenneth Burke begins his excursus on “Psychology and Form” (1931) in Counter-Statement (1931, 1953, 1968). Burke uses Hamlet as an example, painstakingly describing how Shakespeare sets up the audience to receive, indeed to expect the ghost that drives the narrative ahead. In retrospect, Burke sees his own early approach as an oversimplification:

Counter-Statement shows signs of emergence out of adolescent fears and posturings, into problems of early manhood (problems morbidly intensified by the market crash of ’29). The role or persona of the author seems not that of a father, or even of brother, but of conscientiously wayward son (whom the Great Depression compelled to laugh on the other side of his face).

He had early decided that ideally, for each of Shakespeare’s dramatic tactics, modern thought should try to find the correspondingly critical formulation. But he soon came to see that any such orderly unfolding of the past into the present would be greatly complicated, if not made irrelevant or completely impossible, by the urgencies and abruptness of social upheaval.

Kenneth Burke, “Curriculum Criticum” published in Counter-Statement, p. 213 (1953, 1968)

It is interesting to me that Vonnegut’s Rosewater and Albee’s Woolf both address dealing with the past through the distortions of alcohol and questionable deployments of history. Albee’s past is a mixture of fabrication and fact, while Vonnegut’s approach is clever punning. Eliot Rosewater is writing from the Elsinore California Volunteer Fire Department. His grasp on reality is under scrutiny as we meet him through the revelation of certain facts about his life, always a mixture of fact and fabrication. Vonnegut invented word that suits it: chronkling. He explains it in the dedication to his collected essays:

This book is dedicated to the person who helped me regain my equilibrium [his wife, documentary photographer Jill Kremenz]. I say she chronkled me. That is another coined word. She came to me with an expressed wish to “chronicle” my wonderful life from day to day on photographic film. What eventuated was much deeper than mere chronicling.

Kurt Vonnegut Jr. “Preface,” Wampeters, Foma and Granfalloons p. xxi (1974)

Even here, Vonnegut is punning with newscaster Walter Cronkite (most trusted man in America!). Both Vonnegut and Albee self-consciously interrogate the ends of literary form. What Burke, Albee, and Vonnegut (in the sixties, at least) have in common, though, is a celebration of the conscientiously wayward son. I’m not familiar enough with Albee to know if he rejected this eventually, but Vonnegut and Burke do conclude that this approach is fatally flawed.

The excuse for lying and behaving badly? In all cases, it’s a matter of form. The villains (or heroes, it’s hard to tell through all the irony) are always champions of impecable form. As opposed to what? It seems fair to ask. Content might be the easy answer, but I think Burke nails it down better than that. For Burke it isn’t that content is somehow unimportant or bad, but rather the scientization of content as information which they seek to vilify. This is strikingly similar to the bit I posted a few days ago from Werner Herzog.

One of the most striking derangements of taste which science has temporarily thrown upon us involves the understanding of psychology in art. Psychology has become a body of information (which is precisely what psychology in science should be, or must be). Similarly, in art, we tend to look for psychology as the purveying of information. . . .[Joyce, Homer, and Cézanne are summoned as examples]

. . . Thus, the great influence of information has led the artist also to lay his emphasis on the giving of information— with the result that art tends more and more to substitute the psychology of the hero (the subject) for the psychology of the audience. Under such an attitude, when form is preserved it is preserved as an annex, a luxury, or, as some feel, a downright affectation. It remains, though sluggish, like the human appendix, for occasional demands are still made upon it; but its vigor is gone, since it is no longer organically required. Proposition: the hypertrophy of the psychology of information is accompanied by the corresponding atrophy of the psychology of form [emphasis mine].*

Kenneth Burke, “Psychology and Form” published in Counter-Statement, p. 32-33 (1931, 1968)

There’s a lot to digest in this short essay. One of the key things, I think, is his claim that information is intrinsically interesting but not necessarily intrinsically valuable. This tends to be borne out by the focus on the commonplace by many modernists like Walker Evans and Edward Weston; it’s as if they sought to provide a sort of value-added by reintroducing form to an audience’s perception of common objects. Of course, Evans did so with great irony and Weston might be considered irony-deficient. Both were uncomfortable with any sort of psychological criticism. I suspect it’s because, as Burke claims, people are too interested in the psychology of the hero (or artist) while ignoring the psychology of the audience.

I suspect we’d all be more comfortable if Hamlet’s father’s ghost would show up to tell us what to do.

And I roam, too.

*I’ll have to come back to that— I think that we might have swung to far the other direction in the ensuing years.

Time and the Machine

Time and the Machine by Aldous Huxley (1936)

Time, as we know it, is a very recent invention. The modern time-sense is hardly older than the United States. It is a by-product of industrialism – a sort of psychological analogue of synthetic perfumes and aniline dyes.

Time is our tyrant. We are chronically aware of the moving minute hand, even of the moving second hand. We have to be. There are trains to be caught, clocks to be punched, tasks to be done in specified periods, records to be broken by fractions of a second, machines that set the pace and have to be kept up with. Our consciousness of the smallest units of time is now acute. To us, for example, the moment 8:17 A.M. means something—something very important, if it happens to be the starting time of our daily train. To our ancestors, such an odd eccentric instant was without significance  –  did not even exist. In inventing the locomotive, Watt and Stevenson were part inventors of time.1 [emphasis mine]

Another time-emphasizing entity is the factory and its dependent, the office. Factories exist for the purpose of getting certain quantities of goods made in a certain time. The old artisan worked as it suited him with the result that consumers generally had to wait for the goods they had ordered from him. The factory is a device for making workmen hurry. The machine revolves so often each minute; so many movements have to be made, so many pieces produced each hour. Result: the factory worker (and the same is true, mutatis mutandis, of the office worker) is compelled to know time in its smallest fractions. In the hand-work age there was no such compulsion to be aware of minutes and seconds.

Our awareness of time has reached such a pitch of intensity that we suffer acutely whenever our travels take us into some corner of the world where people are not interested in minutes and seconds. The unpunctuality of the Orient, for example, is appalling to those who come freshly from a land of fixed meal-times and regular train services. For a modern American or Englishman, waiting is a psychological torture. An Indian accepts the blank hours with resignation, even with satisfaction. He has not lost the fine art of doing nothing. Our notion of time as a collection of minutes, each of which must be filled with some business or amusement, is wholly alien to the Oriental, just as it was wholly alien to the Greek. For the man who lives in a pre-industrial world, time moves at a slow and easy pace; he does not care about each minute, for the good reason that he has not been made conscious of the existence of minutes.3

This brings us to a seeming paradox.2 Acutely aware of the smallest constituent particles of time – of time, as measured by clock-work and train arrivals and the revolutions of machines – industrialized man has to a great extent lost the old awareness of time in its larger divisions. The time of which we have knowledge is artificial, machine-made time. Of natural, cosmic time, as it is measured out by sun and moon, we are for the most part almost wholly unconscious. Pre-industrial people know time in its daily, monthly and seasonal rhythms. They are aware of sunrise, noon and sunset, of the full moon and the new; of equinox and solstice; of spring and summer, autumn and winter. All the old religions, including Catholic Christianity, have insisted on this daily and seasonal rhythm. Pre-industrial man was never allowed to forget the majestic movement of cosmic time.

Industrialism and urbanism have changed all this. One can live and work in a town without being aware of the daily march of the sun across the sky; without ever seeing the moon and stars. Broadway and Piccadilly are our Milky Way; out constellations are outlined in neon tubes. Even changes of season affect the townsman very little. He is the inhabitant of an artificial universe that is, to a great extent, walled off from the world of nature. Outside the walls, time is cosmic and moves with the motion of sun and stars. Within, it is an affair of revolving wheels and is measured in seconds and minutes – at its longest, in eight-hour days and six-day weeks. We have a new consciousness; but it has been purchased at the expense of the old consciousness.

1I located this essay through the article on James Watt on Wikipedia, which referenced a magazine article from 1973 which cited the emphasized quote. It turns out that this six paragraph essay was printed in a wide variety of writing text books, including An American Rhetoric by William Whyte Watt. I love this Amazon review of An American Rhetoric:

This book is, unfortunately for the literary world, out of print although it is probably only of interest to ‘true and thoughtful’ followers of English composition and literature. I am interested in the teaching of Mr Watt and also other leaders and instructors who developed the notions of creative and responsible writing that influenced writers of the period from the 1950s through the 1980s, after which, sadly to say, literature seemes to have ‘gone to hell in a handbasket’. I believe it is unfortunate that these fine Professors of detail and research have fallen into disfavor. I purchased the book at a premium price in order to once again enjoy the detailed works and guidance of one of the few who clung to attention and to fact and extactness. [sic]

2  The usage of this essay for evaluation of comprehension has persisted, a evidenced by this notation of the 2002 New York State Regents English Literary Arts Exams:

3. Day Two, Part One: The “Compare and Contrast” Essay: The exam uses the last two paragraphs of a six-paragraph essay by Aldous Huxley, Time and the Machine. The altered passage now begins with the sentence: “This brings us to a seeming paradox.” Students cannot know what “this” refers to, without the preceding paragraphs. Compounding the problem, students are asked to answer a question about what the “paradox” refers to.

3 This paragraph, elaborating on the paradox of the culturally specific creation of time, was perhaps the offending part to the NYS examiners. Huxley’s deployment of cultural difference was not politically correct, but it was hardly racist. These days though, it seems accurate because I suspect no corner of the globe can be characterized as “pre-industrial.” This oversimplified version of culturally relative “time” doesn’t wear well into the twenty-first century. It’s more far more complex than six paragraphs can describe.

Pleasure, of a sort

OildaleBreckenridge Mountain views

Reflecting on the two photographs I chose from the cloud of images I took when I visited Bakersfield in 2008, my first return after a decade or so, I suppose the only criteria was that both pictures please me. It is tricky to speak of images as “texts” (I do not wish to offer a “reading” of either picture) and yet it is pleasing to locate the studium and punctum, a la Barthes.

On the left, the studium dominates— when I think of the California I knew it is punctuated with parking lots (in this case a Dairy Queen) and palm trees. These are the “facts” which I never really tired of studying, a perverse sort of pleasure in their constancy. On the right, it is the painted cattle guard as a sort of border between the valley and the mountains, what pricks me (punctum) is not an emotional connection with a pretty sunset, but rather an intellectual pleasure in the knowledge (only found outside the frame on a map) that this is a more than symbolic boundary1 between the open ranges of mountains and my fenced valley home suggestive of its properties. In both cases, reducing the images to symbolic content leaves a taste— a remainder from the division— of a place I once called home. The pleasure “for me” is complex and as Barthes suggests “neither subjective nor existential”:

If I agree to judge a text according to pleasure, I cannot go on to say: this one is good, that bad. No awards, no “critique,” for this always implies a tactical aim, a social usage, and frequently an extenuating image-reservoir. I cannot apportion, imagine that the text is perfectible, ready to enter a play of normative predicates: it is too much of this, not enough of that; the text (the same is true of the singing voice) can wring from me only this judgment, in no way adjectival: that’s it! And further still: that’s it for me! This “for me” is neither subjective nor existential, but Nietzschean (“. . . basically, it is always the same question: What is it for me? . . .”).

Roland Barthes, Pleasure of the Text (13)

When I read this passage a couple of days ago I puzzled over his usage of “Nietzschean.” It took quite some effort to track down the passage he quotes assuming everyone knows. Asserting that the”what is it for me?” question— in matters of pleasure— is not subjective seems to contradict the definition of subjective. After all isn’t all pleasure contingent on the existence of the self? It’s easy to accept that pleasure can’t be existential (because pleasure cannot exist outside the self). The implication that pleasure can be tactical or strategic (or have any sort of pragmatic dimension) is rightfully discarded, enhancing the connection with aesthetic pleasure. But why isn’t pleasure subjective? Perhaps only because of his disclaimer: pleasure in this Barthesian sense has no use and therefore is not a matter of personal benefit/perspective. So the pressure is all the stronger on the for me: to what end, if not a personal utility?

The answer, near as I can tell, is in the passage in Will to Power he quotes so ambiguously and imprecisely:

The answer to the question, “What is that?” is a process of fixing a meaning from a different standpoint. The “essence” the “essential factor,” is something which is only seen as a whole in perspective, and which presupposes a basis which is multifarious. Fundamentally, the question is “What is this for me?” (for us, for everything that lives, etc. etc.)

A thing would be defined when all creatures had asked and answered this question, “What is that? concerning it. Supposing that one single creature, with its own relationship and stand in regard to all things were lacking, that thing would remain undefined.

In short: the essence of a thing is really only an opinion concerning that “thing.” Or, better still; “it is worth” is actually what is meant by “it is” or “that is.”

One may not ask: “Who interprets then? for the act of interpreting itself, as a form of the Will to Power, manifests itself (not as “Being” but as a process, as Becoming) as a passion.

Will to Power

I am certainly not an expert on Nietzsche, and I have many quarrels with most of his interpreters, but it seems to me that most of this is fairly easy to grasp— up to a point. To say that something “is” always entails an opinion and a corresponding value judgment. But the conclusion alludes to (this is a fragmentary and incomplete text) a sort of metaphysical (at least it seems to me) resolution of the problem of missing universal things: universal will. Described here as a passion, it seems to me that what Barthes is summoning in his “Nietzschean sense” is a sort of will to pleasure that exists beyond the existential and the subjective.

Thus, Barthes’ parenthetical benefits from the more emphatic/complete substitution from Nietzsche’s notes

. . . that’s it! And further still: that’s it for me! This “for me” is neither subjective nor existential, but Nietzschean [Fundamentally, the question is “What is this for me?” (for us, for everything that lives, etc. etc.)]

So the aesthetic impulse (pleasure) in this case a universalizing one, a conjecture that the pleasure might be something more than personal/subjective feeling. I like this idea a lot; the possibility that taste, in some way, might transcend its social/communicative utility. But this is a big leap. The commentators I have read on Barthes’ text emphasize the pleasures of text as a way of escaping the subject position, the possibility of liberation— but no one I have read seems to notice that this path leads through universals.

Universals just aren’t Barthesian. The dissonance jars me; I don’t have that much problem with universal claims, as long as they are identified as such. This way of circumventing universals is sly: his claim is for universal processes rather than universal values. Nonetheless, following Nietzsche’s suggested substitution of “it is worth” for “it is,” there is no escape from value judgments and pragmatic utilities. Barthes core claims are at odds with each other.

Who interprets? I think we are doomed to ask that question. 

1Although the lines are an illusory barrier, they are nonetheless a physical presence in the world and not merely a symbol.


When I was a kid, I was really interested in eastern religions (particularly Zen). The framework of Buddhism just didn’t work for me though— absence of striving? WTF? Fine for rocks and trees, not so useful for people. People need things. That’s the only way we learn anything— we must need to learn. Learning, in fact, seems to me to be the goal of consciousness.

Learning should not be confused with literacy. Learning seems to be more deeply connected with a core part of being human, our predisposition toward planning. Squirrels, regardless of the clichés, don’t have retirement plans. Sociality (and thus communication skills) is nonetheless a central need for those with an eye on the future. We hairless apes are not particularly self-sufficient. We band together to survive, honing specific skills to be accepted within our tenuous circles of sociality. We figure out how to fulfill someone else’s needs, so that we may in turn be satisfied.

When I was a kid, I craved images. I would sort pictures into little piles, trying to figure out why I liked them and wanted to return to them. I usually couldn’t vocalize, let alone write down why I liked certain images over others. The more images I saw and collected, the more inexplicable the whole process became. I read a lot, at first to figure out how to make/do things and later to understand why people did the things they did (my father suggested Shakespeare and the Russian writers). I discovered Blake, Milton, and the rest of the usual suspects (Vonnegut, Kerouac and the Beats, etc) that a young man reads. But I didn’t need to be a writer. I was satisfied with reading; but I felt like I could make images. I spent decades learning most everything I could about it.

Somewhere around 36, I finally felt a need to learn how to write. I had to come up with an exhibition statement. A friend named Jeff, who was completing a masters in English Lit at UC Irvine, went through my rough draft with a pen reducing it by about two thirds. Reading between all the blacked out words, it was better. It seemed like writing was a lot like composing images— getting rid of the junk so you can see more clearly the subject that interests you. Writing didn’t seem that hard.

At 37, I found a need to write. I was smitten by a woman half a continent away and the primary form I had to relate myself and my feelings was through  (electronic) love letters. Words worked out well between us, but the when I moved to Arkansas to be with her the reality did not. I didn’t understand why I failed so miserably. At 38, I went back to school both to try and meet new people and learn how to make a better living.

At first, I wanted to study everything— Art, History, Literature, etc. but eventually two paths emerged. I loved literature, so hanging out with other people who liked reading it too was cool. But it wasn’t much of a career plan. Images weren’t being kind to me by this time either; everything seemed so painful. I felt like a walking nerve. People seemed to think I was a good writer. I was never quite sure why. An English professor suggested that I look into technical writing. So I got a dual BA in Rhetoric (seemed much more interesting than “technical writing”) and Literature.

Moving forward into a Master’s degree in Rhetoric (I never could stay interested in technical writing) I was allowed to teach, I loved that. The fundamental need of most students who want to survive college is the need to write. Granted, writing papers is not nearly so interesting as writing love letters or novels, but it is a specific survival skill. Later, I grew to love teaching technical writing as well because it is clearly writing that fills a need in the world. The world doesn’t need a lot more hackneyed love letters or lame novels. We need to understand what we are saying to each other more completely.

I didn’t finish my Ph.D., though I did all the course work because I just simply couldn’t find the need to. I loved teaching, but I never loved the politics of evaluation. How can you really know if you are meeting other people’s needs? I’m not so god-like as to profess to know.

It’s hard to find any real need to write these days. I’m happy, and I really want to rediscover my relationship with images. Some things, you just can’t describe in words. Maybe it’s time to get back to sorting it out. I suppose it’s best to blot out the bits that don’t fit.1

1When I first entered the composition classroom, this was the first assignment that greeted me: compose a literacy biography describing your relationship with writing. Often, there were hidden traumas in there. For example, during my first attempt at community college a million years ago, I dropped out because I wasn’t getting a good enough grade in “English composition.” I was a victim of the grammar cops. So what? I also didn’t have a reason to write— a detail that seems far more important to me.



Watched Hubert Selby Jr: It/ll Be Better Tomorrow last night. His last words were a list:

A list of indignities

  • Birth
  • Death

Most writing begins with autobiography. The craft involves placing one word after another, and one’s self as a topic is always close at hand. Free-association aside, writing involves communicating something to someone and it’s easier to communicate something you have knowledge of. Before taking Chuck Anderson’s class, I hadn’t given a lot of thought to how we lie to ourselves and each other when reconstructing events. Birth and death are great examples: without the buffer of fictionalizing these experiences, they might be too much to bear. Trauma is never far away from our memories. Trauma is how we learn.

Of course, one could make the argument that we are guided by pleasure as much as pain. We learn that lying in the sun is pleasant, that certain foods or behaviors make us feel good, etc. but these things lack the persistence of memory found in the unpleasant. We dwell (with good reason) on what we don’t want to happen again more than what we wish to repeat. Pleasures, when repeated, are often diminished and loose their luster. Pain shines through in the quiet moments when we don’t have much else to occupy our consciousness.1 Most sane people would not chose pain; it chooses us. So we narrate painful memories from a position outside them.

Autobiographical writing, then, is largely a subset of fiction writing. It’s frequently pathetic and not particularly interesting to read unless the writer has a talent for embellishment that isn’t eclipsed by the inclination to whine. When I was teaching writing, it seemed like the hardest task was to get people to get past the fiction— to quit whinging (and wanking) and write something of real world consequence. While it is certainly true that autobiographical writing is of great consequence to the writer, it’s circle of influence seldom stretches beyond personal rationalizations of the indignities of life. We write about what we know for an audience that we know cares: ourselves. Rationalization is essential, because otherwise what we label as experience is meaningless.

Selby’s Last Exit to Brooklyn was one of the few books that I couldn’t read without putting down, over and over again. The scenes that unfolded were too much to bear. But I suspect that was part of the game he was playing with himself: testing just how far he could push language into the indescribable, into the sublime. Over and over people in the film remarked what a “regular” a guy he was in person. His books are literature, not autobiography, though I have no doubt that parts of them began in experience. He pushed them the other way down the axis closer to irationalization. His friends spoke of his books as redemptive; I never found redemption. The universe is dark in there.

(1) I had a dentist once who explained it this way: During the day there are a lot of distractions that keep you from thinking about your pain. In the middle of the night is when the toothache really hits you, and becomes unbearably severe. You can’t sleep because the quieter you become, the more intense the pain is.

Poetry Corner

I’ve always been a huge fan of the work of Paul Henning, but I wasn’t expecting to find this gem featuring Dennis Hopper. It gives me an excuse to work on the problems of hosting/embeding flash video.

I can generate .flv files with no problems, but the player part of the equation had me stumped until I happened on an easy solution. Now full screen video is within reach: more experiments below the fold.

Continue reading “Poetry Corner”

Storytelling (1)

I’ve been thinking a lot lately about the problem of storytelling, particularly about the way that technology impacts the way that we tell stories. There’s a lot to say about it, but it seems like some throat-clearing is in order.

Over the last few days, a couple of rhetoricians have weighed in on Doris Lessing’s Nobel prize acceptance speech—seemingly without bothering to read it first. This tactic reminds me of the sort of snap judgments that first-year composition students make—they accept the consensus of their peers without question. I suppose it’s one of the hazards of the rapid-fire atmosphere of electronic discourse—it’s easier to twit than to perform any sort of analytic work.

Lessing’s speech is also a wonderful example of the classic solitary originary proprietary model of writing, which might provide an interesting contrast to the newly emergent models of distributed collaborative authorship if more close reading were applied. But there isn’t space or time for that at this moment; I’ll press on with the reactive component, hoping I can return at a later date to the analytic problem.

Dennis Jerz and Clay Spinnuzi are not stupid people. I wouldn’t normally expect this sort of knee-jerk. I remember months ago, rr linked to a video of Lessing being ambushed by journalists when she won the prize. She couldn’t think of anything to say, apparently, and ended up asking the reporters to tell her what to say so that she could repeat it back to them—a tactic first suggested by Andy Warhol in one of his books as I recall. Jerz and Spinuzi didn’t misread the speech as far as I know, they simply parroted back the critique of Techcrunch and Ars Techica—which read Lessing as claiming that the internet makes you dumb or that it was the cause of our fragmented culture. Really? That’s not what I read. Here is the pertinent section, as printed by the Guardian:

Continue reading “Storytelling (1)”

There is an eloquence in true enthusiasm

. . .a few of the articles suggested that the great man’s brain had been visible to onlookers during the procedure.

The first of these was an undated letter to the editor of The Baltimore Gazette, which claimed that “a medical gentleman” had seen “that the brain of the poet Poe, on the opening of his grave … was in an almost perfect state of preservation,” and that “the cerebral mass, as seen through the base of the skull, evidenced no signs of disintegration or decay, though, of course, it is somewhat diminished in size.”

The second was an 1878 article in the St. Louis Republican, noting that “the sexton who attended to the removal of the poet’s body” had lifted the head during the exhumation and reported seeing the brain “[rattling] around inside just like a lump of mud.” The sexton reportedly thought that “the brain had dried and hardened in the skull.”

“What I realized was, if that was the case, it would be the only physical evidence we have of what Poe’s condition was at his time of death,” Mr. Pearl said.

Intrigued, Mr. Pearl asked a coroner for an expert opinion. “I read her the description,” Mr. Pearl said, “and she said, ‘Well, that person is just wrong. Unless you embalm the body, the brain is the first thing to liquefy. There’s no way it would still be there 25 years later.’”

Poe’s Mysterious Death: The Plot Thickens!


Our modern science, abandoning the search for the Absolute, has been scrutinizing every atom, to weigh and name it, and to discover its relation with its neighbors. “Relativity” has been the watchword. Science literally knows neither great nor small: it examines the microbe and Sirius with equal interest; it draws no distinction between beauty and ugliness—having no preference for the toadstool or the rose, the sculpin or the trout: it is impartial; it seeks only to know. By observation and experiment, by advancing from the known to the unknown, science has begun to make the first accurate inventory of substances, laws, and properties of the worlds of matter. Its achievements have already been stupendous. Its methods have dominated all other works in our time; it was inevitable that they should encroach on the sphere of art and of literature.

Continue reading “Epidermists”