Midnight Picnic

By Nick Antosca (Word Riot Press, 2009)

Bram pulls into the parking lot half asleep and the crunch of gravel under his tires becomes the crunch of bone. Something screams.

The old deerhound that lives at the bar—it’s pouring tonight and he didn’t even see her.

That crunch.

He gets out of his dented Pontiac, hunches against the downpour. He doesn't want to look. It's 3:30 A.M. and the bar is dark. No light to see by except the Pontiac's headlights, ghostly cones of white slashed by rain.

He kneels to look under the car.

Nothing.

"Baby!" he yells, getting up. "Where are you?"

Movement off in the darkness, on the other side of the car. The deerhound, dragging herself away. She looks less like a dog than a man in a dog suit, huge, crawling across the gravel. He goes to her side.

"Oh, I'm sorry, I'm sorry,"—his voice splintering—"Hold on, let me look."

The damage is catastrophic; the dog will die.

Not quickly, though.

These are the opening paragraphs to Nick Antosca's , a short and terrifying book that I read a few months ago now and just can't get out of my head. The plot follows Bram, an aimless young man living in West Virginia who finds the bones of a murdered child. Hours later, the dead child finds Bram and asks him to help avenge his death. What follows would be, in the outline of the plot, part ghost story, part revenge story, except that the experience of reading it is less like either narrative and more like having a waking nightmare.

I don't use the word terrifying lightly. As anyone who's been to a bad horror movie knows, scaring people is not easy. Do it wrong and it's boring, or maybe just kind of disgusting, or worse, unintentionally funny. (That the line between horror and comedy is so thin and blurry is one of the reasons, I think, that the has blossomed into such a delightful genre.) Do it right, though, and you tap into the fear that early humans must have felt when the sun went down and it began to rain, and they were huddled in a group under a tree that did not provide shelter, and they knew that predators were coming for them. For me, the first two-thirds of The Shining do that (though not the final third, which becomes boring); perhaps all of 28 Days Later and much of Clive Barker's stuff does, too.

But Midnight Picnic's particular brand of scare reminds me most of David Lynch, who, , pulls horror from simple elements—lighting, sound, costume, a good line, clever camera work—capturing with eerie effectiveness the experience of having a very bad and extremely compelling dream. Antosca's own use of such dream logic is the best I've come across in a long time. There are a few missteps—at one point, about halfway through the book, Bram interrogates the dead child in a way that very nearly breaks the spell—but here I'm just quibbling. I could give you passage after passage of the images and conversations that engrossed and frightened me, but I don't want to ruin them.

Also, and most impressively, Antosca manages to give his story what many horror tales never even reach for: heart. Yes, Midnight Picnic is scary. But it's also, keenly and unexpectedly, touching and tragic; for underneath the ghosts and revenge is another story about a boy looking for his father, but not being quite ready for what he finds.

So, Iguana be a citizen?

My friend Molly and I were strolling through East Rock Park last Saturday morning. Not unlike the joggers and the church picnickers, we were thinking about life and what it felt like to live it on that sunny morning. We were happily yammering away when in the middle of the path, in broad daylight, unmoving and prone was a four-foot long iguana. There was a man standing next to it and looking down at it sadly. Closer, we realized the iguana was really hurt. I mean really hurt; as in, he reared his head when he was prodded, and opened his big mouth to hiss a silent hiss of dying. His guts were in his mouth. The poor thing was busted up near dead.

The man on the Blackberry was Justin of Friends of East Rock. He had already phoned the police and was on hold with animal control. Molly and I took turns getting closer looks at the lizard, at once morbidly curious and frightened.

Justin looked at us and said earnestly, "I have to go to a meeting. I've called the police…" And with that, we were charged with responsibility, immediacy, and yes, citizenship. He left us and there we stood guarding the dying iguana. Thus began a Saturday morning taste of real community.

A man walked up with a baby boy, came and checked out the iguana, told us it was supposed to be green, not the jaundice it was. We wondered together if it had been dumped, already hit by a car, or if a Parks and Rec. truck had run it over. The cynic in me thought it had been hit then put in the middle of the park to be found and buried. The half-full woman in me believed it had been living happily in the park for months, and upon reaching for a far-off branch, had fallen from the tall oaks above us.

The man with the baby offered to stay with the iguana while Molly gathered sticks to weigh down a make-shift trash bag shroud for the thing. I went to houses around the park knocking on doors, asking if anyone was missing a pet iguana. I interrupted a woman mowing her lawn, explained the story, and she told me that she was certain none of her neighbors to the right of her had a pet iguana. But the people two houses down, who knew? She didn't really know them. At another house, a man came to the door while on hold with the telephone. "I hope you aren't missing an iguana," I greeted him. He was happy to report he wasn't and was so kind to then ask the operator to hold while I filled him in on what was going on in the park.

And what was going on in the park, as I now looked back to see Molly amid a small and curious crowd, was in the business of community. Some sixth graders came with their bikes and their father. Turns out they were from my school, Foote School. Turns out they were coming from an alderman's party. Turns out the man with the baby wants to run for alderman. Turns out the local poet Alice Mattison and her husband Ed came to see what was going on. Her husband is a former alderman.

Then, the policeman arrived and declared, "The Green Iguana is not native to this park." At first I thought no shit, and as he talked it was clear he was familiar with reptiles; he's got a few snakes as pets. He reckoned the iguana was kept by some ignoramuses who dropped it in the park and that then a truck came by and squashed the thing. He went to check on the iguana under the glad bag and when he poked it, nothing happened.

It didn't move. It was dead. It had died just there. It was alive and then it was dead.

He picked up the body, and folded the thing in, and the thing arced at the bottom of the trash bag.

People's faces were all sad. We were all sad for this poor alien, this poor orphan, and this poor untold story of a living thing.

And that was it. We used the bathroom, and kept walking down Orange street feeling like we belonged to something bigger than ourselves. And that the charge of respecting a helpless living thing, no matter how random and bazaar, brought people out of their own lives, and brought us together. Iguana community like that, don't you?

Seidel'd

One of my more interesting reading experiences last fall was provided by Frederick Seidel's Ooga-Booga (2006). I don't know much about Seidel except he's rich, was born in 1936, published his first book of poems in 1962, and didn't publish another book until 1979. His Collected Poems, 1959-2009 was released a few months ago. I'm hoping to dawdle through it this summer. Whatever we expect a poet to show us, it's rare that he shows us a lifestyle to which only that elusive 5% of the population with 37% of the wealth are accustomed. In Seidel's case, as in "Barbados," there is an outrageous tendency to be as rancid as anything he might witness. Poets with political axes to grind do, of course, give us glimpses of brutal acts and consequences to jar us out of our literary complacency. But Seidel somehow seems to suggest that all he's grinding is his pencil, to make it sharper. Whatever the outcome of the chaos we live in, he seems to shrug, I was there.

But what makes his writing so hard to fathom is its childlike simplicity. Or, rather, its simplicity is so arch, so tongue-in-cheek, so craftily artless, that one always waits to be slapped or jabbed by the inevitable line that arrives with all the specific, precise density -- drowning in acid -- of Robert Lowell or T. S. Eliot when they suddenly drop the right phrase into its inevitable place.

Huntsman indeed is gone from Savile Row, And Mr. Hall, the head cutter. The red hunt coat Hall cut for me was utter Red melton cloth thick as a carpet, cut just so. One time I wore it riding my red Ducati racer -- what a show!-- Matched exotics like a pair of lovely red egrets. London once seemed the epitome of no regrets And the old excellence one used to know Of the chased-down fox bleeding its stink across the snow. --"Kill Poem"

Yeah, and "a savage servility slides by on grease." To me, the echoes of Lowell's "Skunk Hour" dance through a poem that strikes me as a Charles Bukowski poem for an uncannily different demographic. But Bukowski came to mind while reading Seidel, not only for the "fuck you if you can't take me" ethos that these poems exude, but for a sense of the poem as the only possible response to a life of this tenor. Once your lines become this spare, they spare nothing.

But look at how the diction does whatever it wants -- the beautiful balance of line 3 ends with that hanging "utter" that is itself pretty damned utter. And then the "what a show!" interpolation in a flash makes speaker and poem as cartoonish as anything -- or at least as any inconceivable commercial for Ducati racers(!) could be. Then the "matched exotics" of "egrets" and "regrets" so funny and so baldly bad, as we veer into "the old excellence" that ends with a line worthy of Lowell and an image that suddenly brings in the death and blood that lurks so smugly behind all our diversionary tactics. Gee.

What I like about Seidel is the way he plays our banalities back at us, but first subjects them to a sea-change that causes the acrid brine of his own peculiar vision to cling to them:

The young keep getting younger, but the old keep getting younger. But this young woman is young. We kiss. It's almost incest when it gets to this. This is the consensual, national, metrosexual hunger-for-younger. --"Climbing Everest"

What is said is what anyone commenting on how the rich old court the fresh young might say -- but it would be said in a wagging finger way, or at least with mockery of the jaded, fading oldster trying to ignite himself via youth. But Seidel says it with a kind of rueful surprise at being the oldster accepted by youth in his "hunger-for-younger." In other words, it's not jaded at all, but almost charmingly surprised by the mores of "almost incest," where the words "consensual, national" do the job of making both old and young part of a machine that operates simply because it operates. "My dynamite penis / Is totally into Venus" Seidel quips, the intonation of youth appropriated by age to make the sex act partake of "the moment" as, we tend to think, only youth can. The insinuation of the poem -- that such sex acts, like that Ducati racer, are grandiose acts of death-courting -- never stops asserting itself after that first verse of foreplay, and each joke gets a little edgier, stripped of any self-satisfaction, but gripped by the vanity of vanitas, which is to say that being vain is a vain endeavor, that the grave is grave, and that "the train wreck in the tent" is addicted to all the tender mercies he can get.

Judging by Ooga-Booga, Seidel is an acquired taste that I'm on my way to acquiring because his poems confront me in a way that the poets I end up living with for awhile do. Bring on those Collected Poems.

Classics I Hate

When I was in the midst of receiving my doctorate in American literature from the City University of New York Graduate Center, I made my obligatory pilgrimages to the annual convention of the Modern Language Association. My first was a doozy. I vividly remember a panel I attended on canonical and non-canonical works, where such well-regarded scholars as the incendiary and the all-too-conservative duked it out over the Western canon and the validity of the "classic." Both trotted out their respective arguments and in the many years since I have come to take stock in the merits of the two sides. There is definitely room in the canon—whatever that is—for new work that need not labor in the shadow of Melville and Emerson or even the critical sensibility that placed them at the top. On the other hand, there is absolutely no way to regard all published works of literary fiction on par with one another in terms of quality or even critical interest. Charles Dickens is better than Stephen King, just like Stephen King is better than John Saul, who is really not much better than anyone. Now we can argue about what we mean by better, but if we take as one aspect of it my second criterion of "critical interest"—worthiness and worthwhile-ness for critical examination—then, yeah, Dickens is better than King. There is more to say about Dickens' work than there is to say about King's, and on multiple fronts, too: historical, economic, linguistic, sociopolitical.

So, in my mind, there are such things as classics, although I don't much love the term and the baggage it carries. Classics presumably point to works of quality that support that much more critical interest than other works. And this raises, in turn, an issue I have become quite fascinated by: classics I hate.

The hated classic finds its antithesis in the guilty pleasure, which in today's world is hardly a source of shame. Hell, my wife is more than happy to talk about her preferences for American Idol—even though she was less vested in this year's selection of Kris Allen—and I can freely admit my penchant for old Kung Fu movies and Firefall's "You Are the Woman" (please don't hit me). There are many who happily boast a passion for various species of bad art. I have friends who love Z-movie vampire flicks. My sister thinks Dumb and Dumber is one of the greatest comedies ever made. I had a boss who watched Buffy the Vampire Slayer religiously. There is even circulating a much-talked-about documentary on the "" ever. Need I go on?

However, we tend to be more circumspect about how much we dislike great art. True, it is easy enough to confide among friends our gut feeling that Giacometti's sculptures seem childish or Verdi is a bore. But put us in a room of intellectual peers or, even worse, acknowledged superiors, and suddenly it becomes a more vexing matter. We still may not like Giacometti or Verdi, but try justifying your response without sounding entirely solipsistic ("What can I say? It doesn't do a thing for me"), all of which seems to raise important questions about our response—and those of our peers. What do they know that I don't? Is it a question of unacknowledged personal immaturity? Or is this classic just another example of mass hysterical bad judgment? (It's been known to happen.) Or perhaps questions of taste really are relative and Stephen King can be as good as Charles Dicken—Heaven forfend!

With bad art, I suspect we're allowed to indulge our innate solipsism. Why am I willing to overlook how crappy old Kung Fu movies are? The escapism, formulaic storytelling, acrobatic choreography are all psychological creature comforts of the circus and childish wish fulfillment. But why do I hate The Scarlet Letter? It's dull, dull, dull, and I'll take the The Blithedate Romance over it in a heartbeat. So what the hell am I missing?

This is not an insignificant question. As a former college teacher, I was constantly placed in the position of rebutting student charges of dullness, an eternal source of frustration that seemed little more than the response of the lazy mind. In my struggle to teach students how to appreciate works by Conrad, Austen, Poe, Blake, and innumerable others, this response surfaced again and again as an ever-elusive combatant whom I could never quite grasp and pin down.

So why do I hate The Scarlet Letter? Why is my memory of it hardly a pleasurable one? Why has this novel never moved me in any way whatsoever? These are all questions that deserve a better answer than "I'm sorry but it's just a dull read." After all, I am more than willing to tolerate the lengthy mood settings in Joseph Conrad or the fine needlepoint psychological excursions of Henry James. I know The Scarlet Letter is a classic; I can even sense it! But there is radical disconnect, one that flummoxes any attempt at quick explication.

So for now, I am without answers; someday I hope to offer better ones. Until then, let me turn it over to you: Which classics have you found to be an utter failure in your experience as a reader?

The street where I live . . .

I have been thinking about turning I wrote about my street, West Rock Avenue, into a book, and so I have been doing a lot of reading about urbanism, town planning, and architecture. Basically, I am trying to figure out what makes some streets livable and others not. A good deal of the literature — by people including New Havener Philip Langdon, whose A Better Place to Live has given me a whole new outlook on what makes a space a happy one in which to dwell — boils down to this: don't depend on cars. People are happier when they can walk to see neighbors, ride their bicycles, and live close enough to their neighbors that they know them. This small-town mythology is one that I am particularly susceptible to, having grown up in a neighborhood that had many of a small town's virtues. And I find myself, as I read these books, falling prey to an unfortunate smugness, as if growing up on streets laid out on an easily navigated grid, with houses on quarter-acres instead of large lots, is the only way to have a happy childhood.

But that can't be right. For one thing, this mythos runs contrary to another important American mythos, the rural farm. I don't think many of us would want to say that children growing up in the countryside, learning to milk cows by their parents' sides, are unhappy. Nobody thinks that that's an uninspiring or despairing way to grow up. And, to be fair, the writers I'm reading aren't reacting against that way of life, which may be dying out; they are reacting against suburban sprawl, which seemed poised to dominate the American landscape.

But what of that suburban sprawl — especially those cul-de-sac developments that have proved so popular in late-20th-century construction? Can one have a happy childhood where there are no sidewalks, where it's too dangerous to ride a bicycle, where there are no secret passageways behind garages or corner stores at which to buy candy?

I don't know. On the one hand, I don't want to underestimate children's capacity for self-mystification. I suspect that most children, at least most of those who grow up middle-class, and sheltered from anything too abysmal in the family's home life, look back at their early years with a certain sense of awe and wonder. Those lookalike houses in Del Boca Vista Estates are not lookalike to the children inside them, who know which house has the best video-game system, which kid has the dad who makes the best forts with the dining room table and some blankets, whose parents go out late and don't hire a babysitter (all the better for watching verboten TV channels).

On the other hand, there is empirical evidence that suburban life of this kind can lead to bad things: obesity, too much time in the car, fewer friends, less play. And teenagers—forget about it. If they can, they flee to the city. Or at least the curious ones do.

But what I don't have are good sympathetic non-fiction books about life in suburban sprawl. For every book critical of that way of life — Langdon's book, Duany et al.'s Suburban Nation, Ray Oldenburg's The Great Good Place — there seem to be exactly zero books about why it can be pleasurable to grow up in spaces that are, after all, safe, predictable, and quiet, which are all good things.

I want the other side of the story. Ideas, anyone?

New Haven in the 21st Century...?

Driving through the flooded streets of Long Wharf this wet week, then ditching the car to hike up my skirt and trudge barefooted through filthy knee-high water in front of Union Station – all to catch a Metro-North commuter train to Manhattan -- I flashed on a day when life would seem less at odds with nature. Usually, I’m not a green utopist, but a visually stunning and provocatively written new book, (Abrams) is turning me on to the ecological origins and conceivable futures of cities. I’m so taken with this book that I’ve bought several copies as gifts (perfect for Father’s Day) and am planning to present it to my daughter’s grade school to adapt into the curriculum – that’s how important I think the ideas are.

Here’s the premise: even a city as developed as Manhattan began as a natural landscape of forests, trees, rivers and streams. In 1609, when explorer Henry Hudson first came upon Mannahatta -- the “island of many hills” to the Lenape tribe – it teemed with flora and fauna among fifty-five ecosystems that “reused and retained water, soil, and energy, in cycles established over millions of years.” Back then, Mannahatta supported a human population of three hundred to twelve hundred. Today, Manhattan is home to millions of people on a planet of billions. Yet, as author argues, it’s a “conceit” to think any city and its inhabitants -- no matter its technological and economic development – “can escape the shackles that bind [us] to our earthly selves, including our dependence on the earth’s bounty and the confines of our native place.”

Scholarship and research aside, Mannahatta is a fun read. Full-page color photos of today’s Manhattan juxtaposed with photographic visualizations of 1609 Mannahatta open up, centerfold-like, throughout the book. Author Sanderson, a landscape ecologist with the Wildlife Conservation Society, is adept at making complex scientific issues accessible to the lay reader. Charts and maps show habitats of species, distribution of fauna, and even the location of beavers in the vicinity of today’s Times Square. City buffs can pore over bathymetry and topography maps. I laughed aloud when I learned that the bronze bull standing at Bowling Green (the New York Stock Exchange) was once a hill twenty feet high.

By 2050, the majority of people on earth will live in cities, and they will have to become greener and more hospitable if they are to continue being the vibrant cultural and business centers they are today, as well as comfortable places to live in. In the last chapter, “Manhattan 2409,” Sanderson addresses basic human needs – food, water, shelter, energy – acknowledging that changes in infrastructure will most likely come piecemeal. Waterways and greenways will slowly replace beltways and avenues. Sanderson proposes new green buildings with lizard-like second skins that can both shade and insulate, open and close depending upon the season.

As to this week’s flooding at the New Haven train station? The next time the city renovates Union Street, they might use permeable paving materials that capture rainwater below the surface. Or perhaps, we can add a “green roof” -- a thin layer of soil planted with grasses and flowers that slows water flow and cools a building -- atop the New Haven Police Station? I can always dream.

Note: This September 12, 2009, celebrates the 400th anniversary of the arrival of Henry Hudson in New York Harbor. See also, the exhibition at the Mannahatta/Manhattan: A Natural History of New York City.

Bernard Wolfe: Anyone? Anyone?

Though New Haven is rich in intellectual history and, as a corollary to that, has a small place in literary history, one hears little of writers who've actually lived here. By writers I mean not writers who had to take teaching posts to get by but writers who grew up here and went on to Great Things (or even Greatish Things), or who just happened to wind up living here. It does happen. Sometimes. One of my parents is a huge sci-fi/fantasy fan from way back and so I grew up in a household that had ridiculous numbers of those cheap mass market paperbacks with lurid covers. I did not inherit the sci-fi/fantasy gene, so I was and am uninterested in this stuff, but there's one writer in that genre who fascinates me: Bernard Wolfe. Wolfe's stuff came to my attention maybe ten or fifteen years ago. It's not that I was so interested in his writing but I was intrigued by his writing career. He wrote a number of novels, and his subject matter was all over the place. His best known novel -- and a book that is, I understand, a sort of classic in the field -- is called Limbo. I'm sure if you like this kind of stuff it's great; I've tried to read it twice, been bored to tears each time, and don't expect to ever get through it.

But Wolfe wrote a lot of other stuff, too. He wrote a classic of jazz lit -- co-wrote, really with Mezz Mezzrow -- Really the Blues -- and he wrote a Hollywood novel (Come On Out, Daddy); he wrote political novels; and, as one can glean from the title of his memoir, Confessions of a Not Altogether Shy Pornographer, he wrote pornography. It was this book that I read from cover to cover, and which made me wonder: What the hell?

Because it turns out that Bernard Wolfe was a townie. The guy grew up in New Haven. Went to Hillhouse. Went to Yale -- an unusual thing for a Jewish guy of his generation. He graduated from college and was sure that he'd kick some ass in publishing only to find that no one would hire him. So he started out writing porn. (His being a Trotskyite was probably an issue, too, but what can you do.) He had worked as Trotsky's bodyguard, at one point. But he became a real literary figure in his day, and published many works which got reviewed by, you know, professional book reviewers. The New York Times Book Review knew who he was.

So how is it that no one I've talked to around here knows anything about him? Back when I had daily contact with literary geeks of many stripes, I would ask, periodically, "So tell me what you know about Bernard Wolfe." And I'd get very little back. People of a certain generation recalled the name, and that was pretty much it.

Even if Wolfe was just a hack, wouldn't you think that his name would be mentioned more often in New Haven? I mean, as a famous hack? Let's face it, this is a town that will grab desperately at any straw that seems like it'd be good p.r. ("We got restaurants! Boy, do we have restaurants! We got some damn good restaurants! No, don't go over there.... come over here, where there's some good restaurants!"). You'd think maybe that places like, oh, I don't know, the New Haven Free Public Library, for example, might have some Bernard Wolfe stuff sitting around. Well, they've got a copy of Limbo. That's it. The Institute Library has a copy of The Magic of Their Singing, which is Wolfe's take on Beat culture (and pretty entertaining at that). Yale, of course, has tons of stuff, but most of it is in the Beinecke (i.e. not circulating), and I think was mostly given to the place by Wolfe himself (though I may be wrong on that point). They have some of his papers, and I've read them, but it's a spotty collection. It's like the guy evaporated shortly before he died, leaving almost nothing behind. Spooky.

So come on. There's more to this. Wolfe was obviously something of a wonderful maniac in his day. Why don't we know more about him? There was clearly a bunch to know... and I, for one, would like to know it.

“My upbringing was pretty weird," says David Bowie's son

I know. You're thinking, "No WAY." But sure enough. Or so Duncan Jones, the artist formerly known as Zowie Bowie, the New York Times last week.

Jones was recalling the formative years during which his father introduced him to the likes of George Orwell, J.G. Ballard and Philip K. Dick, and let him hang around the set on movies such as Labyrinth. The occasion for these revelations was a publicity push for Jones' feature film directorial debut, Moon--an impressive piece of work, not least because its general disposition is so steadfastly down to Earth.

Sam Rockwell stars as a near-future moon base laborer who for three years has spent his days alone mining the lunar soil's rich supply of Helium 3, with which his far-flung corporate overseer claims to be solving Earth's energy crisis. Alone, that is, until an entirely unlikely visitor arrives and turns out not to be good company.

In recent years, Rockwell has been building a fine body of work by wondering how men live with themselves, and Moon is all about that. It’s hard to discuss in detail without giving the whole plot away, and of course the plot--developed by Jones with screenwriter Nathan Parker--is pretty ridiculous. Let’s just say that it takes place on the mysterious frontier between space madness and corporate malfeasance, and that my disbelief was suspended.

I like the movie’s peculiar personality, its way of being a functional assembly of nice touches--like Clint Mansell’s driving score, or the deliberate tactility of the production design, or the obligatory omnipresent talking computer being voiced by Kevin Spacey, whose performances always strike me as facsimiles of humanness anyway.

Most of all, I like that it's not ever too peculiar. As a conscious throwback to the unabashedly philosophical, pre-CGI science fiction of Jones' youth, Moon also has just enough astronomical distance from his famously spaced-out dad. If we want to call Jones' good taste an inheritance, we should allow that so, too, is his discretionary independence.

On Editing, Part 1

A couple of days ago, to commemorate the twentieth anniversary of the Tiananmen Square massacre, the New York Times website ran a fascinating about the iconic photograph of the so-called tank man—the man in the white shirt, holding what look like shopping bags, standing defiantly in front of a column of approaching tanks. The Times, quite justifiably, calls it "one of the most famous photographs in recent history." Thing is, unlike many other famous photographs—Robert Capa's comes immediately to mind—there isn't just one of them. There were a lot of photojournalists covering the event that day, and as a of the confrontation between the tank man and the tanks shows, the incident lasted long enough for several photographers to capture essentially the same image. The Times piece has the recollections of four of them: Charlie Cole (who was working for Newsweek), Stuart Franklin (Time), Jeff Widener (Associated Press), and Arthur Tsang Hin Wah (Reuters). Each photographer is given several paragraphs to explain how they took their pictures, what was happening around them at the time, and what happened afterward. David Nickerson, a political scientist at Notre Dame, emailed me a link to the New York Times piece; we are friends from college and are always emailing each other newspaper items with snarky comments attached (this piece is written with Prof. Nickerson's generous consent). As we talked more about the piece, two themes emerged, about both the event itself and the apparent personalities of the photographers. But really, we were talking about editing: how our perceptions of a thing are shaped by every word we read about it and every word that is left out.

The tank man is a compelling figure partly because he's anonymous. His back is always to the photographer, and to this day, as the Times piece pointed out, his identity and whereabouts are unclear. This may seem almost impossible to believe: His showdown with the tanks was the middle of a wide street in broad daylight, witnessed apparently by hundreds of people, many of whom, even at the time, knew they were witnessing something extraordinary. But look at the recollections of the four photojournalists of what happened to the tank man immediately after he stopped the tanks:

[Cole:] Finally, the PSB (Public Security Bureau) grabbed him and ran away with him. [Franklin:] He then disappeared into the crowd after being led away from the tank by two bystanders. [Wah:] Four or five people came out from the sidewalk and pulled him away. He disappeared forever.

Was it two bystanders or four or five? And what led Cole to think to think that they were security officers? (Were they?) Right from the start, the stories of eyewitnesses to the scene don't square—and these are journalists at the top of their game, people who are much better than most of us at getting the facts right. At this point, my editor brain kicked in. Surely the editor of the Times piece noticed the discrepancies. Did they ask the photographers any follow-up questions? Wouldn't you just love to get the three of them in a room right now to sort this all out? (Presumably does something like that and more.)

Meanwhile, Widener's piece doesn't mention what happened to the tank man after he took his picture at all, which brings us to a more delicate matter. As Prof. Nickerson pointed out, Widener's account of the actual taking of the picture is very different from the accounts of the other three photographers. Cole, Franklin, and Wah recall mostly what was happening around them—who was there, what they were doing. In their accounts, they're photojournalists doing their jobs, but their concern for the tank man's safety, not to mention the safety of themselves and everyone around them, is evident. Cole even finds time to take a wider view:

As the tanks neared the Beijing Hotel, the lone young man walked toward the middle of the avenue waving his jacket and shopping bag to stop the tanks. I kept shooting in anticipation of what I felt was his certain doom. But to my amazement, the lead tank stopped, then tried to move around him. But the young man cut it off again. Finally, the PSB (Public Security Bureau) grabbed him and ran away with him. Stuart [Franklin; apparently they were standing next to each other, which accounts for the similarities in their photographs —ed.] and I looked at each other somewhat in disbelief at what we had just seen and photographed.

I think his action captured peoples’ hearts everywhere, and when the moment came, his character defined the moment, rather than the moment defining him. He made the image. I was just one of the photographers. And I felt honored to be there.

Now look at Widener:

I loaded the single roll of film in a Nikon FE2 camera body. It was small and had an auto-exposure meter. As I tried to sleep off the massive headache that pounded my head, I could hear the familiar sound of tanks in the distance. I jumped up. Kurt/Kirk [a college kid who's helping them and whose name Widener never quite caught —ed.] followed me to the window. In the distance was a huge column of tanks. It was a very impressive sight. Being the perfectionist that I am, I waited for the exact moment for the shot.

Suddenly, some guy in a white shirt runs out in front and I said to Kurt/Kirk “Damn it—that guy’s going to screw up my composition.” Kurt/Kirk shouted, “They are going to kill him!” I focused my Nikon 400mm 5.6 ED IF lens and waited for the instant he would be shot. But he was not.

The image was way too far away. I looked back at the bed and could see my TC-301 teleconverter. That little lens adapter could double my picture. With it, I could have a stronger image but then I might lose it all together if he was gone when I returned.

I dashed for the bed, ran back to the balcony and slapped the doubler on. I focused carefully and shot one … two … three frames until I noticed with a sinking feeling that my shutter speed was at a very low 30th-60th of a second. Any camera buff knows that a shutter speed that slow is impossible hand-held with an 800mm focal length. I was leaning out over a balcony and peeking around a corner. I faced the reality that the moment was lost.

In comparing the four entries (in their entirety as they appear in the piece, not only in what I've excerpted here) it was easy for me to think a few uncharitable thoughts about Widener. He seems more concerned with his camera equipment than with what's happening to the people around him. Most tellingly—or so it would seem at first read—at the point in his narrative where he might tell us what happened to the tank man, it seems that Widener was actually looking at his camera, noticing that the shutter speed was too low. In the Cole, Franklin, and Wah accounts, the man appears as a man; in Widener's, he is an element of composition.

But then my editor brain kicked in again. Was I really being fair to Widener? After all, my opinion of his account was formed, first, from comparing it with the other three accounts. Without the other three accounts right next to his, the idea that Widener was more preoccupied with his photograph than the subject of the photograph might never have occurred to me. Widener's piece might even have struck me as funny and refreshingly candid—the story of a man trying to do his job in trying circumstances.

Then the real doubt started to creep in. As a editor, I should know better than to think I'd even read all of what the photojournalists wrote (let alone what they would say if I were to be fortunate enough to have dinner with them). Perhaps Widener's piece had a lot about the tank man himself, but it was redacted due to concerns about the length of the piece overall. Perhaps the other three had a lot more talk about the gear they had, but Widener's gear talk was better, so they left Widener's in and cut the rest out. All these pieces had been through the editing mill (one editor? Two? Three?), and they were edited (presumably, and no matter how invasively) to make the piece as a whole as tight and coherent as possible, while sacrificing as little of the authors' voice and intent as possible. But no edit is made without giving up something; the question is whether what you're giving up is worth what you get in return. And there's no predicting how people will read the final product: Maddeningly, wonderfully, some readers respond to even the most straightforward text in ways that surprise even the most careful writers and editors.

In every published text, in the passage from thought or experience to writer to editors to reader, perception piles onto perception, subjectivity piles onto subjectivity, and the original thing—the truth, or the closest thing we have to it—is buried under layers of reconsiderations, rewordings, and manipulated grammar. It sounds like a bunch of postmodern claptrap, I know; but it's the nature of the job. And twenty years later, we have these stories of what happened in Beijing that June 4—or May 35, as some call it to avoid censorship from China's government—but we still don't know who the tank man was, or where he is today.

We Shouldn't Should Teach Creative Writing

When I saw Louis Menand's "Show or Tell: Should Creative Writing be Taught" in this week's New Yorker, I cringed, sighed, and devoured the article right at the kitchen table. As one of the many MFAs and teachers of Creative Writing, I am intimately and darkly interested in this topic. Turns out, Menand's piece is more of a review of Mark McGurl's new book called The Program Era, in which McGurl focuses on fiction writing programs in relation to the Marshall Plan and Post WWII Literature. The article wanders through some sound investigations and is full of surprising statistics.

Oddly enough, Menand has nothing to say about poetry programs except, "I don't think (undergraduate) workshops taught me too much about craft, but they did teach me about the importance of making things, not just reading things. You care about things that you make, and that makes it easier to care about things that others make." He recalls his days at college where " all we were required to do was to talk about each other's poems," and that it "seemed like a great place to be."

My experience at NYU did indeed help inspire a type of care for "made" pieces, and graduate school was a great place to be. But studying in a Creative Writing Program did a lot more than simply inspire in me a compassion for other writers. To think that MFA programs are as Menand writes, "Designed on the theory that students who have never published a poem can teach other students who have never published a poem how to write a published poem," is convenient for his argument, but not entirely accurate. (But not one Creative Writing Program student was interviewed for the article -- also convenient.)

Many of my MFA colleagues came to the program having already published poems and essays in widely circulated journals and magazines. Many programs require their students to take at least two Theory or Critical courses for the degree. Some programs have a language requirement; some have a study abroad requirement. But all programs (that are worth their salt) will certainly compel a student to do more than only "require" students to talk about other students' poem. Many of my teachers: Anne Carson, Sharon Olds, Philip Levine would bring in poems as models for the class, and would conduct mini lectures on that poem's strengths, allusions, or patterns. We were apprentices more than we were a gaggle of "twelve-on-one group therapy" goers.

With the increase in MFA programs and graduates, as is the law of supply and demand, the cache of the degree has worn off. Yes, but questioning whether or not Creative Writing should be taught, or if it should exist in the realm of the Academy, seems like a hackneyed old pitch. (Didn't Dana Goia go there already?)

The volte for me though, is that Menard's article contains within its title the word "should." Would the New Yorker publish, "Sight and Vision: Should Painting be Taught?" or "Stories upon Stories: Should Architecture be Taught?" or even "Eat Your Cake too: Should the Culinary Arts be taught?" I don't think so. How and why is writing held to a different standard? Is it that ultimately we don't as a nation really consider writing to be an art form? That we can't understand that painting, buildings, and poems can all narrate humanity-just through different media?

Marcia Bartusiak and the day we disappeared

Broadly speaking, the history of astronomy reads something like the story about how we humans have discovered our insignificance in the cosmos. In the last two thousand years, major discoveries about the solar system, our galaxy and the universe have shuffled the likelihood of our existence deeper and deeper into the realm of improbable chance and fortuitous coincidence. In the big picture, we're barely a pixel. I suppose it's only unsettling if you start out assuming that human beings are, quite literally, the center of the universe. Two thousand years ago, Aristotle stepped up and said the Earth sits at the center of an unchanging, infinite universe; our planet is surrounded by celestial spheres most easily observed by the movements of the Sun and planets. People believed Aristotle and moved on with their lives. In the 16th century, Copernicus stepped up to the plate, swung, and blasted the Earth out of the center. He explained the motions of the planets by invoking a heliocentric view of the cosmos. The sun, not we, sat in the middle. Earth had officially been displaced as the center of everything—at least in theory. (And Copernicus himself was displaced. Last November, archaeologists finally found his long-lost remains buried under the floor of a cathedral in Poland.) In the 17th century, Galileo improved upon the design of a Dutch engineer to build a telescope, with which he delivered good evidence in support of Copernicus. In this case, good evidence made the Dominicans mad, and Galileo was rewarded by the Catholic Church with house arrest (albeit a very comfortable house arrest) for the rest of his life.

And so on—each new discovery places humanity further from a central position. The greatest decentralizing act of our time may have occurred on New Year’s Day, 1925—the eponymous "day" in the title of science writer ’s latest offering, . Bartusiak is an experienced and award-winning writer, a fellow of the AAAS and a former Knight fellow at MIT. (I feel fortunate to count myself among her former students.)

On that day in 1925, astronomers were gathered in Washington, D.C., for the annual meeting of the American Association for the Advancement of Science. On the third day of the meeting, someone stood up, gave a talk, and changed everything. Edwin Hubble, a 35-year-old stargazer working at the atop Mount Wilson in California, had found conclusive evidence that the Milky Way—our beloved home—was not the only galaxy in the cosmos. (Hubble, fearful of sullying his reputation, didn’t even present his own findings.)

It’s easy to be blasé about the impossible-to-comprehend infinitude of the cosmos now; after all, almost everyone alive today grew up surrounded by the knowledge that the Milky Way is no loner. And we science junkies, with our varying deficits of attention, learn, marvel and move on, looking for the next big thing. Bartusiak, however, writes about that blustery day in the nation’s capital with the infectious excitement of a giddy astrophile:

In one fell swoop, the visible universe was enlarged by an inconceivable factor, eventually trillions of times over. In more familiar terms, it’s as if we had been confined to one square yard of Earth’s surface, only to suddenly realize that there were now vast oceans and continents, cities and villages, mountains and deserts, previously unexplored and unanticipated on that single plug of sod.

Or here:

The Milky Way, once the universe’s lone inhabitant floating in an ocean of darkness, was suddenly joined by billions of other star-filled islands, arranged outward as far as telescopes could peer. Earth turned out to be less than a speck, the cosmic equivalent of a subatomic particle hovering within an immensity still difficult to grasp.

That’s just the beginning: In Day, Bartusiak lovingly and meticulously traces the origins and development of a big idea. Hubble’s name is familiar to most of us mainly because of recent news about the space telescope that bears his name, but he doesn’t really show up in the book until two-thirds of the way through—a structural choice that demonstrates Hubble stood on the shoulders of many, many giants. His name may be forever associated with the discovery of the universe, but his finding was no instantaneous flash of brilliance that launched him from obscurity into the annals of science.

In fact, Bartusiak calmly puts to rest the idea that scientific advancements arrive in discrete packages marked by the word “Eureka!” Despite the legend of Archimedes, scientists aren’t usually struck out of the blue by the clear light of truth: "Answers did not arrive in one eureka moment, but only after years of contentious debates over conjectures and measurements that were fiercely disputed. The avenue of science is more often filled with twists, turns, and detours than unobstructed straightaways."

Her enjoyable book is an exhaustively researched exploration of both major and minor players. She points out that Hubble wasn’t the first person to suspect the great vastness. The Roman poet Lucretius thought it ludicrous to imagine a finite universe; the polymath mystic Emanuel Swedenborg mused that there must be ‘spheres’ beyond our own; Immanuel Kant correctly discerned galactic shape and suggested galaxies were scattered throughout space.

Many unsung heroes get a nod: Vesto Slipher clocked the speeds of distant spiral-shaped nebulae (later identified as other galaxies) and found that most of them were speeding away from the Earth—“a precocious intimation of the cosmic expansion that took many more years to fully recognize.” Henrietta Leavitt’s studies of variable stars made it possible to measure the distance between us and galaxies far, far away. Leavitt, one of many women hired to be a human “computer” at the Harvard College Observatory, died in 1921—four years before the Royal Swedish Academy of Sciences tried to nominate her for a Nobel Prize in Physics. (It wasn’t to be—Nobel nominees must be living.)

The book ends with something of a cliffhanger. Hubble and his fellow giants had found that not only is the universe filled with other galaxies, but these galaxies are retreating. Further investigation revealed that these “galaxies are not rushing through space but instead are being carried along as space-time inflates without end.” No matter which way we look in the sky, we see this vast universe rushing away from us. (In a way, that does put us back at the center of things, but only because every point in the universe is the center…)

Modern astronomy continues to tell us how far we are from the center—and that we still can’t comprehend the weirdness of reality. Ninety-six percent of the barely interacts with the protons, electrons and neutrons that make up our reality. (It’s most likely flowing through you right now, and there’s no way to know.) are scanning the skies for the telltale spectrographic signature of distant, rocky world that may harbor life. Other astronomers are looking for the “chameleon,” a theoretical entity that adjusts its mass to its environment and may help explain dark energy, the unknown goo that fills most of space. Also unsettling is the idea that our infinite universe is just one of an infinite number of universes that together form the ".” Is there no limit to enormousness?

(PS: The International Astronomical Union and the United Nations declared this year the an initiative that invites countries and institutions to step up their education of the public in all matters regarding the universe. Bartusiak’s book would be a great way to celebrate. So would a trip to Yale’s new planetarium at the , which offers free shows on Tuesday nights and, on June 12, a showing of “War of the Worlds.”)

Search Me

'Of course the company founded by Sergey Brin and Larry Page in 1998 - now reckoned to be the world's most powerful brand - does not offer any substitute for the originators of content nor does it allow this to touch its corporate conscience. That is probably because one detects in Google something that is delinquent and sociopathic, perhaps the character of a nightmarish 11-year-old.'-- Henry Porter, "," The Observer, 5 April 2009

Porter's article, which I found because two Facebook friends linked to it, resonated very tellingly after attending a symposium, 'Library 2.0,' held at Yale Law School on Saturday, April 4.

After an intro that featured much 'lifted' film content and a bright, buzzword-laden welcome that urged us to Tweet and Blog and upload photographs from our cellphones, etc., and a paper by Josh Greenberg of the New York Public Library that celebrated the outreach potential of blogs, we finally got to a presentation, by Michael Zimmer of Univ of Wisconsin-Milwaukee, that offered a few caveats to the collective zeitgeist of online über alles with the notion, picked up from Neil Postman, of technology as always offering a Faustian bargain.

Given the need for the internet in contemporary communications, we might think Zimmer was simply playing devil's advocate or was a Luddite at heart, a throwback to the ancient days before we all went online. But not so, what Zimmer was really cautioning us about was all the unexamined consequences of our lemming-like acceptance of internet interaction. As librarians have had to at times stand up for civil liberties, like the right to privacy about one's intellectual inquiries and sources of information, Zimmer had reason to wonder if 'Library 2.0' -- the library as modeled on Google, essentially -- will continue to provide a 'safe harbor for anonymous inquiry.' Not simply 'who owns the content' of what we post -- but who owns the documentation, who gets to data-mine, and so forth. Ted Striphas, of Indiana Univ., extended this 'Big Brother is Watching' paranoia into Amazon's Kindle system which relays its users' annotations, bookmarks, notes, and highlights back to 'the mothership.'

In the course of the day, there were several references to 'the Death Star': the four huge publishing conglomerates that now exist where twice that many major publishers existed a decade before. But the real 'Death Star' emerged when the topic of Google's digitization plans for out-of-print books was on the table in the day's last panel. Already we had heard, in an excellent presentation by John Palfrey of Harvard Law School, how 100% of a focus group of what he called 'digital natives' (those hitting 13-22 since the major internet wave of the late '90s) used Google to search for information and all went to the wikipedia entry on the subject first. Though Palfrey didn't elaborate on this at the time, the point became clear in the Google discussion when Frank Pasquale, Visiting Professor at Yale Law School, spoke of the possible consequences of putting all our searches for information in the hands of 'proprietary black box algorithms subject to manipulation.' Wikipedia is always the first or second entry in any Google search. The first ten are apparently all anyone looks at. Everything that gets buried by the algorithm is as good as not there. This is not how research is conducted.

Then there's the question of all those out-of-print books. Obviously it would be to the public good to have them searchable and accessible online if only because anything not online or available through Kindle (in other words, anything not part of the Death Star of Google and Amazon) falls into the 'here be monsters' of off-the-map ignorance. Already Jonathan Band, a lawyer, had told us that 'fair use' was becoming more conducive for technological and creative appropriation, and Denise Covey of Carnegie Mellon University Libraries and Ann Wolpert of MIT Libraries had spoken about faculties pursuing an open access policy in which anything they publish can be searched and referenced online -- a blow to academic publishers, but a victory for the notion that research on the internet should not be hampered by commercial considerations.

In other words, the notion of open access to all information, via the internet, of complete 'transparency' of provider and user, was more or less the mantra of the day. But what the Faustian bargain came to seem finally was not with the technology itself, but with giants such as Google and Amazon as the Big Brothers playing Mephistopheles, offering us the interconnected, easy access world of our dreams, but a world where we sacrifice something of our own intellectual curiosity, restlessness, and desire to see outside or beyond that black box algorithm that makes things so easily manageable for us.

Think about how Wolpert pointed out that what made the MIT professors move for Open Access was their realization that, in the world of electronic text, libraries only 'lease' access to online work, rather than owning it like all those printed copies they store in perpetuity. If something happens to the provider or to the lease, all that material is no longer available. And now the publishing world seems poised to turn over all electronic control of out-of-print materials to Google to broker for us, and to disseminate to us according to its lights. As Brewster Kahle, co-founder of the Internet Archive, urged us to consider, there are alternatives. But as Ann Okerson, of Yale Libraries, said at the end of the final panel with a kind of 'fait accompli' finality: if Google accomplishes this digitization, the students and users of libraries at Yale will simply want access to it, and her job will be to work with it, not fight it.

'But it was all right, everything was all right, the struggle was finished. He had won the victory over himself. He loved Big Brother.' --George Orwell, 1984

Reviews, Reviews, Reviews

So much to talk about today, it's almost impossible to know where to start, so let's work backwards from what I last read… For years I've known of the achievements of , who carries the distinction of being one of the few, if not only, African-American, female writers in the otherwise all-too-white and once upon a time all-too-male genre of science fiction. Butler's reputation, moreover, is stellar. She cleaned up in science fiction awards for her novella , landed a Nebula for , and even had the rare distinction for a science fiction writer of receiving a MacArthur Foundation "Genius" Grant. She from a stroke at the relatively young age of 58 after authoring some thirteen books in a writing career that spanned nearly 30 years.

Her last work before she died was a science fiction vampire novel, , described by her as a "lark." At a minimum, let us say that it is any number of cuts above such fare as Stephanie Meyers' Twilight series, which I only know from DVD since I refuse to plow through the many thousands of pages of teen vampire angst run amok in the halls of our nation's high schools. Indeed, one wonders if Butler was not responding in part to this that I have lovingly dubbed for my teenage daughter as the "hickeys with holes" brand of fiction.

Fledgling is compelling. A child awakens in a cave, badly injured, in terrible pain, with no memory of her past and struggling to survive. Ravenously hungry, operating only on instinct, Shori discovers that she is a 53-year-old vampire in the body of an 11-year-old child, a member of an ancient, anthropogenetic race known as the "Ina," who live alongside human beings. Shori's amnesia is a literary device that just borders on the trite for pumping up readers' feelings of suspense. But it's also an opportunity, in Butler's deft hands, to reimagine the human-vampire relationship as one of instead of formal parasitism. What we get is Butler's latent utopianism in which the idea of the family is reconfigured into a mixture of physical addiction and mutual dependence, open sexual relations and Western ideations of the village family unit.

But there's an added wrinkle: Shori, unlike all of her vampire relations, is black, purposely so, the result of experiments in skin pigmentation and Ina-human gene mixing. Presumably this should raise Fledgling to the level of , a genre I generally favor when done right. But the material seems to get away from Butler, and what appeared so promising at its opening simply doesn't deliver on the possibilities suggested, an unfortunate result for a work that—as vampire novels today go—still surpasses its peers in depth and invention.

…..

If no one objects to my jumping around a bit for today's post, then let me pick up where my colleague left off by discussing a wonderful book by one of our own that has come into New Haven Review's hands.

When George Scialabba's arrived at the doorstep, I was hooked. Right away I knew Scialabba would be my kind of intellectual, regardless of what intellectuals may or may not be good for. Gathered from the last two decades or so, this collection of essays and reviews raises the question in more ways than one. First and foremost is through the persona of the author himself, who is a public intellectual in perhaps the truest sense of the term. You see, Scialabba is not a professor at a major research university or a policy wonk at a think tank or a Gore Vidal-esque aesthete pontificating from an Italian villa or one of the liquid lunch crowd flowing in and out the Condé Nast building. No, Mr. Scialabba appears to be one of those rarities: a working stiff whose vocation appears to have little to do with his avocation. When he's not busting Christopher Hitchens' chops or assessing Richard Rorty's contributions to American culture, he is presumably working budgets or dressing down contractors in his daylight existence as an assistant building superintendent. OK, I'll grant that even a plant manager at Harvard may have the advantage of proximity to some of the best minds in the country. But Harvard is hardly distinguished for its HVAC systems.

Scialabba, as a public intellectual, is part of a cultural tradition of thinkers who opt to keep their day jobs when none from MFA programs or think tanks are forthcoming. And, to be honest, that's something of a relief to me. It's probably no surprise then that Scialabba most admires those intellectuals whose qualities are defined less by their professional status than the clarity and cogency of their writing, even when on the wrong side of an issue. As a result, What Are Intellectuals Good For? is a veritable who's who of publicly accessible intellectual discourse. Dwight McDonald, Stanley Fish, Richard Rorty, Christopher Lasch, Alisdair MacIntyre, Irving Howe, and assorted others are the subjects of essays and reviews that are notable for their force of argument and precision of thought. There is nary a Continental thinker nor an American imitator to be found here.

There is a special fondness for the —Howe, Trilling, the rest of the Partisan Review crowd—in part for their achievements, in part for their apparent disdain of specialization and academicization. As a consequence, Scialabba's more recent heroes tend towards the plain-spoken and generally incisive Russell Jacoby, Christopher Lasch, and Richard Rorty. Less admirable are the likes of Martha Nussbaum (too generic), Roger Kimball and Hilton Kramer (too conservative), and Christopher Hitchens (too crazy).

And yet whatever Scialabba's verdict, we'd do well to listen. He's often on point, even if you disagree, and quicker than most to get to the root issues in any writer's corpus of thought. But what really distinguishes this collection, especially the reviews, is how Scialabba lets the books and their authors take center stage. Too often in venues such as the and, albeit less frequently, the , one gets the funny feeling that the reviews are more about the reviewer than the reviewed. Now, it would be mean-spirited to begrudge a reviewer his or her authorial voice: I can assure you Scialabba doesn't conceal his. But 4,000 word essays in which the title under review makes its grand entrance in the last two paragraphs do not always seems worth the price of admission. Reviewers with grand ideas and theories of their own are sometimes better off just writing their own books. Fortunately, Scialabba avoids this species of reviewing hubris.

But already I commit the very sin I deplore, too wrapped up in sound of my own voice and not letting Scialabba's book take over from hereon. But let me shamelessly plead the constraints of space and conclude on this note: What Are Intellectual Good For? is, in a sense, the meditation of one deep-thinking critic on the work of other deep-thinking critics and their views of politics, social justice, and morality. In another sense, it is a reader's roadmap to some of the best cultural criticism written in the last half century. And in both senses taken together, it is a highly recommended starting point for anyone who cares deeply about this much-endangered species of criticism.

…..

So who the hell is Robert Levin? Well, there's always the , where you can learn that he's a jazz critic, a short story writer, and a writer of music liner notes. He seems to have had his heyday here and there—a critical letter to the Village Voice about the that drew a year's worth of responses; a 2004 recipient of "storySouth Million Writers Award Notable Story."

That story is the title of a collection of Levin's writings, . Dare I confess that I read this slim 90-page volume over the course of seven dog walks? (Yes, I can walk and read at the same time; I can also chew gum and type.) Let me add that it was one of my more pleasurable dogwalking experiences, which is otherwise a dreadful bore. The reason is simple: Levin is funny. Leaving aside the eponymous lead short story, itself a ribald tale of mistaken identity and the sexual pleasures that can derive therefrom, the miscellany and commentary are laugh-out-loud grotesques, some weirdly Dickensian in their exaggeration of the mundane, others Jamesian in their syntactically elaborate transformations of the bizarre into the clinical or poetic. Only examples will do. In his screed "Recycle This!" on a recycling notice asking residents "to sort and…rinse [their] garbage before leaving it out," he writes: "So while I'll allow that self-immolation would constitute a disproportionate form of protest, I have to say that reacting with less than indignation to so gratuitous an imposition would also be inappropriate." That's a fairly ornate response to a recycling notice. Like I said, pure Dickens.

Or consider "Peggie (or Sex with a Very Large Woman)," a story so wonderfully offensive that it would be impossible not to relish the absurd attempt to poeticize the physical challenges set before Levin's narrator: "…Peggie's particular body could have served as a Special Forces training ground for the field of hazards and challenges its presented. I'm speaking of the twisting climbs and sudden valleys, the crags, the craters and the amazing plenitude of gullies, ravines and bogs that I was, and on my hands and knees, obliged to negotiate and traverse in my search." And don't even ask what he was searching for. You can probably guess.

In some ways, Levin is at his best wringing every drop of qualification from a feeling or thought, an instance of rage or fear, often in one long but densely packed sentence. The bathos of the stories and of some of the miscellany—there are cantankerous whines about cashiers and their stupidity, smoking bans, HMOs, aging, the aforementioned recycling notices—is actually what makes it all worth the reading. Levin, in essence, gets more out of the mundane through an overwrought prose style that is utterly apropos to the sensibility behind it.

But there's no substitute for the man himself, so let's conclude with his thoughts on when one of God's "natural wonders"—in this case a solar eclipse—fails to deliver the goods: "I'll allow that, however disappointing it may be, it's ultimately of small consequence when He mounts a shoddy eclipse. But it's something else again when, for one especially egregious example, He leave you to blow out all your circuits trying to figure just where a mindless inferno of neuroticism like Mia Farrow fits into the notion that everyone's here for a reason." Consider my own circuits blown.

George Scialabba v’Eretz Yisrael

At long last, the critic is getting . I only met George once, about ten years ago, and I had forgotten how articulate he was in conversation; I was reminded by listening to with Christopher Lydon on his web show (which is as good as anything on NPR). I did, however, want to take issue with one comment George makes—and I hope that my minor quibble will be taken in the context of the huge respect I have for George, who is an essential writer and you should buy and read. At one point, George takes a stab at explaining how many Jewish intellectuals moved right-ward politically; his explanation, and it's not his alone, is that the 1967 war, when Israel's survival seemed to be at stake, caused many American Jews to become more attached to Israel, a country that until then had not been a major part of American Jewish consciousness, especially among intellectuals. Since then, he says, many Jews have been unwilling to follow their progressive principles if those principles might put them at odds with (their perception of) what's best for Israel's survival. And so we can understand how, for example, there was no large Jewish outcry about the invasion of Iraq, which they took to be in Israel's interests. (I hope this is a fair representation of George's position; I'm talking about one or two minutes in a 44-minute interview otherwise filled with fascinating discussions of Randolph Bourne, Walter Karp, and other too-forgotten intellectuals. If this is an unfair statement, I hope George will let me know in the comments section—although I understand if he has better things to do!)

On one level, George is of course right; in fact, he does not go far enough. Israel's remarkable victory in the '67 war not only heightened Jewish concern about the survival of their several million co-religionists in Israel, but it also—more important, I think—increased Jewish pride in identification with that state. Even my fervently anti-Zionist, left-wing grandparents were a little astonished at a country that had produced successful Jewish soldiers (or so my mother recalls). (And here I am reminded of the comedian Jackie Mason's line about the difference between Jews and Italians: Jews are wimps on the street corner, while Italians can f— you up; but put them in an army, and Jews are indestructible, whereas Italians can't shoot straight.)

But I think we have to know which Jews we're talking about. The Jews who became the famous neo-conservatives—Norman Podhoretz and Irving Kristol, most famously—were well on their way to the right before 1967, and they were swinging right in a way that was bound to sweep all their opinions to the right. Indeed, I have often marveled at the sheer, improbably drift of their move to the right—how can it be intellectually honest to just happen to move right on labor, foreign policy, economics, etc., at the same time. Such a comprehensive move is more likely to be the result of cynicism or careerism. There is no reason, after all, why becoming more hawkish on foreign policy also entails becoming more hostile to labor unions. But with these guys, so it went, and there you have it.

Anyway, I don't think 1967 had much to do with where Kristol, Podhoretz, and Himmelfarb ended up. Nor did it have much effect on a lot of Jewish New Left types who were pretty irreligious to begin with, and Jewishly uninterested, and who make up one important core of anti-Zionism today. After all, while most Jews are not anti-Zionists, a lot of outspoken anti-Zionists and Israel critics actually are Jews. Jews may, in fact, be more disproportionately anti-Zionist than they are disproportionately Zionist, compared to the American population at large. Pretty much all Americans are, in their unthinking way, supportive of Israel—a goodly number of Jews, in a very thoughtful way, are critical of Israel. Especially among intellectuals, and that's whom Scialabba is talking about.

So who are these Jews whose foreign-policy ideas were warped, or subtly shifted, by the 1967 war? For whom was the war decisive in that way? The best case could be made, I think, by looking at my contemporaries (I am 34), rather than at neo-cons in the sixties and seventies. I would hazard that a lot of New Republic types (to just pick one useful marker), people like Peter Beinart, say (although there is no reason to pick on him, and he's written a lot about how his position on this has changed), were more inclined to support the Iraq invasion because of having grown up in a post-1967 world where the survival of Israel was an issue for young American intellectuals in a way that it wasn't for, say, my dad.

But I think what George was really getting at is a general despair, his and others', that the same people who have been central—indeed, indispensable—to so many other social-justice movements in America have seemed, to him, relatively absent on foreign policy. And that is a shame. But the causation isn't so simple.

Another point about Jews: most American Jews, even those who went Communist or socialist, have, in their own ways, been very supportive of the American project. This is, after all, the land that saved us from what had been happening, and what lay in wait, in Europe. So that deeply felt Americanism has been channeled into certain domestic progressive causes—like Civil Rights—where it is apparent that the United States is not living up to its ideals. And with our long tradition of women being at least moderately educated, and working outside the home (in the shtetl, scholars' wives often worked to support their husbands endless hours in study), Jews were at the forefront of Second Wave feminism. And there was a history of labor radicalism that Jews brought from Europe. But Jews have not historically been pacifists, and we have been enthusiastic soldiers in every American war (including both sides of the Civil War). It may, therefore, be a bit of a mistake to read into that Jewish progressivism a congenital anti-war inclination. Yes, many Jews were at the fore of the anti-Vietnam movement (although perhaps not out of proportion to our representation on liberal college campuses, where the movement was centered). But it's not my sense that the leading pacifists in the Great War or World War II were Jews—they were Protestants, often of the Anabaptist or Radical Reformation stripe: Quakers, Mennonites, etc., with a smattering of Jehovah's Witnesses, and some more mainline Protestants.

So while it would be nice if there were a strong, identifiably Jewish foreign-policy left today, and in the run-up to the Iraq War, I am not sure that that was ever likely, or that there was a historical precedent, and I don't think its absence is as clearly related to the 1967 war as George Scialabba seems to think.

Artists' Books at Southern

What would William Blake have made of the Kindle, I asked myself the other morning during a private viewing of the Artists’ Book Collection at Southern Connecticut State University’s Buley Library. According to Tina Re, curator of the eighty-book collection, it was Blake who created the first artist’s books when he and his wife, Caroline, wrote, illustrated and bound books such as Songs of Innocence and Experience. Their venture beyond authorship into design, self-publishing and self-distribution remain hallmarks in the realm of artists’ books today.

nhrphotoab

“The artist’s book can be purely visual or text, one of a kind or an edition,” Re says, as we walk among tables of books as fans, venetian blinds, flags, accordians, and origami structures, to name a few. “It can take the form of the traditional codex – the book as we know it today, with binding on one side and pages in progression on the other side – or many others. Whatever the case, it’s very much about the artists having control over every aspect of the book.”

Wavy-haired and bespectacled, Re (rhymes with “Fey”) displays the quiet intensity you would expect from a Special Collections librarian. Under constant state budget constraints, she’s been amassing this collection for only three years. It serves mainly as a multi-disciplinary teaching collection for faculty and students, and is presently available to the public largely through private arrangement.

We stop at a book called Aunt Sallie’s Lament, a collaboration between poet Margaret Kaufman and Claire Van Vliet of Janus Press. The words of a Southern quilter printed on thick, colored uniquely-shaped pages that layer and interlock with each other, the book is a harmonious blend of text, image, color, structure and craft.

Next we come upon a portfolio box housing Direction of the Road by Ursula LeGuin with Foolscap Press. It's the story of an oak tree standing in the road, and is a limited edition artist’s book incorporating anamorphic art, first recorded in the west by Leonardo da Vinci in 1485 in a drawing in his Codex Atlanticus. An anamorphic woodcut of an oak tree, intentionally distorted on one axes, becomes proportionate when we view it through an included standing cylindrical mirror. The reflection of the oak tree provides a backdrop to the book itself, which is printed letterpress on white linen paper with fabric leaves pressed into the paper.

“Listen when you read,” Re says to me. And as I turn the pages, the linen paper rustles, like leaves in a breeze. It feels as if we are standing directly below the tree.

I am hooked. I fully comprehend the urge to create a book in its whole, not just as author, but as designer, illustrator, papermaker, typographer, photographer, printer, illustrator, calligrapher, and bookbinder.

Re said that she had similar responses after a recent open house of the collection. Students came up to her demanding to know more about bookmaking and printmaking, and talking about DIY, steampunk and the huge popularity of Esty.com, the site to buy and sell handmade items. It appears that this generation brought up on tactile-less text on the computer, cellphone and Kindle screen are starting to get back to the basics.

William Blake is surely resting easier in his grave.

Letter from New Orleans

Thinking about it now, I pause to think about the ramifications of moving from one new place to another over the past five years—from New York to New Haven and now to New Orleans. After years of banging around various locales in and around New York City, it wasn't too long after I moved to the Elm that I was schooled by two locals on the question of emphasis when it comes to how you actually pronounce New Haven. New Yorkers, it seems, tend to put the emphasis on the NEW! and not the York. "I live in New York, not to be confused with Old York." But here, as Ideat Village impresarios Bill Saunders and Nancy Shea counseled one night, repeatedly, the emphasis, generally speaking, is on the Haven. Not NEW! Haven but New HAAAY-VEN. After awhile I got it; you want to linger on the Haven a good long while, since it is a town that will grow on you. So I started emphasizing the Haven part of New Haven and was therefore able to live here for over four years. When it comes to New Orleans, hey, I'm having a hard enough time pronouncing some of these street names without worrying too much about where the emphasis ought to lie. Post-Katrina you could argue for NEW! Orleans, but that sounds like Chamber of Commerce-approved marketing of the most vanilla-cynical variety. Besides, the blessed and endemic lassitude of the Big Easy begs for a lingering over the Orleans. On the other hand, from a fact-checking point of view, there is an Orleans with which one can make a differentiation. So I have been worrying about it, but not too much.

The other day I was walking in the Marigny neighborhood of New Orleans, which, I have learned, is not pronounced “ma-RIG-knee,” and came upon one of those those Volvo sub-wagons you see in every town, festooned with an abundance of bumper stickers citing various locales, pols and causes. Amid the Hillary! Stop War! and Peace is Patriotic stickers was one trumpeting the glories of . . . New Haven!—complete with an accompanying icon, a slice of pizza. The nudgenik in me thought, "Hey, that's wrong! None of the legendary places in town sells pizza by the slice! Outrages!"

But seeing that bumper sticker, indeed the whole trove of them, did evoke yet another question of where to put one's emphasis, this time when it comes to the old canard, "Everything in moderation." It's one on the great cautionary aphorisms of all time, but that's only because most of us put the emphasis on the moderation. Embedded within lies the stunning and deeply gratifying notion that if you can swing the moderation, you can have everything!

And so there we were, the winter of 2008, myself and Bill and Nancy, and the writer Todd Lyon. By then I knew how to tell people I lived in New Haaaay-ven and it had stopped bothering me that my friends from New York would always want to know how I was enjoying Hartford. In any case, we had an agenda. Oh, such a one as it was! We would hit four pizzerias in one night. We'd start at Zuppardi's in West Haven, get the double-dose at Sally's and Pepe's, and then grab a capper pie at Modern.

A slice here, a slice there, no gastro-problems would ensue if we paced ourselves; that was the plan. Alas. Immoderation won out in Wooster Square and we never made it to Modern. As for BAR, we kept it off the list. Too new, relatively speaking, to qualify for the tour.

Boston's Neat Graffitist vs. New Haven's.... Random Acts of Text

A Short Tribute to Selected Artiness I Remember from the 1980s, and a Hearty Recommendation of a Novel by Eric Kraft

When I was in high school, someone—I have no idea who—went around town putting up posters that said "New Haven is the Paris of the 1980s."

This was completely untrue, but it just slayed me and my friends. Every few years or so, I end up in a conversation with someone who was around then, and we go, "Remember the 'New Haven is the Paris of the 1980s' guy? Who the hell was that?" and then we laugh and have another beer.

In the 1990s I read all the available work by the sadly underrecognized literary genius Eric Kraft. I'd read his Herb n' Lorna, fallen madly in love with it, and begun to eat my way through the rest of his books. The one I liked best, and which is probably still my second favorite, is called Reservations Recommended. It is a sad comic novel (I know that sounds impossible, but trust me, it isn't) about a guy who lives a boring life working for a big company, but has an alter ego who is a restaurant critic in Boston. A lot of the novel is this guy's observations about Boston in general, and many of those observations focus on someone he calls the Neat Graffitist. The Neat Graffitist, actual identity unknown, goes around Boston neatly magic markering the town with random statements, "in small, precise capital letters," such as:

NEVER FEAR PAIN. TIME DIMINISHES IT. BUT AVOID BOSTON CITY HOSPITAL. NURSES THERE WEAR USED UNIFORMS PURCHASED FROM BURGER KING, TREAT PATIENTS WITH FATALISTIC DETACHMENT.

There are many parts of Reservations Recommended that I reread with deep pleasure, just reveling in the wonderfulness of it, but the words of the Neat Graffitist are some of my favorite parts of the book. My husband and I are especially fond of this one:

TO HERBERT: YOU WERE BORN ONCE AND NOT TWICE AND WHEN YOU ARE DEAD YOU WILL BE DEAD FOREVER. GIVE ME BACK MY WATCHES. THEY WILL NOT MAKE YOU HAPPY. THEY ARE NO DEFENSE AGAINST DEATH.

The Neat Graffitist belongs in a category, I feel, with the "New Haven is the Paris of the 1980s" creator, along with the person who spent time writing pithy little sweet nothings on masking tape and putting them on parking meters around downtown around 1984–85. I vividly remember strolling around downtown with a friend who noticed this and chortled: "Uh-oh—someone's getting arty with the parking meters." It was almost certainly a Yalie, but still pretty entertaining.

So whoever the "New Haven is the Paris of the 1980s" guy (or, okay, girl) is, thanks and hats off to him/her; also to the Masking Tape Artist of 1984. Where are you now? Do you even remember doing these things?

Meanwhile, Eric Kraft's been getting rave reviews for his recent fiction, but it's not likely you've actually acted on your fleeting thought, "Gee, I should pick that up sometime and read it." Listen to me. Start with Herb n' Lorna if you can; if you can't, with Reservations Recommended. Act on the fleeting thought. Fleeting thoughts are our friends.

Summary Observations: The Movie

Aside from intellectual property attorneys, who really knows where to get good movie ideas? Julie & Julia, due in theaters this August, is Nora Ephron's movie of Julie Powell's memoir (originally a blog) of the year she devoted to making every recipe in Julia Child's famous cookbook, Mastering the Art of French Cooking. Starring Amy Adams as Powell and Meryl Streep as Child, it is said to be the first wide-release movie developed from a book developed from a blog developed from a cookbook. And it just goes to show that potential entertainment properties are lurking everywhere. What most interests me, though, is its implied confidence in the supremacy of storytelling. If this film succeeds, it might inaugurate a whole new cinematic subgenre of movies dramatizing the doing of things described in instructional books.

Is an adaptation of Harold Bloom's How to Read and Why finally at hand? If so, what would it require? Perhaps the enterprising screenwriter might invent some twenty-something everyman, poised on the brink of self-actualization, and cross cut his intellectual development with telling formative vignettes from the life of Bloom?

Already I can picture our young, book-addled hero, sitting in an uncomfortable chair and contemplating “the vagaries of our current counter-Puritanism,” with the camera swirling and the music swelling around him; or standing by his apartment window, gazing out into the dusk and bearing in mind that “Irony will clear your mind of the cant of the ideologues, and help you to blaze forth as the scholar of one candle.” It began with Jason Schwartzman in contention for the part, but now I’m seeing Michael Cera.

So OK, it’s looking like this will be a Ron Howard picture, dumbed down just enough for mainstream safety and perhaps controversial in its casting of Tom Bosley as Bloom (certain members of the blogorati having lobbied in vain for Martin Landau). A box-office success? Maybe. An Oscar magnet? Well, sure, as long as it gets across the notion that “We read not only because we cannot know enough people, but because friendship is so vulnerable, so likely to diminish or disappear, overcome by space, time, imperfect sympathies, and all the sorrows of familial and passional life.”

And if that doesn’t work out, there must still be a good movie to be made from How to Complain for Fun & Profit: The Best Guide Ever to Writing Complaint Letters, by Bruce Silverman. Or at least from The Garden Primer, by Barbara Damrosch.

Finding the War

It is common to hear that part of what contributed to victory in World War II, and the overwhelming sense that it was the right thing to do, was that nobody at home knew how awful it was for the soldiers fighting it abroad. For many years now, , , and have been editing the popular story about the war, revealing its singular brutality and the myriad of motivations that led the powers that be to fight it as they did. This has led to a bit of a crumbling of World War II's image as America's last good war, due to both those hoping to complicate its simple popular moral equation and those hoping to give a clearer picture of just what the soldiers' sacrifice entailed. Yet the idea that, from 1939 to 1945, the people were sheltered from the war's horrors persists—an idea that I found myself questioning when I read Ernie Pyle's , a collection of the war correspondent's dispatches from 1943 to 1944, when he covered the war in Italy and then the Allied push through France to Berlin. The 1944 edition of the book is a fascinating artifact of the time period: Along with the copyright, an eagle-emblazoned seal states that this is a Wartime Book: "books are weapons in the war of ideas," the sash fluttering from the eagle's beak proclaims. The seal goes on to state that "this complete edition is produced in full compliance with the government's regulations for conserving paper and other essential materials," and it's easy to see the result: It's printed on very thin paper (that nonetheless has held up remarkably well—even wartime cost-cutting seems to have produced a better-quality book than today's mass-market paperback printers do) and with a clear regard for cramming as much text onto the page as possible without rendering it illegible. For me, a grandchild of those who lived through World War II, the effect of the book is to recall the stories I'd heard of who in my family served in the war, what they did, and where; and also, what the lives of the women who stayed behind were like, waiting for their husbands to come home, raising small children who had no recollection of their fathers.

Pyle, like the cartoonist , is celebrated for his honest yet dignifying depictions of the soldiers that he met, and Brave Men certainly gives you a lot of that. But Pyle's vision of the war does more than that: By giving us what he saw in Europe at the height of combat, and by making the soldiers he met human—naming them, talking about the meals he shared and the combat he experienced with them—Pyle undermines the idea of World War II, or any war, as good. Consider his description of an air battle he witnessed from the ground:

Someone shouted that one of the planes was smoking. Yes, we could all see it. A long faint line of black smoke stretched straight for a mile behind one of them. And as we watched there was a gigantic sweep of flame over the plane. From nose to tail it disappeared in flame, and it slanted slowly down and banked around the sky in great wide curves, this way and that way, as rhythmically and gracefully as in a slow-motion waltz. Then suddenly it seemed to change its mind and it swept upward, steeper and steeper and ever slower until finally it seemed poised motionless on its own black pillar of smoke. And then just as slowly it turned over and dived for the earth—a golden spearhead on the straight black shaft of its own creation—and disappeared behind the treetops. But before it was down there were more cries of, "There's another one smoking—and there's a third one now." Chutes came out of some of the planes. Out of some came no chutes at all. One of white silk caught on the tail of a plane. Men with binoculars could see him fighting to get loose until flames swept over him, and then a tiny black dot fell through space, all alone.

Or his description of soldiers approaching a firefight:

The men didn't talk amongst themselves. They just went. They weren't heroic figures as they moved forward one at a time, a few seconds apart. You think of attackers as being savage and bold. These men were hesitant and cautious. They were really the hunters, but they looked like the hunted. There was a confused excitement and a grim anxiety on their faces.

They seemed terribly pathetic to me. They weren't warriors. They were American boys who by mere chance of fate had wound up with guns in their hands, sneaking up a death-laden street in a strange and shattered city in a faraway country in a driving rain. They were afraid, but it was beyond their power to quit. They had no choice. They were good boys. I talked with them all afternoon as we sneaked slowly forward along the mysterious and rubbled street, and I know they were good boys. And even though they weren't warriors born to the kill, they won their battles. That's the point.

He goes on to describe all of them—their names, their addresses, and a epigram about them that makes them suddenly, startlingly real ("his New England accent was so broad I had to have him spell out 'Arthur' and 'Auburn' before I could catch what he said"; "Eddie was thirty, he was married, and used to work in a brewery back home; he was a bazooka man, but his bazooka was broken that day so he was just carrying a rifle."). Always giving them their dignity.

I like to think it was this dignifying impulse, and not the work of censors, that made Pyle use a kind of synecdoche when he described the beaches at Normandy just after the fighting was over. He lets the soldiers tell you what the actual assault was like themselves, the things that happened to them then, but he never pretends he was there with them. And when he takes a walk along the beach himself, he tells us that "men were sleeping on the sand, some of them sleeping forever. Men were floating in the water, but they didn't know they were in the water, for they were dead." But he lingers much longer on the mangled machinery, "empty life rafts and soldiers' packs and ration boxes, and mysterious oranges," and at last, "in a jumbled row for mile on mile were soldiers' packs." He goes on:

There were socks and shoe polish, sewing kits, diaries, Bibles, hand grenades. There were the latest letters from home, with the address on each one neatly razored out—one of the security precautions enforced before the boys embarked.

There were toothbrushes and razors, and snapshots of families back home staring up at you from the sand. There were pocketbooks, metal mirrors, extra trousers, and bloody, abandoned shoes. There were broken-handled shovels, and portable radios smashed almost beyond recognition, and mine detectors twisted and ruined.

There were torn pistol belts and canvas water buckets, first-aid kits, and jumbled heaps of life belts. I picked up a pocket Bible with a soldier's name in it, and put it in my jacket. I carried it a half a mile or so and then put it back down on the beach. I don't know why I picked it up, or why I put it back down again.

There are other things he sees that day, ironic and funny and pitiful and heartbreaking. And though Pyle himself never questions why they fought—if no war is good, fighting against Nazi Germany was certainly just—the overwhelming impression you get from the soldiers is that they're just trying to do their jobs and then get back home. If they believe in the cause, it's not so much for the lofty reasons that came out of the mouths of politicians, but because what they went through damn well better have meant something.