Reviews

All of it is Autobiographical

Rafael Yglesias’s new novel, The Happy Marriage, is wholly autobiographical, a fact which may interest some readers, including those of our Ygliesias, a novelist and screenwriter who lost his wife, Margaret, to bladder cancer after nearly 30 years together, tells the story of a novelist and screenwriter, Enrique, who, after a long, happy marriage, loses his wife, Margaret, to bladder cancer. The novel alternates in chapters between the couple when they first meet and at key points in the marriage, and their final three weeks together as Margaret makes the decision to take herself off intravenous feeding and bid farewell to family, friends and of course, Enrique/Rafael. I was engrossed and delighted with the book. Reading it, though, I couldn’t help wonder if what I knew about the author (as fully disclosed in the book flap and about the author) informed my reading, and if so, to what extent. Did I find the characters compelling because I automatically assumed the writer’s authority over them? Did I make allowances for contradictions and inconsistencies in characters because they sprang from true people? What did the known link between the writer and his material do for me as a reader? Did it lend a certain versimilitude? Why is versimilitude even necessary for me in a novel? Is truth indeed stranger than fiction?

When asked in a recent NPR Fresh Air interview why he didn’t simply write a memoir a la Didion’s The Year of Magical Thinking, Ygliesias immediately clarifies that it wasn’t because he wanted to provide any “cover” for himself. Indeed, the protagonist Enrique as written is at times selfish, impotent, and unfaithful. However, Yglesias continues, he wanted to tell the story of a marriage and keep the reader very present in this marriage. Thus, he chose to use fictional devices of dialogue—conversations as he remembered them from 30 years ago—and compression.

I like this thin line between novel and memoir. Lately, I find a resistance, perhaps even an aversion, toward fiction. Is it ego? I feel that my own life and head is so busy that I resent extending my attention and sympathy to invented characters, only real ones, or at least, ones based upon real ones. However you label fiction or nonfiction, it all comes down to story. I read James Frey’s A Million Little Pieces because it was a memoir. When all hell broke loose, I couldn’t understand the uproar. He told a damned good tale, so what difference did it make if it was all true or not?  We all know that stories contain many . We all know stories are subject to embellishment. Frey would have saved himself a lot of trouble if, like Inglesias, he’d only called his book an autobiographical novel.

Conquest of the Useless

By Werner Herzog; translated from the German by Krishna Winst (Ecco/HarperCollins, 2009)

In the annals of moviemaking catastrophe--from Apocalypse Now to Cleopatra to Heaven’s Gate to Waterworld--perhaps no famously troubled production has been more copiously documented than Werner Herzog’s Fitzcarraldo.

Maybe it’s because, in that case, the making-of really is more interesting than the movie itself. Or maybe it’s because they tell the same story. Fitzcarraldo is a tale of one man’s nearly ruinous obsession with bringing opera to the Amazon jungle. Its backstory is a tale of one man’s nearly ruinous obsession with the first man’s obsession. So the annotation of Herzog’s 1982 movie, much of it from the filmmaker himself, just seems to flow like a--well, like a great, majestically indifferent tropical river.

You’ll find it in Herzog’s commentary on the Fitzcarraldo DVD. And in his 1999 documentary My Best Fiend, about his nutso leading man and nemesis Klaus Kinski. You’ll find a lot of it in Les Blank and Maureen Gosling’s exceptional documentary, Burden of Dreams, whose Criterion Collection DVD edition even comes with a book gathering Blank and Gosling’s journals from their experience of Herzog’s production. And now you can read the maestro’s own journal of the event, Conquest of the Useless: Reflections from the Making of Fitzcarraldo, originally published in 2004 and newly available in English from Ecco Press.

In his preface, Herzog writes: “These texts are not reports on the actual filming--of which little is said. Nor are they journals, except in a very general sense. They might be described instead as inner landscapes, born of the delirium of the jungle. But even that may not be entirely accurate--I am not sure.”

Uh, OK. And after 306 pages, he doesn’t seem much surer. Could anyone else get away with this? The book covers a very dreamlike two and a half years, through which Herzog remains mesmerized by his own restless tenacity. Only the most committed readers will do likewise, of course, but that’s exactly how the empathy of obsession is supposed to work.

Herzog’s narrating voice is an acquired taste. (Here’s his entry from July 20, 1979, in its entirety: “San Francisco. Emptiness.”) But you already knew that. The real fun to be had with Conquest of the Useless is in the cross-referencing.  Blank’s account of April 12, 1981, for instance, begins with instant coffee and vultures perched on a hotel roof. Herzog’s begins with a drowned workman and whiskey and card games. Consensus: Doom is in the air.

Those of us who remember Herzog’s comments on the obscenity and “overwhelming misery” of the jungle in Blank’s film, or his assertion that “I love it against my better judgement,” at last can have this clarification, of sorts, from April 14, 1981: “The Grand Emotions in opera, often dismissed as over the top, strike me on the contrary as the most concentrated, pure archetypes of emotion, whose essence is incapable of being condensed any further. They are axioms of emotion. That is what opera and the jungle have in common.” The next day, according to Blank’s account, “He expressed his intention to end his life if he failed to complete the filming.”

Rest assured, he did complete the filming--and apparently has yet to complete processing the experience of completing the filming. Maybe he never will.

Honor, thy father!

About a year ago, I wrote a review of , Honor Moore’s memoir of her father, the late Episcopal bishop of New York, Paul Moore. The review never ran, but the recent release of that book in paperback prompted me to return to the review, and I still think it contains some points worth making. So here it is; read on: In the strongest sense, literature has no ethnicity, of course. Beloved is not African American, even if its author is; Studs Lonergan is not Irish-American, even if its author is. Art, that’s what they are. But for the sake of shorthand, and to describe our acquired tastes, we do use ethnic language for literature (and so Portnoy’s Complaint is obviously Jewish, to take a familiar example). By those arbitrary standards, The Bishop’s Daughter, Honor Moore’s memoir of her father, the late Episcopal bishop Paul Moore, is a distinguished contribution to the very small genre we might call WASP confessional. Other writers have delivered the juicy, clam-baked goods, dishing on the sex and drugs and general dissolution hidden behind the brownstone walls, but the most notable of these works have usually been fictional, if just barely: from Edith Wharton to John Cheever, from to all can be told if no real names are used. In The Bishop’s Daughter, Moore quite plainly has decided that the old rules aren’t just old—they’re dead.

The book is thus instructive as an example of how meaningless ethnic literary categories are becoming, if they ever mattered all. Having decided there’s nothing to be said for her tribe’s traditional discretion, Moore can thus yank her bisexual father, who died in 2003, quite rudely from the posthumous closet and write of her mother’s descent into mental illness, of her own abortion, of lesbian affairs, and of straight affairs too numerous to keep straight. Much of the book’s compelling scent is the strong whiff of transgression. It’s the odor of dirty sex coming off those sheets of paper. Who writes like this about her dead father’s sodomite tendencies? Who besmirches the church this way? Certainly not a Radcliffe alumna descended of a founder of Bankers Trust! Thank God few of her father’s St. Paul’s classmates are alive to see this. Moores just don’t do this.

That was, in any case, one way to read the message of several anguished letters that Honor Moore’s siblings wrote to The New Yorker after the magazine published an excerpt from The Bishop’s Daughter in March 2008. But what they actually spoke of was common, not aristocratic, decency. “With moving elegiac sentiments, my sister Honor Moore has outed my recently deceased father, Bishop Paul Moore, against his clearly and often stated will,” Paul Moore III wrote. “Many of her siblings were astonished when she decided to do so. Our family resembles many others in that we presume a natural confidentiality as we share our struggles in life.” Osborn Elliott, the former editor of Newsweek and a neighbor of Paul Moore’s in Stonington, Conn., added his two, acerbic cents: “Writing about what she learned growing up as a daughter of Bishop Paul Moore, Jr.—and later about his secret life—Honor Moore seems to have forgotten the Fifth Commandment: Honor thy father and thy mother.”

I for the most part agree with Honor’s younger siblings and with Mr. Elliott: it’s a bad thing to write a book like this. Not because it libels the dead, who in this case is quite guilty and therefore not libeled in any case. And not because it violates class protocol. Rather, there has to be a very good reason to to go against the wishes of those friends and other family who would rather not see their beloved exhumed for the purpose of Amazon rankings. And Moore’s defense, straight from the canons of romanticism, does not cut it: “I came to understand that my own sexual development was inextricably tied up with my father’s complicated erotic life, and…I thought that story important for me to understand. [B]ecause I was a writer, understanding meant telling.” “So you have to write this for your integrity?” Moore is asked. “Yes,” Moore answers.

Nonsense, I answer. And I say that as a writer, one with the same good notices and poor sales as Honor Moore. Our integrity cannot require us to hang Daddy’s dirty laundry in public, nor to ignore the feelings of our siblings. (As another eldest sibling in a large family, I am particularly galled by Moore’s sororal irresponsibility.)

Meanwhile, however, the book is very good. The language is lovely, showing Bishop Moore vividly in all the stations of his cross: as the prep school boy slowly coming to Jesus, the worker priest serving the poor yet conflicted about his own family’s wealth (and about his lust for men), the sad, widowed father of nine children, the nationally famous left-wing bishop of New York City in the 1970s and ’80s, and the elder pastor, outed to his family and hoping for some portion of their compassion. And not only is the book beautifully written—a published poet, Moore will surely be remembered for this exercise in prose—and anthropologically interesting, taking us inside that world where possessions are rarely bought but always had, where every friend and lover has a summer home, and where practically the only Jew to be seen is “Arnold Weinstein,” a Portnoy-figure who honors the much younger Honor by making love on her “in daring, experimental ways.” The book is also theologically profound, making a powerful case that Paul Moore’s progressive episcopate depended on his homosexual urges. He was an enlightened clergyman because, not in spite of, what he believed was his sin-darkened heart.

In his daughter’s telling, Paul Moore appears to have been that rare creature, a genuine male bisexual. As a bachelor Paul Moore had courted more than one woman, and Honor, using her parents’ letters, reconstructs for us the winding road that led to Jenny McKean’s triumph over the competition. Then, beginning at least in seminary, already married, Paul Moore was having gay relationships. He continued having gay sex throughout his marriage. But when he and Jenny separated in 1970, probably because she knew about his affairs, they agreed to see other people…and soon, Honor later discovered, he was “dating no fewer than five women.” After Jenny’s death from cancer in 1973, a grieving Moore connected with at least one old female love but soon was re-married to a new love, Brenda Eagle. He seems truly, if inexplicably, besotted with his second wife, a falling-down drunk who wastes none of her small capacity for kindness on her stepchildren; but the marriage does not, at least, seem like a cynical arrangement meant to maintain a public persona. Meanwhile, Moore keeps his long-term male lover, abandons him when Brenda finds out, then goes back to him after Brenda’s death. He also goes back to women, taking at least one lover shortly before his death. (He told me about her when I him in 2002.) Long after he was out of the public eye, when he had no reputation to uphold, and when his children all knew about his gay past and present, he continued to love and make love to both men and women.

Honor Moore is very sensitive to the nuances of her father’s complicated sexuality, and she never tries to fix his erotic life to any theoretical matrix (his sexuality is never “on a continuum,” for example). She lets the facts speak for themselves, and saves her interpretation for the relationship between those sexual facts and his ministry. First, Honor notes, the overriding desire in Paul Moore’s life was not sexual but pastoral. He wanted to serve God in a very specific way: not as a theologian or church educator or deacon or choir director, but as an Episcopal priest of the traditional parish kind. That meant, in his estimation, having a wife, not just or even mainly for appearance’s sake, but rather because he would need a helpmeet in serving God. “Eventually,” Honor writes, “he found himself in love with my mother, his misgivings about her and his other desires subordinate to his quest for a partner in the life he was becoming more and more determined to pursue, a life in the church.” Attracted to both men and women, he chose to settle down with a woman, and as a young bride that woman helped him feed the poor and shelter the destitute in their parsonage in Jersey City; their joint ministry became a model in the church for engaged social action.

Honor seems to believe that the will does have some sway over the libido. Not only does her father choose women, but after a rocky time with men she loves only women for a long time, then returns to men. In this view, it is plausible that the bishop chose his double life in part because of the kinship it would give him with the suffering. Moore’s first great causes were justice for the Negro and for the poor man, and he was as far as can be from either. He did, however, have his own burden—homosexual love—and it’s one that gave him a sense of otherness, of what it was to be the Invisible Man.

“As my father lived his sexuality with men, it certainly was ‘something else,’” Honor writes, “something that moved beneath the surface of the life he lived with his wives, with his children, with parishioners and colleagues; something that moved between the interstices of language in the charged realm of desire, of imagination, of relationship with the unseen, informing his theology and compassion.” What’s more, if he “had disclosed that existence to his wives and children, he would have had to give up one life or the other….” This is not the time-worn drama of the tragic closet-case. Rather, Honor is arguing that her father’s refusal to choose between two worlds, even in old age, when gay rights were a fact of the world, and even at great cost to honest relations with his family, was the crucible in which his special Christian charity was forged.

That’s not to say that Paul always saw his bisexuality as a blessing. He was a man of his time, ashamed of his same-sex attraction, and he could be blunt about what he saw as a terrible failure. He did not valorize gay love as some sort of manly, Platonic ideal; to the contrary, he saw it as inferior to what a man shared with a woman. “It was an addiction,” he once told Honor. By contrast, “I loved your mother, and I love Brenda.” And at a time, the late 1960s, when other preachers, like the philandering , were preaching a “situation ethics” that might allow for extramarital sex, Moore was slow to give up the belief that “all sexual activity outside marriage was per se sinful,” as he wrote in

But of course that unflagging sense of rectitude contributed to Moore’s suffering, and therefore may have made him an even finer pastor. In 1969, , the first openly gay Unitarian minister, once compared the plight of homosexuals to the plight of blacks in America. “[T]here are many different groups of ‘Niggers’ in this country,” he wrote. “Mexican Americans, poor people, women, and yes, homosexuals.” Moore would never have preached in such off-color language, but he would have been in intuitive agreement with Stoll. “But what of the suffering?” Honor writes. “It was my father’s sacrifice and his gift. It was, as he had once told Andrew Verver”—his longtime lover—“what kept his ministry alive, what made his faith necessary.”

What made his faith necessary. The late twentieth century was not a good time for liberal religion, and certainly not for mainline Protestantism. The old establishment churches, the Congregationalists and the Episcopalians, hemorrhaged membership. People like Paul Moore were losing faith all around him; Honor is quite typical in having rejected the church of her patrician ancestors, the church of her dad. But Paul Moore remained a believer; his faith did not waver. Was it being bisexual that made his faith not only possible but necessary? Without it, would he have been just a rich old white man with a sentimental side and a soft sport for high-church ritual? Did the poor and benighted whom he served have his “addiction” to thank—for his willingness to lead them into the light, for his table where he fed them?

When Paul Moore told his eldest daughter, “It’s come out that I’ve had gay affairs,” he followed up quickly that “it is not public, and…you are NOT going to write a short story about it.” Honor was disoriented, and one of her first thoughts was, “Doesn’t he know I don’t write fiction?” She sure doesn’t. This is an exercise in confessional, the kind her father knew only in a liturgical context. We cannot know how her father would have felt about this fine exercise in non-fiction, and we can only wonder if her siblings will ever forgive her. But for those interested in what makes even some flawed men great, what makes them give their lives over to an ideal that leads them to serve others, this book offers a fresh, provocative answer.

Summer Lovin’-in a flashy 19th Century Sort of Way

At the beach this week, my friend was reading Music for Torching by A.M. Homes. After the novel, she couldn't get her dramatic internal monologue to turn off. She confessed the novel left her narrating her life with a similar sort of agonizing ennui. She said it was something like: “Okay, it’s time for dinner.” She hated the way he swung the dishtowel over his shoulder like he’d actually been the one cooking dinner for the last eight years! Or:

“Great. Let’s go.” And for that moment, she believed they could love each other.

Flopped down there as I was on the beach, I was so happy to have an adventure novel to dig into. My beach book was packed with drama, to be sure, but was light on the simple-sentence quips between white suburban depressives. I turned to my yellowed little paperback Flashman in the Great Game. There I could give myself up to that randy ol’ rascal Sir Harry Paget Flashman of the “Flashman” series by George MacDonald Fraser.

The series came about in the 1970s, and are brilliant books. The novels are chronological memoirs told as the found diaries of Sir Harry. (Fraser based his character off of Tom Brown’s bully at Rugby School from Tom Brown’s School Days of 1857.) The memoirs are artfully written; each book packed with forty or fifty encyclopedic footnotes about various geographic or biographic addendums for further historic reading. And they are saucy and witty as hell. The novels take us through Harry’s missions in India, Crimea, the slaving United States, Germany, and back again to Russia. In short, he emerges as the lucky and yet hexed hero of nearly all of the major wars of the 19th century.

What’s fun about reading Flashy are the novels’ absolute cheek in the face of feminism, heroism, patriotism, and religion. Flashman fancies himself to be a Victorian victor and yet few who meet him do not see through his brazen charade. Our hero is a confessed womanizer, whoremonger even, and an absolute coward in the thick of battle. He’d rather throw a drugged naked women off a sled in Siberia to save his own skin from the Cossacks. In his own words, he’s "a scoundrel, a liar, a cheat, a thief, a coward—and oh yes, a toady." In The Great Game, Flashy manages to tell off a Christian zealot better than any ethicist, “roger” the princess of Jhansi (an Indian province in 1852), and escape execution by his own English army- all in a mere 300 pages.

I’ve been bingeing on Flashy, plowed through five of the series of 12 books in the last two months, and have bought the first book, Flashman (about the first Anglo-Afghan war) for most of the readers in my family. (That makes me feel a bit odd, because the novels are littered with anglophile/intellectual/farcical sex scenes in which Flashman is unabashedly base and fervent. And yet-my dad loves them!) And best, in my mind, these books are a sort of adventurous and historical antidote to the likes of Music for Torching, books that remind us of our suburban monotony and cliche hairdos. I highly recommend going along for a ride with Sir Flashy.

I Hate My Generation

I hate my generation, I offer no apologiesI hate my generation, yeah–Cracker

My recent visit to the Metropolitan Museum of Art’s show “The Pictures Generation, 1974-84" made this song, from 1996, leap to mind. Interestingly, in her review of the show for the Village Voice, Martha Schwendener bookended her take on the show with two generational markers: a comment from a seventy-something viewer (‘I have no appreciation for this’) and an anecdote by Robert Longo (one of the artists in the show) in which a 15-year-old asked if he got the idea for his Men in the Cities drawings from an iPod ad. Schwendener uses these bookends to talk about how art is generational, marked by what’s in the air at a given time; art can be distanced from us by the convictions we’ve spent a lifetime acquiring (the seventy-year-old), or as immediate as our own ignorance (the 15-year-old).

I appropriate Schwendener’s opening as my opening because I’m supposedly ‘of’ the ‘generation’ being represented in this show. Which is to say that the artists represented, born from the mid-40s to the mid-50s for the most part, are from fifteen to five years older than I am. In 1977, when the Pictures show was up at Artists Space, that seminal event from which this exhibit, curated by Douglas Eklund, takes its name, I graduated from high school. So these artists are my elders in the way that older siblings and such can be: which is to say: annoying in their know-it-all cool, their endgame of art as no longer having ‘aesthetic’ quality, no longer being something specifically made as an ‘art object,’ but rather something concocted from images and existing only as commentary on the ubiquity of image, both as something we look at everywhere, as spectators and voyeurs, and as something that shows us ourselves, as reflection and simulacrum.

This, as almost every commentator on the show has underscored, is the first generation to come of age with TVs in the home. And that, we’re made to think, has made all the difference. Though why television should spell the death of the aesthetic object is another one of those mysterious givens of art history, as for instance when it became clear, to use Wallace Stevens line, that ‘it must be abstract.’ We can rehearse the reasons why -- point to Abstraction, point to Conceptualism and Minimalism, point to Pop Art -- and then sum up why the only self-respecting response to the ubiquity of Madison Avenue, as the moneyed little brat it is, is: to appropriate it, thus making images of its images. Only this time with irony.

Fine. I can accept that. It was 1977, after all. Disco . . . Punk . . . New Wave, you get the idea. And, what’s worse, these people were all recently in art school. Let them have fun, let it rip . . . never mind the bollocks. But howevermuch one might have been sympathetic to the stance at the time, something rankles when these brittle disquistions on the staging of objecthood and send-ups of the mechanisms of attention, generally known as The Tube, get appropriated by The Museum and then hoisted onto walls where formerly masters of their medium had hung.

Is there a sense in which these artists -- Cindy Sherman, Robert Longo, Sherrie Levine, David Salle, Louise Lawler, Barbara Kruger, to name but a few -- are masters? Yes, if we mean ‘masters of an idiom,’ ‘masters of manipulation,’ for how else explain the manner by which these largely ephemeral works have become ‘permanent’ as art objects? Where once these artists might have protested The Tube’s appropriation of virtually every image, The Artworld’s appropriation of every possible style, The Museum’s appropriation of every ‘aesthetic object’ so-called (da Vinci to Duchamp, etc.), offering their appropriations as flick-offs of the Pop Art/Minimalist aesthetic that, to quote Saint Andy, was all surface or mere object, they have now, via this show, appropriated The Museum, appropriately enough. Because this day had to come. But as with the idea of exhibits in The Rock’n’Roll Hall of Fame, I can’t help feeling that everything about the art that gave it charm and brio has become a casualty of its enshrinement.

And what’s maybe even worse than that is the fact that the note-cards in the show play back to us the tired tale of epic struggle, not against the Artworld, or even against the bourgeoisie, but against The Image, and yet one is left wondering in whose name such struggle takes place. If it’s artist decrying advertising, such an idea is not new at the time and certainly not accurate to the world of the artist after Pop Art; if it’s the more thinky pleasure of deconstructing the Image, it’s not clear who can really receive this message. For there’s no evidence, in all this surfacedom, that anyone here can think outside the Image or that they would want to. When Sherrie Levine takes pictures of Walker Evans pictures she trades on the fact that those photographs are already known images. And her images of those photos become objects, but objects whose only purpose is the ancillary role of objectifying imagehood. When we watch clips of the TV show Wonder Woman we might hear an internal voice asking us: why was this televised, what does it say about the medium, and about the advertisers who aired to support it, and about those who tuned in -- or at least we might in an exhibit cataloguing eras of broadcast television -- but the question I ask myself here is what makes this objectification of an Image salient, provocative ... enduring?

The big gun of the show, we’re told by everyone, including Barry Schwabsky, whose critical take on the show for The Nation I’m mostly in agreement with, is Cindy Sherman, and it’s true that, in rooms of forgettable stuff, her images of herself as ‘forgotten film stills’ are already ‘unforgettable.’ But is that because they really do look like images of movies we may have seen once upon a time, or is it because we have seen these images before -- in the aggrandizing use of Sherman as the poster child of self-exploitative manipulation of gender signals in which every ‘look’ is aimed to see/show a cliché? Striking as these photographs might have been in their day, they now seem already to be like Walker Evans’ photos chez Levine, now objectified by the curatorial effort to tell a story in which the Image of the object (here ‘woman,’ the great absent signifié) reigns. And Sherman’s sad one-shot psychodramas are the best way to ‘reflect’ that.

The reason the seventy-year-old has no appreciation for this, we may say, is because a good part of her life was lived before The Tube changed forever the meanings of looking and watching and being seen ... enviable woman, she existed before The Image was everything. But the reason I have no appreciation for this is that I don’t see why The Museum has to capitulate to The Tube, nor why my looking at things and beings (odd that I should think such may be found in the world I live in, independent of images of them) has to be inflected by ersatz renderings of more commercial mediums (TV, magazine ads, pop music) for the sake of art history, and po-mo art history at that. For if the grand narrative was already kaput when this stuff first surfaced -- and these artists were cool with that in their glib image-happy heyday -- then the deflating irony comes in when we realize that, without those art-critic gestures to the narratives of Pop Art and Objectivism, these particular images mean -- to borrow another line from a song -- less than zero

Let's Get Radical

A decade-and-a-half ago, somewhere in the far reaches of cloudy memory, a friend told me a wonderful story that went something like this: There was a political radical who had come to some unnamed municipality to agitate for the rights of its local black population. However, instead of the usual grist of petitions and protest marches, he embraced more disruptive methods laced with a good dose of humor. One particular action involved purchasing a hundred theatre tickets for an upcoming, nearly always white-only attended play and giving them to members of the black community whom he was then representing. Before entering the theatre building, the group feted itself with a meal notable for its preponderance of baked beans. Needless to say, the event's malodorous results—and the threat of more such actions—changed how the municipality's cultural centers treated its minority populations, namely for the better. I forgot that story until this weekend when I picked up Saul Alinsky's , published in 1971 by Random House (under the keen eye of its legendary editor-in-chief Jason Epstein). I didn't realize this story came from Alinsky's handbook for how to stir the political pot until I was over a 100 pages in. Before I even came to story itself, a sneaking feeling that I was in familiar territory had crawled up on me. Ten or twenty pages later, there it was: the scene, the Rochester Opera House in Rochester, New York; the instigator, the famed Chicago community activist, , protégé to the great CIO leader ; the bad guys, Eastman Kodak, the University of Rochester, and Rochester City Hall; the cause for all this trouble, the the year before that had paralyzed a city in which the community of stupefied white residents had assumed that, because there had been no such previous riots, all was right in their little world.

But tendrils of unconscious memory were not the reason I plucked the volume off the book shelf of friends whom I was visiting in Chicago this weekend. No, the reason I was intrigued was because of the well-publicized fact that Alinsky's work had served as the for Barack Obama's community activism in Chicago—hardly a surprise given Alinsky's long history of organizing in Chicago, , where Obama worked for nearly a decade and has lived for over two.

In terms of sheer efficacy, there has never been a presidential campaign like that organized by Obama's brain trust, David Axelrod and David Plouffe. But many also attribute the training regimen and organizational keenness of the operation to Obama's own experience as a community organizer, the skills from which he reapplied to the many thousands of campaign-focused community organizers his team churned out with such painfully meticulous efficiency.  (The best ever on the Obama campaign's organization was authored by Zack Exley for the Huffington Post.)

Given the unique character of the campaign, Obama's community organizing background, and the influence of Alinsky's work and writings on Obama, there were who argued that perhaps Republican campaign managers and organizations ought turn a few pages in Alinsky's book and take notes. After all, Democrats had schooled themselves in the Republican playbook after repeated defeats during the Bush years. Surely Alinsky might shed some light on the wonders of the Obama machine.

Well, it does shed light, but not the kind I thought. At first, my assumption had been that, after a few preliminary remarks, Rules for Radicals would just dig in with a flurry of techniques and tactics—and, to a certain extent, it does. But it does more in ways that I am still digesting. In brief, after Alinsky's prologue, the second chapter lays out the groundwork for an ethics of means and ends that out-Machiavelli's Machiavelli by taking apart the old moral saw that "ends don't justify means." In Alinsky's dictionary, this is the very definition of foolishness. While he makes a noble effort to reformulate an ethics in which "particular ends justify particular means," the 11 rules that he, in fact, assets make it hard see how he hasn't merely updated for modern circumstances. Even when Alinsky tries to hem in his "any ends"-"any means" philosophy with such bottom-line provisos of "as long as it does not violate human dignity," it's weak tea, at best. Here are Alinsky's rules, recast in simpler English than the pseudo-mathematical language of the professional philosophers he adopts for no real good reason:

  1. The more closely involved you are in the conflict, the less justification of means and ends matter.
  2. Ethical evaluations of means and ends depend upon the relation of your political position to them.
  3. In war, ends will justify almost any means.
  4. Means and end can never be adequately judged in hindsight.
  5. The more means available for accomplishing an end, the more room there is for ethical considerations of them.
  6. The less important an end is, again the more room there is for ethic concerns.
  7. Success or failure is a strong determinant of the ethics of means and ends.
  8. The imminence of success or failure, victory or defeat, narrows any ethical considerations of means.
  9. The opposition will always cast effective means as unethical.
  10. Do what you can with what you have and clothe it with moral arguments.
  11. Use popular ideas and catch phrases to justify ends.

Here Alinsky liberally mixes the descriptive with the prescriptive, skipping the distinction for the hardened reality, based on his experience fighting large corporate interests on behalf of the underprivileged, that what is and what ought matter little when the rubber hits the road. Alinsky is playing to win, and probably goes even further than Machiavelli in recommending masking one's methods with rhetoric (see rules 10 & 11). In fact, to gain community participation in an action, he shows absolutely no qualms about having supporters do, as he sees it, the right thing for the wrong reasons. For Alinsky, it’s always war, especially when the forces arrayed against you—corporations and their cadres of union-busting lawyers; city halls and their platoons of bureaucrats—will not being giving you any quarter.

Alinsky’s manifesto is a guide to political streetfighting, lessons that were not learned by the Gore or Kerry campaigns but were clearly absorbed by Obama’s. Notwithstanding the seeming noblesse oblige of his campaign—as opposed to the messy bomb-throwing that characterized the McCain camp—it was all a street fight, from beginning to end. Alinsky, for example, recognizing how little real power “have-nots” can bring to bear against “haves,” strongly recommends a kind of ju-jitsu (he has a chapter called “Hoist the Enemy by His Own Petard”) that the Obama campaign took to heart, almost encouraging (yes, encouraging!) the McCain campaign to wallow in its own muck.

Did Obama take the high road in his campaign? He did…and didn’t (see Rule 10 again). All that tut-tutting and wink-and-nod ridicule, as if all of us together couldn’t help but shake our heads at how foolish the McCain campaign acted, was just Alinsky-esque karate chops to the back of the neck as McCain and Palin careened forward with their misplaced drop kicks. Even Machiavelli would have to smile.

Occasional Paper #1: Rudolph Delson Reviews the Official GED Practice Test

This post marks the release of the New Haven Review's first occasional paper; as the title suggests, we expect to put out more such papers, well, occasionally (though we have more in the works right now). Why an occasional paper, you may ask? I answer: why not? In this occasional paper, novelist and essayist , a lawyer by training, reviews the Official GED Practice Tests (Steck-Vaughn Co., $21.95). No, it's not mean. And no, it's not smarmy. What is it, then? Download and find out. And let us know what you think—both of Delson's piece and the idea of occasional papers generally.

Cruciverbalize This!

Puzzling as a sport was not a feature of my father’s love of the crossword. He enjoyed them thoroughly, but there was no fanaticism in his play, and thus neither stopwatches nor blasts of indignation at seemingly disingenuous clues or specious puns. He was a cruciverbalist—the technical moniker for the habitual crossword solver—in the most traditional of senses, at his leisure or on a lunch break. Moreover, he liked doing them in ink and all caps—both no-no’s according to Stanley Newman in his .

Midnight Picnic

By Nick Antosca (Word Riot Press, 2009)

Bram pulls into the parking lot half asleep and the crunch of gravel under his tires becomes the crunch of bone. Something screams.

The old deerhound that lives at the bar—it’s pouring tonight and he didn’t even see her.

That crunch.

He gets out of his dented Pontiac, hunches against the downpour. He doesn't want to look. It's 3:30 A.M. and the bar is dark. No light to see by except the Pontiac's headlights, ghostly cones of white slashed by rain.

He kneels to look under the car.

Nothing.

"Baby!" he yells, getting up. "Where are you?"

Movement off in the darkness, on the other side of the car. The deerhound, dragging herself away. She looks less like a dog than a man in a dog suit, huge, crawling across the gravel. He goes to her side.

"Oh, I'm sorry, I'm sorry,"—his voice splintering—"Hold on, let me look."

The damage is catastrophic; the dog will die.

Not quickly, though.

These are the opening paragraphs to Nick Antosca's , a short and terrifying book that I read a few months ago now and just can't get out of my head. The plot follows Bram, an aimless young man living in West Virginia who finds the bones of a murdered child. Hours later, the dead child finds Bram and asks him to help avenge his death. What follows would be, in the outline of the plot, part ghost story, part revenge story, except that the experience of reading it is less like either narrative and more like having a waking nightmare.

I don't use the word terrifying lightly. As anyone who's been to a bad horror movie knows, scaring people is not easy. Do it wrong and it's boring, or maybe just kind of disgusting, or worse, unintentionally funny. (That the line between horror and comedy is so thin and blurry is one of the reasons, I think, that the has blossomed into such a delightful genre.) Do it right, though, and you tap into the fear that early humans must have felt when the sun went down and it began to rain, and they were huddled in a group under a tree that did not provide shelter, and they knew that predators were coming for them. For me, the first two-thirds of The Shining do that (though not the final third, which becomes boring); perhaps all of 28 Days Later and much of Clive Barker's stuff does, too.

But Midnight Picnic's particular brand of scare reminds me most of David Lynch, who, , pulls horror from simple elements—lighting, sound, costume, a good line, clever camera work—capturing with eerie effectiveness the experience of having a very bad and extremely compelling dream. Antosca's own use of such dream logic is the best I've come across in a long time. There are a few missteps—at one point, about halfway through the book, Bram interrogates the dead child in a way that very nearly breaks the spell—but here I'm just quibbling. I could give you passage after passage of the images and conversations that engrossed and frightened me, but I don't want to ruin them.

Also, and most impressively, Antosca manages to give his story what many horror tales never even reach for: heart. Yes, Midnight Picnic is scary. But it's also, keenly and unexpectedly, touching and tragic; for underneath the ghosts and revenge is another story about a boy looking for his father, but not being quite ready for what he finds.

Seidel'd

One of my more interesting reading experiences last fall was provided by Frederick Seidel's Ooga-Booga (2006). I don't know much about Seidel except he's rich, was born in 1936, published his first book of poems in 1962, and didn't publish another book until 1979. His Collected Poems, 1959-2009 was released a few months ago. I'm hoping to dawdle through it this summer. Whatever we expect a poet to show us, it's rare that he shows us a lifestyle to which only that elusive 5% of the population with 37% of the wealth are accustomed. In Seidel's case, as in "Barbados," there is an outrageous tendency to be as rancid as anything he might witness. Poets with political axes to grind do, of course, give us glimpses of brutal acts and consequences to jar us out of our literary complacency. But Seidel somehow seems to suggest that all he's grinding is his pencil, to make it sharper. Whatever the outcome of the chaos we live in, he seems to shrug, I was there.

But what makes his writing so hard to fathom is its childlike simplicity. Or, rather, its simplicity is so arch, so tongue-in-cheek, so craftily artless, that one always waits to be slapped or jabbed by the inevitable line that arrives with all the specific, precise density -- drowning in acid -- of Robert Lowell or T. S. Eliot when they suddenly drop the right phrase into its inevitable place.

Huntsman indeed is gone from Savile Row, And Mr. Hall, the head cutter. The red hunt coat Hall cut for me was utter Red melton cloth thick as a carpet, cut just so. One time I wore it riding my red Ducati racer -- what a show!-- Matched exotics like a pair of lovely red egrets. London once seemed the epitome of no regrets And the old excellence one used to know Of the chased-down fox bleeding its stink across the snow. --"Kill Poem"

Yeah, and "a savage servility slides by on grease." To me, the echoes of Lowell's "Skunk Hour" dance through a poem that strikes me as a Charles Bukowski poem for an uncannily different demographic. But Bukowski came to mind while reading Seidel, not only for the "fuck you if you can't take me" ethos that these poems exude, but for a sense of the poem as the only possible response to a life of this tenor. Once your lines become this spare, they spare nothing.

But look at how the diction does whatever it wants -- the beautiful balance of line 3 ends with that hanging "utter" that is itself pretty damned utter. And then the "what a show!" interpolation in a flash makes speaker and poem as cartoonish as anything -- or at least as any inconceivable commercial for Ducati racers(!) could be. Then the "matched exotics" of "egrets" and "regrets" so funny and so baldly bad, as we veer into "the old excellence" that ends with a line worthy of Lowell and an image that suddenly brings in the death and blood that lurks so smugly behind all our diversionary tactics. Gee.

What I like about Seidel is the way he plays our banalities back at us, but first subjects them to a sea-change that causes the acrid brine of his own peculiar vision to cling to them:

The young keep getting younger, but the old keep getting younger. But this young woman is young. We kiss. It's almost incest when it gets to this. This is the consensual, national, metrosexual hunger-for-younger. --"Climbing Everest"

What is said is what anyone commenting on how the rich old court the fresh young might say -- but it would be said in a wagging finger way, or at least with mockery of the jaded, fading oldster trying to ignite himself via youth. But Seidel says it with a kind of rueful surprise at being the oldster accepted by youth in his "hunger-for-younger." In other words, it's not jaded at all, but almost charmingly surprised by the mores of "almost incest," where the words "consensual, national" do the job of making both old and young part of a machine that operates simply because it operates. "My dynamite penis / Is totally into Venus" Seidel quips, the intonation of youth appropriated by age to make the sex act partake of "the moment" as, we tend to think, only youth can. The insinuation of the poem -- that such sex acts, like that Ducati racer, are grandiose acts of death-courting -- never stops asserting itself after that first verse of foreplay, and each joke gets a little edgier, stripped of any self-satisfaction, but gripped by the vanity of vanitas, which is to say that being vain is a vain endeavor, that the grave is grave, and that "the train wreck in the tent" is addicted to all the tender mercies he can get.

Judging by Ooga-Booga, Seidel is an acquired taste that I'm on my way to acquiring because his poems confront me in a way that the poets I end up living with for awhile do. Bring on those Collected Poems.

The street where I live . . .

I have been thinking about turning I wrote about my street, West Rock Avenue, into a book, and so I have been doing a lot of reading about urbanism, town planning, and architecture. Basically, I am trying to figure out what makes some streets livable and others not. A good deal of the literature — by people including New Havener Philip Langdon, whose A Better Place to Live has given me a whole new outlook on what makes a space a happy one in which to dwell — boils down to this: don't depend on cars. People are happier when they can walk to see neighbors, ride their bicycles, and live close enough to their neighbors that they know them. This small-town mythology is one that I am particularly susceptible to, having grown up in a neighborhood that had many of a small town's virtues. And I find myself, as I read these books, falling prey to an unfortunate smugness, as if growing up on streets laid out on an easily navigated grid, with houses on quarter-acres instead of large lots, is the only way to have a happy childhood.

But that can't be right. For one thing, this mythos runs contrary to another important American mythos, the rural farm. I don't think many of us would want to say that children growing up in the countryside, learning to milk cows by their parents' sides, are unhappy. Nobody thinks that that's an uninspiring or despairing way to grow up. And, to be fair, the writers I'm reading aren't reacting against that way of life, which may be dying out; they are reacting against suburban sprawl, which seemed poised to dominate the American landscape.

But what of that suburban sprawl — especially those cul-de-sac developments that have proved so popular in late-20th-century construction? Can one have a happy childhood where there are no sidewalks, where it's too dangerous to ride a bicycle, where there are no secret passageways behind garages or corner stores at which to buy candy?

I don't know. On the one hand, I don't want to underestimate children's capacity for self-mystification. I suspect that most children, at least most of those who grow up middle-class, and sheltered from anything too abysmal in the family's home life, look back at their early years with a certain sense of awe and wonder. Those lookalike houses in Del Boca Vista Estates are not lookalike to the children inside them, who know which house has the best video-game system, which kid has the dad who makes the best forts with the dining room table and some blankets, whose parents go out late and don't hire a babysitter (all the better for watching verboten TV channels).

On the other hand, there is empirical evidence that suburban life of this kind can lead to bad things: obesity, too much time in the car, fewer friends, less play. And teenagers—forget about it. If they can, they flee to the city. Or at least the curious ones do.

But what I don't have are good sympathetic non-fiction books about life in suburban sprawl. For every book critical of that way of life — Langdon's book, Duany et al.'s Suburban Nation, Ray Oldenburg's The Great Good Place — there seem to be exactly zero books about why it can be pleasurable to grow up in spaces that are, after all, safe, predictable, and quiet, which are all good things.

I want the other side of the story. Ideas, anyone?

New Haven in the 21st Century...?

Driving through the flooded streets of Long Wharf this wet week, then ditching the car to hike up my skirt and trudge barefooted through filthy knee-high water in front of Union Station – all to catch a Metro-North commuter train to Manhattan -- I flashed on a day when life would seem less at odds with nature. Usually, I’m not a green utopist, but a visually stunning and provocatively written new book, (Abrams) is turning me on to the ecological origins and conceivable futures of cities. I’m so taken with this book that I’ve bought several copies as gifts (perfect for Father’s Day) and am planning to present it to my daughter’s grade school to adapt into the curriculum – that’s how important I think the ideas are.

Here’s the premise: even a city as developed as Manhattan began as a natural landscape of forests, trees, rivers and streams. In 1609, when explorer Henry Hudson first came upon Mannahatta -- the “island of many hills” to the Lenape tribe – it teemed with flora and fauna among fifty-five ecosystems that “reused and retained water, soil, and energy, in cycles established over millions of years.” Back then, Mannahatta supported a human population of three hundred to twelve hundred. Today, Manhattan is home to millions of people on a planet of billions. Yet, as author argues, it’s a “conceit” to think any city and its inhabitants -- no matter its technological and economic development – “can escape the shackles that bind [us] to our earthly selves, including our dependence on the earth’s bounty and the confines of our native place.”

Scholarship and research aside, Mannahatta is a fun read. Full-page color photos of today’s Manhattan juxtaposed with photographic visualizations of 1609 Mannahatta open up, centerfold-like, throughout the book. Author Sanderson, a landscape ecologist with the Wildlife Conservation Society, is adept at making complex scientific issues accessible to the lay reader. Charts and maps show habitats of species, distribution of fauna, and even the location of beavers in the vicinity of today’s Times Square. City buffs can pore over bathymetry and topography maps. I laughed aloud when I learned that the bronze bull standing at Bowling Green (the New York Stock Exchange) was once a hill twenty feet high.

By 2050, the majority of people on earth will live in cities, and they will have to become greener and more hospitable if they are to continue being the vibrant cultural and business centers they are today, as well as comfortable places to live in. In the last chapter, “Manhattan 2409,” Sanderson addresses basic human needs – food, water, shelter, energy – acknowledging that changes in infrastructure will most likely come piecemeal. Waterways and greenways will slowly replace beltways and avenues. Sanderson proposes new green buildings with lizard-like second skins that can both shade and insulate, open and close depending upon the season.

As to this week’s flooding at the New Haven train station? The next time the city renovates Union Street, they might use permeable paving materials that capture rainwater below the surface. Or perhaps, we can add a “green roof” -- a thin layer of soil planted with grasses and flowers that slows water flow and cools a building -- atop the New Haven Police Station? I can always dream.

Note: This September 12, 2009, celebrates the 400th anniversary of the arrival of Henry Hudson in New York Harbor. See also, the exhibition at the Mannahatta/Manhattan: A Natural History of New York City.

“My upbringing was pretty weird," says David Bowie's son

I know. You're thinking, "No WAY." But sure enough. Or so Duncan Jones, the artist formerly known as Zowie Bowie, the New York Times last week.

Jones was recalling the formative years during which his father introduced him to the likes of George Orwell, J.G. Ballard and Philip K. Dick, and let him hang around the set on movies such as Labyrinth. The occasion for these revelations was a publicity push for Jones' feature film directorial debut, Moon--an impressive piece of work, not least because its general disposition is so steadfastly down to Earth.

Sam Rockwell stars as a near-future moon base laborer who for three years has spent his days alone mining the lunar soil's rich supply of Helium 3, with which his far-flung corporate overseer claims to be solving Earth's energy crisis. Alone, that is, until an entirely unlikely visitor arrives and turns out not to be good company.

In recent years, Rockwell has been building a fine body of work by wondering how men live with themselves, and Moon is all about that. It’s hard to discuss in detail without giving the whole plot away, and of course the plot--developed by Jones with screenwriter Nathan Parker--is pretty ridiculous. Let’s just say that it takes place on the mysterious frontier between space madness and corporate malfeasance, and that my disbelief was suspended.

I like the movie’s peculiar personality, its way of being a functional assembly of nice touches--like Clint Mansell’s driving score, or the deliberate tactility of the production design, or the obligatory omnipresent talking computer being voiced by Kevin Spacey, whose performances always strike me as facsimiles of humanness anyway.

Most of all, I like that it's not ever too peculiar. As a conscious throwback to the unabashedly philosophical, pre-CGI science fiction of Jones' youth, Moon also has just enough astronomical distance from his famously spaced-out dad. If we want to call Jones' good taste an inheritance, we should allow that so, too, is his discretionary independence.

Marcia Bartusiak and the day we disappeared

Broadly speaking, the history of astronomy reads something like the story about how we humans have discovered our insignificance in the cosmos. In the last two thousand years, major discoveries about the solar system, our galaxy and the universe have shuffled the likelihood of our existence deeper and deeper into the realm of improbable chance and fortuitous coincidence. In the big picture, we're barely a pixel. I suppose it's only unsettling if you start out assuming that human beings are, quite literally, the center of the universe. Two thousand years ago, Aristotle stepped up and said the Earth sits at the center of an unchanging, infinite universe; our planet is surrounded by celestial spheres most easily observed by the movements of the Sun and planets. People believed Aristotle and moved on with their lives. In the 16th century, Copernicus stepped up to the plate, swung, and blasted the Earth out of the center. He explained the motions of the planets by invoking a heliocentric view of the cosmos. The sun, not we, sat in the middle. Earth had officially been displaced as the center of everything—at least in theory. (And Copernicus himself was displaced. Last November, archaeologists finally found his long-lost remains buried under the floor of a cathedral in Poland.) In the 17th century, Galileo improved upon the design of a Dutch engineer to build a telescope, with which he delivered good evidence in support of Copernicus. In this case, good evidence made the Dominicans mad, and Galileo was rewarded by the Catholic Church with house arrest (albeit a very comfortable house arrest) for the rest of his life.

And so on—each new discovery places humanity further from a central position. The greatest decentralizing act of our time may have occurred on New Year’s Day, 1925—the eponymous "day" in the title of science writer ’s latest offering, . Bartusiak is an experienced and award-winning writer, a fellow of the AAAS and a former Knight fellow at MIT. (I feel fortunate to count myself among her former students.)

On that day in 1925, astronomers were gathered in Washington, D.C., for the annual meeting of the American Association for the Advancement of Science. On the third day of the meeting, someone stood up, gave a talk, and changed everything. Edwin Hubble, a 35-year-old stargazer working at the atop Mount Wilson in California, had found conclusive evidence that the Milky Way—our beloved home—was not the only galaxy in the cosmos. (Hubble, fearful of sullying his reputation, didn’t even present his own findings.)

It’s easy to be blasé about the impossible-to-comprehend infinitude of the cosmos now; after all, almost everyone alive today grew up surrounded by the knowledge that the Milky Way is no loner. And we science junkies, with our varying deficits of attention, learn, marvel and move on, looking for the next big thing. Bartusiak, however, writes about that blustery day in the nation’s capital with the infectious excitement of a giddy astrophile:

In one fell swoop, the visible universe was enlarged by an inconceivable factor, eventually trillions of times over. In more familiar terms, it’s as if we had been confined to one square yard of Earth’s surface, only to suddenly realize that there were now vast oceans and continents, cities and villages, mountains and deserts, previously unexplored and unanticipated on that single plug of sod.

Or here:

The Milky Way, once the universe’s lone inhabitant floating in an ocean of darkness, was suddenly joined by billions of other star-filled islands, arranged outward as far as telescopes could peer. Earth turned out to be less than a speck, the cosmic equivalent of a subatomic particle hovering within an immensity still difficult to grasp.

That’s just the beginning: In Day, Bartusiak lovingly and meticulously traces the origins and development of a big idea. Hubble’s name is familiar to most of us mainly because of recent news about the space telescope that bears his name, but he doesn’t really show up in the book until two-thirds of the way through—a structural choice that demonstrates Hubble stood on the shoulders of many, many giants. His name may be forever associated with the discovery of the universe, but his finding was no instantaneous flash of brilliance that launched him from obscurity into the annals of science.

In fact, Bartusiak calmly puts to rest the idea that scientific advancements arrive in discrete packages marked by the word “Eureka!” Despite the legend of Archimedes, scientists aren’t usually struck out of the blue by the clear light of truth: "Answers did not arrive in one eureka moment, but only after years of contentious debates over conjectures and measurements that were fiercely disputed. The avenue of science is more often filled with twists, turns, and detours than unobstructed straightaways."

Her enjoyable book is an exhaustively researched exploration of both major and minor players. She points out that Hubble wasn’t the first person to suspect the great vastness. The Roman poet Lucretius thought it ludicrous to imagine a finite universe; the polymath mystic Emanuel Swedenborg mused that there must be ‘spheres’ beyond our own; Immanuel Kant correctly discerned galactic shape and suggested galaxies were scattered throughout space.

Many unsung heroes get a nod: Vesto Slipher clocked the speeds of distant spiral-shaped nebulae (later identified as other galaxies) and found that most of them were speeding away from the Earth—“a precocious intimation of the cosmic expansion that took many more years to fully recognize.” Henrietta Leavitt’s studies of variable stars made it possible to measure the distance between us and galaxies far, far away. Leavitt, one of many women hired to be a human “computer” at the Harvard College Observatory, died in 1921—four years before the Royal Swedish Academy of Sciences tried to nominate her for a Nobel Prize in Physics. (It wasn’t to be—Nobel nominees must be living.)

The book ends with something of a cliffhanger. Hubble and his fellow giants had found that not only is the universe filled with other galaxies, but these galaxies are retreating. Further investigation revealed that these “galaxies are not rushing through space but instead are being carried along as space-time inflates without end.” No matter which way we look in the sky, we see this vast universe rushing away from us. (In a way, that does put us back at the center of things, but only because every point in the universe is the center…)

Modern astronomy continues to tell us how far we are from the center—and that we still can’t comprehend the weirdness of reality. Ninety-six percent of the barely interacts with the protons, electrons and neutrons that make up our reality. (It’s most likely flowing through you right now, and there’s no way to know.) are scanning the skies for the telltale spectrographic signature of distant, rocky world that may harbor life. Other astronomers are looking for the “chameleon,” a theoretical entity that adjusts its mass to its environment and may help explain dark energy, the unknown goo that fills most of space. Also unsettling is the idea that our infinite universe is just one of an infinite number of universes that together form the ".” Is there no limit to enormousness?

(PS: The International Astronomical Union and the United Nations declared this year the an initiative that invites countries and institutions to step up their education of the public in all matters regarding the universe. Bartusiak’s book would be a great way to celebrate. So would a trip to Yale’s new planetarium at the , which offers free shows on Tuesday nights and, on June 12, a showing of “War of the Worlds.”)

Reviews, Reviews, Reviews

So much to talk about today, it's almost impossible to know where to start, so let's work backwards from what I last read… For years I've known of the achievements of , who carries the distinction of being one of the few, if not only, African-American, female writers in the otherwise all-too-white and once upon a time all-too-male genre of science fiction. Butler's reputation, moreover, is stellar. She cleaned up in science fiction awards for her novella , landed a Nebula for , and even had the rare distinction for a science fiction writer of receiving a MacArthur Foundation "Genius" Grant. She from a stroke at the relatively young age of 58 after authoring some thirteen books in a writing career that spanned nearly 30 years.

Her last work before she died was a science fiction vampire novel, , described by her as a "lark." At a minimum, let us say that it is any number of cuts above such fare as Stephanie Meyers' Twilight series, which I only know from DVD since I refuse to plow through the many thousands of pages of teen vampire angst run amok in the halls of our nation's high schools. Indeed, one wonders if Butler was not responding in part to this that I have lovingly dubbed for my teenage daughter as the "hickeys with holes" brand of fiction.

Fledgling is compelling. A child awakens in a cave, badly injured, in terrible pain, with no memory of her past and struggling to survive. Ravenously hungry, operating only on instinct, Shori discovers that she is a 53-year-old vampire in the body of an 11-year-old child, a member of an ancient, anthropogenetic race known as the "Ina," who live alongside human beings. Shori's amnesia is a literary device that just borders on the trite for pumping up readers' feelings of suspense. But it's also an opportunity, in Butler's deft hands, to reimagine the human-vampire relationship as one of instead of formal parasitism. What we get is Butler's latent utopianism in which the idea of the family is reconfigured into a mixture of physical addiction and mutual dependence, open sexual relations and Western ideations of the village family unit.

But there's an added wrinkle: Shori, unlike all of her vampire relations, is black, purposely so, the result of experiments in skin pigmentation and Ina-human gene mixing. Presumably this should raise Fledgling to the level of , a genre I generally favor when done right. But the material seems to get away from Butler, and what appeared so promising at its opening simply doesn't deliver on the possibilities suggested, an unfortunate result for a work that—as vampire novels today go—still surpasses its peers in depth and invention.

…..

If no one objects to my jumping around a bit for today's post, then let me pick up where my colleague left off by discussing a wonderful book by one of our own that has come into New Haven Review's hands.

When George Scialabba's arrived at the doorstep, I was hooked. Right away I knew Scialabba would be my kind of intellectual, regardless of what intellectuals may or may not be good for. Gathered from the last two decades or so, this collection of essays and reviews raises the question in more ways than one. First and foremost is through the persona of the author himself, who is a public intellectual in perhaps the truest sense of the term. You see, Scialabba is not a professor at a major research university or a policy wonk at a think tank or a Gore Vidal-esque aesthete pontificating from an Italian villa or one of the liquid lunch crowd flowing in and out the Condé Nast building. No, Mr. Scialabba appears to be one of those rarities: a working stiff whose vocation appears to have little to do with his avocation. When he's not busting Christopher Hitchens' chops or assessing Richard Rorty's contributions to American culture, he is presumably working budgets or dressing down contractors in his daylight existence as an assistant building superintendent. OK, I'll grant that even a plant manager at Harvard may have the advantage of proximity to some of the best minds in the country. But Harvard is hardly distinguished for its HVAC systems.

Scialabba, as a public intellectual, is part of a cultural tradition of thinkers who opt to keep their day jobs when none from MFA programs or think tanks are forthcoming. And, to be honest, that's something of a relief to me. It's probably no surprise then that Scialabba most admires those intellectuals whose qualities are defined less by their professional status than the clarity and cogency of their writing, even when on the wrong side of an issue. As a result, What Are Intellectuals Good For? is a veritable who's who of publicly accessible intellectual discourse. Dwight McDonald, Stanley Fish, Richard Rorty, Christopher Lasch, Alisdair MacIntyre, Irving Howe, and assorted others are the subjects of essays and reviews that are notable for their force of argument and precision of thought. There is nary a Continental thinker nor an American imitator to be found here.

There is a special fondness for the —Howe, Trilling, the rest of the Partisan Review crowd—in part for their achievements, in part for their apparent disdain of specialization and academicization. As a consequence, Scialabba's more recent heroes tend towards the plain-spoken and generally incisive Russell Jacoby, Christopher Lasch, and Richard Rorty. Less admirable are the likes of Martha Nussbaum (too generic), Roger Kimball and Hilton Kramer (too conservative), and Christopher Hitchens (too crazy).

And yet whatever Scialabba's verdict, we'd do well to listen. He's often on point, even if you disagree, and quicker than most to get to the root issues in any writer's corpus of thought. But what really distinguishes this collection, especially the reviews, is how Scialabba lets the books and their authors take center stage. Too often in venues such as the and, albeit less frequently, the , one gets the funny feeling that the reviews are more about the reviewer than the reviewed. Now, it would be mean-spirited to begrudge a reviewer his or her authorial voice: I can assure you Scialabba doesn't conceal his. But 4,000 word essays in which the title under review makes its grand entrance in the last two paragraphs do not always seems worth the price of admission. Reviewers with grand ideas and theories of their own are sometimes better off just writing their own books. Fortunately, Scialabba avoids this species of reviewing hubris.

But already I commit the very sin I deplore, too wrapped up in sound of my own voice and not letting Scialabba's book take over from hereon. But let me shamelessly plead the constraints of space and conclude on this note: What Are Intellectual Good For? is, in a sense, the meditation of one deep-thinking critic on the work of other deep-thinking critics and their views of politics, social justice, and morality. In another sense, it is a reader's roadmap to some of the best cultural criticism written in the last half century. And in both senses taken together, it is a highly recommended starting point for anyone who cares deeply about this much-endangered species of criticism.

…..

So who the hell is Robert Levin? Well, there's always the , where you can learn that he's a jazz critic, a short story writer, and a writer of music liner notes. He seems to have had his heyday here and there—a critical letter to the Village Voice about the that drew a year's worth of responses; a 2004 recipient of "storySouth Million Writers Award Notable Story."

That story is the title of a collection of Levin's writings, . Dare I confess that I read this slim 90-page volume over the course of seven dog walks? (Yes, I can walk and read at the same time; I can also chew gum and type.) Let me add that it was one of my more pleasurable dogwalking experiences, which is otherwise a dreadful bore. The reason is simple: Levin is funny. Leaving aside the eponymous lead short story, itself a ribald tale of mistaken identity and the sexual pleasures that can derive therefrom, the miscellany and commentary are laugh-out-loud grotesques, some weirdly Dickensian in their exaggeration of the mundane, others Jamesian in their syntactically elaborate transformations of the bizarre into the clinical or poetic. Only examples will do. In his screed "Recycle This!" on a recycling notice asking residents "to sort and…rinse [their] garbage before leaving it out," he writes: "So while I'll allow that self-immolation would constitute a disproportionate form of protest, I have to say that reacting with less than indignation to so gratuitous an imposition would also be inappropriate." That's a fairly ornate response to a recycling notice. Like I said, pure Dickens.

Or consider "Peggie (or Sex with a Very Large Woman)," a story so wonderfully offensive that it would be impossible not to relish the absurd attempt to poeticize the physical challenges set before Levin's narrator: "…Peggie's particular body could have served as a Special Forces training ground for the field of hazards and challenges its presented. I'm speaking of the twisting climbs and sudden valleys, the crags, the craters and the amazing plenitude of gullies, ravines and bogs that I was, and on my hands and knees, obliged to negotiate and traverse in my search." And don't even ask what he was searching for. You can probably guess.

In some ways, Levin is at his best wringing every drop of qualification from a feeling or thought, an instance of rage or fear, often in one long but densely packed sentence. The bathos of the stories and of some of the miscellany—there are cantankerous whines about cashiers and their stupidity, smoking bans, HMOs, aging, the aforementioned recycling notices—is actually what makes it all worth the reading. Levin, in essence, gets more out of the mundane through an overwrought prose style that is utterly apropos to the sensibility behind it.

But there's no substitute for the man himself, so let's conclude with his thoughts on when one of God's "natural wonders"—in this case a solar eclipse—fails to deliver the goods: "I'll allow that, however disappointing it may be, it's ultimately of small consequence when He mounts a shoddy eclipse. But it's something else again when, for one especially egregious example, He leave you to blow out all your circuits trying to figure just where a mindless inferno of neuroticism like Mia Farrow fits into the notion that everyone's here for a reason." Consider my own circuits blown.

George Scialabba v’Eretz Yisrael

At long last, the critic is getting . I only met George once, about ten years ago, and I had forgotten how articulate he was in conversation; I was reminded by listening to with Christopher Lydon on his web show (which is as good as anything on NPR). I did, however, want to take issue with one comment George makes—and I hope that my minor quibble will be taken in the context of the huge respect I have for George, who is an essential writer and you should buy and read. At one point, George takes a stab at explaining how many Jewish intellectuals moved right-ward politically; his explanation, and it's not his alone, is that the 1967 war, when Israel's survival seemed to be at stake, caused many American Jews to become more attached to Israel, a country that until then had not been a major part of American Jewish consciousness, especially among intellectuals. Since then, he says, many Jews have been unwilling to follow their progressive principles if those principles might put them at odds with (their perception of) what's best for Israel's survival. And so we can understand how, for example, there was no large Jewish outcry about the invasion of Iraq, which they took to be in Israel's interests. (I hope this is a fair representation of George's position; I'm talking about one or two minutes in a 44-minute interview otherwise filled with fascinating discussions of Randolph Bourne, Walter Karp, and other too-forgotten intellectuals. If this is an unfair statement, I hope George will let me know in the comments section—although I understand if he has better things to do!)

On one level, George is of course right; in fact, he does not go far enough. Israel's remarkable victory in the '67 war not only heightened Jewish concern about the survival of their several million co-religionists in Israel, but it also—more important, I think—increased Jewish pride in identification with that state. Even my fervently anti-Zionist, left-wing grandparents were a little astonished at a country that had produced successful Jewish soldiers (or so my mother recalls). (And here I am reminded of the comedian Jackie Mason's line about the difference between Jews and Italians: Jews are wimps on the street corner, while Italians can f— you up; but put them in an army, and Jews are indestructible, whereas Italians can't shoot straight.)

But I think we have to know which Jews we're talking about. The Jews who became the famous neo-conservatives—Norman Podhoretz and Irving Kristol, most famously—were well on their way to the right before 1967, and they were swinging right in a way that was bound to sweep all their opinions to the right. Indeed, I have often marveled at the sheer, improbably drift of their move to the right—how can it be intellectually honest to just happen to move right on labor, foreign policy, economics, etc., at the same time. Such a comprehensive move is more likely to be the result of cynicism or careerism. There is no reason, after all, why becoming more hawkish on foreign policy also entails becoming more hostile to labor unions. But with these guys, so it went, and there you have it.

Anyway, I don't think 1967 had much to do with where Kristol, Podhoretz, and Himmelfarb ended up. Nor did it have much effect on a lot of Jewish New Left types who were pretty irreligious to begin with, and Jewishly uninterested, and who make up one important core of anti-Zionism today. After all, while most Jews are not anti-Zionists, a lot of outspoken anti-Zionists and Israel critics actually are Jews. Jews may, in fact, be more disproportionately anti-Zionist than they are disproportionately Zionist, compared to the American population at large. Pretty much all Americans are, in their unthinking way, supportive of Israel—a goodly number of Jews, in a very thoughtful way, are critical of Israel. Especially among intellectuals, and that's whom Scialabba is talking about.

So who are these Jews whose foreign-policy ideas were warped, or subtly shifted, by the 1967 war? For whom was the war decisive in that way? The best case could be made, I think, by looking at my contemporaries (I am 34), rather than at neo-cons in the sixties and seventies. I would hazard that a lot of New Republic types (to just pick one useful marker), people like Peter Beinart, say (although there is no reason to pick on him, and he's written a lot about how his position on this has changed), were more inclined to support the Iraq invasion because of having grown up in a post-1967 world where the survival of Israel was an issue for young American intellectuals in a way that it wasn't for, say, my dad.

But I think what George was really getting at is a general despair, his and others', that the same people who have been central—indeed, indispensable—to so many other social-justice movements in America have seemed, to him, relatively absent on foreign policy. And that is a shame. But the causation isn't so simple.

Another point about Jews: most American Jews, even those who went Communist or socialist, have, in their own ways, been very supportive of the American project. This is, after all, the land that saved us from what had been happening, and what lay in wait, in Europe. So that deeply felt Americanism has been channeled into certain domestic progressive causes—like Civil Rights—where it is apparent that the United States is not living up to its ideals. And with our long tradition of women being at least moderately educated, and working outside the home (in the shtetl, scholars' wives often worked to support their husbands endless hours in study), Jews were at the forefront of Second Wave feminism. And there was a history of labor radicalism that Jews brought from Europe. But Jews have not historically been pacifists, and we have been enthusiastic soldiers in every American war (including both sides of the Civil War). It may, therefore, be a bit of a mistake to read into that Jewish progressivism a congenital anti-war inclination. Yes, many Jews were at the fore of the anti-Vietnam movement (although perhaps not out of proportion to our representation on liberal college campuses, where the movement was centered). But it's not my sense that the leading pacifists in the Great War or World War II were Jews—they were Protestants, often of the Anabaptist or Radical Reformation stripe: Quakers, Mennonites, etc., with a smattering of Jehovah's Witnesses, and some more mainline Protestants.

So while it would be nice if there were a strong, identifiably Jewish foreign-policy left today, and in the run-up to the Iraq War, I am not sure that that was ever likely, or that there was a historical precedent, and I don't think its absence is as clearly related to the 1967 war as George Scialabba seems to think.

Letter from New Orleans

Thinking about it now, I pause to think about the ramifications of moving from one new place to another over the past five years—from New York to New Haven and now to New Orleans. After years of banging around various locales in and around New York City, it wasn't too long after I moved to the Elm that I was schooled by two locals on the question of emphasis when it comes to how you actually pronounce New Haven. New Yorkers, it seems, tend to put the emphasis on the NEW! and not the York. "I live in New York, not to be confused with Old York." But here, as Ideat Village impresarios Bill Saunders and Nancy Shea counseled one night, repeatedly, the emphasis, generally speaking, is on the Haven. Not NEW! Haven but New HAAAY-VEN. After awhile I got it; you want to linger on the Haven a good long while, since it is a town that will grow on you. So I started emphasizing the Haven part of New Haven and was therefore able to live here for over four years. When it comes to New Orleans, hey, I'm having a hard enough time pronouncing some of these street names without worrying too much about where the emphasis ought to lie. Post-Katrina you could argue for NEW! Orleans, but that sounds like Chamber of Commerce-approved marketing of the most vanilla-cynical variety. Besides, the blessed and endemic lassitude of the Big Easy begs for a lingering over the Orleans. On the other hand, from a fact-checking point of view, there is an Orleans with which one can make a differentiation. So I have been worrying about it, but not too much.

The other day I was walking in the Marigny neighborhood of New Orleans, which, I have learned, is not pronounced “ma-RIG-knee,” and came upon one of those those Volvo sub-wagons you see in every town, festooned with an abundance of bumper stickers citing various locales, pols and causes. Amid the Hillary! Stop War! and Peace is Patriotic stickers was one trumpeting the glories of . . . New Haven!—complete with an accompanying icon, a slice of pizza. The nudgenik in me thought, "Hey, that's wrong! None of the legendary places in town sells pizza by the slice! Outrages!"

But seeing that bumper sticker, indeed the whole trove of them, did evoke yet another question of where to put one's emphasis, this time when it comes to the old canard, "Everything in moderation." It's one on the great cautionary aphorisms of all time, but that's only because most of us put the emphasis on the moderation. Embedded within lies the stunning and deeply gratifying notion that if you can swing the moderation, you can have everything!

And so there we were, the winter of 2008, myself and Bill and Nancy, and the writer Todd Lyon. By then I knew how to tell people I lived in New Haaaay-ven and it had stopped bothering me that my friends from New York would always want to know how I was enjoying Hartford. In any case, we had an agenda. Oh, such a one as it was! We would hit four pizzerias in one night. We'd start at Zuppardi's in West Haven, get the double-dose at Sally's and Pepe's, and then grab a capper pie at Modern.

A slice here, a slice there, no gastro-problems would ensue if we paced ourselves; that was the plan. Alas. Immoderation won out in Wooster Square and we never made it to Modern. As for BAR, we kept it off the list. Too new, relatively speaking, to qualify for the tour.

Boston's Neat Graffitist vs. New Haven's.... Random Acts of Text

A Short Tribute to Selected Artiness I Remember from the 1980s, and a Hearty Recommendation of a Novel by Eric Kraft

When I was in high school, someone—I have no idea who—went around town putting up posters that said "New Haven is the Paris of the 1980s."

This was completely untrue, but it just slayed me and my friends. Every few years or so, I end up in a conversation with someone who was around then, and we go, "Remember the 'New Haven is the Paris of the 1980s' guy? Who the hell was that?" and then we laugh and have another beer.

In the 1990s I read all the available work by the sadly underrecognized literary genius Eric Kraft. I'd read his Herb n' Lorna, fallen madly in love with it, and begun to eat my way through the rest of his books. The one I liked best, and which is probably still my second favorite, is called Reservations Recommended. It is a sad comic novel (I know that sounds impossible, but trust me, it isn't) about a guy who lives a boring life working for a big company, but has an alter ego who is a restaurant critic in Boston. A lot of the novel is this guy's observations about Boston in general, and many of those observations focus on someone he calls the Neat Graffitist. The Neat Graffitist, actual identity unknown, goes around Boston neatly magic markering the town with random statements, "in small, precise capital letters," such as:

NEVER FEAR PAIN. TIME DIMINISHES IT. BUT AVOID BOSTON CITY HOSPITAL. NURSES THERE WEAR USED UNIFORMS PURCHASED FROM BURGER KING, TREAT PATIENTS WITH FATALISTIC DETACHMENT.

There are many parts of Reservations Recommended that I reread with deep pleasure, just reveling in the wonderfulness of it, but the words of the Neat Graffitist are some of my favorite parts of the book. My husband and I are especially fond of this one:

TO HERBERT: YOU WERE BORN ONCE AND NOT TWICE AND WHEN YOU ARE DEAD YOU WILL BE DEAD FOREVER. GIVE ME BACK MY WATCHES. THEY WILL NOT MAKE YOU HAPPY. THEY ARE NO DEFENSE AGAINST DEATH.

The Neat Graffitist belongs in a category, I feel, with the "New Haven is the Paris of the 1980s" creator, along with the person who spent time writing pithy little sweet nothings on masking tape and putting them on parking meters around downtown around 1984–85. I vividly remember strolling around downtown with a friend who noticed this and chortled: "Uh-oh—someone's getting arty with the parking meters." It was almost certainly a Yalie, but still pretty entertaining.

So whoever the "New Haven is the Paris of the 1980s" guy (or, okay, girl) is, thanks and hats off to him/her; also to the Masking Tape Artist of 1984. Where are you now? Do you even remember doing these things?

Meanwhile, Eric Kraft's been getting rave reviews for his recent fiction, but it's not likely you've actually acted on your fleeting thought, "Gee, I should pick that up sometime and read it." Listen to me. Start with Herb n' Lorna if you can; if you can't, with Reservations Recommended. Act on the fleeting thought. Fleeting thoughts are our friends.

Summary Observations: The Movie

Aside from intellectual property attorneys, who really knows where to get good movie ideas? Julie & Julia, due in theaters this August, is Nora Ephron's movie of Julie Powell's memoir (originally a blog) of the year she devoted to making every recipe in Julia Child's famous cookbook, Mastering the Art of French Cooking. Starring Amy Adams as Powell and Meryl Streep as Child, it is said to be the first wide-release movie developed from a book developed from a blog developed from a cookbook. And it just goes to show that potential entertainment properties are lurking everywhere. What most interests me, though, is its implied confidence in the supremacy of storytelling. If this film succeeds, it might inaugurate a whole new cinematic subgenre of movies dramatizing the doing of things described in instructional books.

Is an adaptation of Harold Bloom's How to Read and Why finally at hand? If so, what would it require? Perhaps the enterprising screenwriter might invent some twenty-something everyman, poised on the brink of self-actualization, and cross cut his intellectual development with telling formative vignettes from the life of Bloom?

Already I can picture our young, book-addled hero, sitting in an uncomfortable chair and contemplating “the vagaries of our current counter-Puritanism,” with the camera swirling and the music swelling around him; or standing by his apartment window, gazing out into the dusk and bearing in mind that “Irony will clear your mind of the cant of the ideologues, and help you to blaze forth as the scholar of one candle.” It began with Jason Schwartzman in contention for the part, but now I’m seeing Michael Cera.

So OK, it’s looking like this will be a Ron Howard picture, dumbed down just enough for mainstream safety and perhaps controversial in its casting of Tom Bosley as Bloom (certain members of the blogorati having lobbied in vain for Martin Landau). A box-office success? Maybe. An Oscar magnet? Well, sure, as long as it gets across the notion that “We read not only because we cannot know enough people, but because friendship is so vulnerable, so likely to diminish or disappear, overcome by space, time, imperfect sympathies, and all the sorrows of familial and passional life.”

And if that doesn’t work out, there must still be a good movie to be made from How to Complain for Fun & Profit: The Best Guide Ever to Writing Complaint Letters, by Bruce Silverman. Or at least from The Garden Primer, by Barbara Damrosch.

Finding the War

It is common to hear that part of what contributed to victory in World War II, and the overwhelming sense that it was the right thing to do, was that nobody at home knew how awful it was for the soldiers fighting it abroad. For many years now, , , and have been editing the popular story about the war, revealing its singular brutality and the myriad of motivations that led the powers that be to fight it as they did. This has led to a bit of a crumbling of World War II's image as America's last good war, due to both those hoping to complicate its simple popular moral equation and those hoping to give a clearer picture of just what the soldiers' sacrifice entailed. Yet the idea that, from 1939 to 1945, the people were sheltered from the war's horrors persists—an idea that I found myself questioning when I read Ernie Pyle's , a collection of the war correspondent's dispatches from 1943 to 1944, when he covered the war in Italy and then the Allied push through France to Berlin. The 1944 edition of the book is a fascinating artifact of the time period: Along with the copyright, an eagle-emblazoned seal states that this is a Wartime Book: "books are weapons in the war of ideas," the sash fluttering from the eagle's beak proclaims. The seal goes on to state that "this complete edition is produced in full compliance with the government's regulations for conserving paper and other essential materials," and it's easy to see the result: It's printed on very thin paper (that nonetheless has held up remarkably well—even wartime cost-cutting seems to have produced a better-quality book than today's mass-market paperback printers do) and with a clear regard for cramming as much text onto the page as possible without rendering it illegible. For me, a grandchild of those who lived through World War II, the effect of the book is to recall the stories I'd heard of who in my family served in the war, what they did, and where; and also, what the lives of the women who stayed behind were like, waiting for their husbands to come home, raising small children who had no recollection of their fathers.

Pyle, like the cartoonist , is celebrated for his honest yet dignifying depictions of the soldiers that he met, and Brave Men certainly gives you a lot of that. But Pyle's vision of the war does more than that: By giving us what he saw in Europe at the height of combat, and by making the soldiers he met human—naming them, talking about the meals he shared and the combat he experienced with them—Pyle undermines the idea of World War II, or any war, as good. Consider his description of an air battle he witnessed from the ground:

Someone shouted that one of the planes was smoking. Yes, we could all see it. A long faint line of black smoke stretched straight for a mile behind one of them. And as we watched there was a gigantic sweep of flame over the plane. From nose to tail it disappeared in flame, and it slanted slowly down and banked around the sky in great wide curves, this way and that way, as rhythmically and gracefully as in a slow-motion waltz. Then suddenly it seemed to change its mind and it swept upward, steeper and steeper and ever slower until finally it seemed poised motionless on its own black pillar of smoke. And then just as slowly it turned over and dived for the earth—a golden spearhead on the straight black shaft of its own creation—and disappeared behind the treetops. But before it was down there were more cries of, "There's another one smoking—and there's a third one now." Chutes came out of some of the planes. Out of some came no chutes at all. One of white silk caught on the tail of a plane. Men with binoculars could see him fighting to get loose until flames swept over him, and then a tiny black dot fell through space, all alone.

Or his description of soldiers approaching a firefight:

The men didn't talk amongst themselves. They just went. They weren't heroic figures as they moved forward one at a time, a few seconds apart. You think of attackers as being savage and bold. These men were hesitant and cautious. They were really the hunters, but they looked like the hunted. There was a confused excitement and a grim anxiety on their faces.

They seemed terribly pathetic to me. They weren't warriors. They were American boys who by mere chance of fate had wound up with guns in their hands, sneaking up a death-laden street in a strange and shattered city in a faraway country in a driving rain. They were afraid, but it was beyond their power to quit. They had no choice. They were good boys. I talked with them all afternoon as we sneaked slowly forward along the mysterious and rubbled street, and I know they were good boys. And even though they weren't warriors born to the kill, they won their battles. That's the point.

He goes on to describe all of them—their names, their addresses, and a epigram about them that makes them suddenly, startlingly real ("his New England accent was so broad I had to have him spell out 'Arthur' and 'Auburn' before I could catch what he said"; "Eddie was thirty, he was married, and used to work in a brewery back home; he was a bazooka man, but his bazooka was broken that day so he was just carrying a rifle."). Always giving them their dignity.

I like to think it was this dignifying impulse, and not the work of censors, that made Pyle use a kind of synecdoche when he described the beaches at Normandy just after the fighting was over. He lets the soldiers tell you what the actual assault was like themselves, the things that happened to them then, but he never pretends he was there with them. And when he takes a walk along the beach himself, he tells us that "men were sleeping on the sand, some of them sleeping forever. Men were floating in the water, but they didn't know they were in the water, for they were dead." But he lingers much longer on the mangled machinery, "empty life rafts and soldiers' packs and ration boxes, and mysterious oranges," and at last, "in a jumbled row for mile on mile were soldiers' packs." He goes on:

There were socks and shoe polish, sewing kits, diaries, Bibles, hand grenades. There were the latest letters from home, with the address on each one neatly razored out—one of the security precautions enforced before the boys embarked.

There were toothbrushes and razors, and snapshots of families back home staring up at you from the sand. There were pocketbooks, metal mirrors, extra trousers, and bloody, abandoned shoes. There were broken-handled shovels, and portable radios smashed almost beyond recognition, and mine detectors twisted and ruined.

There were torn pistol belts and canvas water buckets, first-aid kits, and jumbled heaps of life belts. I picked up a pocket Bible with a soldier's name in it, and put it in my jacket. I carried it a half a mile or so and then put it back down on the beach. I don't know why I picked it up, or why I put it back down again.

There are other things he sees that day, ironic and funny and pitiful and heartbreaking. And though Pyle himself never questions why they fought—if no war is good, fighting against Nazi Germany was certainly just—the overwhelming impression you get from the soldiers is that they're just trying to do their jobs and then get back home. If they believe in the cause, it's not so much for the lofty reasons that came out of the mouths of politicians, but because what they went through damn well better have meant something.