Monday, August 15, 2011
That is, the natural impetus of the blog format towards novelty — something I've been complaining about for five years now — makes it easier to link to one more story, with a few words of commentary, than to stop and think matters through at greater length and with greater rigor. I don't think that's been good for me.
This format served me well, I believe, when I was working on my book The Pleasures of Reading in an Age of Distraction. It was deeply helpful to me to try out ideas here and get comments on them — some of which made their way into the book — but now that that project is done I am feeling the need to move on. And you know, blogs have natural lifespans, or so I think; few of them can continue indefinitely without diminishment. Especially when they are issue-based blogs. Who wants to watch someone ride the same old hobby-horses year after year?
So I am stepping away. I will continue to write about the issues I wrote about here, but in longer formats and elsewhere. I am hoping to write a good deal more for The New Atlantis itself, if Adam and Ari and Caitrin and the crew will have me. But much of my energy in the rest of 2011 and all of 2012 will go to my two book-length projects: a critical edition of Auden's long poem For the Time Being and a "biography" of the Book of Common Prayer (both for Princeton University Press).
I am by no means abandoning the online world. I have resumed posting to my good old online commonplace book here, where the full range of my interests is represented; and I will continue to be active on Twitter, for which I harbor a strange and ongoing affection. I will also still use the #textpatterns hashtag, and I hope you will also. But for now, and from here: So long, and thanks for all the fish.
Sunday, August 14, 2011
Nicholas Christenfeld and Jonathan Leavitt of UC San Diego gave several dozen undergraduates 12 different short stories. The stories came in three different flavors: ironic twist stories (such as Chekhov’s “The Bet”), straight up mysteries (“A Chess Problem” by Agatha Christie) and so-called “literary stories” by writers like Updike and Carver. Some subjects read the story as is, without a spoiler. Some read the story with a spoiler carefully embedded in the actual text, as if Chekhov himself had given away the end. And some read the story with a spoiler disclaimer in the preface.
Here are the results: . . . almost every single story, regardless of genre, was more pleasurable when prefaced with a spoiler.
Lehrer accepts the results without question, but that may be because, as he says in the post, it's his habit as a reader to spoil endings for himself. But he also knows that that's not common, that most of us try to avoid learning the endings of stories (though of course sometimes we succumb to temptation). So I'm wondering how much these results are dependent on the novelty of the experience: that is, it's at least possible that the greater pleasure taken in "spoiled" stories is a temporary phenomenon, resulting at least in part from the pleasure of deviating from habitual practice. Would the pleasure of knowing the ending in advance hold up over time for the majority of readers? I wonder.
It might, though. I have noticed that sometimes, when reading stories of suspense, I think too much about "how it comes out" and therefore have trouble focusing on the story as it develops: my mind is overly focused on the conclusion that I don't know and am constantly anticipating. This might not be the best frame of mind in which to enjoy a story, and it's certainly possible that knowing the ending could liberate me to enjoy all of the story, not just its ending. But I have my doubts.
I also wonder whether people's feelings about spoilers might be relative to the length of the narrative, or, if we're going to include other forms of art, the time we invest in it. The stakes are lower when you're taking half an hour to read a story than when you need two hours of more to devote to a movie — Would The Sixth Sense be better if you knew the ending in advance? That's hard for me to imagine — or when you need a dozen hours or more to read a big novel. Maybe there'll be future studies that will explore these variables.
Friday, August 12, 2011
Sarah Werner offers a thoughtful, informed take on some issues I've raised here in, well, thoughtless and uninformed ways: "The digitization folks talk about access and the book folks talk about being in the presence of the object. Neither side tends to present a more nuanced sense of how they might each have something to offer the other, or to recognize that there might be other considerations and uses at stake."
Read the whole thing, as they say, and then read the follow-up post. Great stuff.
And then just as I'm complaining about the invocation of the great god Relevance, here comes Mark Bauerlein to preach Against Relevance:
As any trainer in sports, in the military, or in martial arts will report, however, to make the experience successful, training for it has to go well beyond it. In martial arts, for instance, one goal is to prepare someone to handle a confrontation wisely, with proportional force and self-awareness. Some confrontations may require physical defense, blocks and punches and kicks, and so the student has to be trained for them. The training program, however, asks of students much more than the confrontation will demand. Training involves high kicks, but rarely is it effective to throw one in a confrontation. A low kick will suffice. But in order to make that low kick effective, the student has to master high kicks.
The pattern applies to the cultural materials on the syllabus. If teachers want students to discern the implicit meanings in commercial images, they should have students study images of more complexity and subtlety. A few days with images taken from great photography and film will equip them to “read” music videos much more effectively than will a few days with those videos themselves. Poetry by Alexander Pope and Edna St. Vincent Millay will do more for students’ verbal cognizance than will political advertisements and Twitter tweets.
This is the immediate virtue of anti-relevance. If teachers want to raise critical thinking about contemporary mass culture, they should expose students to past high culture. The language of Romantic poetry exercises critical thinking about language better than does the language of billboard jingles. It’s a paradox, but it’s true. If teachers want students to know the present and all its coarse enticements, they should immerse them in the best expressions of the past.
Mark is often too curmudgeonly to suit me, but there's a lot of wisdom in this. I similarly argue, in my recent New Atlantis essay on McLuhan, that his success in limning new media stemmed largely from his thorough training in the old.
This kind of thing just makes me sad. Sad, sad, sad. It seems that whenever any event causes people to think about how "young people today don't read" — in this case, bizarrely, it's the failure of English looters to break into Waterstone's — the worn old words get dragged out and dusted off, as Nikesh Shukla drags them here:
We need to . . . create a culture that lasts the entirety of young adult life. The people who will want to read will read. Those who might stand to, as Waterstone's put it, "learn something", need to be engaged more.
How does that start? It starts in-house, in the publishing industry. We need to produce more books that relate to these kids and their lives, offering something relevant or aspirational. We need to market these books directly to these audiences, make young people feel included and empowered to read. We need to deliver these books in relevant and contemporary ways. Maybe the rioters would have BBM'd less if they had other stuff to read on their phones.
Engaged, relate, relevant, aspirational, included, empowered, relevant (again), contemporary. It's hard for me to believe that anyone really, truly believes that if the publishing industry just published more books featuring dark-skinned characters a whole new culture of readers would miraculously spring up.
The young people Shukla is rightly concerned about have, for the most part, grown up in homes with few or no books, and at school their overworked and often underprepared teachers struggle to inculcate basic literacy. Nothing about this situation "starts in-house, in the publishing industry"; such a claim simultaneously elevates publishers' importance far beyond what's warranted and creates pointless guilt (since the imposed expectations can never be fulfilled). Whether you want to blame the political Left or Right, or secularism, or media culture, or capitalism, or whatever for the recent riots, the problems go far, far deeper than a lack of appropriate reading material.
And while I'm complaining, one more point: this whole post assumes that reading is an intrinsically pacifying experience. But it isn't. Maybe — maybe — fiction can work that way, but what if these young people gained the necessary literacy to read and really absorb The Autobiography of Malcolm X or The Wretched of the Earth? Reading gives birth to revolutionaries, too.
Thursday, August 11, 2011
There's something satisfying about this development — in this case anyway — as people discover that they have ways to participate in the clean-up of their community, in more ways than one. But there's also something obviously scary about it.
And that's the way it goes: wherever our burgeoning information technologies touch our lives, they magnify, dramatically, already existing tendencies. It would be a mistake to think only about what's cool in that or what's disturbing. It's cool and disturbing alternately, or all at once. If you're feeling disoriented by these magnifications, get used to it. There's a lot more where that came from.
Wednesday, August 10, 2011
Well, we've been around this block a few times here at Text Patterns, but as I read that post a little thought experiment occurred to me. Imagine a person who comments regularly on certain blogs under a pseudonym, and writes her own blog under a different pseudonym, and then of course "IRL" or "offline" has an official legal name. Not an especially unusual situation, I imagine, and one that few of us would find upsetting or even noteworthy.
But what if that same person applied the principle of contingent self-naming to her regular in-person social circles? What if she told one group of friends that her name is Carol Watson and another that it is Tamar Weinberg, while at work and to her family she's known as Jennifer Esposito? Do we have a problem with that? My sense is that most of us would find that kind of creepy — even those of us who find the use of various online names perfectly acceptable. But why? If it's okay online why wouldn't it be okay offline? Is there just a residual "yuck factor" there that we ought to dispense with? Or what?
P.S. After writing this and putting it in the queue, I discovered that the estimable Alexis Madrigal has a different take on our expectations for everyday self-identification: "in real life, we expect very few statements to be public, persistent, and attached to your real identity. Basically, only people talking on television or to the media can expect such treatment." I'm not sure precisely what Alexis has in mind here, but our public identities are attached to a lot of what we do everyday: not just our written and oral exchanges with friends and co-workers, but most of our purchases now that we increasingly use cards rather than cash. More than at any point in human history, the average person's everyday life is made up of statements and actions that are "public, persistent, and attached to [his or her] real identity." So in that sense our expectations for online conversation are outliers. The question is whether they should be.)
Tuesday, August 9, 2011
Last spring, I spent a few hours looking at the autograph manuscript of “Dorian Gray,” at the Morgan Library. When Dorian attempts to destroy his portrait, the manuscript has him “ripping the thing right up”; Wilde then adds the phrase “from top to bottom.” Nicholas Frankel, the editor of the new Harvard edition of “Dorian Gray,” notes that the eviscerating gesture evokes Jack the Ripper, whose crimes had filled the papers two years earlier.
I think Frankel is quite wrong: the line Wilde added has nothing to do with Jack the Ripper. The genuine and key reference here is to the Passion narrative in the Gospels:
Jesus, when he had cried again with a loud voice, yielded up the ghost. And, behold, the veil of the temple was rent in twain from the top to the bottom; and the earth did quake, and the rocks rent; And the graves were opened; and many bodies of the saints which slept arose, And came out of the graves after his resurrection, and went into the holy city, and appeared unto many.
At times Wilde tried to downplay his Biblical knowledge. (One of my favorite anecdotes about him concerns his taking a viva voce examination at Oxford to demonstrate his competence in New Testament Greek, during which he fluently translated at sight a passage from the Passion narrative. After a few lines the examiners, thoroughly satisfied, told him he could stop, but Wilde replied, "Oh do let me continue. I want to see how it comes out." I testify not to the truth of this story.) But his writings, like those of most of his contemporaries, are saturated with Biblical allusion.
The tearing of the picture "from top to bottom" is an especially powerful and rich one. Wilde would have known that this event in the Passion story was widely thought to indicate that the sacrificial death of Jesus ends the separation between God and humanity (represented in the Temple by the veil separating the Holy of Holies from the rest of the world, the veil that only the High Priest could cross) and effects a reconciliation that makes further sacrifices at the Temple unnecessary. Here, then, Dorian's slashing of his own portrait — which brings about his own death, as he surely knows it was likely to do — ends his own bifurcation. It makes him whole again, though at the cost of his own life.
A hundred years ago most readers of The Picture of Dorian Gray, who were educated much as Wilde was, would have caught the reference; now even the experts are likely to miss it. Jack the Ripper is the kind of thing we are interested in, thus we see Jack the Ripper — even though what Dorian does is a deeply guilty man's self-mutilation, not a monstrous killer's preying upon innocent victims; while the Bible is not the sort of thing we are interested in, thus even a direct quotation can be invisible to us. Given that this alteration has happened in little more than a century, it makes one wonder how much else we scholars have managed to lose sight of. Perhaps every English department should keep a Christian around just to catch Biblical allusions that his or her colleagues won't recognize.
Monday, August 1, 2011
The problem with precision, though, is that it can often be discouraging. Let’s say you want to lose 10 pounds. After following a strict diet for a few days, you then decide to weigh yourself. The good news is that you have lost weight. The bad news is that you’ve only lost 4 pounds. While that represents progress, it probably feels pretty disappointing, since you’ve already worked hard and you’re not even half way to the goal. As a result, you might become a little less motivated, which means that you start to cheat on your diet. Before long, those pounds are back – you’ve been undermined by the precise feedback. The larger point is that the exactitude of the scale made it impossible to ignore the lack of success, which makes us more likely to surrender. And this is where vagueness comes in: when information is ambiguous we typically settle on more generous interpretations – Perhaps we’ve lost eight pounds! Perhaps we’re just retaining water! – which means that we stay motivated. In this sense, vagueness is a useful delusion, a nifty means of remaining committed to long-term goals. Reality is a deterrence.
Hang on — is it true that "when information is ambiguous we typically settle on more generous interpretations"? And is it true that "generous interpretations" mean that we "stay motivated"? I'm not sure I buy either of those completely unsupported assertions.
Let's suppose that I do interpret generously: if all I know is that I've lost some weight but less than ten pounds of it, what's to keep me from saying, "Hey, maybe I've lost eight or nine pounds, in which case this milkshake won't set me back too far"? Why makes that assumption less likely that "I'm probably nearly at 10 pounds so I'll push harder to get there"?