Text Patterns - by Alan Jacobs

Monday, May 31, 2010

The Mongoliad

Now here’s a really interesting idea — from Neal Stephenson and others — for a book . . . or rather an app . . . or a service . . . sort of a wiki or a game . . . oh heck, just read it:

The Mongoliad stand out as a possible way forward for post-print publishing. PULP makes this book into something that's truly the product of our collective imaginations. When you're reading a chapter of the book, you always have the option to pull up a an interactive discussion window and leave a note or enter a discussion about the book. You can write your own additional storyline. Or add to the pedia to explain more about the historical setting. You can also rate every aspect of the book, rating any page on a scale of one to five stars.

The Mongoliad isn't just a story; it's a platform for collaborative worldbuilding. The question is, how do you prevent such an endeavor from degenerating into chaos? "We have the concept of canonicity – if we like it we'll tag it as canon," Bornstein says. "We'll have ways of reflecting people's community standing in our forums. So some people will be able to help curate the canon, and we'll be the ultimate arbiters." The book will become a thicket of fanfic, but there will be a clear, canonical path marked through it by the creators of the story.

So how will you buy The Mongoliad? It won't be like getting a traditional ebook, which is usually wrapped in some kind of format or digital restriction management software designed to prevent people from sharing it. Subutai is devoted to selling this book without DRM. You'll get it in an app store, and you'll pay what Bornstein calls "a relatively low price" for it as a six-month service, where you get new content every week. At the end of those six months you can renew for "a lower price." Bornstein hinted that the book will eventually contain "a few games too."

I have to say, this sounds really interesting — except I don't like the idea of having the experience locked in to the iPhone/Pod/Pad world. That wouldn't be the venue I’d choose anyway.

(Thanks to Adam Keiper for the link.)

Sunday, May 30, 2010

why aren't you more emphatic about my lack of empathy?

Psychology Today:

College students who hit campus after 2000 have empathy levels that are 40% lower than those who came before them, according to a stunning new meta-analysis presented to at the annual meeting of the Association for Psychological Science by University of Michigan researchers. It includes data from over 14,000 students. . . .

Previous research done by psychologist Jean Twenge had measured what she labeled a "narcissism epidemic," with more students showing selfish qualities and with increases in traits that can lead to a diagnosis of narcissistic personality disorder. That is a condition in which people are so self-involved that other people are no more than objects to reflect their glory.

But I was less than convinced by that data because some of the measures of narcissism — statements like "I am a special person," — might reflect a lifetime spent in classrooms aimed at raising self-esteem rather than a true increase in self-centeredness.

The survey on empathy used in this study — which you can take for yourself here — however, is another matter. While it so obviously measures empathy that you could easily game it to make yourself look kinder and nicer, the fact that today's college students don't even feel compelled to do that suggests that the study is measuring something real. If young people don't even care about seeming uncaring, something is seriously wrong.

Online life gets a lot of blame. Plausibly?

UPDATE: Via Robin Sloan at Snarkmarket, a curious and thoughtful essay on this topic by Roberto Greco.

Friday, May 28, 2010

the problem of abundance

The Quintessence of Ham:

Roger Chartier identifies eighteenth-century concerns about scarcity and abundance which closely parallel the challenges faced by digitial humanists. For example, he notes that the “fear of obliteration obsessed the societies of early modern Europe.” According to Chartier the eighteenth century compounded the problem of scarcity with unexpected abundance. He describes the scenario as one of “uncontrollable textual proliferation, of a discourse without order or limits. The excess of writing piled up useless texts and stifled thought beneath the weight of accumulating discourse, creating a peril no less ominous than the threat of disappearance.” Anyone who has studied printed materials during the late seventeenth and early eighteenth century is acutely aware of the explosion of new titles and reeditions that transformed the literary landscape of early modern Europe. This revolution in supply was matched by equally transformative growth in demand, with literacy rates spiking and growing especially fast among women. In France, for example, the percentage of women who could read doubled, and the overall rate among men and women rose from about 1/3 to 1/2 of the population. Authors developed new strategies to differentiate their works, and readers had to develop new filters to determine what was worth reading.

The eighteenth century was thus a time when people grappled, often uneasily, with the problem of abundance. One response was the Enlightenment fascination with taxonomy and system-building. The eighteenth century gave us enduring systems for ordering living things (Linnaeus) and physical matter (Lavoisier), but it also attempted to systematize more or less the entire material world with a spate of projects tackling language, arts, cooking, hair styles, whatever. All were designed not only to impose order but also to solve the problem of abundance.

Courtesy @tcarmody.

reforming the humanities PhD

Views: A New Humanities Ph.D. - Inside Higher Ed:

Humanities education needs to do more than change the shape of the dissertation, legitimate non-academic jobs, or validate academic jobs that are not tenure-track teaching posts. The crisis in academic humanities, brought on by years of focus on nothing but turning out professor-wannabes, has to be addressed long before the job-placement stage. Long before the dissertation stage. We need to train Ph.D. students differently from the first day of graduate school.


If we value the humanities enough to teach them at the undergraduate level, if we believe that humanities education produces thoughtful, critical, self-aware global citizens, then we need to recognize that advanced training in the humanities cannot be simply the province of aspiring tenure-track faculty members. If there’s no prospect of a tenure-track job in the humanities, and humanities graduate programs train students for nothing but tenure-track jobs, how long can these programs be sustainable?


The current job crisis may be just the impetus graduate humanities education needs in order to recognize that what it has to offer is essential to this democracy, and essential to training leaders in a whole range of fields, far beyond academics.

scritic on reading habits

Reflections on Cog Sci: How we read (articles and magazines) now: "In this post, I want to compare the two modes of reading: how we read before the internet and how we read now. I will be limiting the analysis only to news articles and magazines since I believe, pace Nicholas Carr, that it isn't still clear how the internet has changed book-reading habits. But today, the internet IS the platform for delivering news and I believe it has changed our reading habits substantially."

dragged back into the maw of the Beast

I think I'm re-Googled. My escape attempt has, I fear, failed.
Mail is the main issue. Fastmail is a fine email service, but I need more email organizational-fu than I can get via their web interface. That means using Apple Mail, tricked out with some plugins . . . but Mail is, frankly, a mess of an application. It has never worked well for me: it's often unresponsive, and will sometimes spin its wheels for an hour without managing to open a folder, eating crazy amounts of CPU as it does so. Also, I don't use Microsoft products, and the Mozilla-based alternatives (Thunderbird, Postbox, etc.) aren't sufficiently integrated into the OS to make them attractive alternatives.
Meanwhile, Gmail is super-fast and has a suite of organizational tools that I have used for years and have fine-tuned to my needs: labels, filters, and Superstars allow me to classify every email I get in useful ways.
So I think I'm going back. Which probably means going back to Google Reader as well, since it's so much faster than Fever and handles all my feeds flawlessly. (Fever tends to lose some.)
You gotta handle it to those Google people. They make some first-rate products. And they have responded much more appropriately than Facebook to privacy concerns. I think I can control my privacy settings in my Google services sufficiently well to salve my conscience . . . I think . . . I hope . . .

Thursday, May 27, 2010

the dichotomy

Clay Shirky:

There are two principal effects of the Internet on privacy. The first is to shrink personal expression to a dichotomy: public or private. Prior to the rise of digital social life, much of what we said and did was in a public environment — on the street, in a park, at a party — but was not actually public, in the sense of being widely broadcast or persistently available.

This enormous swath of personal life, as we used to call it, existed on a spectrum between public and private, and the sheer inconvenience of collecting and collating theoretically observable but practically unobserved actions was enough to keep those actions out of the public sphere.

That spectrum has now collapsed — data is either public or private, and the idea of personal utterances being observable but unobserved is becoming as quaint as an ice cream social.

I can't improve on these statements

Chadwick Matlin:

The purgatory scenes are a symptom of what, in retrospect, was Lost's greatest flaw. It refused to follow its own advice and let dead be dead. In the early seasons, Lost prided itself on its willingness to kill off any character it wanted. This, we were told, was proof that on the island the stakes were high. But then Lost's writers fell in love with their characters, and people started wearing bulletproof vests, recovering from harpoons to the heart, and returning as Demon Spawn. By granting the characters' souls eternal life, in purgatory or elsewhere, the writers diminished our interest in their actual lives — the ones audiences spent six years watching. Lost's writers should have taken a lesson from their characters and learned to let go.

Will Wilkinson:

It is the sloppy promiscuity of our undiscerning sentimentality that allows us to project our feelings from one character across worlds to his or her non-identical counterpart. . . . Now, I don’t know about you, but I’d like to think I’m not such a pushover. I don’t want to marry a bundle of repeatable attributes. I say I’m in love with an individual, a solid substance and its singular quiddity. I could give f•••-all if her counterpart in some untouchable precinct of the multiverse wears an eyepatch, wins the Pulitzer Prize, or is torn limb from limb by cannibal dwarves. None are my beloved. The finale of Lost pretended to be about the ultimacy and redemptive power of love, or something like that, but it exemplified instead the incoherent ruinous mess of our needy scattershot attachments, our whorish readiness to be doped by the dull, warm, indeterminate golden light. Speak not to me of love, Lost, if you know not love.

I’m not a fan of large sweeping apocalyptic statements, but here’s one for you: the current fascination with possible worlds, an infinite number of alternative universes, is death to narrative. Death to narrative because our stories draw their power chiefly from the limits of our lives. If death is the mother of beauty, limit is the mother of story. I’m not sure why or how the makers of Lost got caught up in this — in the recent reboot of Star Trek it seems that J. J. Abrams glommed on to it because it offers infinite expansion of the franchise: one world in which Kirk and Spock are enemies, another in which they are best friends, several in which they die young, a few in which they live to a ripe old age. . . .

But whatever one’s reasons for embracing this model, it renders every particular story vacuous. Why weep when Lear enters, bearing the dead Cordelia in his arms, or when Juliet awakens from her drug-induced sleep to find Romeo dead? Much easier to turn our eyes to those alternate worlds in which Lear and Cordelia crush their enemies, and Romeo and Juliet unite the houses of Montague and Capulet, world without end, amen. Or, rather, world that goes on until we get bored again and decide we want a bloodier cosmos, just for a change.

thesis for disputation

"There must always be two kinds of art: escape-art, for man needs escape as he needs food and deep sleep, and parable-art, that art which shall teach man to unlearn hatred and learn love." — W. H. Auden, 1932.

Wednesday, May 26, 2010

cars are bad

Chuck Klosterman: "I think that most technology is positive in the short term, and negative in the long term. I wonder, if somebody looked back at the 20th and 21st centuries a thousand years from now, what their perception of the car would be. Or of television. I wonder if over time, they’ll be seen as this thing that drove the culture, but ultimately had more downside than upside. When you think about it, cars are the most central thing in America, in a lot of ways. They’ve probably influenced the way we live more than anything else, and yet every really big problem—whether it’s the environment or who dictates the international economy because of oil—is all tied to cars. Ultimately, cars are bad for civilization. I don’t know if they’ll end us. That’s always the thing when somebody asks you if something is good or bad. You say something is bad, they’ll be like, ‘Oh, you think that’s going to end society?’ No, but something can be bad without ending society!"

Thoughts on DIY U

Dean Dad: "If you're serious about education for the non-elite, you need institutions. The institutions need to be accountable, and open to creativity, and efficient, and changed in a host of ways that I spend most of my waking hours obsessing over and probably more that I've never even thought of. But you need them. Every serious social movement of the past two centuries has understood this. The internet has changed a lot of things, but it hasn't changed that. The rich kids may experience unbundling as liberation, and to some degree, it can be. But for the vast majority, the issue isn't that their individuality is being squelched by The Man and his distribution requirements. It's that without effective educational institutions from preschool on up, they will never get the chance to develop their skills in the first place."

goblins

What's the best poetry to learn by heart?

Alison Flood:
I am in genuine awe of John Basinger, who has learned the whole of Paradise Lost by heart – all 12 books, 10,565 lines and 60,000-odd words. He completed his feat in 2001 and can still recite it today; his achievement is so astonishing that the journal Memory recently conducted a study on him. Testing Basinger by giving him two lines from the beginning or middle of each book, the academics found he could recall the next 10 lines each time. He achieved it, they believe, by 'deeply analysing the poem's structure and meaning over lengthy repetitions'. They suggest that 'exceptional memorisers such as [Basinger] are made, not born, and that cognitive expertise can be demonstrated even in later adulthood'.
As well as awe, I'll admit to feeling a little jealous of Basinger, because I hardly know any poetry by heart. When my mum was at school, it was something they were made to do, and she can recite scads and scads; I just called her to check, and she could reel off Upon Westminster Bridge, Ozymandias, Adlestrop and lots of Shakespeare. And she says her grandmother, brought up in the west of Ireland, knew hundreds of poems by heart.
In Milton's own time, when memory development was a major art form, how unusual would this have been? In any case, whenever I teach a class in which I assign poetry, I require students to memorize fifty lines and recite them to me. It's the most valuable assignment I use, and I would probably do better to throw out much of the writing I assign and just do more memorization.

only connect!

Shreeharsh Kelkar has emailed with some questions that I thought it might be interesting to answer here. Here are the first two:

1) Just briefly, how do you decide if something is worthwhile ("clippable") while browsing the web? Obviously, the easiest is when it relates to a particular project you're doing. But what about the others which you think may be useful some day but can't really say? How often do you end up going back to them? Or even better, using them in a project?

2) Finally, do you "clip" anything that you think will be relevant at some point in the future or are you more discriminating?

Over the last couple of years I have developed, gradually and not altogether intentionally, a three-stage method of organizing and responding to what I read online. It works like this:

1) If I see something long enough and complex enough that I need to read it with care, but don't have time to read at the moment, I send it to Instapaper. And by the way, Instapaper’s “mobilizer” — its built-in tool for extracting the text from a webpage — provides the ideal way to read anything on the iPhone. It’s now built in to Tweetie — um, Twitter for iPhone, so clicking on links in tweets takes me to the mobilized version rather than the original web page, which means that it loads fast and is easy to read. Brilliant.

2) If, having read something, I think I might want to come back to it later, I clip an excerpt and send it to Pinboard. I used to use Delicious for this, and Delicious is still a fine tool, and free, but Pinboard is more elegant. I then use Pinboard’s tag cloud to browse the relevant clippings when I’m working on a particular project.

3) But if I know (or think I know) that a particular article or story or blog post is going to be important for something I’m doing, and I can't take the chance on it disappearing behind a paywall or just plain disappearing, then I convert it to a PDF and send it to my preferred Everything Bucket, Together.

By the way, it has become clear to me that I save too much, both to Pinboard and to Together. I go through and purge from time to time, but I would like to develop habits that make me more thoughtfully selective at the point of reading.

Here’s a funny thing: in general, I dislike having more apps open at once than absolutely necessary — I like to streamline my workflow, and will even at times use a second-best tool in order to keep things simple. And yet, even though I could bypass Pinboard and keep all my clippings in Together, I don't. Similarly, I could gather everything in Zotero, but I can't stand the way Firefox looks. I’m weird that way.

Similarly, I could keep all my notes and jottings in Together, but I don't: I use the brilliant Notational Velocity instead. I am not sure why I violate my own principles here, but I think it’s because I’m keeping tasks that make different cognitive demands on me in different environments: note-jotting in one place, articles that need skimming in another, articles that require serious attention in a third.

Here is Shreeharsh’s third question:

3) I use Evernote to make notes while I browse the web and one of the things I find is that while I do clip extracts from web-pages there, when the time comes for me to go back to the article, I often just google the article rather than searching for it on Endnote (I usually remember some keywords from the article). Does this happen to you too?

It does! — and that’s interesting, no? It’s often faster to Google something that to look through the materials I have so painstakingly filed (especially when I’m not sure whether I’ve put something in Pinboard or Together). “Search, don't sort” is the Gmail motto, and it seems to work here too. But I don't always remember what I need to remember to do a good Google search; and the sorting and filing itself is cognitively useful, I think — it helps me to organize my thoughts and keep them in good marching order, even when I don't go back to the materials I’ve collected later.

Tuesday, May 25, 2010

simpler = privater?

Technology Review: "On stage this morning at TechCrunch Disrupt in New York City, Facebook vice president of product, Chris Cox, promised 'drastically simplified' privacy controls, which should be available on the site on Wednesday. Though he declined to give any details of how they will work, the company is clearly feeling the sting of the recent storm of criticism.
"It will be interesting to see how the privacy controls actually look. Simpler controls don't necessarily mean more privacy. It's entirely possible that it will remain difficult for users to keep the company from making large portions of their data public."

a change is gonna come

Dear readers, in my ongoing and mostly futile attempt to simplify my life, I have put the kibosh on my tumblelog, or online commonplace book — it used to be mentioned in the bio to your right, but has just been ruthlessly excised. From now on the random quotations, photos, and links that once populated that site will populate this one. if it's only a link, or a link that needs a very brief description, I'll post it via the Text Patterns Twitter feed; otherwise it will show up on this very page. Now I will have fewer decisions to make. Whew.
I should add that when I post something without comment, tumblelog-style, that should not be taken as agreement or endorsement or anything else except this: it's something that I find interesting, worth noticing, worth thinking about.

London letters

In March my friend Brett Foster and I spent a few days in London (joined briefly by actor, director, theatrical impresario, and boon companion Mark Lewis) and, in Heathrow awaiting our return flight, decided that it would be fun to write a series of letters about our experience. John Wilson of Books & Culture was game to run them, so here they are: One, Two, Three, Four, Five, and Six. (I'm odd, Brett even.)

This was great fun to do, and both Brett and I would love to be able to find another venue for these essayistic epistolary exchanges. I may know London a little better than Brett does, but he has spent much time in Rome and knows it very well, so wouldn't some letters featuring the young expert and the elderly novice be illuminating and entertaining? John, what do you say? Surely B&C has an enormous travel budget for its writers. . . .

Here's one of Brett's poems from a few years ago; here's a newer one. Mark leads, guides, and directs the consistently excellent Arena Theater.

more on sharing and oversharing

Tim O’Reilly takes a line similar to that of Steven Johnson:

The essence of my argument is that there's enormous advantage for users in giving up some privacy online and that we need to be exploring the boundary conditions - asking ourselves when is it good for users, and when is it bad, to reveal their personal information. I'd rather have entrepreneurs making high-profile mistakes about those boundaries, and then correcting them, than silently avoiding controversy while quietly taking advantage of public ignorance of the subject, or avoiding a potentially contentious area of innovation because they are afraid of backlash. It's easy to say that this should always be the user's choice, but entrepreneurs from Steve Jobs to Mark Zuckerberg are in the business of discovering things that users don't already know that they will want, and sometimes we only find the right balance by pushing too far, and then recovering.

To his credit, O’Reilly goes on to say that Facebook isn't acting out of any commitment to the public good, and he endorses a recent Bill of Rights for Facebook users. But his approach troubles me in a number of ways.

First, he sees virtue only in “pushing the boundaries” in one direction, towards the reduction or elimination of privacy — but he never explains why that direction is the one that counts. Some of the people striving to create alternatives to Facebook are exploring the idea that tighter privacy controls are more generally desirable.

Second, O’Reilly talks about companies like Facebook “correcting” their invasions of user privacy and then “recovering” — but, as we learned from the Google Buzz fiasco, information doesn't go back in a bottle. You can stop sharing, but you can't unshare what’s already out there. Facebook may “correct,” but users can't “recover” privacy they’ve lost.

Finally, he’s failing to make an important distinction. It’s absolutely true that “entrepreneurs . . . are in the business of discovering things that users don't already know that they will want,” but historically they have discovered those “things” and then sold them to users. But what Google and Facebook have recently done is different: they have sold (or given away) a particular service and then altered it significantly without consulting or even warning users. Imagine getting up one morning and, as you’re leaving for work, discovering that your sports car has overnight been transformed into a minivan — there’s a note on the windshield from the car manufacturer saying that they know you’ll really enjoy all the features this new model has that the old one lacked. Would you say that the car company was being “entrepreneurial”?

Monday, May 24, 2010

"neglected"? "masterpiece"?

Robert McCrum lists a few "neglected masterpieces," among them, oddly, Lincoln's Second Inaugural Address and Melville's "Bartleby the Scrivener." Or perhaps not oddly: perhaps this is just an indication of the difference between American and British schooling. Generations of American schoolchildren would have loved to neglect old Bartleby, but didn't get the chance.
I have always been uneasy with the term "neglected masterpiece" — such a thing is really quite rare, whereas the phrase isn't. There aren't many books that genuinely deserve to be called "masterpieces." Also, what counts as "neglected"? Often books so designated aren't neglected at all, but rather are just given less attention, and less universal attention, than they deserve.
However, I will admit that "neglected masterpiece" has a certain flair to it that "pretty good book that hasn't gotten the attention it deserves" lacks.
And I will say this: the best historical novel I have ever read, and one of my very favorite books, is almost completely forgotten now, less than thirty years after its publication: The Succession, by George Garrett.

altruistic oversharing

Steven Johnson writes:

In our house, we have had health issues . . . that we have chosen not to bring to the public sphere of the valley. We have kept them private not because we're embarrassed by them, but because some things we already think about enough and would frankly rather think less about, and we don't need to the extra prodding of 1,000 Facebook friends thinking alongside us. Every revelation sends ripples out into the world that collide and bounce back in unpredictable ways, and some human experiences are simply too intense to let loose in that environment. The support group isn't worth the unexpected shrapnel. Most of us, I think, would put the intensities of sex and romantic love in that category: the intensity comes, in part, from the fact that the experience is shared only in the smallest of circles.

But no doubt something is lost in not bringing that part of our lives to the valley. Somewhere in the world there exists another couple that would benefit from reading a transcript of your lover's quarrel last night, or from watching it live on the webcam. Even a simple what-I-had-for-breakfast tweet might just steer a nearby Twitterer to a good meal. We habitually think of oversharers as egoists and self-aggrandizers. But what Jarvis rightly points out is that there is something profoundly selfish in not sharing.

Really? Wow, every day brings new evidence of my moral corruption. Someone out there is eating Pop-Tarts because of my failure, on so many mornings, to describe my wife Teri’s homemade muffins. A couple’s marriage is crumbling because I neglected to tell the world about the last fight Teri and I had — they would have “benefited” from that account . . . somehow. In fact, how could I have gone all these years without keeping a 24-hour-a-day webcam on myself?? Just think of the opportunities for enriching society I have missed! I have absolutely no idea what those opportunities could be, of course, but that just makes me feel worse. And greater still my guilt for assuming that people like Darnell Dockett are being exhibitionistic when they create live video streams of themselves taking showers. Why are you apologizing, Darnell? You should have said, “People, I wasn’t doing this for me, I was doing it for you!”
Seriously, I can understand and sympathize with the argument that what some call "oversharing" is defensible and even in some cases valuable. But to suggest that people who fail to expose their lives online are selfish is the height, or depth, of absurdity.

Saturday, May 22, 2010

in the shallows

I will have more to say about all this at some later point, but for now let me refer you all to Russell Arben Fox’s excellent response to Nick Carr’s forthcoming book The Shallows:

The book's arguments are broad, but its overarching thesis is a simple two-pronged one. First, that the internet has an "intellectual ethic," just as every "intellectual technology" does, "a set of assumptions about how the human mind works or should work" (p. 43). The assumptions at work in and through the internet, as it happens, are ones which revolve around snap judgments, thin generality, shifting attention spans, multitasking, flexibility, and most of all, distractions: the ethic of "the juggler" in other words (p. 112). Second, that in becoming jugglers of information we are actually making it — neurologically, psychologically, structurally — harder and harder for our own brains to do anything otherwise. The "deep reading" made possible by the ethic of the book--the way we could learn to enter into, identify with, creatively work through and embrace or reject a written argument (pp. 71-72) — has a neurological reality to it, and when our brains mold themselves to a different environment of reading, basic cognition, long-term memory, and learning are all dramatically affected, and not in a positive way. On the contrary, we become habituated to viewing all information — literature, science, journalism, scholarship, whatever — as something to be "strip-mined [for] relevant content" (p. 164), and rather than thereby supposedly refining our ability to recognize (in classic marketplace of ideas fashion) good information from bad, in fact our capacity to make learned judgments is physically undermined.

Lots here to think about the respond to!

Thursday, May 20, 2010

admonitory image

Via Margaret Soltan the story of a Russian controversy: images from the fiction of Dostoevsky in the Moscow subway. People seem to be particularly freaked out by the image above.

The Moscow Times story Margaret links to doesn't note it, but the guy with the pistol to his head is surely Svidrigailov, who kills himself near the end of Crime and Punishment after announcing that he is "going to America." Svidrigailov is Raskolnikov's doppelgänger, his evil twin, and illustrates the path Raskolnikov is headed down until his almost-too-late swerve towards Sonia, repentance, and Christianity. In short, he's a delightfully appropriate object lesson for Russians who have sold their souls to commerce and need to recover their spiritual inheritance. I vote to keep him.

the broken system of grading

Cathy Davidson, a professor of English at Duke, announced last year that for one of her classes, “Your Brain on the Internet” — yes, that’s an English class — she would outsource the grading to the students.

I’m trying out a new point system supplemented, first, by peer review and by my own constant commentary (written and oral) on student progress, goals, ambitions, and contributions. Grading itself will be by contract: Do all the work (and there is a lot of work), and you get an A. Don’t need an A? Don’t have time to do all the work? No problem. You can aim for and earn a B. There will be a chart. You do the assignment satisfactorily, you get the points. Add up the points, there’s your grade. Clearcut. No guesswork. No second-guessing ‘what the prof wants.’ No gaming the system. Clearcut. Student is responsible.

But what determines meeting the standard required in this point system? What does it mean to do work “satisfactorily”? And how to judge quality, you ask? Crowdsourcing. Since I already have structured my seminar (it worked brilliantly last year) so that two students lead us in every class, they can now also read all the class blogs (as they used to) and pass judgment on whether the blogs posted by their fellow students are satisfactory. Thumbs up, thumbs down. If not, any student who wishes can revise. If you revise, you get the credit. End of story. Or, if you are too busy and want to skip it, no problem. It just means you’ll have fewer ticks on the chart and will probably get the lower grade. No whining. It’s clearcut and everyone knows the system from day one. (btw, every study of peer review among students shows that students perform at a higher level, and with more care, when they know they are being evaluated by their peers than when they know only the teacher and the TA will be grading).

I have a few comments.

1) Several of the phrases here give us a pretty good idea what motivated Davidson to make this change: “Clearcut. No guesswork. No second-guessing ‘what the prof wants.’ No gaming the system. Clearcut. Student is responsible.” I.e., “I’m sick of grade-obsessed students who just want A’s and complain bitterly if they get anything else.”

2) Note that you can blow off at least some of the work and still get a B. Not only do you not have to do good work to get a B, you don’t even ave to do it all. And the dread letter “C” is never invoked.

3) Davidson is naïve if she thinks she can come up with a system that students can’t game. Her announcement received many responses, including this one: “Never underestimate grade orientation . . . I tried something similar several years ago at Buffalo. My mistake was to make it a ‘curved’ class (though only a positive curve). Two ‘gangs’ (one a group of fraternity brothers, the other just people who met and formed up) reached an agreement that they would vote up each others’ work no matter what, and non-members’ work down, no matter what, in order to increase their own grade in the class favorably, and hurt others’ grades. . . . When I intervened, I got complaints: I had set up the rules, several said, if I didn’t like the outcome, how was it their fault.” And of course even without the curve there can be problems. . . .

4) Result: “Davidson, the Ruth F. Devarney Professor of English, said that of the 16 students in the course, 15 already have earned an A and she expects the remaining student to soon finish an assignment that will earn an A as well.” How does Davidson feel about this? “It was spectacular, far exceeding my expectations. It would take a lot to get me back to a conventional form of grading ever again.” So maybe not a problem after all.

I actually have a lot of sympathy for Davidson: she is, I think, genuinely trying to recover for her students an experience of actual learning that the grading system, and students’ obsession with it, systematically undermines. But it’s impossible to make the argument that these grades are legitimate as grades — as serious evaluations of the quality of student work. Students lack the knowledge, the training, the experience, and the motivation to evaluate their peers’ work responsibly and accurately. When Davidson sends those grades along to the Duke registrar’s office, she is collaborating with her students in “gaming the system” — gaming it massively and wholly. And she may well be avoiding some of her own responsibilities as a teacher, which, as Leonard Cassuto points out, will not be good for our profession in the long run.

To be sure, in one sense the system deserves to be gamed — it’s fundamentally broken — and what Davidson is doing is only slightly more extreme than what most professors, enablers of grade inflation, do every day. But the system needs to be faced and critiqued more straightforwardly, more honestly. What Davidson has set up is an elaborate piece of academic theater, and Lord knows the academy is theatrical enough already. She would do better, I think — not well, but better — to decide before even planning a class that everyone in it will receive an A, and then ask: How shall I teach this material, how do I think it really ought to be taught, now that every thought of grading has been banished from my mind?

Wednesday, May 19, 2010

Facing Facebook lock-in

A long time ago (in internet terms anyway) I explained why I was an early adopter and then an early abandoner of Facebook. Given the path Facebook has followed in its treatment of its users — this chart tells you everything you need to know about that — I’m really glad I got out when I did, because I know what it’s like to feel locked into a digital environment I have serious qualms about: thus my ongoing Google emancipation project. If I had been using Facebook regularly for the past few years, I’m sure it would be hard for me to figure out how to do without it, because there are no alternatives to Facebook. Not right now at least, though some folks are hoping and others are trying.

But even if a legitimate alternative emerges, you can't just make the unilateral decision to move there — you have to get your friends to move also, and to move to the same alternative that you’ve chosen. My rejection of Google has been difficult enough, and it doesn't pose that problem: email is email — based on a set of open protocols, thanks be to God — so while I may experience some annoyance at losing my favorite Gmail features, I can communicate with all the same people I communicated with before, and in just the same way. The transition is seamless. Similarly, thanks to the assistance and server-provision of my friend Matt Frost, I’m getting my RSS feeds via Fever — and RSS and Atom are similarly open standards, so that it’s trivial to shift from one client to another.

Facebook doesn't work that way: everything about it is closed and proprietary, and since it’s fundamentally social, it doesn't even make all that much sense to talk about taking your own data out of Facebook: the value of the service lies in the relation of your data to other people’s data, and the only way for that value to be ported elsewhere is for all your friends’ data to move along with yours. But that would likely be a case of “meet the new boss, same as the old boss”: any other company that held that much social information would be unlikely to wield its power any less crassly than Facebook has. Power corrupts, and lots of power corrupts a lot. To coin a phrase.

The only answer to this difficulty — since danah boyd’s idea that Facebook should be regulated as a utility is manifestly ridiculous — is for everyone’s social presence online to become more widely distributed among multiple services. But of course that would mean the end of the convenience of having a one-stop social shop.

So Facebook users are presented with a choice: they can have more privacy, more control over their own personal information, or they can have convenience. I bet I know what 95% of them will choose — and again, I might be making the same choice if I had been putting stuff into Facebook for the past three years.

Moral of this story: before buying into an online service, always make sure you know where the exits are. And that they’re unlocked.

(By the way, I wonder if Farhad Manjoo still thinks there’s no legitimate reason for not using Facebook?)

(Also by the way, much of what I say here about Facebook is also true of Twitter — but some of it isn't, which is why I’m not getting off Twitter anytime soon. Maybe I’ll find time to explain this later.)

Tuesday, May 18, 2010

poor academic tools

This is not surprising:

The Kindle isn't doing as well in academic environments as Amazon—and educators—had originally hoped. The Darden Business School at the University of Virginia is near the end of its Kindle "experiment," already concluding that students are not into the Kindle when it comes to classroom learning. They are, however, fans of the Kindle when it comes to using it as a personal reading device.

Darden is one of a handful of schools that decided to give the larger-screened Kindle DX a trial run in select classes to see how well it fared in the academic environment. And, it's not the first to conclude that the Kindle isn't quite right for its students. Arizona State University recently completed its own pilot program for the Kindle DX and wasn't particularly impressed — the university also settled a lawsuit with the American Council for the Blind, agreeing to use devices that were more accessible to the blind in the future. Princeton was also underwhelmed by its Kindle test; one student described the device as a "poor excuse of an academic tool" in an interview with the Daily Princetonian.

Most Darden students seem to agree. When asked to fill out a midterm survey on whether they would recommend the Kindle DX to incoming MBA students, 75 to 80 percent answered "no," according to Darden director of MBA operations Michael Koenig. On the flip side, 90 to 95 percent answered "yes" to whether they would recommend it to an incoming student as a personal reading device.

And I think it’s probably wise to ditch the Kindle as the academic reading environment, at least for now — though as I have recently commented, it’s getting better. I don't think this story is over.

Monday, May 17, 2010

bleg!

Gracious readers, I need some help. I have a vast compendium of stories and quotes about reading that I'm drawing on for my book, but there is one story I can't find — I may not have saved it. And though I have mad Googling skillz they have let me down this time.
Here's what I remember: it was a newspaper article that quoted a college student saying that he didn't see any point in reading books because he could get the necessary information more efficiently online. I found that interesting because it reflects a certain idea about reading that I want to contest, i.e., that's it's fundamentally a way of uploading information to the brain.
And here's what I think I remember: the guy was class president, and maybe even student body president, at a university in Florida, and was a philosophy major. (That last item really caught my attention.)
You'd think with all that information I'd be able to track down the story . . . but no. So I would be thankful for any help y'all can give me.

getting off on the wrong foot

Brandon Sanderson's novel Mistborn: the Final Empire begins with a brief italicized passage, spoken by the protagonist, which contains this sentence: "They say I will hold the future of the entire world on my arms." Wait — shouldn't that be either "in my arms" or "on my shoulders"? In idiomatic English people don't hold things on their arms: they might have tattoos or mosquito bites on their arms, but that's about it. What mental image arises when you hear the phrase "She held her young daughter on her arms"? Nobody goes to fantasy novels for literary style, of course, but still!

Lord knows I have perpetrated greater errors, but this kind of thing annoys me, especially when it comes at the beginning of a book, because it compromises my confidence in the writer’s attentiveness to his task — and readers need that confidence, especially when they're starting books by writers new to them.

Something similar happened to me a couple of years ago when I picked up the one-volume abridged edition of William T. Vollmann's Rising Up and Rising Down. Here are the first two sentences of the book: “Death is ordinary. Behold it, subtract its patterns and lessons from those of the death that weapons bring, and maybe the residue will show what violence is.” Okay, let me work this through: Vollmann is asking me to take death tout court, death altogether, and subtract that from deaths that are brought about by the use of weapons. That is, he wants me to subtract a complete set from one of its subsets. Doesn't this leave the conceptual equivalent of a negative number? He can't mean what he’s saying here. He can only mean that if you subtract “the patterns and lessons” of nonviolent death from the patterns and lessons of death altogether (the whole set) you will be able to learn something about violent death. (Note also that he is equating “violent death” with “death caused by weapons,” which is wrong but at least is a comprehensible statement.) In other words, Vollmann didn’t even come close to saying what he meant. Didn't get within a mile of it. And this is how he starts his book!

How much farther did I get into Rising Up and Rising Down? That’s it. No further. Which is probably foolish of me. But I just didn't have any confidence that a guy who can so completely butcher the first sentences of his book would take significant care with the rest of it.

I’m moving ahead with Sanderson, though. A hundred pages in, the story has real potential, even if he writes in that wooden way that’s so common in fantasy and science fiction. I’m not totally unforgiving. Besides, in the first few pages a character was introduced who has a curious network of scars . . . um . . . on his arms. So maybe that fifth sentence of the book wasn't a slip after all? Maybe it's possible to pass judgment too quickly . . . ?

Friday, May 14, 2010

a boon to the annotator

Thanks to one of my wise and learned commenters, I discovered the pretty-much-wholly-unadvertised Your Reading page on Amazon's Kindle site. This has made me think, for the first time, that the Kindle really could be used for serious and scholarly reading: I can see all of my highlighted passages and all of my notes on a single screen, and can copy and paste all of that text into my own manuscripts. (Though I believe there are limits on the amount you can copy at any one time.) I have been using this feature recently, and what a wonderful time-saver it is — as well as offering a great deal of information, and information I have already decided is highly relevant, on a single screen. This is potentially a game-changer for me.
And yes, I know that Amazon is gathering this information, anonymizing it, and giving users a look at the "most highlighted" passages in various books. Not only is that not a problem for me, I consider it an additional benefit: it can be very interesting to see what other readers have annotated in books I myself annotated. Do we see the same passages as significant? Or do we have significant disagreements? Surely there are scholars who want to study these patterns — if there aren't, there ought to be.
(I might add that that page contains notes and highlighted passages only for books I have bought from Amazon: any public-domain texts, or documents of my own, that I have annotated on my Kindle don't show up there — which is a minor inconvenience, since I can get those elsewhere. Indeed I did get them elsewhere or could not have put them on the Kindle.)

The Age of Anxiety

My critical edition of W. H. Auden’s long poem The Age of Anxiety will be out later this year, I hope, though it hasn't shown up on Amazon yet . . . but hey, while I’m handing out excerpts, how about one more? This is from my Introduction to the poem:

The Age of Anxiety begins in fear and doubt, but the four protagonists find some comfort in sharing their distress. In even this accidental and temporary community there arises the possibility of what Auden once called “local understanding.” Certain anxieties may be overcome not by the altering of geopolitical conditions but by the cultivation of mutual sympathy — perhaps mutual love; even among those who hours before had been strangers.

The Age of Anxiety is W. H. Auden’s last book-length poem, his longest poem, and almost certainly the least-read of his major works. (“It’s frightfully long,” he told his friend Alan Ansen.) It would be interesting to know what fraction of those who begin reading it persist to the end. The poem is strange and oblique; it pursues in a highly concentrated form many of Auden’s long-term fascinations. Its meter imitates medieval alliterative verse, which Auden had been drawn to as an undergraduate when he attended J. R. R. Tolkien’s lectures in Anglo-Saxon philology, and which clearly influences the poems of his early twenties. The Age of Anxiety is largely a psychological, or psychohistorical, poem, and these were the categories in which Auden preferred to think in his early adulthood (including his undergraduate years at Oxford, when he enjoyed the role of confidential amateur analyst for his friends).

The poem also embraces Auden's interest in, among other things, the archetypal theories of Carl Gustav Jung, Jewish mysticism, English murder mysteries, and the linguistic and cultural differences between England and America. Woven through it is his nearly lifelong obsession with the poetic and mythological “green world” Auden variously calls Arcadia or Eden or simply the Good Place. Auden’s previous long poem had been called “The Sea and the Mirror: A Commentary on Shakespeare’s The Tempest,” and Shakespeare haunts this poem too. (In the latter stages of writing The Age of Anxiety Auden was teaching a course on Shakespeare at the New School in Manhattan.)

But it should also be noted that this last long poem ended an era for Auden; his thought and verse pursued new directions after he completed it.

It's an amazing poem, I think, but an extraordinarily difficult one. If you plan to read it, as surely you should, you might want to look for a carefully annotated edition with a detailed contextual introduction. Just sayin'.

Thursday, May 13, 2010

old man Jacobs

Following the lead of Old Man Stewart here, I’m going to take a moment shake my fist at Bill Simmons.

I needed something nearly mindless to read at the end of a long, hard, illness-infested semester, so I thought I would try Simmons’s The Book of Basketball: The NBA According to The Sports Guy. I got about a third of the way through it — it’s 736 pages in the hardcover edition, by the way — and then deleted it from my Kindle. Too much porn. Yes, porn — Simmons’s jokes have a pretty limited range: penises, breasts, porn flicks, a movie about porn flicks (Boogie Nights), penises, gambling, other movies, penises, and porn flicks. After a while this became . . . not offensive so much as numbingly repetitive and just plain sad.

It’s sad that porn is so mainstream now that Simmons can assume that pretty much everyone who reads his book knows as much about it as he does. And perhaps he assumes rightly, since hardly any of the customer reviews on Amazon mention this prominent, um . . . feature of the book. Nor does Amazon’s own review, or Booklist’s. Apparently porn has moved from being taboo to disreputable to risqué to defensible to invisible. Just part of the scenery.

Notice, by the way, that I did what we all do these days, which is to soften or even undercut moral disapproval with a self-deprecating joke. (See this post’s title?) Robert Ebert did something similar recently in the first paragraph of his review of Kick-Ass:

Shall I have feelings, or should I pretend to be cool? Will I seem hopelessly square if I find “Kick-Ass” morally reprehensible and will I appear to have missed the point? Let's say you're a big fan of the original comic book, and you think the movie does it justice. You know what? You inhabit a world I am so very not interested in. A movie camera makes a record of whatever is placed in front of it, and in this case, it shows deadly carnage dished out by an 11-year-old girl, after which an adult man brutally hammers her to within an inch of her life. Blood everywhere. Now tell me all about the context.

Yeah. Do please tell me about it. I think the self-deprecation, the near-apology, comes in because we know that there is simply no point in arguing with someone who's happy with a world in which porn is thoroughly mainstream and there's some value in watching films that depict children being beaten and then killing (and, by the way, conscripting actual children to act out those fantasies). I cannot discern any point of commonality that would allow me to formulate an argument that such people would recognize as valid, or perhaps they would even be able to make sense of. I sympathize with Ebert's simple statement — "You inhabit a world I am so very not interested in "— but I doubt its sufficiency. I may be "so very not interested" in a particular world, yet still have to live in it and experience its consequences.

new directions

Victor Keegan:

My second book, Big Bang, was published as a real book but launched in the virtual world Second Life amid an animated conversation by avatars. Since those early days Second Life has developed an innovative culture of creating three dimensional virtual books including poems which is taking literature in a new direction, albeit for a minority audience.

We’ll know the rules have really changed when a professor gets tenure based on a book published in Second Life, or some future equivalent thereof. Tenure in this world, I mean, not in Second Life.

Wednesday, May 12, 2010

disputed sovereignty

For me, ‘tis the season of proofs. Recently I got proofs for my contribution to the forthcoming Cambridge Companion to C. S. Lewis, which is an essay on the Narnia books. Here’s an excerpt:

Lewis once suggested that literary critics are, and have always been, neglectful of ‘Story considered in itself’. They have been so focused on themes and images and ideological commitments that they have failed to notice the thing that decisively differentiates stories from articles or treatises. If we then try to consider the seven Narnia stories as a single story, what is that story about? I contend that the best answer is disputed sovereignty. More than any other single thing, the story of Narnia concerns an unacknowledged but true King and the efforts of his loyalists to reclaim or protect his throne from would-be usurpers.

This theme is announced early in The Lion, the Witch and the Wardrobe: when the four Pevensies first enter Narnia as a group, their first action is to visit the house of Mr Tumnus. There they discover the house ransacked and a notice of Tumnus’s arrest that concludes with the words ‘LONG LIVE THE QUEEN!’ – to which Lucy replies, ‘She isn’t a real queen at all.’ . . .

Among the first facts established about Narnia are these: it is a realm in which authority is contested, in which the present and visible Queen of This World ‘isn’t a real queen at all’ but rather a usurper, while the rightful King is frequently absent and invisible – but liable to return and assert his sovereignty. . . .

This theme is repeated in several forms in the later books, and is never wholly absent from them. [Here I give examples from each book in the series.] . . .

In short: there is a King of Kings and Lord of Lords whose Son is the rightful ruler of this world. Indeed, through that Son all things were made, and the world will end when he ‘comes again in glory to judge the living and the dead’, though ‘his kingdom will have no end’, in the words of the Nicene Creed. Meanwhile, in these in-between times, the rulership of Earth is claimed by an Adversary, the Prince of this world. And what is asked of all Lewis’s characters is simply, as the biblical Joshua put it, to ‘choose this day’ whom they will serve.

Tuesday, May 11, 2010

books for the ages

Recently a meme flitted around the internet for a few days — a meme about books: “What,” whispered the meme, “are the Ten Books That Have Most Influenced You?” Or something like that; sometimes I have trouble hearing memes, because of the whispering and all. Also because they tend to bore me.

I don't know how to answer the meme’s question, but the question did get me thinking, for once, and what it got me thinking about is this: what books were most important to me at different stages of my life? That one I believe I can answer, at least up until fifteen years ago or so — this is the kind of thing that’s best assessed in retrospect (which is one reason why I’m not answering the meme’s original question). So check out this list:
Age 6: My favorite book then, and for years after, was The Golden Book of Astronomy — how I loved that book. It influenced me so deeply that until I was sixteen I knew that I would be an astronomer. What happened at age 16? Calculus.

Age 10: Robert A. Heinlein, Tunnel in the Sky. An interplanetary survivalist manifesto. I was ten. Enough said.

Age 14: Arthur C. Clarke, Childhood's End. My first tutor in philosophy, comparative religion, comparative mythology, and dystopian futurism. Also a ripping good read. Roughly contemporaneous with my discovery of Dark Side of the Moon. Not since has my mind been so thoroughly blown.

Age 16: Loren Eiseley, The Night Country. My discovery that the essay could be an art form, and that interests in the sciences and in literature could be profitably and brilliantly combined. I read Eiseley’s complete works that year, I think, but the melancholy humor of The Night Country remained with me more strongly than anything else. The (widely anthologized) essay “The Brown Wasps” just devastated me.

Age 20: William Faulkner, Absalom, Absalom! “The past is not dead, it is not even past.” History is tragedy. “Why do you hate the South?” — No, Shreve, you damned Canadian, you don't understand. You don't understand at all.

Age 22: The Philosophy of Paul Ricoeur. Theology, theory, phenomenology, hermeneutics — all in the mind of one person?? Then anything is possible. Boundless intellectual vistas. I may make it through graduate school after all.

Age 24: W. H. Auden, The Dyer's Hand. Theology, poetry, history, myth — the poet as thinker. The poet as Christian. The Christian who, because he is a Christian, is thinking decades ahead of everyone else about the collapse of psychoanalysis, the end of Christendom, the dead-ends of late modernity. . . . A lifetime’s study commences now.

Age 30: Mikhail Bakhtin, The Dialogic Imagination. He knows everything — the whole of history. As he wrote near the end of his life, “There is neither a first nor a last word and there are no limits to the dialogic context (it extends into the boundless past and the boundless future). . . . At any moment in the development of the dialogue there are immense, boundless masses of forgotten contextual meanings, but at certain moments of the dialogue’s subsequent development along the way they are recalled and invigorated in new form (in a new context). Nothing is absolutely dead: every meaning will have its homecoming festival.” What Auden is as critic and poet for me, Bakhtin is as theorist and thinker.

Age 35: Lesslie Newbigin, The Gospel in a Pluralist Society. If I could have every young and thoughtful Christian read one book, it would be this one. There’s no one quite like Newbigin — or perhaps it would be better to say that there have been few like him since Augustine: the bishop-missionary-theologian.

Since then . . . well, I’ll tell you in a few more years.

UPDATE: I've decided I'd be remiss if I didn't add one more:
Age 38: W. H. Auden, “Horae Canonicae.” I had read these poems several times over the years, but it was only in my late 30s, as I was writing a book on Auden, that their true greatness began to dawn on me. They have permanently and profoundly shaped my understanding of what it means to be a human being living historically, and being accountable for one's own history; and it is through these poems more than through anything else that I have come to understand the meaning of Good Friday.

Monday, May 10, 2010

children, stories, and tears

Here’s a meditation on children’s books that can still make adults cry. The list is largely what you would expect: The Velveteen Rabbit, The Giving Tree, Charlotte’s Web, etc. Oscar Wilde’s “The Happy Prince”, which Lynne Sharon Schwartz has called “the saddest story ever written,” which may well be true, is mentioned also. I was also pleased to see the story link to this strange and epic thread at my old stomping grounds The American Scene.

I didn't read any of these when I was a child, but I read The Velveteen Rabbit as a teenager. I was a receiving clerk at a bookstore, and when opening a box of books paused, for some reason, to read that one. Did it extract a tear or two from my callow adolescent eyes? Indeed it did. But primarily it made me angry: I saw it as a crassly blatant attempt to manipulate the emotions of children, who, I thought, are the softest of targets for this kind of thing. Not long afterward I read Charlotte’s Web and had the same response.

And I’ve never been able to dismiss that initial response: I still find myself annoyed by, if not actually angry at, children’s books that end in death or other catastrophic loss. This is probably not rational — a good case can be made for the need to introduce children to the fact of death — but I have always had this lurking feeling that some of these writers enjoy the task just a little too much.

Saturday, May 8, 2010

comparative reading

Michael Grothaus:
After finishing my read of the novel on both mediums — the iPad and the paperback — I am more convinced than ever that the iPad and its iBookstore will not usurp traditional print editions. A paperback is just a too versatile and easy device to read from. It's cheap and replaceable and takes a very low level of care to keep it in working order. It's much lighter and the page turns flow more naturally in your hands than page swipes do on the iPad. Also, a paperback offers no distractions from the printed words. Despite my love for the traditional medium, I do want to say that I love the iBookstore. I use it now to browse for books I think would be interesting and use the "Get Sample" feature to explore the first few chapters. If I like what I read, I'll purchase the print edition from Amazon or at my local bookseller.

Friday, May 7, 2010

B. A. F. C.

Long before Brian Phillips did me the great honor of allowing me to post a thought or two on his site, I expressed my great admiration for his marvelously evocative (and often really, really funny) writing on his blog about soccer, The Run of Play.
In a turn of events that's pretty close to being too-good-to-be-true, Brian is giving us a serial novel on the site, B. A. F. C. — which, you'll be pleased to know, stands for Brooklyn Asylum Football Club. Start with the Prologue, and if you want to see a list of all posts (including some preparatory material), go here.
This is going to be great. (And check out Brian's beautiful design for the site while you're at it.)

neuro, cogno, evo, devo

My New Atlantis colleague Ari Schulman has already called our attention to the neuro lit crit debate. All of the entries in that little NYT symposium are worth reading, but there’s one significant issue that no one mentions: that literary criticism grounded in cognitive science resembles most other theoretical approaches in being uncomfortable with, or perhaps just indifferent to, the act of reading.

I am sometimes tempted to argue that all methodological approaches to the study of texts are strategies for avoiding reading. They can have great intellectual value, but there is something evasive about them as well. For instance, think back to the source criticism of the Bible) that became so dominant around the turn of the twentieth century: as Robert Alter has often noted over the years, source critics were interested in the Biblical text only as putative evidence for what lay behind it, the lost and therefore magical Quelle. Gabriel Josipovici pointed out some years ago that the nineteenth century has this curious habit of believing that the truth about anything may be found only if one can find its origins.

Neuro-lit-crit approaches — and their siblings, the various evolutionary models of literary writing and reading — are like this. Attention quickly shifts from texts to the evolutionarily-produced cognitive processes that create texts and then respond to texts. This tends to mean that when such critics actually talk about literary texts they can say things that aren't so interesting, as Michael Bérubé explains in a review of Brian Boyd:

Much of Boyd’s approach consists of explaining how Homer and Dr. Seuss manage to win and keep our attention, and Boyd castigates contemporary literary criticism for failing to attend to this important matter. But might it not be that “Homer organized the poem in this way so as to win and keep your attention” is the kind of thing that, in literary criticism, literally goes without saying? Similarly, readers for almost three millennia have recognized that Odysseus is one crafty fellow, and that one indication of his craftiness is that he does not act on impulse; even when he’s trapped in a cave with a one-eyed giant eating his men, he takes a deep breath and comes up with a well-considered plan. Boyd explains precisely what this means in neurological terms: “Rapid-fire reactions have to be inhibited (in the orbitofrontal cortex) so that there is time to formulate and assess new options (in the dorsolateral cortex) before acting on them.” Personally, I am tremendously pleased that my species has gotten to the point at which it understands things like this. But how much is added to the history of criticism, finally, by the realization that Odysseus was doing his crafty plotting in his dorsolateral cortex?

So while I enjoy cognitive and evolutionary accounts of literature, I enjoy them more as applications of cognitive science; as a reader, and as someone who wants to understand stories and poems better, I don't get much out of them. And I think that’s because at this point there’s not much to get.

Thursday, May 6, 2010

just thinking out loud here

Kara Swisher tells us that

personal computing is about to get a lot more personal. Internet-based television now in development will recognize a viewer and deliver customized entertainment.

And it will do this without the trusty keyboard and mouse. We're already phasing them out, thanks to the increasing popularity of touch screens -- including the patron saint of all this, the Apple iPhone, and a spate of copycat smartphones. All of these devices allow users to navigate without physical buttons or input devices.

Thus, with a flick of the finger, the era of the mouse and the keyboard will soon be over.

Sounds awesome! But I have a question. Ever since I got my first laptop — the original Mac PowerBook 100 in 1992, if you must know — I have done a good deal of my writing in coffee shops, in libraries, in my living room (while other members of my family read or watch TV), and even on airplanes and in trains. I have written a lot, and plan to do a lot more writing in the future. If I’m going to keep writing in such environments, Kara Swisher, how am I going to do it without a keyboard? Am I going to make everyone around me listen as I compose my prose out loud with my voice-recognition software, sort of like like Winston Churchill in his bedroom with his army of secretaries? Will “writing in public” be the new “talking on my cellphone in public”?

Please reply soonest.

books as home

Here’s a wonderful interview — alas, too brief — with Alberto Manguel. Some choice bits:

I don't think the book of paper and ink will disappear, as long as we allow for technologies to coexist. The notion that one must replace the other is simply the urge of the new to exist alone on the planet, but it doesn't happen. . . .

It used to be that readers were relegated because they considered themselves far above society, and so the metaphor of the ivory tower developed. Now there's still this idea that the reader doesn't take part in the social game and in politics, the res publica, but for other reasons: he doesn't do it because he's not making any money.

I remember, as a child, the confusion of not knowing what this place was where I was supposed to spend the night: it's a disquieting experience for a child. And what I would do was quickly unpack my books and go back to a book I knew well and make sure the same text and the same illustrations were there. There was always an immense sense of relief. That was home.

Manguel also convinces me that I have been remiss in never visiting the Warburg Institute’s Library in London.

thesis for disputation

The best interpreters of any given text are those whose ideas — whether you agree with them or not — send you back with excitement to that text.

Wednesday, May 5, 2010

gridding London

Via Strange Maps.

Tuesday, May 4, 2010

Twitter as bookclub

Is Neil Gaiman’s American Gods a good choice for the One Book, One Twitter event/bookclub/opinionfest? Gaiman himself isn’t so sure, though he will participate in good spirit, and is even willing to tweet answers to readers’ questions when it’s all done.

To my own surprise, I find that I very much like the idea of a group of people tweeting their responses to a common book — but I don't like it at this scale. It will involve too many people with no common knowledge or experience, and 140 characters aren't enough to provide helpful context. Twitter works in large part because people who follow one another tend to know one another, which means that we have contextual knowledge that helps us to understand our friends’ tweets. A brief message that might seem innocuous or even pointless to someone who doesn't know the tweeter can be hysterically funny to someone who does. Such tacit knowledge is essential to the success of the Twittersphere, a fact that can be obscured when people focus too much on the .01% of Twitter users who have thousands of followers.

But at a smaller scale the idea of a Twitter bookclub is rather appealing. The problem is that I (like most people who use Twitter regularly) have multiple sets of friends who aren't all interested in the same things. With my church friends I might want to read a book that would leave some of my other followers bored, indifferent, or hostile. This problem can be addressed: for instance, I could create a new Twitter account, encourage some friends to follow that account, and use it to initiate a conversation about a book. Then we could send all our comments on the book as replies to that account, which would keep us from cluttering up other people’s Twitter pages. And there may be simpler ways I haven't thought of. It’s a bit of a kludge, but a kludge worth the trouble, I think. May a thousand bookclubs tweet.

computational thinking

Jeannette M. Wing describes “computational thinking” in this PDF:

Consider these everyday examples: When your daughter goes to school in the morning, she puts in her backpack the things she needs for the day; that’s prefetching and caching. When your son loses his mittens, you suggest he retrace his steps; that’s backtracking. At what point do you stop renting skis and buy yourself a pair?; that’s online algorithms. Which line do you stand in at the supermarket?; that’s performance modeling for multi-server systems. Why does your telephone still work during a power outage?; that’s independence of failure and redundancy in design. How do Completely Automated Public Turing Test(s) to Tell Computers and Humans Apart, or CAPTCHAs, authenticate humans?; that’s exploiting the difficulty of solving hard AI problems to foil computing agents.

Computational thinking will have become ingrained in everyone’s lives when words like algorithm and precondition are part of everyone’s vocabulary; when nondeterminism and garbage collection take on the meanings used by computer scientists; and when trees are drawn upside down.

But wait — didn't you just say we're using "computational thinking" already, even without the fancy vocabulary? And when you express the desire that the language of computational thinking enter the general public’s word-hoard, aren't you forgetting that much of the terminology of computation was itself borrowed from everyday life? Children were “backtracking” for their mittens — and hikers to discover missed forks in their paths — long before there were computers.

Also, what's the difference between computational thinking and, well, thinking?

(Wing's essay accessed from here.)

Monday, May 3, 2010

blogging re-reading

I’ve just discovered a cool thing at Tor.com, the website of the SF/fantasy publisher: blogs devoted to chapter-by-chapter re-readings of classic fantasy works. There’s one on The Lord of the Rings and one on Robert Jordan’s Wheel of Time series. The idea tends to be a little better than the execution, which is sometimes rather mechanical — too much time devoted to plot summary of each chapter — but I think this is great all the same. In fact, I think I may do something of the kind myself in the not-too-distant future — perhaps working off one of my classes. It’s interesting — well, it’s interesting to me, anyway — to see how my responses to books I often teach have changed over the years after many re-readings and many classroom conversations. Maybe that would be worth recording.