Text Patterns - by Alan Jacobs

Wednesday, June 30, 2010

art of letters

Peak Book

Will Self:

The above leads me to suspect that we indeed may have passed that numinous — but for all that, real — point known as "peak book". Might this mean that the ever-expanding and ever-deranging gap between what is written and what is read may be beginning to narrow at last? Don't be ridiculous! The web has put paid to that — all those petabytes, all those pages! If the consciousness of unread books was bad enough, what about the consciousness of unread web pages?

It all puts me in mind of the Cha'an meditation illness: an incontinent recall of Buddhist texts that is the symptom of a Zen pupil's overstrained psyche, and which can only be rectified by his master hitting him on the head with a stick. Otherwise, the texts proliferate across his visual field, while the meaning of every word is instantly grasped by him. At first, there are just texts the pupil knows, but soon enough these are joined by others he has only heard of — yet these, too, are comprehended in their entirety.

There is worse to come, as flying from all angles wing still more texts that the pupil is compelled to include in his screaming wits — texts he has never heard of at all, texts he didn't know could exist, texts written by alien civilisations, texts doodled on the Etch a Sketch of God by archangels peaking on acid! No stick is big enough to beat this pupil — Humanity. So the maddening and delusory library expands, while the real and useful one is shut down.

creation and consumption

From Megan Garber’s largely positive, thoughtful review of Clay Shirky’s Cognitive Surplus:

But the problem with TV, in this framing, is its very teeveeness; the villain is the medium itself. The differences in value between, say, The Wire and Wipeout, here, don’t much matter — both are TV shows, and that’s what defines them. Which means that watching them is a passive pursuit. Which means that watching them is, de facto, a worse way — a less generous way, a more selfish way — to spend time than interacting online. As Shirky puts it: “[E]ven the banal uses of our creative capacity (posting YouTube videos of kittens on treadmills or writing bloviating blog posts) are still more creative and generous than watching TV. We don’t really care how individuals create and share; it’s enough that they exercise this kind of freedom.”

The risk in this, though, for journalism, is to value creation over creativity, output over impulse. Steven Berlin Johnson may have been technically correct when, channeling Jeff Jarvis, he noted that in our newly connected world, there is something profoundly selfish in not sharing; but there’s a fine line between Shirky’s eminently correct argument — that TV consumption has been generally pernicious in its very passivity — and a commodified reading of time itself. Is the ideal to be always producing, always sharing? Is creating cultural products always more generous, more communally valuable, than consuming them? And why, in this context, would TV-watching be any different from that quintessentially introverted practice that is reading a book?

Sometimes it seems that in Shirky’s ideal world everyone is talking and no one is listening.

(I commented on the idea that not sharing is selfish here.)

Tuesday, June 29, 2010

The Long Ships

Oh, how I love commendations of neglected or forgotten books. Michael Chabon beautifully praises one called The Long Ships. Whose fault is it that this book is unknown?
The fault, therefore, must lie with the world, which as any reader of The Long Ships could tell you, buries its treasures, despises its glories, and seeks contentment most readily in the places where it is least likely to be found. Fortunately for us, it is in just those unlikely places, as Red Orm quickly learns, that the opportunities and treasures of the world may often be found. My encounter with The Long Ships came when I was fourteen or fifteen, through the agency of a true adventurer, my mother’s sister, Gail Cohen. Toward the end of the sixties she had set off, with the rest of her restless generation of psychic Vikings, on a journey that led from suburban Maryland, to California where she met and fell in love with a roving young Dane, to Denmark itself, where she settled and lived for twenty years. It was on one of her periodic visits home that she handed me a UK paperback edition of the book, published by Fontana, which she had randomly purchased at the airport in Copenhagen, partly because it was set in her adopted homeland, and partly because there was nothing on the rack that looked any better. “It’s really good,” she assured me, and I would soon discover for myself the truth of this assessment, which in turn I would repeat to other lucky people over the years to come. Gail’s own adventure came to an end at home, in America, in the toils of cancer. When she looked back at the map of it, like most true adventurers, she saw moments of joy, glints of gold, and happy chances like the one that brought this book into her hands. But I fear that like most true adventurers—but unlike Bengtsson’s congenitally fortunate hero—she also saw, looking back, that grief overtopped joy, that trash obscured the treasure, that, in the end, the bad luck outweighed the good. That is the great adventure, of course, that reading holds over what we call “real life.” Adventure is a dish that is best eaten takeout, in the comforts of one’s own home.

too much research

I couldn't agree more with this call to reduce the amount of published academic research. Too much of what is published is of poor quality, and most published research is ignored by the scholars' peers. (We can only hope that it's the poor quality work that's being ignored.)
All of us in academia have colleagues who enjoy teaching but who have little or no interest in scholarly publication — and yet are forced to pursue scholarly publication by the tenure system. How good is work likely to be that is done, not for the love of it or out of passionate curiosity, but at metaphorical gunpoint?
The primary valid reason for mandating such published research, I think, is to ensure that the teachers in our classrooms actually know what's going on in their fields. (To be sure, that's not the reason that university administrations mandate publications, but it ought to be.) Without publish-or-perish, there's some danger that teachers will spend their careers placidly recycling what they learned in graduate school, without ever having to reckon with new knowledge and new approaches.
But is publish-or-perish the only, or even the best, solution to this problem? I tend to think that digital technologies — and especially technologies for sifting the web — offer all sorts of new ways for teachers to show their command of their disciplines, and to do so in ways that are more accessible to their students than traditional scholarly articles. A really well-curated wiki, for instance, would be useful to administrators trying to evaluate their faculty and to students taking that teacher's course. The sooner we abandon the current models and encourage professors to explore those new technologies, the sooner we are likely to find a lively alternative to the current cycle of scholarly futility.
I will try to give more examples of how this might work in future posts.

Monday, June 28, 2010

catacombs

“A spotlight illuminates the icon of the Apostle John discovered with other paintings in a catacomb located under a modern office building in a residential neighborhood of Rome, Tuesday, June, 22, 2010. Restorers said Tuesday they had unearthed the 4th-century images using a new laser technique that allowed them to burn off centuries of white calcium deposits without damaging the dark colors of the original paintings underneath. The paintings adorn what is believed to be the tomb of a Roman noblewoman and represent some of the earliest evidence of devotion to the apostles in early Christianity.” — here

'nuff said

paywalls vs. ?

Jeff Jarvis is contemptuous of Rupert Murdoch's decision to charge for online access to his newspapers and magazines. I think that it's hard to imagine paywalls working, but what should Murdoch do? Oddly, Jarvis makes not one recommendation. If paywalls are so obviously misbegotten, what are the alternatives? Perhaps if there really are any Jarvis would have mentioned them. Maybe building paywalls and trying to keep sites alive via advertising are just two different ways of losing money.

this is dialogue?

library ad infinitum:

Putting The Shallows into dialogue with Shirky's Cognitive Surplus, the latter book seems like the one with an actual idea. However smartly dressed, Carr's concern about the corrosiveness of media is really a reflex, one that's been twitching ever since Socrates fretted over the dangers of the alphabet. Shirky's idea — that modern life produces a surplus of time, which people have variously spent on gin, television, and now the Internet — is something to sink one's teeth into.

This is pretty typical of the technophilic reviews I’ve seen so far of Carr’s book: let’s just pretend that Carr didn't cite any research to support his points, or that the research doesn't exist. Let’s just assert that Carr made assertions. In short: Carr makes claims I would prefer to be false, so I’ll call his position an archaic “reflex.” That way I won't have to think about it.

(Steven Johnson, by contrast — see my comments a few posts back —, acknowledges that the research on multitasking is there, that it’s valid, and that Carr has cited it fairly. He just doesn't think that losing 20% of our attentiveness is all that big a deal.)

It would be a wonderful thing if someone were to put Carr’s book and Shirky’s into dialogue with each other — I might try it myself, if I can find time to finish Cognitive Surplus — but saying, in effect, “this book sucks” and “this other book is awesome” doesn't constitute dialogue.

Sunday, June 27, 2010

bodies and minds

Really interesting brief essay by Linda Stone:

In our current relationship with technology, we bring our bodies, but our minds rule. “Don’t stop now, you’re on a roll. Yes, pick up that phone call, you can still answer these six emails. Follow Twitter while working on PowerPoint, why not?” Our minds push, demand, coax, and cajole. “No break yet, we’re not done. No dinner until this draft is done.” Our tyrannical minds conspire with enabling technologies and our bodies do their best to hang on for the wild ride.

With technologies like Freedom, we re-assign the role of tyrant to the technology. The technology dictates to the mind. The mind dictates to the body. Meanwhile, the body that senses and feels, that turns out to offer more wisdom than the finest mind could even imagine, is ignored.

At the heart of compromised attention is compromised breathing. Breathing and attention are commutative. Athletes, dancers, and musicians are among those who don’t have email apnea. Optimal breathing contributes to regulating our autonomic nervous system and it’s in this regulated state that our cognition and memory, social and emotional intelligence, and even innovative thinking can be fueled.

Our opportunity is to create personal technologies that are prosthetics for our beings. Conscious Computing. It’s post-productivity, post-communication era computing.

Stone is best known as the coiner of the term “continuous partial attention”.

Friday, June 25, 2010

the writer who forgot how to read

once more without much feeling

I think I may have said all I have to say about e-readers, at least until the technology changes significantly. I've written a good deal about their benefits and their limitations, and I think I've covered both categories fairly well — at least, that's how it seems to me when I read this article about six academics' responses to e-readers. Been all those places, done all those things, bought all those t-shirts.
Two months ago I was reading on my Kindle every day, and thinking it was likely to get more and more important to me; in the past month I have scarcely touched it, largely because suddenly, and for no particular reason that I can identify, its really lousy typography started to bug me. It will probably go in and out of favor with me for a while to come; but any possibility of electronic devices making up the bulk of my reading experience seems a long way off. Electronic reading has been a major topic on this blog but probably won't be in the future — not until two issues get themselves sorted out: typography and DRM.
That said, the recent 2.5 update to the Kindle software — especially the ability to create "collections" — and the relatively recent ability to look at one's notes and marks online have dramatically increased the usefulness of the device.

Thursday, June 24, 2010

re-evaluating

Ross Douthat disagrees with Stanley Fish and me, but the article AKMA linked to in his comment on my first Fish post suggests that the data may be on our side:
Professors rated highly by their students tended to yield better results for students in their own classes, but the same students did worse in subsequent classes. The implication: highly rated professors actually taught students less, on average, than less popular profs.
Meanwhile, professors with higher academic rank, teaching experience and educational experience -- what you might call "input measures" for performance -- showed the reverse trend. Their students tended to do worse in that professor's course, but better in subsequent courses. Presumably, they were learning more.
That conclusion invites another: students are, in essence, rewarding professors who award higher grades by giving them high ratings, and punishing professors who attempt to teach material in more depth by rating them poorly.
I agree with Ross that teachers need to be evaluated, and evaluated well; the question is whether student evaluations as they are currently practiced help or hinder that goal.

Fish follow-up

A generally thoughtful piece by Mark Bousquet, with some valuable considerations of the various ways — legitimate and not-so-legitimate — that teachers can get their students to rate them more highly.
But here's an odd thing:
Fish makes two arguments against the proposal. He squanders pixels bolstering his weaker point, that students aren't necessarily in a position to judge whether Fish-as-teacher-phallus has, ugh, “planted seeds that later grew into mighty trees of understanding.”
How exactly is planting trees a phallic act? Apparently Bousquet is forgetting that there's more than one meaning for the word "seed."

Stanley Fish is right again

In a follow-up to his earlier post about his gratitude for high-school education:

A number of responses to my column about the education I received at Classical High (a public school in Providence, RI) rehearsed a story of late-flowering gratitude after an earlier period of frustration and resentment. “I had a high school (or a college) experience like yours,” the poster typically said, “and I hated it and complained all the time about the homework, the demands and the discipline; but now I am so pleased that I stayed the course and acquired skills that have served me well throughout my entire life.”

Now suppose those who wrote in to me had been asked when they were young if they were satisfied with the instruction they were receiving? Were they getting their money’s worth? Would they recommend the renewal of their teachers’ contracts? I suspect the answers would have been “no,” “no” and “no,” and if their answers had been taken seriously and the curriculum they felt oppressed by had been altered accordingly, they would not have had the rich intellectual lives they now happily report, or acquired some of the skills that have stood them in good stead all these years. . . .

“Deferred judgment” or “judgment in the fullness of time” seems to be appropriate to the evaluation of teaching.

And that is why student evaluations (against which I have inveighed since I first saw them in the ’60s) are all wrong as a way of assessing teaching performance: they measure present satisfaction in relation to a set of expectations that may have little to do with the deep efficacy of learning.

This is exactly right. I have often argued over the years that if we must have such evaluations, students should be asked for their responses to a course at least one semester after completing it. Instead, they are asked for their judgments near the end of a semester, when they are probably busier and more stressed than at any other time, and when they haven't completed their final work for the class or received their final evaluations. It’s a perfect recipe for useless commentary.

By the way, colleagues typically respond to my suggestion by arguing that if students have to wait a semester before evaluating courses, they won't even remember what kind of experience they had. I counter, “If true, wouldn’t that be worth knowing?”

Wednesday, June 23, 2010

a note about comments

Gentle readers, I am thankful for the thoughtfulness of the commenters on this blog. I learn from y'all, and even when I disagree my thinking is often sharpened and clarified. I think I have an obligation to respond thoughtfully to people who have responded thoughtfully to me — but I may have some trouble doing that in the coming weeks. I am trying to finish this darn book, and that leaves me little time to do other kinds of typing. But I will respond when I can.
Another thing: I often write a week's worth of posts at once and then schedule them to apear throughout the week. When you're writing a book, multitasking is your enemy, so I try to block out relatively large chunks of time to devote to it. Writing blog posts every day doesn't fit my modus operandi very well. So I may go a couple of days without even looking at this blog, which means that I may not even see your impassioned and intelligent comment until you've forgotten that you wrote it. . . .

the uses of books

Market Day

To me, James Sturm's Market Day provides a far more compelling visual world than David Small's Stitches. It is beautiful and memorable. But even so, the story leaves something to be desired. Sturm tells the story of a rug-weaver who takes his wares to market but by the end of the day has, it seems, completely changed his life. The problem is that there just aren't enough . . . well, words to explain this change. There's no doubt that Mendelman suffers some serious jolts during what he had expected to be a commonplace visit to a nearby market town, but are those jolts really sufficient to cause him to throw over his life's work, his beloved vocation? Would any man so dedicated make so dramatic a decision so quickly? It's not impossible — but it's not at all likely. We need to learn more about Mendelman in order to decide whether his catastrophe makes sense. I think we also need more background, in Mendelman's character and in the culture, to account for the descent into obscenity that he suffers near the end of the story.
There are some things pictures do better than words: Sturm creates with remarkable power the materiality of the old world of Eastern European Jewish culture. But other things words do better than pictures: Mendelman's delicate psychological state is something that can't be rendered fully and effectively without more language. Or so it seems to me.
It's a very worthwhile book all the same. I want to emphasize that in case he sees this, because he told Amazon that he started his recent internet fast in part because of his responses to (amateur and professional) reviews:
In some ways, Market Day was the reason I went offline. I can get obsessive sometimes when I’m online, and I knew if I had a book out, I’d be looking at my Amazon ranking, and I’d be re- reading interviews, and, you know, “What does Chewbacca45 think of my book?” Like Mendleman, every one of those things would be either an ego puff, or a little arrow. As I’ve gotten older and done a few books now, I’ve realized how fleeting this moment is...and by not being online, I feel like I can enjoy this very brief window. I feel like I have a healthier relationship with the book.

Tuesday, June 22, 2010

Stitches

Text people — my tribe — tend not to get graphic stories. Or we struggle to get them. We zip through a whole book in less than an hour and feel cheated. “This could have been a short story!” “I bet the whole book doesn't have more than a couple of thousand words.”

To respond in this way is of course to miss the point. If you’re not allowing yourself to be absorbed into the book’s visual world — if your eye is passing as quickly as possible over the images in order to get to the next words — then you’re not experiencing the book as it was meant to be experienced. In a very meaningful sense, you’re not reading it at all.

That said, the artist needs to create a fully-realized visual world for the reader to enter. When that happens, you don't think about the slightness of the story, the paucity of words, because so much is being communicated by the images that the story feels full. I didn't like Persepolis all that much, largely because I find the character of the young Marjane Satrapi neither interesting nor attractive, but I thought the world she created with her images a very rich one.

By contrast, David Small’s Stitches felt slight to me, the story sketchy and without much nuance, because I didn't feel that the images conjured up a whole world. The book has been greatly praised, so it’s likely that I’m missing something, but to me the story largely felt like an act of vengeance against Small’s parents, especially against his mother. And while she very well may have deserved his anger, I think most readers tend to sympathize with a narrator more when there seems to be an attempt at fairness — even when the point of the book is to dramatize Small’s experience at the time. Small adds a note at the end, with a photograph of his mother, that suggests a more complex story, but it’s not there in the narrative itself, and I think that’s a problem.

But if the book’s visual world were more compelling to me, I’m not sure I would feel this way: I might be experiencing the terror of young David Small’s world so fully that I wouldn’t be standing back and judging. Your mileage, of course, may vary.

(You can read an excerpt from Stitches here.)

(And I'll have thoughts on another graphic story tomorrow.)

Monday, June 21, 2010

quantity and quality revisited

It would seem that Steven Johnson isn't the only advocate of the quantity-trumps-quality defense of online life. The other day I mentioned Cory Doctorow’s praise for Clay Shirky’s new book, but Jonah Lehrer has a different and considerably more skeptical take:

After Shirky introduces his argument, much of the remaining 170 pages of the book are devoted to outlining what this surplus has produced. The author begins by describing the protests in South Korea over the importation of American beef. Interestingly, a majority of the protesters were teenage girls, who had been motivated to take to the streets by their online conversations. (Many of these conversations took place on a website dedicated to a Korean boy band.) Shirky describes this protest movement in breathless terms: "When teenage girls take to the streets to unnerve national governments, without needing professional organizations or organizers to get the ball rolling, we are in new territory," he writes.

But are we really? There were, after all, a few political protests before the internet. Somehow, the students at Kent State found a way to organize without relying on the chat rooms of Bobdylan.com. While the internet might enable a bit more youthful agitprop, it seems unlikely that we are on the cusp of a new kind of politics, driven by the leisure hours of the young. . . . After getting enthralled by the opening premise of the book, I expected Shirky to have a long list of exciting new examples of our surplus at work. This is where the book gets slightly disappointing. From Wikipedia, Shirky takes us on a tour of ... lolcats. He cites ICanHasCheezburger.com as an example of what happens when our cognitive surplus is transformed into "the stupidest possible creative act." While Shirky pokes fun at the site, he still argues that it represents a dramatic improvement over the passive entertainment of television. "The real gap is between doing nothing and doing something, and someone making lolcats has bridged that gap." There are two things to say about this. The first is that the consumption of culture is not always worthless. Is it really better to produce yet another lolcat than watch The Wire? And what about the consumption of literature? By Shirky's standard, reading a complex novel is no different than imbibing High School Musical, and both are less worthwhile than creating something stupid online. While Shirky repeatedly downplays the importance of quality in creative production — he argues that mediocrity is a necessary side effect of increases in supply — I'd rather consume greatness than create yet another unfunny caption for a cat picture.

I would add that when we read books, if we’re reading them well, we’re not just consuming: our minds are deeply and fully and actively engaged, and in ways that are measurable, if you need that kind of evidence. (Carr has some good stuff about this in The Shallows.) Shirky is generally contemptuous of literary reading, but if he thinks literary reading is only passive consumption, the science doesn't bear him out.

Saturday, June 19, 2010

Steven Johnson's numbers game

Unfortunately, Steven Johnson, once one of the sharpest cultural commentators around, seems to be turning into a caricature. His recent response to the concerns about digital life articulated by Nicholas Carr and others is woefully bad. He simply refuses to take seriously the increasingly large body of evidence about the negative consequences of always-on always-online so-called multitasking. Yes, “multitasking makes you slightly less able to focus,” or, as he later says, “I am slightly less focused,” or, still later, “we are a little less focused.” (Am I allowed to make a joke here about how multitasking makes you less likely to notice repetitions in your prose?)

But what counts as a “little less”? Choosing to refer only to one of the less alarming of the many studies available, Johnson reports that it “found that heavy multitaskers performed about 10 to 20 percent worse on most tests than light multitaskers.” Apparently for Johnson losing 20% of your ability to concentrate is scarcely worth mentioning. And apparently he hasn't seen any of the studies showing that people who are supremely confident in their multitasking abilities, as he appears to be, are more fuddled than anyone else.

Johnson wants us to focus on the fabulous benefits we receive from a multitasking life. For instance,

Thanks to e-mail, Twitter and the blogosphere, I regularly exchange information with hundreds of people in a single day: scheduling meetings, sharing political gossip, trading edits on a book chapter, planning a family vacation, reading tech punditry. How many of those exchanges could happen were I limited exclusively to the technologies of the phone, the post office and the face-to-face meeting? I suspect that the number would be a small fraction of my current rate.

And then, later: “We are reading more text, writing far more often, than we were in the heyday of television.” So it would appear that Johnson has no concept whatsoever of quality of interaction — he thinks only in terms of quantity. How much we read, how much we write, how many messages we exchange in a day.

That’s it? That’s all? Just racking up the numbers, like counting your Facebook friends or Twitter followers? Surely Johnson can do better than this. I have my own concerns about Carr’s arguments, some of which I have tried to articulate here, but the detailed case he makes for the costs of connection deserves a far more considered response than Johnson is prepared to give it.

I think the Steven Johnson of a few years ago would have realized the need to make a much stronger — and probably a wholly different — case for the distracted life than this sad little counting game. He should get offline for a few weeks and think about all this some more.

these days

Tom Bisell, from his book Extra Lives, an extended defense of the art of the video game and the value of spending large chunks of your life playing them:

Once upon a time, I wrote in the morning, jogged in the late afternoon, and spent most of my evenings reading. Once upon a time, I wrote off as unproductive those days in which I had managed to put down “only” a thousand words. Once upon a time, I played video games almost exclusively with friends. Once upon a time, I did occasionally binge on games, but these binges rarely had less than fortnight between them. Once upon a time, I was, more or less, content.

“Once upon a time” refers to relatively recent years (2001-2006) during which I wrote several books and published more than fifty pieces of magazine journalism and criticism — a total output of, give or take, 4,500 manuscript pages. I rarely felt very disciplined during this half decade, though I realize this admission invites accusations of disingenuousness or, failing that, a savage and justified beating. Obviously, I was disciplined. These days, however, I am lucky if I finish reading one book every fortnight. These days, I have read from start to finish exactly two works of fiction — excepting those I was not also reviewing — in the last year. These days, I play video games in the morning, play video games in the afternoon, and spend my evenings playing video games. These days, I still manage to write, but the times I am able to do so for more than three sustained hours have the temporal periodicity of comets with near-Earth trajectories.

Friday, June 18, 2010

nobody does it better than A. O. Scott

A. O. Scott on Toy Story 3:

Perhaps no series of movies has so brilliantly grasped the emotional logic that binds the innate creativity of children at play to the machinery of mass entertainment. Each one feeds, and colonizes, the other. And perhaps only Pixar, a company Utopian in its faith in technological progress, artisanal in its devotion to quality and nearly unbeatable in its marketing savvy, could have engineered a sweeping capitalist narrative of such grandeur and charm as the “Toy Story” features. “Toy Story 3” is as sweet, as touching, as humane a movie as you are likely to see this summer, and yet it is all about doodads stamped and molded out of plastic and polyester.

Therein lies its genius, and its uncanny authenticity. A tale that captured the romance and pathos of the consumer economy, the sorrows and pleasures that dwell at the heart of our materialist way of life, could only be told from the standpoint of the commodities themselves, those accretions of synthetic substance and alienated labor we somehow endow with souls.

today lolcats, tomorrow the world

Cory Doctorow on Clay Shirky’s new book:

Shirky is very good on the connection between trivial entertainments and serious business, from writing web-servers to changing government. Lolcats aren't particularly virtuous examples of generosity and sharing, but they are a kind of gateway drug between zero participation and some participation. The difference between "zero" and "some" being the greatest one there is, it is possible and even likely that lolcatters will go on, some day, to do something of more note together.

Can someone explain to me how the third sentence there follows from the previous two?

revisiting Barsetshire (2)

The second major impression that strikes me, on this re-reading, is Trollope’s almost metafictional refusal to play some of the typical games of the novelist. A great example comes in Barchester Towers when we see our heroine, Eleanor Harding, pursued simultaneously by the feckless and improvident Bertie Stanhope and the scheming, oily Reverend Obadiah Slope. Trollope pauses in the midst of his narration and makes this rather surprising statement:

But let the gentle-hearted reader be under no apprehension whatsoever. It is not destined that Eleanor shall marry Mr. Slope or Bertie Stanhope. And here perhaps it may be allowed to the novelist to explain his views on a very important point in the art of telling tales. He ventures to reprobate that system which goes so far to violate all proper confidence between the author and his readers by maintaining nearly to the end of the third volume a mystery as to the fate of their favourite personage. Nay, more, and worse than this, is too frequently done. Have not often the profoundest efforts of genius been used to baffle the aspirations of the reader, to raise false hopes and false fears, and to give rise to expectations which are never to be realized? Are not promises all but made of delightful horrors, in lieu of which the writer produces nothing but most commonplace realities in his final chapter? And is there not a species of deceit in this to which the honesty of the present age should lend no countenance?

He does the same thing in Doctor Thorne, when he introduces a digression on a minor character thusly: “Though, by so doing, we shall somewhat anticipate the end of our story, it may be desirable that the full tale of Mr Gazebee's loves should be told here. When Mary is breaking her heart on her death-bed in the last chapter, or otherwise accomplishing her destiny, we shall hardly find a fit opportunity of saying much about Mr Gazebee and his aristocratic bride.” Just a sly reminder that — of course — Trollope has no intention of allowing his beloved Mary Thorne to “break her heart on her death-bed.” Which is why I didn't introduce this post with the words “SPOILER ALERT.”

Thursday, June 17, 2010

revisiting Barsetshire

It’s been ten years or more since I read Anthony Trollope’s Barsetshire novels, and I am returning to them now with great delight. I have now re-read the first four, and will probably move along to The Last Chronicle of Barset because of my particular dislike for Lily Dale, the masochistic heroine of The Small House at Allington. There’s no reason why I should force myself, you know.

A number of impressions strike me this time around, and I’ll just mention a couple of them — one today, one tomorrow.

Trollope is wonderful with characters who are bad but could have been good, or good but could have been bad. This is particularly evident in Framley Parsonage. One of my favorite Trollope characters is Lady Lufton, whose pride in her family and class could easily have turned her into a snobbish tyrant — were it not for the essential goodness of her heart, her deep desire to love and be loved. Conversely, after going to considerable trouble to portray Mr. Sowerby as a scoundrel, Trollope then pivots and goes to almost equal trouble to show us that he could have been something much better:

We know not what may be the nature of that eternal punishment to which those will be doomed who shall be judged to have been evil at the last; but methinks that no more terrible torment can be devised than the memory of self-imposed ruin. What wretchedness can exceed that of remembering from day to day that the race has been all run, and has been altogether lost; that the last chance has gone, and has gone in vain; that the end has come, and with it disgrace, contempt, and self-scorn — disgrace that never can be redeemed, contempt that never can be removed, and self-scorn that will eat into one's vitals for ever? Mr. Sowerby was now fifty; he had enjoyed his chances in life; and as he walked back, up South Audley Street, he could not but think of the uses he had made of them. He had fallen into the possession of a fine property on the attainment of his manhood; he had been endowed with more than average gifts of intellect; never-failing health had been given to him, and a vision fairly clear in discerning good from evil; and now to what a pass had he brought himself!

Somewhat later in the book, Sowerby is speaking to Mark Robarts, a man whom he has entangled in financial difficulties, and is moved to tears, real tears, by Robarts's plight. He would do something to help him if he could, but there's nothing he can do — he has compromised himself too thoroughly, is too deep in debt, is so utterly discredited that he's helpless. Moreover, Robarts doesn't believe in Sowerby's good will, which grieves Sowerby but does not surprise him.
And here's one more layer: Trollope also points out that, in the midst of his self-condemnation and his attempts to help his friends, if not himself, Sowerby still dresses elegantly, still pays for cabs rather than walking anywhere. Somehow, Trollope comments, such ruined men always manage to find ready cash for life's little luxuries, to which they are so accustomed that no real choice occurs to their minds. A cab in London isn't a luxury to Sowerby, it's just what one does.
Trollope is never given enough credit for the subtlety and complexity with which he renders such points.

Wednesday, June 16, 2010

the moral lives of emergent adults

I seem to be in an academic-pedagogical vein these days, and while I’ll shift from that tomorrow, let me go at it one more time. . . .

Some people have an inexhaustible appetite for the what’s-the-matter-with-these-darn-kids subgenre of the jeremiad; others can't stand it and find it intrinsically offensive. But whatever side you’re on, and especially if you’re not on either, it would be worth your time to pay attention to Souls in Transition, the recent book by Christian Smith and Patricia Snell about “emergent adults” — basically, people between the ages of 18 and 23. Smith and Snell aren’t hectoring finger-waggers; instead, they’re primarily reporters, and it would be unjust to blame them if much of what they have to report is troubling. It’s not all bad news, by any means, but here are two representative passages from an early chapter in which they summarize their findings:

Voices critical of mass consumerism, materialistic values, or the environmental or social costs of a consumer-driven economy were nearly nonexistent among emerging adults. Once the interviewers realized, after a number of interviews, that they were hardly in danger of leading their respondents into feigned concern about consumerism, the interviewers began to probe more persistently to see if there might not be any hot buttons or particular phrases that could tap into any kind of concern about materialistic consumerism. There were not. Very many of those interviewed simply could not even understand the issue the interviewers were asking them about.

. . .

The majority of those interviewed stated . . . that nobody has any natural or general responsibility or obligation to help other people. . . . Most of those interviewed said that it is nice if people help others, but that nobody has to. Taking care of other people in need is an individual's choice. If you want to do it, good. If not, that's up to you. . . . Even when pressed — What about victims of natural disaster or political oppression? What about helpless people who are not responsible for their poverty or disabilities? What about famines and floods and tsunamis? — No, they replied. If someone wants to help, then good for that person. But nobody has to.

If nothing else, all this is a salutary reminder to me of how different my Christian students are from the American norm. Not that they’re untouched by the movements Smith and Snell describe, by any means; but by and large their characters have been formed by quite different forces. Which raises the question, not just for Christian teachers but for all teachers: what are the best ways to educate people for meaningful participation in a society which is coming more and more to look like the world of these “emergent adults”?

Tuesday, June 15, 2010

evaluating the humanities

Here, in a nutshell, is the insoluble probleem with “the humanities” in the academy, by which I mean most people in English departments and a good many people in history, Continental philosophy, art history, political theory, etc.:

1) The scholarly performance of academic humanists is evaluated — by colleagues, tenure committees, etc. — using criteria developed for evaluating scientists.
2) Those criteria are built around the idea of knowledge creation.
3) But many humanists aren't sure what counts as knowledge creation for them, since they are not able to follow any agreed-upon method for testing hypotheses.
4) This problem grows more pressing as expectations for publication rise: scholars are asked to create more and more knowledge without being sure what knowledge is.
5) Thus the Cycle of Theory, in which an approach to doing humanist work arises, is deemed outrageous, is more and more generally accepted, becomes orthodoxy, is challenged by a new approach, and becomes superannuated. See: the New Criticism, archetypal criticism, structuralism, deconstruction, the New Historicism, Queer Theory, eco-criticism, etc. Lather, rinse, repeat.

But all these movements do have something in common: they generate books and articles that look quite similar. Basically, you get discursive prose with footnotes, and that’s about it (give or take a few typographical eccentricities in the Derridean traditon).

As has often been pointed out, no widely influential theoretical model has arisen since the New Historicism, about thirty years ago. This apparent end to a Cycle that has given generations of graduate students and assistant professors new stuff to do has raised anxiety levels to ever-higher levels.

One result is that humanists are becoming increasingly willing to look at models of scholarship that offer something other than discursive prose with footnotes. Thus the work of Franco Moretti and his students, mentioned here earlier, also Brian Boyd and Jonathan Gottschall.

All of these scholars have decided — in their very different ways — that the humanities need to stop seeing themselves as radically different than the sciences, but instead need to appropriate science and learn from it. This may be a matter of incorporating scientific discoveries (Boyd) or appropriating scientific methods (Moretti, Gottschall). But either way, it creates an interesting new situation in which the problem of evaluating scholarship n the humanities is going to become more, not less, complicated.

Though I am strongly critical of some of these approaches, I think this is an exciting time to be a humanist scholar — or would be, if institutional support for the humanities weren’t evaporating. Though some of this innovation derives from the shaky place of the humanities in the university, and attempts to shore up that place, I don't think any of that is likely to work. It would be great to see what might come of this ferment in an environment in which the humanities were well-funded and institutionally secure.

Monday, June 14, 2010

previous post, continued

From the Washington Post:

Increasingly, though, another view is emerging: that the money schools spend on instructional gizmos isn't necessarily making things better, just different. Many academics question industry-backed studies linking improved test scores to their products. And some go further. They argue that the most ubiquitous device-of-the-future, the whiteboard — essentially a giant interactive computer screen that is usurping blackboards in classrooms across America — locks teachers into a 19th-century lecture style of instruction counter to the more collaborative small-group models that many reformers favor.

"There is hardly any research that will show clearly that any of these machines will improve academic achievement," said Larry Cuban, education professor emeritus at Stanford University. "But the value of novelty, that's highly prized in American society, period. And one way schools can say they are 'innovative' is to pick up the latest device."

(I thought a whiteboard was, you know, a white board. But anyway.)

the teacher's dilemma

No thinking person can simply be for or against digital technology. You have to be able to use your critical faculties and evaluate any particular technology in an independent way, trying to balance the plusses (which there will be) against the minuses (which there will also be).

In my job as a teacher I use some recent technologies and avoid others. I assign blogs for some of my classes; I ask students to submit papers as PDFs which I then annotate. This kind of thing makes some of my colleagues think I am very cutting edge. On the other hand, I don't use Blackboard, not because I am philosophically opposed to it but because I think it is really terrible software — though admittedly not as terrible as it used to be. I think I can get at what Blackboard tries to do in other ways, some of them electronic, some not. I make a good many handouts, often with very sophisticated software like Omnigraffle, because I think such handouts are almost always better than PowerPoint. As I say, I evaluate on a case-by-case basis.

I just learned the other day that the classroom in which I usually teach — maybe three-fourths of my classes are there, the others in seminar rooms — will be transformed this summer into a “smart” classroom. This means that an enormous console will be hauled in, to enable a range of digital audio and video stuff, online and local. But the size of this console, and its accompanying projector and screen, will in turn require that the rectangular room’s seats be rotated ninety degrees, so that they will now be oriented lengthwise — that is, broad and shallow instead of long and narrow.

But this means (a) the space will be much more crowded, leaving me little room to move, and (b) it will be impossible to rearrange the seating. Now, as long as I have been teaching in that classroom I have arranged the seats in a two-desk-deep semicircle. This has enabled me to move among the students, to lecture when I need to, but also to get them talking to each other about the books we read. I have, I realize, adapted my teaching style to the characteristics of that space — I use the space as one of the tools in my pedagogical toolbox. But from now on the space will be significantly altered, and everyone in it will be in neat rows, facing the same direction, so that they can all look at pictures on the screen — and spend less time looking at books.

I think this is a bad trade-off, because what we have here is the facilitation of a particular set of technologies at the expense of others. It’s a net reduction of pedagogical options — or at best an even trade — and the pedagogical options it does enable are ones poorly suited to teaching students to read books with care.

So now I have a choice: Do I try to get my class reassigned to another room, almost certainly in another building? (One major advantage of this room is that it’s twenty feet from my office.) Do I try to adapt my teaching style to this new environment? Or do I try to persist in my old teaching style, fighting this new environment?

Friday, June 11, 2010

the humanities: once more with vagueness

James Mulholland, an English professor at the Wheaton College that Ann Curry confuses with mine, writes about the future of the humanities:

We could think of humanities centers as the beginning of a “more is more” strategy for our fields in the corporatized university. One constant complaint from humanists is that academic budgets are more devoted to financing the sciences, from expensive labs to costly science journals. In the competition for scarce resources, we need to be more aggressive in attracting research money, whether it’s through the pursuit of “big humanities” (digital projects, long-term edited collections, and the like) or through centers that can draw donors who want to see their names in lights.

Some scholars worry that such efforts would undercut departmental budgets. But I think the opposite could happen. Humanities centers would complement traditional disciplines, provide publicity for the college, and, most important, direct money back to traditional disciplines. Centers are good advertising within the college, especially for donors who can see what it is that we do.

I’ve read Mulholland’s article several times, and I can't figure out what exactly he is recommending. What counts as a “humanities center”? Who works there? What do they do? Are students involved, and if so in what ways? Also, what are “big humanities”? What kinds of “digital projects” would be likely to attract research money away from the sciences? Why would funding agencies be interested in “long-term edited collections,” and what would they be collections of?

I don't think the humanities are going to get anywhere unless we can come up with something a lot more specific than this.

And I will say one thing about my own discipline: humanities centers or no humanities centers, I do not think that the study of literature will long survive as an independent concern within universities. I think by the time I retire literature will be studied only as part of two other disciplines: rhetoric and cultural history. And while that will be unfortunate in some ways, it won't be the worst thing that ever happened to literature.

Thursday, June 10, 2010

re-enactment

Here, via Kottke. Too awesome.

writing, silence, and privacy

From a brilliant essay by Jed Perl:

Writing, before it is anything else, is a way of clarifying one’s thoughts. This is obviously true of forms such as the diary, which are inherently solitary. But even those of us who write for publication can conclude, once we have clarified certain thoughts, that these thoughts are not especially valuable, or are not entirely convincing, or perhaps are simply not thoughts we want to share with others, at least not now. For many of us who love the act of writing — even when we are writing against a deadline with an editor waiting for the copy — there is something monastic about the process, a confrontation with one’s thoughts that has a value apart from the proximity or even perhaps the desirability of any other reader. I believe that most writing worth reading is the product, at least to some degree, of this extraordinarily intimate confrontation between the disorderly impressions in the writer’s mind and the more or less orderly procession of words that the writer manages to produce on the page. . . .

I am not saying that writers need to be or ought to be isolated, either from other writers or from the reading public at large. But writers must to some degree believe that they are alone with their own words. And writers who are alone with their words will quite naturally, from time to time, conclude that some of those words should remain private. This needs to be emphasized right now, when so few people in the publishing industry understand why anything that has been written, and especially written by a well-known author, should not be published, and not published with the widest possible readership in mind.

. . . What I fear is that many readers are coming to believe that a writer who holds something back from publication is somehow acting unnaturally. Nobody understands the extent to which, even for the widely acclaimed author with ready access to publication, the process of writing can sometimes necessitate a rejection or at least an avoidance of one’s own readers. That silence is a part of writing — that the work of this day or this week or even this year might for good reason be withheld — is becoming harder and harder to comprehend.

The dominance in our culture of social networking, especially but not only Facebook, intensifies this problematic situation. Shyness and introversion, as a search for either of those words on Amazon.com will show you, are regularly seen as pathologies; Eric Schmidt thinks that if you don't want Google to know everything about you you must have something discreditable to hide; Mark Zuckerberg believes, or says he believes, that the exposure of your life on Facebook promotes honesty and integrity. Clearly there are people who would like to see a social stigma attached to a concern for privacy: will they succeed in making it happen?

Wednesday, June 9, 2010

Hypatia and the Great Library

David Bentley Hart is doing his best to replace some long-told lies with some approximation of the truth. Well, we all know how that kind of thing works out.
Mark Twain, from his great address "Advice to Youth":
Think what tedious years of study, thought, practice, experience, went to the equipment of that peerless old master who was able to impose upon the whole world the lofty and sounding maxim that “Truth is mighty and will prevail”—the most majestic compound fracture of fact which any of woman born has yet achieved. For the history of our race, and each individual’s experience, are sewn thick with evidences that a truth is not hard to kill, and that a lie well told is immortal. There is in Boston a monument of the man who discovered anesthesia; many people are aware, in these latter days, that that man didn’t discover it at all, but stole the discovery from another man. Is this truth mighty, and will it prevail? Ah no, my hearers, the monument is made of hardy material, but the lie it tells will outlast it a million years. An awkward, feeble, leaky lie is a thing which you ought to make it your unceasing study to avoid; such a lie as that has no more real permanence than an average truth. Why, you might as well tell the truth at once and be done with it. A feeble, stupid, preposterous lie will not live two years—except it be a slander upon somebody. It is indestructible, then of course, but that is no merit of yours. A final word: begin your practice of this gracious and beautiful art early—begin now. If I had begun earlier, I could have learned how.

Tuesday, June 8, 2010

academic, interviewed

By Conor Friedorsdorf at his new Ideas blog for the Atlantic. Familiar stuff to regular readers of this blog.

history men

I have mentioned elsewhere that the best work of history I have read in a long, long time is Keith Thomas's The Ends of Life: Roads to Fulfillment in Early Modern England.
But if it weren't for Thomas, I would be singing in the streets about Jonathan Rose's The Intellectual Life of the British Working Classes. I am trying to resist the temptation to put an anecdote from that book on every page of my own book about reading. One example will have to serve, concerning "an uneducated Irish laborer" who managed to acquire sufficient literacy not just to read but even to write:
When he shut himself in a bedroom to write, his anxious family held a conference and did everything to dissuade him. "There's something far wrong with a man who writes letters to himself!" his brother exploded. "If you'd just been a pouf the priest could have talked to you or one of us could have battered it out of you. But what the hell can anybody do about a writer?" When he received his first check for a short story, his mother was convinced that he had committed some kind of fraud and insisted that he return it. And when a television play of his was reviewed "his mother was shocked and and said that theirs had been a respectable family until then; never once had any of their names been in the paper.
Of course, there are may other examples of working-class families who invested great hopes in education and encouraged, or even pressured, their children to academic success — a situation that has its own perils. Anyway: read this book. There's a brand-new edition out.

Monday, June 7, 2010

back in the day

Stanley Fish:

I wore my high school ring for more than 40 years. It became black and misshapen and I finally took it off. But now I have a new one, courtesy of the organizing committee of my 55th high school reunion, which I attended over the Memorial Day weekend.

I wore the ring (and will wear it again) because although I have degrees from two Ivy league schools and have taught at U.C. Berkeley, Johns Hopkins, Columbia and Duke, Classical High School (in Providence, RI) is the best and most demanding educational institution I have ever been associated with. The name tells the story. When I attended, offerings and requirements included four years of Latin, three years of French, two years of German, physics, chemistry, biology, algebra, geometry, calculus, trigonometry, English, history, civics, in addition to extra-curricular activities, and clubs — French Club, Latin Club, German Club, Science Club, among many others. A student body made up of the children of immigrants or first generation Americans; many, like me, the first in their families to finish high school. Nearly a 100 percent college attendance rate. A yearbook that featured student translations from Virgil and original poems in Latin.

UPDATE: I just noticed that when I posted this I didn't manage to include my comment on this quote. What I said was that Fish leaves out an important piece of information: while he identifies himself as the first in his family to finish high school, he doesn't describe his parents' attitude towards his education. My guess is that the parents of almost all the kids who attended Providence Classical High were deeply committed to their children's education and pressed them to do their very best, and then go on to the best college possible. I don't think schools as rigorous as that can succeed without the strongest possible backing of their strict and high standards by the parents of almost every kid in the place. At the end of his essay Fish pronounces his judgment on this kind of education: "Worked for me" — which seems to imply that it can and should be implemented widely so it can work for others as well. But that's true only where parental involvement is deep and parental standards are high.

me and the Beast

I'm still thinking about my future with or without Google, so don't jump to any conclusions if you see me wearing this. It doesn't mean a thing. Really.

every day in every way. . .

Jonah Lehrer:

There is little doubt that the Internet is changing our brain. Everything changes our brain. What Carr neglects to mention, however, is that the preponderance of scientific evidence suggests that the Internet and related technologies are actually good for the mind. For instance, a comprehensive 2009 review of studies published on the cognitive effects of video games found that gaming led to significant improvements in performance on various cognitive tasks, from visual perception to sustained attention. This surprising result led the scientists to propose that even simple computer games like Tetris can lead to “marked increases in the speed of information processing.” One particularly influential study, published in Nature in 2003, demonstrated that after just 10 days of playing Medal of Honor, a violent first-person shooter game, subjects showed dramatic increases in visual attention and memory.

Carr’s argument also breaks down when it comes to idle Web surfing. A 2009 study by neuroscientists at the University of California, Los Angeles, found that performing Google searches led to increased activity in the dorsolateral prefrontal cortex, at least when compared with reading a “book-like text.” Interestingly, this brain area underlies the precise talents, like selective attention and deliberate analysis, that Carr says have vanished in the age of the Internet. Google, in other words, isn’t making us stupid — it’s exercising the very mental muscles that make us smarter.

I wish I could believe this. And Clay Shirky too.

Also, I wanted to finish reading this story but I had to write this blog post. And tweet some.

Sunday, June 6, 2010

more from Nick Carr

Are Google Maps and GPS bad for our brains?:
Véronique Bohbot, a professor of psychiatry at McGill University in Montreal, has done extensive research demonstrating the connection between the size of the hippocampus and the degree to which we employ our navigational skills. She worries that, should our hippocampi begin to atrophy from a lack of use in navigation, the result could be a loss of memory and a growing long-term risk of dementia. 'Society is geared in many ways toward shrinking the hippocampus,' she said in an interview with journalist Alex Hutchinson last year. 'In the next twenty years, I think we're going to see dementia occurring earlier and earlier.'

Saturday, June 5, 2010

Future Fatigue

William Gibson:

Alvin Toffler warned us about Future Shock, but is this Future Fatigue? For the past decade or so, the only critics of science fiction I pay any attention to, all three of them, have been slyly declaring that the Future is over. I wouldn’t blame anyone for assuming that this is akin to the declaration that history was over, and just as silly. But really I think they’re talking about the capital-F Future, which in my lifetime has been a cult, if not a religion. People my age are products of the culture of the capital-F Future. The younger you are, the less you are a product of that. If you’re fifteen or so, today, I suspect that you inhabit a sort of endless digital Now, a state of atemporality enabled by our increasingly efficient communal prosthetic memory. I also suspect that you don’t know it, because, as anthropologists tell us, one cannot know one’s own culture.

The Future, capital-F, be it crystalline city on the hill or radioactive post-nuclear wasteland, is gone. Ahead of us, there is merely…more stuff. Events. Some tending to the crystalline, some to the wasteland-y. Stuff: the mixed bag of the quotidian.

Friday, June 4, 2010

the book of books

O'Connor investigated

Over at University Diaries, I am having a debate with Margaret Soltan about Flannery O'Connor. Come join the fun.

The Death and Life of the Book Review

The Death and Life of the Book Review:
In 1999 Steve Wasserman was three years into his tenure as the editor of The Los Angeles Times Book Review, and that July he published a review of Richard Howard's new translation of Stendhal's The Charterhouse of Parma. The reason was simple: Howard is among the best translators of French literature. As Wasserman explained several years ago in a memoir of his days at the Los Angeles Times published in the Columbia Journalism Review, the review of the book, written by Edmund White, was stylish and laudatory. The Monday after the piece ran, the paper's editor summoned Wasserman to his office and admonished him for running an article about "another dead, white, European male." But the paper's readers in Los Angeles thought otherwise. Soon after the review appeared, local sales of the book took off; national sales did too when other publications reviewed the book. The New Yorker ended up printing a "Talk of the Town" item that traced the book's unexpected success to The Los Angeles Times Book Review. In his memoir, Wasserman relates a similar story about Carlin Romano, then the books critic at the Philadelphia Inquirer, who was scolded by an editor for running as the cover story of his section a review of a new translation of Tirant Lo Blanc, a Catalan epic beloved by Cervantes. "Have you gone crazy?" the editor asked. "Perhaps the most remarkable aspect of America's newspapers in the 1990s," Romano reflected, "is their hostility to reading in all forms."
I may have to comment later on this fascinating article.

the patron saint of modern reading

This week I've spent some time thinking and writing about John Self, the protagonist of Martin Amis’s 1984 novel Money. Self is not what you’d call a reader. He may be living in the pre-internet days, but he has access to telephones, and directs television commercials for a living. He’s used to thinking in thirty-second bites. However, John Self is also enamored of a woman who won't talk to him until he reads a book she gives him. “Martina’s present was called Animal Farm and was by George Orwell. Have you read it? Is it my kind of thing?” Perhaps not, since Self runs aground on the first sentence — “Mr. Jones, of the Manor Farm, had locked the henhouses for the night, but was too drunk to remember to shut the pop-holes” — because he doesn't know what pop-holes are. (Neither did I when I read the book, I might add. But I didn't try to find out and I didn't stop reading.)

Still, Martina is immensely alluring, so he doesn't give up. “I positioned the lamp and laid out the cigarettes in a row. I then drank so much coffee that by the time I cracked the book open on my lap I felt like a murderer getting his first squeeze of juice from the electric chair.” Fearful of boredom, Self may have overdone the caffeine; for whatever reason he has trouble keeping on track.

[Orwell’s] book kicked off with the animals holding a meeting and voicing grievances about their lives. Their lives did sound rough — just work, no hanging out, no money — but then what did they expect? I don't nurse realistic expectations about Martina Twain. I nurse unrealistic ones. It’s amazing, you know, what big-earning berks can get these days. If you’re heterosexual, and you happen to have a couple of bob, you can score with the top chicks. The top prongs are all going gay, or opting for pornographic berk women. At the animals’ meeting, they sing a song. Beasts of England. . . . I went and lay down on the sack. My head was full of interference.

And things don't get much better for Self from here on. “This body of mine is a constant distraction. Here I am, trying to read, busy reading, yet persistently obliged to put the book aside in order to hit the can, clip my nails, shave, throw up, clean my teeth. . . .” Note that he is “obliged” to perform these acts by his rebellious body. “I started reading again. I went on reading for so long that I became obsessed by how long I had gone on reading. I called Selina.” But he does not call Selina to tell her that he has finished the book. After failing to reach her, he resumes his task, but:

Reading takes a long time, though, don't you find? It takes such a long time to get from, say, page twenty-one to page thirty. I mean, first you’ve got page twenty-three, then page twenty-five, then page twenty-seven, then page twenty-nine, not to mention the even numbers. Then page thirty. Then you’ve got page thirty-one and page thirty-three — there's no end to it. Luckily Animal Farm isn't that long a novel. But novels . . . they’re all long, aren't they. I mean, they’re all so long. After a while I thought of ringing down and having Felix bring me up some beers. I resisted the temptation, but that took a long time too. Then I rang down and had Felix bring me up some beers. I went on reading.

You know what’s great about John Self? He eventually finished the book. Not that he fully appreciated it: "The only thing that puzzled me was this whole gimmick with the pigs. . . . I mean, how come the pigs were meant to be so smart, so civilized and urbane? Have you ever seen pigs doing their stuff?"

However, if he had had a laptop and wireless access he wouldn't have gotten past page twenty-one. But still.

Thursday, June 3, 2010

Carr talk

At the Technology Liberation Front, Adam Thierer has a long, detailed, and helpful review of Nick Carr's The Shallows.
Meanwhile, Mr. Carr himself is pursuing a strategy of delinkification, and following up with, um, links to responses.

oppose it, I say!

Story here.

Leopard Skin Chief at Oxford

My friend and colleague Tim Larsen on A History of Oxford Anthropology:

The tone is set in a preface in which the argument is advanced that Oxford was able to lead the field because its collegiate system "provided a lived experience of 'tribal' life." This analysis is developed apparently in all seriousness, although the reader begins to wonder when it comes to lines such as the "Head of House is like the Leopard Skin Chief." The generosity of All Souls College made Oxford anthropology, and this is repaid by a chapter that explains that the college had more money than it knew what to do with and that it hoped anthropology would strengthen the British empire. Of a key founding figure, R. R. Marett, we are told "it would be difficult to identify any ideas of his that have had a lasting influence." A. R. Radcliffe-Brown is discussed in a chapter framed around the question of whether he was "a major disaster to anthropology." People generally liked personally and admired professionally his successor, E. E. Evans-Pritchard, which prompts much vague handwringing about the "mythology" of this period and ineffectual attempts to puncture its "special aura."

John Davis, who succeeded to the chair in 1990, gleefully reports what Sir Isaiah Berlin said to him at the time: "I have known all your predecessors: two charlatans, one eccentric and one sensible man. I wonder what you will turn out to be." Then there is the anecdote about a particularly heated exchange in which one anthropologist was complaining that scholars were over-determining artifacts. Holding up one he asked, "What is the use of this lump of metal?" To which a rattled colleague menacingly replied, "Well, I could kill you with it."

thesis for disputation

"Attacking bad books is not only a waste of time but also bad for the character. If I find a book really bad, the only interest I can derive from writing about it has to come from myself, from such display of intelligence, wit and malice as I can contrive. One cannot review a bad book without showing off." — W. H. Auden

Wednesday, June 2, 2010

publishers: please cooperate!

Jacqui Cheng:

There are already several open e-book formats out there — ePub and MobiPocket are just a couple. The major e-book devices even support them; with a little bit of effort, you can get an ePub version of a book onto your Kindle or iPad in no time. The problem is the “effort” part—e-book sellers like Amazon, Barnes & Noble, and Apple heavily market their own stores and make it even easier for customers to simply buy the proprietary formats.

The downside, of course, is that customers are then locked into specific formats and devices. As noted in a recent Reuters piece, a Kindle book may be readable on a Kindle app on the iPad, but it’s still limited to the Kindle “universe” — other devices that lack Kindle apps won’t be able to handle those formats, and vice versa.

“Our fondest wish is that all the devices become agnostic so that there isn’t proprietary formats and you can read wherever you want to read,” Penguin Group CEO David Shanks told Reuters. “First we have to get a standard that everybody embraces.”

Some believe the industry itself needs to get its act together before pointing fingers at Amazon or Apple. “Indeed, there are several open formats, but the problem is that they still need work,” self-published author Cesar Torres told Ars. Torres believes that if publishers worked together to get behind a particular open format, the format would improve and device makers would be more motivated to offer wider support.

“The problem still lies with publishing houses and their inability to talk to one another. Everyone is doing their own thing without any regard for readers or customers,” Torres said. “Apple and Amazon would be toast if publishers really got their act together.”

for connoisseurs only

Ruth Franklin in The New Republic:
At one panel I attended, titled “The Next Decade in Book Culture,” Nicholas Latimore, a publicist for Knopf, waxed lyrical about the material qualities of a hardbound book. His house has always gloried in its beautiful design: the colophon page at the end that identifies and gives the history of the font the book was set in; the deckled edges on the most prized books, which encourage the reader to turn pages slowly rather than flipping through; even the attention that book designers put into choosing paper with “the right tooth”—that perfectly calibrated roughness of texture. In response, someone commented that the hardcover book might be going the way of the vinyl LP: largely obsolete, replaced by a cheaper and more convenient product, but still prized by connoisseurs for the superior quality of the aesthetic experience it offers. A collective gasp was stifled at this idea. But perhaps it’s not so crazy.
Of course, the book has been around a lot longer and is far more deeply entrenched in our vision of culture—both what it is and what we want it to be—than the LP, which turned out to be a disposable format, a means to an end. Yet what the digital revolution in the music industry shows us, I think, is that what people want is music: the format doesn’t matter nearly as much as the product. As we moved from 45s to LPs to eight-tracks to cassette tapes to CDs to MP3s, the music itself remained the constant. What we wanted, it turned out, was to have as much music as possible at our fingertips at any given moment, easily accessible. This hasn’t been an unmitigated boom for the music industry, but it’s also been far from an unmitigated disaster. And I have faith that people—who have been telling stories just as long as we’ve been singing songs—will continue to want novels, too, no matter the format. I like deckled edges and toothsome paper as much as the next person, but if they turn out to be extravagances we can no longer afford, well, I still plan to keep reading.

confusion reigns

Don Norman and Jakob Nielsen see user-interaction chaos in new gestural devices:

In Apple Mail, to delete an unread item, swipe right across the unopened mail and a dialog appears, allowing you to delete the item. Open the email and the same operation has no result. In the Apple calendar, the operation does not work. How is anyone to know, first, that this magical gesture exists, and second, whether it operates in any particular setting?

With the Android, pressing and holding on an unopened email brings up a menu which allows, among other items, deletion. Open the email and the same operation has no result. In the Google calendar, the same operation has no result. How is anyone to know, first, that this magical gesture exists, and second, whether it operates in any particular setting?

Whenever we discus these examples with others, we invariably get two reactions. One is "gee, I didn't know that." The other is, "did you know that if you this (followed by some exotic swipe, multi-fingered tap, or prolonged touch) that the following happens?" Usually it is then our turn to look surprised and say "no we didn't know that." This is no way to have people learn how to use a system.

Norman and Nielsen point to some real design flaws, but aren't the new conventions developing pretty rapidly, and aren't people also figuring them out pretty rapidly? (The one unforgivable sin in UI design, though, is when there's no undo for a destructive action.)

Tuesday, June 1, 2010

one vision of the digital humanities

From the Chronicle of Higher Education, an overview of Franco Moretti’s lab at Stanford, devoted to quantitative work on, especially, Victorian fiction:

The idea that animates his vision for pushing the field forward is “distant reading.” Mr. Moretti and Mr. Jockers say scholars should step back from scrutinizing individual texts to probe whole systems by counting, mapping, and graphing novels.

And not just famous ones. New insights can be gleaned by shining a spotlight into the “cellars of culture” beneath the small portion of works that are typically studied, Mr. Moretti believes.

He has pointed out that the 19-century British heyday of Dickens and Austen, for example, saw the publication of perhaps 20,000 or 30,000 novels — the huge majority of which are never studied.

The problem with this “great unread” is that no human can sift through it all. “It just puts out of work most of the tools that we have developed in, what, 150 years of literary theory and criticism,” Mr. Moretti says. “We have to replace them with something else.”

Let’s be a touch more precise, Professor Moretti: if someone wants to study all those forgotten novels in a special sense of the word “study” that involves not reading any of them, then your approach may well be a good way to do that. But for those of us who are interested in, you know, reading books, we will probably have to employ other intellectual tools. If you’re not into that, that’s cool.

"darkness and silence"

Robert McCrum:

For new and original books to flourish, there must be privacy, even secrecy. In Time Regained, Marcel Proust expressed this perfectly. “Real books”, he wrote, “should be the offspring not of daylight and casual talk, but of darkness and silence.”

How many “real books” enjoy “darkness and silence” today? Not many. In 2010, the world of books, and the arts generally, is a bright, raucous and populist place. The internet – and blogs like this – expose everything to scrutiny and discussion. There’s a lot of self-expression, but not necessarily much creativity.

So the question I ask is: can the secret state of creative inspiration flourish on global platforms on which everything is exposed, analysed and dissected?

I don't think this is quite right. I think there are some kinds of books — some kinds of art — that can only be made in privacy, by people who seclude themselves from other voices and work through a project without interference. But that’s not a universal rule. Many of the ideas in the book I’m writing now, on reading in a digital age, have made their first appearance on this blog: I have tried out thoughts, had readers agree or disagree or send me links to related ideas. Even when you don't get a lot of comments on an idea, just putting it before the public forces you to think about it in a different way than when it’s only in your head.

Maybe what McCrum should have written is that there must be a stage in the making of any significant work that takes place “in darkness and silence.” But even Proust gained a great deal of the knowledge and insight that fed his books from social occasions — even Proust!