Eyal Ophir, the study’s lead investigator and a researcher at Stanford’s Communication Between Humans and Interactive Media Lab, said: "We kept looking for multitaskers’ advantages in this study. But we kept finding only disadvantages. We thought multitaskers were very much in control of information. It turns out, they were just getting it all confused."— NYT
Monday, August 31, 2009
The smart guys over at Snarkmarket are thinking about how we'll be reading in 2020. I want to follow up on this later, but for now consider some of the questions they're asking:
• What kind of devices will we use to read?• What formats will be used to deliver documents?• What kinds of documents will be “read” - text, image, video, audio, hybrids?• How will documents be written and produced?• How will documents be bought, sold, and otherwise supported?• How will contributors be compensated?• How will reading work in different industries?
They're hoping to do a presentation at South by Southwest Interactive. I hope it works out.
Mi hart aeks, and a drouzy numnes paens
Mi sens, as tho of hemlok I had drunk,
Or emptyd sum dul oepiaet to the draens
Wun minit past, and Lethe-wards had sunk. ...
Friday, August 28, 2009
Wednesday, August 26, 2009
I am notably deficient in visual imagination — at least I think I am. It may not be my problem. I have always found it extremely difficult, if mot impossible, to visualize objects, especially man-made ones, when they are described to me. I remember reading Iris Murdoch’s novel The Philosopher’s Pupil and struggling mightily to comprehend her descriptions of the complex of mineral baths that are central to the town of Ennistone and (in a sense) to the novel’s plot as well. I just couldn't manage it, though I read and re-read. But maybe this is a problem with words rather than with me; maybe words just aren't good at describing the appearance of complicated objects.
In either case, the foregoing might help you understand why one of my favorite books — one I return to again and again — is A Visual Dictionary of Architecture, by the great Frank Ching. If my house were burning down, there aren't too many things I’d rescue before Ching’s book. It is endlessly delightful and instructive. There aren't many images of it online, but here’s one:
Just yesterday I was poring over the pages on joinery, delighted to discover the astonishing variety of ways that one piece of wood can be connected to another. One of my favorite sections is on arches. And the two pages on light — on incidence, reflection, refraction — have taught me more than any of my physics classes ever did. And look at his beautiful printing, which has been adapted by Adobe and incorporated into its Tekton font family.
From this nice profile, here’s Ching’s drawing of the Campo di Fiore in Rome:
And here’s Ching’s demonstration (I think from this book) of how to draw shadows:
All hail Frank Ching!
Name of applicant: Tolstoy, Leo (interviewed in England)
We recommend something sensible from Marks & Spencer and a different hairdresser for the candidate to cut a more reputable figure at any future interview. While he scorned the beef sandwiches (although there were cheese and pickle too, and a vegan plate could have been ordered if he had made his dietary preferences clear), his chances truly began to go off when he asked the other candidates and staff how they intended to live moral lives. Even before the questions at his personal interview, he inquired if lepers could be hired as teaching assistants. While Slavically born, educated and souled, he must surely have undertaken his doctoral work in the American Midwest, perhaps with a year abroad in the Middle East, in the company of his fellow zealots. Despite his off-putting air of nobler-by-birth and holier-than-thou, he showed an uncanny understanding of our staff's fatigue, faculty frustrations in the hiring process and the nerves of fellow candidates. Some may consider the insight into people and sweeps of history - which he covered in his 15-minute presentation on the functions of great men, impiety of marriage and destiny of Russia - possibly beneficial. But we opted to forgo a colleague who is both mind reader and messiah. When the committee called to inform him of his rejection, the front desk reported that he had left on a mission to succour British serfs, after disparaging the hotel furniture.
The fates of other applicants — names include Dostoevsky, Flaubert, Austen, and Kafka — may be discovered here.
Tuesday, August 25, 2009
ABSTRACT: Chronic media multitasking is quickly becoming ubiquitous, although processing multiple incoming streams of information is considered a challenge for human cognition. A series of experiments addressed whether there are systematic differences in information processing styles between chronically heavy and light media multitaskers. A trait media multitasking index was developed to identify groups of heavy and light media multitaskers. These two groups were then compared along established cognitive control dimensions. Results showed that heavy media multitaskers are more susceptible to interference from irrelevant environmental stimuli and from irrelevant representations in memory. This led to the surprising result that heavy media multitaskers performed worse on a test of task-switching ability, likely due to reduced ability to filter out interference from the irrelevant task set. These results demonstrate that media multitasking, a rapidly growing societal trend, is associated with a distinct approach to fundamental information processing.
(AP story here.) In short, the more you try to multitask the worse you’ll be at it; or, perhaps, the worse you are at it the more you’ll try to do it.
Contrast this to the Swiss writer Robert Walser, who understood the importance of single-minded focus on the one thing needful:
In 1933 his family had him transferred to the asylum in Herisau, where he was entitled to welfare support. There he occupied his time with chores like gluing paper bags and sorting beans. He remained in full possession of his faculties; he continued to read newspapers and popular magazines; but, after 1932, he did not write. "I'm not here to write, I'm here to be mad," he told a visitor.
(I was reminded of this anecdote by reading this post.)
Monday, August 24, 2009
Erin O’Connor, from a post that I’ve been thinking about for the past six months or so:
English teachers are mediators. They are not ends in themselves. That’s how it should be, anyway. They are training wheels that young readers ought to be able to shed once they acquire the skills they need to read purposefully and profitably on their own. But, too often, this backfires. Kids get turned off, and reading just becomes a chore they have to do for school. Or — and this pattern is less discussed, but still troubling — they become dependent. They may really enjoy reading — but they think they need a class, and spoonfed lectures, and guided discussions, in order to get anything out of what they read. They are willing and eager — but have learned from their teachers exactly what they should not have learned. They have become passive where they should be active, and the teacher becomes a crutch for laziness, fear, uncertainty, and sometimes even a creeping snobbery about reading, about choosing what to read, deciding how to read, and figuring out what one thinks about what one has read. These folks grow up into the kind of adults who answer questions about their favorite books by listing works they think should be their favorites — but that they may never have even actually read.
I enjoy most of the books I read — I admire many of them — I adore some of them — but it is not often that I think “I have no words to express how desperately I wish I had written that.” But just such a longing consumed me as I read Arika Okrent’s scholarly yet delightful In the Land of Invented Languages: Esperanto Rock Stars, Klingon Poets, Loglan Lovers, and the Mad Dreamers Who Tried to Build A Perfect Language. You must buy it, or at least borrow it from your library.
However, you should not get the Kindle version: the book has a number of fascinating charts that are reproduced on the Kindle screen at a size too small and resolution to low to be read. Publishers need to be more responsible about this kind of thing, and Amazon too: if images can't be seen clearly, they should be left out of the Kindle editions, and perhaps made available on a website. Very annoying.
Speaking of websites, the one for the book features a shockingly long list of invented languages, from Lingua Ignota to Proto-Central Mountain, most of which are accompanied by brief translation samples and random tidbits of endlessly fascinating trivia. Consider yourself warned.
Thursday, August 20, 2009
A video of Larry Lessig's talk on the Google Book Settlement — and its legal and technological background — is here. The talk is careful and nuanced, as is typical of Lessig, and while it gives me a great deal to think about, it really doesn't help me understand what I should do.
I like the general theme of this post very much — we do pay a price, often too high a price, for adding features to our software (and other things) — but this paragraph is wrong:
A perfectly blank sheet of white paper is a tool of infinite possibility. For input you could use a pencil, a pen, a crayon, a marker, a stamp, a brush or more. You could use all of those at once. You can write or draw or paint in any direction. Even multiple directions on the same sheet. You can use any color you want. How you enter data onto it and how that information is structured seems almost limitless. That flexibility and power is available to you because of [its] lack of features. In fact, it is featureless — devoid of them.
No, a sheet of paper has many features — traits — and they are worth noting. Compared to most things in our world, it is remarkably expansive in the two dimensions of height and width, considering its lack of depth. It’s also flexible and foldable. These features are sometimes wonderful (e.g., when you want to write a detailed letter and you only have one sheet of paper, which you can fold into a small square and stick in your back pocket) and sometimes regrettable (e.g., when the letter gets all crimped from staying in your pocket, or when you can't find anything flat and solid to write on).
Most of the paper we see every day is also designed so as to receive quite easily all sorts of marks and impressions: the same sheet of paper can go through many different kinds of printers, can be typed on with a typewriter, and, as the post notes, can be marked on by pencils, pens, markers, paint brushes, and who knows what else. This is a feature, not featurelessness.
Curiously enough, a piece of paper is rarely square, and as a rectangle offers us the choice of portrait or landscape mode; however, once one of those modes has been chosen it’s not possible to change completely to the other. Unless you turn the page over.
We find the spatial proportions of an ordinary piece of paper — or a stack of such pieces, in the form of a codex or notebook — so appealing that we design electronic readers to resemble it. We like its receptiveness to marking so much that we keep hoping for someone to design a really excellent tablet-and-stylus computer.
A plain sheet of paper, then, has a great many features, and all of them are worth thinking about. Similarly, when I write in a text editor rather than a word processor, that’s not because my text editor has fewer features than Microsoft Word, but rather because the features it does have are better suited to the task of writing.
Wednesday, August 19, 2009
I don't know what to think about the Google Books Settlement, though since it affects me, I really ought to have a position.
The Author’s Guild is strongly for it. The William Morris Agency is strongly against it. Professors at the University of California accept the general outlines but suggest some changes. What “may be the most fundamental challenge to the settlement yet” comes from this attorney. My own agent, the fabulous Christy Fletcher, is leaving it up to me.
I’ve got sixteen days to decide whether or not to opt out.
This story has been around for a while, but it’s a New Atlantis kind of thing — cf. Ari Schulman’s fine essay on “Why Minds Are Not Like Computers” — so: Deena Skolnick Weisberg (along with some colleagues) has been doing some really interesting work lately on the way people — especially readers of newspapers and magazines — respond to explanations of human behavior that invoke, or claim to invoke, neuroscience. Here’s the abstract of that paper:
Explanations of psychological phenomena seem to generate more public interest when they contain neuroscientific information. Even irrelevant neuroscience information in an explanation of a psychological phenomenon may interfere with people’s abilities to critically consider the underlying logic of this explanation. We tested this hypothesis by giving naïve adults, students in a neuroscience course, and neuroscience experts brief descriptions of psychological phenomena followed by one of four types of explanation, according to a 2 (good explanation vs. bad explanation) X 2 (without neuroscience vs. with neuroscience) design. Crucially, the neuroscience information was irrelevant to the logic of the explanation, as confirmed by the expert subjects. Subjects in all three groups judged good explanations as more satisfying than bad ones. But subjects in the two nonexpert groups additionally judged that explanations with logically irrelevant neuroscience information were more satisfying than explanations without. The neuroscience information had a particularly striking effect on nonexperts’ judgments of bad explanations, masking otherwise salient problems in these explanations.
The whole paper, “The Seductive Allure of Neuroscience Explanations,” may be found on Weisberg’s website as a PDF. You can also find there a related and somewhat less technical essay, “Caveat Lector: the Presentation of Neuroscience Explanation in the Popular Media” (also PDF). And here’s another example of what happens when “nonsense dresses up as neuroscience.”
“Explanations” of human behavior that claim the authority of evolutionary psychology can work in much the same way — it’s remarkable how rarely people notice when such explanations are purely speculative. For instance, in How the Mind Works here’s the explanation Stephen Pinker gave for the story of Hamlet: “Fictional narratives supply us with a mental catalogue of the fatal conundrums we might face someday and the outcomes of strategies we could deploy in them. What are the options if I were to suspect that my uncle killed my father, took his position, and married my mother?”
The best response I ever saw to this was by the philosopher Jerry Fodor: “Good question. Or what if it turns out that, having just used the ring that I got by kidnapping a dwarf to pay off the giants who built me my new castle, I should discover that it is the very ring that I need in order to continue to be immortal and rule the world? It’s important to think out the options betimes, because a thing like that could happen to anyone and you can never have too much insurance.”
Some “explanations” offer considerably less than meets the eye.
Hmmm. Response to my earlier post on my self-announced candidacy for Apple’s Board of Directors has been . . . I don't want to say “lukewarm,” but — well, choose your own term. I’m turning my attentions elsewhere.
I just learned today that the notoriously touchy Alain de Botton has just been named the writer-in-residence of London’s Heathrow Airport. This interests me strangely. If the Apple Board thing doesn't work out, I’m going to seek to be named writer-in-residence at DuPage Airport. Wish me luck!
Monday, August 17, 2009
One of many pleasures to be had at the Book Cover Archive.
Friday, August 14, 2009
From Automata, which I learned about from Ministry of Type, which I learned about from Urge of the Letter, which I learned about from Snarkmarket. I don't remember where I learned about Snarkmarket. The important point is that all of those links lead you to smart posts which, in turn, link to other smart posts.
Giving y'all some extra posts today (and some good links on the Twitter feed — see right) because I’ll be away for a few days, organizing my campaign to be named to Apple’s board. Namaste.
Now that Eric Schmidt of Google has resigned from Apple’s Board of Directors, there’s a good deal of speculation about his possible replacement. I would like to nominate myself.
Yes. I would be an excellent choice. It’s true that most members of the board — all but one, in fact — are CEOs of other companies, but that one other member doesn't even have a steady job. (Gore is his name, I believe.) So between the CEOs and the unemployed guy, what do we have? Nothing, that’s what. That board needs someone to represent The Apple User: someone who knows the company’s products, who is committed but not uncritical, and who is aware of the ways that Apple’s decisions affect power users, average users, novice users — and the people who make software for Apple. (I’ve bought a lot of third-party Mac and iPhone software over the years.) Also, as a writer I could shape and edit any statements from the board so that they would come more closely to resemble statements in English.
I bought the first Mac — all 128k of it — and have used Macs ever since. I bought the first PowerBook and the first iMac. I bought the first iPod and the first iPhone. (Well, I don't mean literally the very first, I mean the first iteration of each product line. You get me.) I was present at the creation, almost. I know what Apple does well and what it does not so well. I can explain how to fix the problems with the iPhone App Store. I know what the best piece of Apple software is (Keynote) and what the worst is (Mail). No one could do this job as well as I could.
Steve. You know in your heart I’m right. Call me.
So the BBC wants you to vote for the Nation’s Favourite Poet. Or it wants some people to vote, anyway — I’m a little confused because I’m not sure what “nation” the good ol’ Beeb has in mind. England? Great Britain? The United Kingdom of Great Britain and Northern Ireland? Or perhaps something more amorphous: Albion, say, or the British Isles?
The nominated poets themselves aren't much of a guide. In addition to the many who lived all their lives in England, there’s Robert Burns (a Scot), W. B. Yeats (born in Dublin), Seamus Heaney (born in Northern Ireland, but an Irish citizen), T. S. Eliot (born in St. Louis, later became a British subject), W. H. Auden (born in York, later became an American citizen). It’s all so very confusing. And it’s odd that Shakespeare is not, for these purposes, considered a poet. He was one, you know.
Apparently the two qualifications for nomination are: (a) you must have lived in one or more of the British Isles for some significant chunk of your life, and (b) you must write in English.
I voted for Auden, of course. So should you.
Thursday, August 13, 2009
Last week I finally got around to reading Tom Standage’s The Victorian Internet, his history of the telegraph. One of Standage’s major themes is the widespread belief, in the early days of the telegraph, that the technology itself would somehow usher in a New Age of international peace and cooperation. After all, said one observer of the time, the telegraph was “transmitting knowledge of events, removing causes of misunderstanding, and promoting peace and harmony throughout the world.” “It brings the world together,” said another. “It joins the sundered hemispheres. It unites distant nations, making them feel that they are members of one great family. . . . By such strong ties does it tend to bind the human race in unity, peace, and concord.”
Standage goes on to point out that the development of the internet was accompanied by the same rhetoric: twelve years ago Nicholas Negropnte famously proclaimed that the internet would end nationalism and bring about world peace, and he was just one of many prophets preaching the same gospel.
From my editor Adam Keiper I now get this story by Jonathan Last: plus ça change, plus c'est la même chose indeed. Clay Shirky: “Twitter makes us empathize.” British PM Gordon Brown: because of Twitter, “You cannot have Rwanda again.”
Really? Twitter can keep people from taking machetes to their neighbors? And sending and receiving 140-character messages will make us empathize? The assumptions underlying all of these statements are precisely the same assumptions that underlay the praise of the telegraph a hundred and fifty years ago: that one group of people cannot have fundamentally different interests than any other group; that any conflict is the product of insufficient information; that the provision of sufficient information will immediately end any conflict; that familiarity inevitably breeds not contempt but affection and respect; that human beings are naturally filled with compassion and simply require a technology sufficiently powerful to release that compassion. But — alas — none of these assumptions is true.
Wednesday, August 12, 2009
As for the intellectual property, I try not to get too worked up about it. There’s a lot of people angsting about piracy and copying of stuff on the Internet, publishers who are very, very worried about the whole idea of ebook piracy. I like to get a little bit of perspective on it by remembering that back before the Internet came along, we had a very special term for the people who buy a single copy of a book and then allow all their friends to read it for free. We called them librarians.
David Ulin has a problem:
Sometime late last year — I don't remember when, exactly — I noticed I was having trouble sitting down to read. That's a problem if you do what I do, but it's an even bigger problem if you're the kind of person I am. Since I discovered reading, I've always been surrounded by stacks of books. I read my way through camp, school, nights, weekends; when my girlfriend and I backpacked through Europe after college graduation, I had to buy a suitcase to accommodate the books I picked up along the way. . . .
So what happened? It isn't a failure of desire so much as one of will. Or not will, exactly, but focus: the ability to still my mind long enough to inhabit someone else's world, and to let that someone else inhabit mine. Reading is an act of contemplation, perhaps the only act in which we allow ourselves to merge with the consciousness of another human being. We possess the books we read, animating the waiting stillness of their language, but they possess us also, filling us with thoughts and observations, asking us to make them part of ourselves. . . . In order for this to work, however, we need a certain type of silence, an ability to filter out the noise.
Such a state is increasingly elusive in our over-networked culture, in which every rumor and mundanity is blogged and tweeted. Today, it seems it is not contemplation we seek but an odd sort of distraction masquerading as being in the know. Why? Because of the illusion that illumination is based on speed, that it is more important to react than to think, that we live in a culture in which something is attached to every bit of time.
I’ve been there myself. A couple of years ago I was in Ulin’s condition, and it was worrying me a good deal. I thought I might have entered into a state of permanent and inescapable distraction. However, I did escape, and the prime aid to my recovery of my old long-attention-span self was . . . the Kindle. Yes, that’s right. I have written earlier on this blog about the positive reading momentum the Kindle can generate: I went through a period when I read about twenty books in a row on the Kindle, and after I did, I had my reading mojo back.
Interestingly, once I got it back I lost a good deal of interest in the Kindle, and I haven't read many books on it in the last six months or so. Im not sure quite what to make of that, but I will always be thankful to the gadget for the restorative work it did on me.
Tuesday, August 11, 2009
Thanks to Will Benton I'm having a Translation Party: type in a phrase and the site translates it back and forth between English and Japanese until "achieves equilibrium" — that is, you get the same output every time. Sometimes that happens quickly, sometimes not at all.
This yields something interesting: “The judgments of the Lord are true and righteous altogether.”
Also the first sentence of Shakespeare’s Sonnet 129: “The expense of spirit in a waste of shame is lust in action.”
It gives up on this one: “And malt does more than Milton can to justify God’s ways to man.”
On this one (a line from Richard Wilbur) it seeks simplification: “The sky became a still and woven blue.”
With this one it says it has achieved equilibrium when it really hasn’t: “I repose by the sills of the exquisite flexible doors” (Whitman).
And here's a party waiting to happen: "It is a truth universally acknowledged, that a single man in possession of a good fortune, must be in want of a wife."
Somewhat outside the scope of this blog — though central to the concerns of The New Atlantis — is this interesting essay on upcoming health-care proposals by Lee Siegel. It’s worth meditating on even if you don't agree with it. Here’s a lengthy excerpt:
End-of-life treatment is still under consideration and would be a tiny sliver of Obama’s health-care package. But it is a highly volatile sliver. Betsy McCaughey, who singlehandedly killed the Clintons’ health-care initiative 15 years ago with her infamous and infamously inaccurate cover story in The New Republic, claims that this small passage in the bill “would make it mandatory—absolutely require—that every five years people in Medicare have a required counseling session that will tell them how to end their life sooner.” Not quite. But — painful as it is to concede anything to an ideological hack like McCaughey — it’s uncomfortably close.
The section, on page 425 of the bill, offers to pay once every five years for a voluntary, not mandatory, consultation with a doctor, who will not blatantly tell the patient how to end his or her life sooner, but will explain to the patient the set of options available at the end of life, including living wills, palliative care and hospice, life sustaining treatment, and all aspects of advance care planning, including, presumably, the decision to end one’s life.
The “presumably” here is a sticking point. And what is meant by “ending one’s life”? I doubt that American doctors are going to hold out suicide as an option anytime soon, though I expect it to happen eventually — and almost certainly by the time I am elderly (should I make it that far). But that doctors trained in cost-cutting measures should nudge patients in the direction of palliative care seems nearly certain:
The shading in of human particulars is what makes this so unsettling. A doctor guided by a panel of experts who have decided that some treatments are futile will, in subtle ways, advance that point of view. Cass Sunstein calls this “nudging,” which he characterizes as using various types of reinforcement techniques to “nudge” people’s behavior in one direction or another. An elderly or sick person would be especially vulnerable to the sophisticated nudging of an authority figure like a doctor.
Bad enough for such people who are lucky enough to be supported by family and friends. But what about the dying person who is all alone in the world and who has only the “consultant” to turn to and rely on? The heartlessness of such a scene is chilling. . . .
One of Obama’s most alluring traits has been what some see as a literary bent that relishes complexity, irony, and even the mystery of the human personality. Let him turn toward that part of his nature and leave the sterile precincts of utilitarian social and legal theory behind. He should immediately and publicly declare his commitment to not placing economic hurdles in the way of people who want to prolong their life, or the life of their loved ones. In that way, he would take the air out of charlatans like McCaughey. And he would calm the fears of people who, far from being right-wing fanatics, are in clear-eyed possession of perhaps the only universal truth there is. No one wants to die.
Following up on yesterday’s post about inevitability, here’s another statement along the same lines:
Teaching without digital technology is an irresponsible pedagogy. Why? The future is digital, love it or hate it. We can argue later about whether or not this is a good or a bad thing. (Hint: the answer is both.) But to educate students, or to attempt to educate students without developing their digital literacy is to leave them ill prepared for their futures. You wouldn’t think of educating a student and not teaching them how to read, digital literacy is this crucial.
(The writer of this blog, David Parry, is a media professor, but he writes terribly: run-on sentences, incoherent use of commas, etc. I sometimes think he’s doing it on purpose. But I digress.) Now, I think I’ve made it clear over the years what my views are on these pedagogical matters: while I take a great interest in technologies of knowledge myself — else I wouldn't be writing this blog — I don't think pursuing those technologies is the most important thing I can do as a teacher of literature. My focus, rather, is on making my students more thoughtful and creative users of the technology of the book. When I look at my students, I see people who need more instruction in using books than in using computers, and I apportion my pedagogical time and energy accordingly. I do this also because I think the book is a superb technology of great intrinsic and instrumental value; and because I think that students who become skilled users of (paper) books will be equipped to make a pretty smooth transition to the use of non-paper books and various text-based media. I also think that it’s easier to go from mastering paper books to mastering digital media than to move in the opposite direction.
Now, I do teach some classes in which the emphasis is on the various technologies of reading. But I don't see those as absolutely central to my calling — important, yes, but not as important as focusing on the reading of great books. Am I “irresponsible”? Would I be irresponsible if I didn't teach about the various technologies of reading? Well, I don't think so. But if I am irresponsible, it’s not because of some vague nostrums along the lines of “The future is digital, love it or hate it.” I don't know what “the future” is going to be, but I do know what I think valuable for people to know in order to flourish. I’ll keep going with that.
Monday, August 10, 2009
In an article about the (possible) end of textbooks, we hear this comment:
“Kids are wired differently these days,” said Sheryl R. Abshire, chief technology officer for the Calcasieu Parish school system in Lake Charles, La. “They’re digitally nimble. They multitask, transpose and extrapolate. And they think of knowledge as infinite.”
Okay, we’ve been over all this before. No, they’re not “wired differently” from anyone else; surprisingly few of them are “digitally nimble”; they don't multitask — no one does — ; and they hardly think of knowledge at all.
All that duly noted, the article is almost certainly right that there are better alternatives to the traditional textbook, at least for most subjects. And the superiority of digital alternatives to textbooks can be defended, if people will just take the time and trouble to think and explain.
But thinking and explaining can be difficult, which is why most proponents of new technologies fall back on two standard lines: (1) Human Nature Has Irreversibly Changed , and (2) Beware Lest the Wheels of History’s Juggernaut Crush You. I swear, these people are going to make technophobes of me yet.
The latter one is especially annoying, perhaps because it’s (in my experience) more common. Take this comment, for example: “Our ideas about what reading is will have to change to keep up with what is going on in a digital culture.” Which just makes me want to say: No, they bloody well won't. My ideas about what reading is don't have to change at all, and if they do change they will do so because I have discovered new ideas that interest or provoke or delight me, not because I am obliged to “keep up with” anything. Anyone who wants to read books while ignoring the internet is free to do so, and I’m not convinced that such a person will be intellectually impoverished in comparison to, for example, me. I’m online a lot, but I’m online because I want to be, not because the tides of history are compelling me to be. I can choose otherwise — we all can, and we all need to remember that we can.
(One of the most thoughtful recent reflections on the idea of the inevitable comes from Kevin Kelly, but KK is too prone to believing that whatever ends up happening must be inevitable, and he fails to make the vital distinction between the inevitability of an event or a development and the non-inevitability of any given person's participation in that event or development.)
More on this tomorrow.
Saturday, August 8, 2009
I’ve been reading a number of mysteries lately — something about which I may have more to say later on — so I was pleased to see this post by Nick Baldock on Agatha Christie, the mystery as a genre, and Christianity. But Baldock doesn't mention the single most important essay on these themes, the one that W. H. Auden wrote for Harper’s in 1948: “The Guilty Vicarage”. Everyone should read it. Excerpt:
I can, to some degree, resist yielding to these or similar desires which tempt me, but I cannot prevent myself from having them to resist; and it is the fact that I have them which makes me feel guilty, so that instead of dreaming about indulging my desires, I dream about the removal of the guilt which I feel at their existence. This I still do, and must do, because guilt is a subjective feeling where any further step is only a reduplication–feeling guilty about my guilt. I suspect that the typical reader of detective stories is, like myself, a person who suffers from a sense of sin. From the point of view of ethics, desires and acts are good or bad, and I must choose the good and reject the bad, but the I which makes this choice is ethically neutral; it only becomes good or bad in its choice. To have a sense of sin means to feel guilty at there being an ethical choice to make, a guilt which, however “good” I may become, remains unchanged. As St. Paul says: “Except I had known the law, I had not known sin.”
Thursday, August 6, 2009
Alex Rose isn’t so sure that it’s straightforward and intrinsic:
A few years ago, I found myself on a blind date with an English professor. At some point after the second drink, one of us mentioned a feature in the Times that day about a recent slew of steamy, pulpy young adult novels whose sudden popularity had incurred the wrath of both protective mothers and knuckle-rapping critics.
"But at least the kids are reading," said my date, raising her glass. "That's got to count for something."
Does it? . . .
The only conceivable value of trashy books is the dubious but not unthinkable possibility that they might go some of the way towards engendering in young people a love of reading as an end in-itself, which in turn might whet the appetite for better books. For many, that's the only way in. They'll read Sweet Valley High or Twilight at thirteen, lose their taste for it by fourteen and demand something richer and more challenging at sixteen. Or so the thinking goes.
If the argument applies to one form of entertainment, though, it should apply to all. Why is it that when kids become enraptured by some idiotic program, no one says, "well, at least they're watching TV?"
Wednesday, August 5, 2009
In relation to my earlier post on academic genres, here’s an assignment that I’ve been thinking about using: a critical response to a Wikipedia page. Students would be asked to read a Wikipedia page on an author or a book, say, and evaluate it for accuracy and fairness — but then they would also look into the history of that page. Such histories can be very illuminating, because they tell us what’s at stake in the conversation about that author, what debates are common. And one of the really interesting things about Wikipedia pages on literature is that they are edited by both academics and fans: there aren't many places in our culture where those two groups come together, because they so rarely have the same agendas.
Here’s an interesting case in point: on the home page of the W. H. Auden Society you may find this message: “A highly accurate, thoroughly revised version of the Wikipedia.org entry on Auden is now available. This site strongly recommends that online researchers make reference to the archived version of the page, in the link above, rather than to current versions, which may be less accurate or may be subject to vandalism.” (I’m pretty sure Edward Mendelson, Auden’s exemplary literary executor and a brilliant critic in his own right, oversaw those revisions.) So that “officially approved” page could become the standard by which the whole revision history could be evaluated — though who knows? Perhaps an enterprising student would find edits that improved upon the standard.
An assignment like this would be an example of what Gerald Graff calls “teaching the conflicts”, but it would have the advantage of going beyond the boundaries of the academy. Yes, it’s a tad meta, and an assignment like this shouldn't replace close attention to the literary texts themselves — but I think it could be very useful.
Tuesday, August 4, 2009
If you like mystery novels, be prepared for spoilers ahead — though I try not to be too explicit, I still give a lot away.
The Franchise Affair (1948) is a mystery novel by Josephine Tey, one of the most remarkable writers ever to work in that genre. (She’s also something of a mystery herself. She was a Scot whose real name was Elizabeth Mackintosh — “Josephine Tey” was just one of her pseudonyms — who worked in London for much of her adult life, and . . . not much else is known about her.) The Franchise Affair is a modernization of one of the great “true crime” stories of the eighteenth century, and is generally considered one of the classic mysteries.
However, recently in the Guardian Sarah Waters — an interesting novelist herself — offered a strong dissent from the usual view. While admitting that “in some ways Tey's retelling of the Elizabeth Canning story is a quite brilliant one,” Waters is “mystified and appalled” by the general tenor of the story, which, in her reading, is driven by a particularly ugly form of British class warfare: it’s a “bilious, bigoted” tirade against the manners and morals of the unruly working classes.
And you know what? I think Waters nails it. I read The Franchise Affair just a few months ago, and I found it much less satisfying than some of Tey’s other work, but I didn't describe it to myself in the terms that Waters uses — I was thinking more of the way that mysteries tend to work. In this novel, the protagonist is a small-town solicitor named Robert Blair who, though not a specialist in criminal law, takes up the cause of two local women, an elderly mother and her middle-aged daughter, who are accused of kidnapping a teenage girl and forcing her into slavery in their house, until after a month of misery she manages to escape.
Right from the beginning, Blair — who doesn't know any of these people at all — determines that the mother and daughter are innocent and that the girl who accuses them is a lying, scheming little bitch. He never wavers in this view of the case, even when evidence seems to be strongly against his new friends, and in the end . . . he is proven to be precisely right.
That was what threw me. Blair was so certain, I felt that Tey was — perhaps over-obviously! — setting the reader up for a reversal. But the reversal never came. All Blair’s instincts were right all along. The “mystery” of the book turns out to be the explanation for the girl’s lies, and for their plausibility. And while the late revelations of the girl’s real story are well-handled, that just wasn’t what I was expecting — or not all that I was expecting.
There are some other things that are strange about the story. At the darkest hour, Blair’s aunt says she will pray for an angel to come and set everything right, and the very next morning a man comes to Blair’s office with information that reveals all and eliminates the clients’ danger. It’s almost as though Tey is playing with her readers — almost as though The Franchise Affair is a po-faced, deeply ironic parody of the genre.
But Waters’s essay gives a much more plausible explanation for the book’s strangeness — alas: I was enjoying my speculations about its irony. Class-based angst can make a writer do some strange things.
The Franchise Affair is very much worth reading, though — all of Tey’s novels are interesting for reasons unrelated to what mysteries usually do. I think the best of them is her last, The Singing Sands, though The Daughter of Time — in which her detective Alan Grant lies in a hospital bed and tries to figure out whether Richard III really murdered the little princes in the tower — is justly famous.
(Cross-posted at The American Scene.)
Monday, August 3, 2009
My essay on William Hazlitt — and Duncan Wu’s recent biography of the great essayist — is now up on the Books & Culture website. Excerpt:
Reading Hazlitt's essays I am rarely conscious of anything much happening to me. His prose moves in irregular rhythms, but without calling overmuch attention to itself, and in his best work he does not seem even to try to convince me that his subject is important or his treatment distinctive. And yet when I reach the end of any of his finest pieces, I find myself setting the book on my lap and raising my head from the page a while: I feel vibrations in my mind, echoes of ideas that have just been suggested to me, echoes that resonate with one another variously and strangely. No one makes me think quite the way that Hazlitt does.
Over at the Guardian’s Books blog, Philip Hall is remembering the books that meant a great deal to teenagers of his generation. He’s thinking of books that weren’t just popular, but were intellectual touchstones for smart young people — books like Catch-22 and Siddhartha and Slaughterhouse-Five. (I don't know how old Philip Hall is, but most of those books were on the unofficial lists of my intellectual friends, so we’re probably in the same generation.) It’s interesting, by the way, how many of them are American novels. And he’s wondering what those books are for the current generation of young people.
His commenters aren't giving him much help (so far, anyway). I’m inclined to say that they can't give him much help because there aren't any books that function that way for today’s adolescents in the U.S. and U.K. It’s not that young people aren't reading — though the evidence on that point is inconsistent — and in any case we’re talking about the smarter, more thoughtful, more questioning ones. Rather, I suspect that even very bright young people aren't using books to orient themselves ethically and politically to the world. At least not in ways that I can see. But if they’re not using books for orientation, what are they using?
Commentary on technologies of reading, writing, research, and, generally, knowledge. As these technologies change and develop, what do we lose, what do we gain, what is (fundamentally or trivially) altered? And, not least, what's fun?
Alan Jacobs is Distinguished Professor of the Humanities in the Honors Program of Baylor University and the author, most recently, of The “Book of Common Prayer”: A Biography and The Pleasures of Reading in an Age of Distraction. His homepage is here.
Sites of Interest
- August (20)
- July (17)
- June (5)
- May (14)
- April (12)
- March (15)
- February (10)
- January (15)
- August (9)
- July (8)
- June (14)
- May (28)
- April (13)
- March (24)
- February (16)
- January (23)
- December (28)
- November (19)
- October (21)
- September (25)
- August (20)
- July (33)
- June (54)
- May (44)
- April (19)
- March (24)
- February (19)
- January (25)
- December (33)
- November (33)
- October (39)
- September (27)
- August (32)
- July (36)
- June (26)
- May (25)
- April (32)
- March (34)
- February (2)
- January (31)