Text Patterns - by Alan Jacobs

Tuesday, May 31, 2011

curators and imitators

You know what annoys me? Well, actually, that would be a long list. You know one thing that annoys me? The way some people on the internet use the word “curator.” People find cool stuff online and put links to that cool stuff on their website, and they say that they’re “curating” the internet. When Jorn Barger invented that kind of thing he was content to call it a weblog — a record or “log” of interesting stuff he found online.

Now, one might argue that the weblog or blog has changed its character since Barger invented it: instead of logging cool things found online, it primarily logs a writer’s thoughts, feelings, and experiences (often about stuff found online). So maybe a new name is needed for the “logging” kind of site?

Maybe. But can we try for something a little less pretentious than “curator”? In the usual modern senses of the word, a curator (who often works for a museum) has a complex set of responsibilities that can only be carried out well by someone with a good deal of training, taste, experience, and intelligence. A curator plays a role in deciding what a museum will acquire, and once acquisitions have been made, will consider which objects are to be displayed, for how long they will be displayed, and in relation to what other objects they will be displayed. Curators organize objects in space and present them for public scrutiny. They also educate the public in the understanding of those objects, and of the principles of organization employed. Curators also help to care for those objects, to make sure they don’t get damaged or lost. (In ecclesiastical language, the priest who cares for the people of a parish while the rector is away is called a curate.)

Almost none of this is at work when people link to interesting things they have found on the internet. If a person whose website links to other websites is a curator, then a person who walks into the Louvre with a friend and points out the Mona Lisa is also a curator. It seems to me that if we go with that usage we’re losing a worthwhile distinction.

When I first made a comment about this on Twitter recently, I got pushback from my friend Pascal-Emmanuel Gobry, and since he’s a very smart guy I have thought about this some more. His concern is that my point is unnecessarily elitist, and I don't mean for it to be that — and I don't think it is. It’s just a matter (I hope) of distinguishing among different sorts of online activity.

So I’d suggest this as the beginnings of a taxonomy:

1) The Linker: That’s what most of us are. We just link to things we’re interested in, without any particular agenda or system at work. That’s what my Pinboard page is, just a page of links.

2) The Coolhunter: People who strive to find the unusual, the striking, the amazing — the very, very cool, often within certain topical boundaries, but widely and loosely defined ones. I think Jason Kottke and Maria Popova are exemplary online coolhunters.

3) The Curator: There are some. Not many, but some. The true online curator tends to have a clear and strict focus: he or she doesn’t post just anything that seems cool, but instead is striving to illuminate some particular area of interest. The true curator also finds things that other people can’t find, or can’t easily find, which means either (a) having access to stuff that is not fully public or (b) actually putting stuff online for the first time or (c) having a unique take on public material so that images and ideas get put together that the rest of us would never think to put together. I think Bibliodyssey is a genuinely curated site; also, just because of its highly distinctive sensibility, Things magazine.

Again, I’m not saying that one of these categories is superior to the others. They’re just all different, and the difference is worth noting.

Friday, May 27, 2011

memory and forgetting

This essay by Margaret Morganroth Gullette on memory loss and Alzheimer’s disease has a provocative title: “Our Irrational Fear of Forgetting.” So, what’s irrational about a fear of forgetting? At one point Gullette says “People over 55 dread getting Alzheimer’s more than any other disease, according to a 2010 survey by the MetLife Foundation. The fact that only 1 in 8 Americans older than 65 has Alzheimer’s fails to register.” So it’s irrational to fear Alzheimer’s because you’re not likely to get it.

Yes, it’s not likely — but 1 in 8 is not a negligible number. That’s higher than I would have guessed.

Gullette goes on to point out that people commonly experience various kinds of forgetfulness that have nothing to do with Alzheimer’s and betoken no serious cognitive impairment — so there’s no need to get anxious about that kind of thing. This is a good point.

But some of Gullette’s other claims I’m not so sure about. She says that

People with cognitive impairments can live happily with their families for a long time. My mother was troubled by her loss of memories, but she discovered an upside to forgetting. She had forgotten old rancors as well as President George W. Bush’s name. We sang together. She recited her favorite poems and surprised me with new material. We had rich and loving times.

The mind is capacious. Much mental and emotional ability can survive mere memory loss, as do other qualities that make us human.

Well, yes . . . but: it’s not really a gain to lose rancor because you’ve forgotten the people who had aroused your rancor. As Montaigne said in one of his greatest essays, there’s a big difference between conquering lust and simply becoming impotent. If righteous indignation (for example) is a key element of a person’s character, the disappearance of that indignation due to forgetfulness should not be confused with the achievement of peaceableness.

I don't doubt that Gullette and her mother did indeed have “rich and loving times,” but for many families it doesn't work out that way. Many people who contract the disease are constantly agitated by the loss of their faculties, their inability to get a grip on their conditions — as is understandable: but it’s deeply painful for them to experience and for their loved ones to watch. And I have seen the grief of friends whose parents — parents who raised them, loved them, nurtured them, consoled them — no longer recognize their children. When you think about the distinctively painful nature of these changes, you can understand why many people are more afraid of Alzheimer’s than of cancer, even though they’re far more likely to get cancer.

It may not be especially likely that any of us go through the Alzheimer’s experience, but it’s not rare. After all, even if only one in eight contract the disease, that one will likely have family members and friends who will suffer along with him or her. Alzheimer’s touches far more than 12.5% of Americans, and because of the changes in personality it can bring, and the loss of a history of experiences that loved ones can share, it has distinctive and significant costs. Insofar as popular culture sees a diagnosis of Alzheimer’s as a reason for suicide, and insofar as every small forgetfulness gets dramatically magnified in people’s minds, then yes, there is too much “fear-mongering” going on. But there’s nothing “irrational” about fearing the losses that Alzheimer’s brings.

Wednesday, May 25, 2011

dead and dying

Bill Pannapacker writes about the dying grandmother problem — and for college teachers it’s a big one. Every semester, just as the big end-of-term assignments come around, the grandparents start dropping like flies. I haven't known how to handle this any better than Pannapacker does, but a few weeks before his essay came out I was thinking of adding an item to my Frequently Asked Questions page along these lines: “If you tell me you can't complete an assignment because of the death of a grandparent, I will need to have confirmation of that sad loss from one of your parents.” But students have trouble remembering these policies anyway, and I can imagine the look on a genuinely grief-stricken person’s face when I tell her I need to hear from her mom so I know she’s not lying. . . .

Really, there should be a variation on the Schrödinger's Cat problem called the Student's Grandparent problem.

When I think of these matters I always remember a strange experience I had once at my local Starbucks. I arrived during the morning rush and had to stand in a long line. Gradually it crept forward, but when I was just one person away from the register a young woman in a business suit came walking up, peering at the pastry case — just checking it out, her body language seemed to say. But then when the register was free she glided right to it and quickly placed her order.

I leaned forward and said in a quiet voice, “You really shouldn't break in line like that” — at which she let out a loud cry, and wailed, “How dare you talk to me that way?” Then she threw her head down onto the counter and sobbed. After a moment she raised her head and said, “And my grandmother is dying!”

I don't quite remember what happened after that — did she get her drink and saunter out? — but when I got to the office I was still a little unnerved by the whole thing. Fortunately for me, the first person I saw was the kindest and gentlest of my colleagues, a man who is a paragon of Christian charity. I told him what had happened, and what the woman had said.

He just smiled. “Ah, I can beat that,” he said. “My grandmother’s dead.

Tuesday, May 24, 2011

banner!

Hey everyone, look at the awesome new banner my friend Brad Cathey of Highgate Cross made for the blog. Cool, is it not? Brad, by the way, is also responsible for the beautiful design of Gospel of the Trees. I am once again in his debt.

understanding the medium

Zeynep Tufekci has written a long, thoughtful, and worthy-of-contemplation post on Bill Keller’s recent rant against Twitter. An excerpt:

And Bill Keller should understand that, at its best, Twitter is not a broadcast medium but a medium of conversation. What he has done so far on Twitter is the equivalent of walking into a party and saying a provocative sentence, followed by sitting at the corner sipping his cocktail – as in “#twittermakesyoustupid. Discuss.” Social encounters are satisfying and worth mostly to the degree that one participates in conversations, rather than announces witticisms and withdraws. Yes, I am a professor but I do not walk into random rooms and expect people to quietly take notes on what I am saying while I launch into a speech, projecting my voice to the back of the room. Keller cannot understand this medium if he treats it as something different than what it is, and to understand requires participation in its indigenous form, conversation.

I thus urge the Literati to come join the social media conversation with the understanding that some of their strengths will not be as valued, that they will need to relearn certain skills, and some parts of the experience will be annoying – but just like some good literature, it sometimes take some effort to grasp the value of a new form. I think the literate should accept that this is now an inseparable part of the public sphere and increasing numbers of people who were otherwise excluded can now be heard; yes, they don’t always think or say what I wish people thought or said but what else is new? Given the complexities of the issues facing humanity, engaging this expanded public sphere is of crucial importance to anyone concerned about how we, as humans, will continue to live our lives, socially, economically and politically.

Monday, May 23, 2011

sunk costs

There are some excellent and helpful thoughts in this post by Megan McArdle, for instance:

A lot of the reaction to any new technology is simply that many of us invested a lot of effort in learning how to use the old technology well. That's especially true of books. (It's no accident that so many of the complaints come from journalists, academics, and other writers). For years, in school and at work, we constructed increasingly elaborate personal reference systems from notes, flags, and dog-ears, and our brains are now very nimble at using them. Change is hard. Moreover, it involves recognizing that all of our previous effort was a sunk cost: we have a painfully acquired skill that is now useless. We'd much rather double down than move on.

An incisive point. I really do need to ask how much of my resistance and discomfort — when I feel those, which is not always — are just the inevitable result of old and deeply-ingrained habits.

But while I’m asking, I want to question the chief analogy of the post:


  • Books : E-readers :: Horses : Automobiles


But I don't think the analogy is quite right. What’s the task at which horses and automobiles can be said to compete? Presumably, it’s transporting people or things from one point to another as quickly and reliably as possible. (When speed is not required, but rather peace and quiet, a horse may well be superior to a car, and walking superior to either.) What’s the task at which codex books and e-readers can be said to compete? Ummmm . . . .

See, that’s the tricky question. It depends on what you’re reading and why you’re reading it. If I’m reading a novel just for fun, I’d say the e-reader does a better job; if I’m reading a seriously literary novel for study, or am discussing such a book with a class, then I think the codex is far superior. And — I’m trying to formulate an abstract point here that’s still not perfectly clear in my own mind — when I’m reading a book whose key ideas are organized around the recurrence of certain words, I like e-readers better because they’re easy to search; but some books (especially novels) organize their central themes not around repetitions of words but rather repetitions of images or thoughts or events, and e-books are not well suited to investigating that variety of coherence.

Perhaps the e-reading technologies will improve and lessen the gap; probably they will. But what I want to avoid is the temptation to stop thinking in certain ways, stop striving for certain forms of understanding, because the technology I’m employing doesn’t favor them.

Sunday, May 22, 2011

benighted

When I read this introductory paragraph to a story by John Noble Wilford

In the thousand years between the decline of Rome and the springtime of the Renaissance, science and other branches of learning took a holiday throughout Europe. It was a benighted time in the history most of us raced through in school, skipping lightly through Charlemagne and Richard the Lion-Hearted, the Norman Conquest and the Crusades, and arriving none too soon at the time of Leonardo and Michelangelo, Columbus and da Gama, Erasmus and Luther.

— I thought it was the set-up to a joke. No informed person really believes all those hoary old clichés about the “benighted” Middle Ages, right? I mean, he said “benighted” — surely that’s a dead giveaway of parody?

Apparently not. It’s really sad to see this kind of nonsense coming from Wilford, who has been writing about the history of science for a long time. If you want an absolutely clear-cut refutation of this simplistic Whiggishness, here’s a highly accessible account, and here’s a more scholarly one. (Oddly, the scholarly work is about half as long as the more popular book.) And if you want shorter accounts still, read Myths 2, 3 and 10 here.

It's always interesting to see the myths that are so deeply ingrained — so obviously true to writers and editors — that no one bothers to fact-check them.

Friday, May 20, 2011

what I love about Twitter (and Storify)


These things just surge and fade — they appear out of nowhere and then, after a flurry of exchanges, they subside. It's an insult to the intrinsically ephemerality of the thing to preserve an exchange in this way — but just for purposes of illustration I'll do it this once. I love it. I just love it.

rediscovered



Huckleberry Finn, illustrated, from a 1961 English edition.

Thursday, May 19, 2011

aloud

Erin O’Connor on the great power of a simple idea:

Teaching high school for a year at a very interesting little Berkshire boarding school got me onto shared class reading projects–the kids I was teaching were very smart, but, like most kids these days, just didn’t have much experience reading. So we read and read out loud together, stopping from time to time to talk about the language and the ideas and so on. I have very fond memories of doing that with “Song of Myself” in winter time, the whole class clustered around the wood-burning stove in our otherwise unheated classroom. When spring rolled around, we lay on the grass and read Gatsby together. Part of me felt guilty about spending class time on such a pleasant and low key activity–but you really couldn’t argue with the results. Kids got turned on to the language, read closely, loved talking about what they were reading as they were reading it, and greatly improved their comprehension and their close reading skills along the way. When the most reading-averse kids in the class are spontaneously picking out “favorite” passages in Whitman, you know something cool is happening. 

So when I returned to college teaching the next year, I imported this teaching model and adapted it to Ivy League undergrads–which actually didn’t take much adapting at all. Once every couple of weeks, we’d read something together in class, going around the room, taking turns, everyone reading as much as they felt like reading and then leaving off for the next person. I worried that Penn students might think this was “beneath” them–might find it a silly or infantilizing activity. But they never did, and in fact, I think the class dynamic benefited a great deal from the relaxed, shared, contemplative quality of those sessions. Certainly they brought the literature we were reading “to life” in a way that silent, solitary reading can never do.

attention deficits and thick descriptions

A few days late, but this is an interesting article:

It’s an assertion I’ve heard many times when a child has attention problems. Sometimes parents make the same point about television: My child can sit and watch for hours — he can’t have A.D.H.D.

In fact, a child’s ability to stay focused on a screen, though not anywhere else, is actually characteristic of attention deficit hyperactivity disorder. There are complex behavioral and neurological connections linking screens and attention, and many experts believe that these children do spend more time playing video games and watching television than their peers.

But is a child’s fascination with the screen a cause or an effect of attention problems — or both? It’s a complicated question that researchers are still struggling to tease out.

The kind of concentration that children bring to video games and television is not the kind they need to thrive in school or elsewhere in real life, according to Dr. Christopher Lucas, associate professor of child psychiatry at New York University School of Medicine. “It’s not sustained attention in the absence of rewards,” he said. “It’s sustained attention with frequent intermittent rewards.”

This is a reminder of a point that I’ve been trying to make for a long time: we can't make useful generalizations about “screens.” You have to ask, “Which screens? What’s on the screens? Who’s using the screens? What would they be doing if they weren’t using these screens?” In the same way, we can't draw sweeping generalizations about whether social media are good or bad, whether they enable revolutions or make revolutions impossible. Screens, social media, computers, digital technologies of all sorts — they just aren’t “good” or “bad.” We need thick descriptions of our online lives, and right now the available descriptions are pretty thin.

That’s not surprising; online life is new, so the serious study of online life is (necessarily) newer. But I am craving richer, more detailed, more stringently controlled, thicker studies of how we live now.

you heard it here . . . later

"Markdown is the new Word 5.1. Seriously." I could have written that.

Wednesday, May 18, 2011

down memory lane

A conversation on Twitter the other day reminded me of my earliest experiences with online life. It was in 1992 that I learned that I could have my college computer connected to something called the “Internet” — though I don't know how I learned it, or what I thought the Internet was.
I had a Mac SE/30 at the time — the first computer my employer ever bought for me — and someone from Computing Services came by, plugged me in, and installed some basic software. I know I didn’t get any training, so what puzzles me now is how I learned how to use the programs. I must have checked out some books . . . but I don’t remember checking them out.

Here’s something else I don't remember: very few people I knew had email, so how did I find out my friends’ email addresses? I must have asked when I saw then and wrote the addresses down on paper. But in any case I soon developed a small group of people that I corresponded with, using the venerable Pine — and again, how I, a Mac user from the start of my computing career and therefore utterly mouse-dependent, adjusted to a mouseless console environment. . . . But I did, not only when using Pine, but when accessing Wheaton’s library catalogue via Telnet, and when finding some rudimentary news sources via Gopher, followed a couple of years later by my first exposure to the World Wide Web, via Lynx. Pine, Telnet, and Lynx were the internet for me for several years — and they were great programs, primarily because they gave the fastest possible response on slow connections.

It was only when I got a Performa — with a CD drive! — that I began to turn away from the text-only goodness of those days. I was seduced by all the pretty pictures, by Netscape and, above all, by what must remain even today the greatest time-waster of my life.

How odd for all this to be nostalgia material. After all, the whole point at the time was to be cutting-edge. But even when Wheaton eliminated Telnet access to the library catalogue and moved it to the Web, I knew that I was losing something. To this day I’d search catalogues on Telnet if I could.

Tuesday, May 17, 2011

recent additions to the Trees

The very talented designer of the Gospel of the Trees website, Brad Cathey, has just added a page to indicate the most recent changes. Several people have asked for something like this, so your wish is our command.

Saturday, May 14, 2011

advice sought

Yesterday I got this email from Amazon:

We're writing about your past Kindle purchase of The Lord of the Rings by J.R.R. Tolkien. The version you received had missing content and typos that have been corrected. 

An updated version of The Lord of the Rings (ASIN:B0026REBFK) is now available. It's important to note that when we send you the updated version, you will no longer be able to view any highlights, bookmarks, and notes made in your current version and your furthest reading location will be lost. 

If you wish to receive the updated version, please reply to this email with the word "Yes" in the first line of your response. Within 2 hours of receiving the e-mail any device that has the title currently downloaded will be updated automatically if the wireless is on.

Hmmmm. I’d certainly like to have the corrected edition. On the other hand, the copy I currently have has lots of underlined passages, and I have some notes keyed to the locations of those passages — I’d like to keep those. One possibility: I could go to the Your Reading page, and save all my annotations as a PDF, then update the book.

What a strange situation. Can you imagine buying a book at a bricks-and-mortar bookstore, only to have the store manager call you six months later to apologize for errors in that book? And offering to bring you a brand-new corrected copy? But only on the condition that you return the first one to him? It’s all just too weird.

get 'em while they're hot

I see that my book The Pleasures of Reading in an Age of Distraction is now available on Amazon. Only sixteen left in stock!

I now must resume the spiritual discipline of not checking the book's Amazon sales rank. That way madness lies.

In celebration of this great event, my friends at The New Atlantis are hosting a big event. Be there or be square!

Friday, May 13, 2011

more about student reading

Nick Carr, who sent me the link that led to this post on how college students read on Kindles, has followed up with his own thoughts. Check them out.

meaning and responsibility

A long quotation here, from John Gray’s review of a new book by Brian Christian:

So, what human abilities did Christian exercise that the computers could not mimic? With a degree in computing and philosophy, he is also a poet, and summarises one of the book's most compelling insights when he writes: "If poetry represents the most expressive way of using a language, it might also, arguably, represent the most human." The amazing proficiency that computers display in many contexts depends on their superior ability to think digitally, using information that has been broken down into discrete bits.

In contrast, what is distinctive of poetry — and, for that matter, of human language in general — is the vital role of context and allusion, which cannot be broken down into separate units of information. Human conversations are not composed of a finite number of particular exchanges; they take place against a background of tacit understandings, which often make what is not spoken as important as what is said. That is one reason why artificial intelligence programs have failed to replicate the subtlety of natural languages. 

Christian notes that the ever more pervasive role of computers in our lives risks thinning out these tacit understandings. In a change that he regrets, Facebook has replaced the box in which people described their favourite activities with a drop-down menu. The assumption is that people can come to know one another by ticking a list. But what makes us individuals is not which of a limited set of activities we choose to engage in. When we describe the things we love to do we are telling more about ourselves than we know. By eliminating the option of entering our own description of our favourite activities, Facebook has emptied these activities of some of their meaning. 

Yet Facebook is no less popular. For many, it seems, the loss does not matter. In fact, one of the attractions of a life that is mediated through computers may be just this loss of meaning. Computers have been immensely liberating in all kinds of ways, but one of these is in opening up the possibility of a life composed of a succession of individual bits of information. Part of the charm of the wired life is the freedom from meaning it promises.

I have read excerpts from Christian’s book here, and read an interview with him here, but haven’t gotten to the book yet. I am all the more determined to do so now.

What I find most interesting about this passage from Gray’s review is the forthright claim that people can prefer a life drained of meaning, or at least drained of the responsibility for seeking meaning. Thinking is hard work; discovering what life is all about is a fraught and unpredictable activity; and people, by and large, are immensely lazy. (I certainly am.) How many of us can resist the draw of any technology that says, “Here, let me handle that for you”?

the future of reading?

Matt Henderson showed his kids Al Gore’s Our Choice:

My kids — 10 and 8 years old — are both avid readers. The absolutely devour every book they can find. And, they are both intimately familiar with the iPad. My 10 year-old, in fact, reads many of her books in the iPad’s Kindle app, and often likes to write summaries of when finished.

I showed them Our Choice, and just observed. They quickly figured out the navigation, and discovered all the interactive features. But… they didn’t read the content. Fascinated, they skipped through the book, hunting for the next interactive element, to see how it works. They didn’t completely watch a single video.

When they finished, I asked them to tell me about the book. They described how they could blow on the screen and see the windmill turn, how they could run their fingers across the interactive map and see colors changing. How they could pinch to open and close images. But they couldn’t recall much of what the book was about. They couldn’t recall the message intended to be communicated in any of the info-graphics (though they could recall, in detail, how they worked.)

Whole story here.

Wednesday, May 11, 2011

the new Mansueto Library



At the University of Chicago. Wow. More photos here.

visual explanations!

Via Joe Carter, how to explain a KIndle to Charles Dickens:



a library is a good place to go

At the always-wonderful Letters of Note, some lovely responses from writers asked to tell the children of Troy, Michigan what’s good about a library. My favorite response is E. B. White’s, tinged as it is with that gentle melancholy that characterizes most of his writing and that children responded to surprisingly well:

A library is many things. It's a place to go, to get in out of the rain. It's a place to go if you want to sit and think. But particularly it is a place where books live, and where you can get in touch with other people, and other thoughts, through books. If you want to find out about something, the information is in the reference books — the dictionaries, the encyclopedias, the atlases. If you like to be told a story, the library is the place to go. Books hold most of the secrets of the world, most of the thoughts that men and women have had. And when you are reading a book, you and the author are alone together — just the two of you. A library is a good place to go when you feel unhappy, for there, in a book, you may find encouragement and comfort. A library is a good place to go when you feel bewildered or undecided, for there, in a book, you may have your question answered. Books are good company, in sad times and happy times, for books are people — people who have managed to stay alive by hiding between the covers of a book.

Tuesday, May 10, 2011

students and their Kindles

Via Nick Carr, a really interesting forthcoming paper on how students read using the Kindle DX. Some findings:

• Students did most of the reading in fixed locations: 47 percent of reading was at home, 25 percent at school, 17 percent on a bus and 11 percent in a coffee shop or office. 
• The Kindle DX was more likely to replace students’ paper-based reading than their computer-based reading. 
• Of the students who continued to use the device, some read near a computer so they could look up references or do other tasks that were easier to do on a computer. Others tucked a sheet of paper into the case so they could write notes. 
• With paper, three quarters of students marked up texts as they read. This included highlighting key passages, underlining, drawing pictures and writing notes in margins.
• A drawback of the Kindle DX was the difficulty of switching between reading techniques, such as skimming an article’s illustrations or references just before reading the complete text. Students frequently made such switches as they read course material. 
• The digital text also disrupted a technique called cognitive mapping, in which readers used physical cues such as the location on the page and the position in the book to go back and find a section of text or even to help retain and recall the information they had read.

One of the study’s authors “predicts that over time software will help address some of these issues.” Here’s hoping! — but I have a feeling, based on my own experiences, that this is going to be a tough technological nut to crack.

Monday, May 9, 2011

Pinboard

I've written more than once about my troubles with Tumblr, but people keep writing to tell me that they miss those links and quotes. So how about this: I've made my Pinboard bookmarks public (well, most of them anyway). This gives you a tag-based collection of all sorts of things that I'm reading, have recently read, want to save, and plan to blog about. And you can subscribe via RSS to be informed about all new posts.

Pinboard is a fantastic bookmarking service, by the way — well worth the cost of the initial sign-up.

Saturday, May 7, 2011

Christianity and the Book

Next academic year I will be leading a faculty seminar here at Wheaton on Christianity and the Book: Histories and Futures. Participating in the seminar will be faculty from English, Education, Chemistry, Ancient Languages, Communications, and the Library. Oh, and our President wants to be there too.

We’ll want to start the year by acknowledging that the book is a technology, and that, therefore, we need to think well about technology in general. Here I think Albert Borgmann’s Technology and the Character of Contemporary Life will be especially helpful. We will also want to develop a specifically theological vocabulary, and might be assisted in that endeavor by Murray Jardine’s The Making and Unmaking of Technological Society. We will read Leo Marx’s terrific essay, “Technology: the Emergence of a Hazardous Concept” and sample some of the more pessimistic (Jacques Ellul, Neil Postman) and optimistic (Kevin Kelly) thinkers about technology.

Then we’ll turn to the history of the book. We’ll need an anthology: either The Book History Reader or A Companion to the History of the Book.

For the particularities of Christianity’s relation to the book, we’ll read, among other things, Roberts and Skeat’s The Birth of the Codex, Anthony Grafton and Megan Williams’s Christianity and the Transformation of the Book, and Ivan Illich’s In the Vineyard of the Text. When we get to the Reformation it’s going to be hard to know what to select: I’ll probably come up with readings from Elizabeth Eisenstein, Andrew Pettegree, Adrian Johns, and Lucien Febvre and Henri-Jean Martin. Surely there’s a place for Ann Blair’s Too Much to Know.

Then — and here's where it gets really tricky — late in the year we’ll want to think about how Western culture is shifting away from the dominance of the codex and what implications that has both for Christianity and for higher education. I’d love to discover that some brilliant sociologist is studying churches and new media, but I haven’t discovered that yet. We may need to read some McLuhan at this point. Also perhaps essays from Robert Darnton’s The Case for Books and Grafton’s Worlds Made by Words.

And then, on the embrace-the-future side of things, we’ll want to read The Future of Learning Institutions in a Digital Age, by Cathy Davidson and David Theo Goldberg. Possibly Kamenetz’s DIY U and Shirky’s Cognitive Surplus. And many blog posts by many different people.

What am I missing? What else should we read? What topics should we explore? Whom might we invite to come and speak to our group? Help me out, folks.

Friday, May 6, 2011

rage against the Twitter machine

In the New Statesman, A. C. Goodall writes that Twitter is really bad. She doesn’t make any arguments, she just says stuff. “Twitter is all about fitting in.” “Twitter functions as banally as a school hierarchy: who to like, who not to, who you're allowed to criticise, who you can't etc.” “Twitter relies on people's desire to be the same.” Like that: assertions without evidence.

To which I reply that Twitter is a platform and a medium, not an organized and coherent body — it’s not like a book, for instance, which can be said to have a single overall character. Imagine what you would think if someone said, “Email is all about fitting in.” Or “The telephone functions as banally as a school hierarchy.” Or “The telegraph relies on people’s desire to be the same.” Media platforms are what you make of them, and the history of each reveals that its makers expected it to have a relatively narrow set of uses and were surprised when people exercised their creativity to find remarkably varied uses.

In fact, it’s not enough to say that different people use Twitter in different ways: one person may use it in different ways. On Twitter I talk sometimes to my fellow literary academics, sometimes to my old American Scene friends (largely about pop culture), sometimes to my fellow soccer fans, sometimes to friends from church — and those are all different kinds of conversations, with different tonalities and shades of intimacy or distance. I find this fascinating. We need a modern Mikhail Bakhtin to write about the speech genres of Twitter.

Oh, by the way: Goodall has a Twitter account.

(P.S. Just to prove my point, take a look at what just showed up in my Twitter feed.)

Thursday, May 5, 2011

this 'n' that

Yes, not a lot of activity around here lately, I know. End of semester craziness. I'll be more active in the coming weeks — though, I must note, this blog will be quietish through most of the summer, as I will be leading (with my buddy and colleague Brett Foster) a study tour in England from mid-June through early August. 'Twill be great fun, I'm sure, but also a very busy time. Blogging will have to take a back seat.

A few notes: first, my review of Ann Blair's Too Much to Know and Anthony Grafton and Joanna Weinberg's 'I Have Always Loved the Holy Tongue' is online at the Books & Culture website — for members of the CT Library only, though. But still online. I stand by that.

Over at The American Scene, I recently posted a brief report on what happened to my sister in last week's tornadoes. Since then she and her husband have been the beneficiaries of some remarkable acts of generosity, as I have documented on Twitter.

Some things forthcoming soon: at First Things, a review of Terry Eagleton's Why Marx Was Right, and here at The New Atlantis, an essay commemorating the centenary of Marshall McLuhan. I continue to contribute, when I'm able, to The Run of Play, though I can't hold a candle to the site's creator and impresario, Brian Phillips.

And one more thing: in a couple of weeks I hope to be able to announce a new book project, one that I'm rather excited about.

Wednesday, May 4, 2011

backup strategies

Here's an odd post by Gina Barreca about a student of hers whose completely un-backed-up computer died. Everything was lost: documents of all kinds, photos, email, etc. Barreca's response: See, this is why you should print everything out. (Everything? Even the photos?)

Barreca says she pays for a backup service, though she seems to make a point of emphasizing that she doesn't understand it: "one of those companies which (I am told) will keep my stuff safe in the ether or the cloud or the memory of one really smart guy who’ll be able to recite everything I have on my hard-drive." So what she really relies on is "filled-to-overflowing filing cabinets of paper and shelves of hand-written notebooks."

Is that really the most appropriate response? About a month ago my computer died — as I mentioned in an earlier post — and I would have been completely miserable, indeed would be completely miserable for some time to come, if I could only rely on paper copies of everything. (Everything textual, that is: I don't think anyone could seriously suggest printing out high-resolution copies of every digital photo they've taken, and few would suggest burning every song they own to CD.) The best possible scenario for making materials useful to me again would be to scan them to PDF and use OCR software to make at least some of the text readable again. But this would take countless hours and would lead to highly imperfect results.

What I did instead: I had my whole computer backed up to an external hard drive in my office, my entire home folder backed up to Amazon S3, and my essential files backed up to Dropbox. So while I was waiting for my new computer to arrive I used Dropbox on other computers to keep working, and when the computer finally did show up I just transferred the whole contents of my previous computer to the new one. Using Apple's Migration Assistant, I set it up one afternoon when I left the office, and had everything in place when I got back the next morning.

And Barreca thinks it makes more sense just to print everything out?