Text Patterns - by Alan Jacobs

Saturday, January 30, 2010

definitive iPad thoughts

John Gruber is one of my favorite tech commentators, but he’s not doing so well with the iPad. He has divided the world into those who “get it” (i.e., adore the iPad) and those who “don’t get it.” This is an old Andrew Sullivan move, and one of the more annoying ones. Saying “you just don't get it” is not an argument; in fact, it’s a straightforward refusal to discuss an issue rationally.

Gruber and others — Stephen Fry, for instance — have said that it’s impossible to understand just how fabulous the iPad is until you hold it in your hands. In one sense I’m sure that’s true, but let’s remember that holding a device in your hands for twenty minutes (which is how long Gruber got) is not the same as using it day in and day out. Let’s think about this one task at a time:

Typing. How much typing are you going to want to do on the iPad? Even in Apple’s promotional video the guy looks awkward, pecking with three or four fingers with the device propped on his lap. You’re only going to be able to do short emails, texts, tweets — anything more and you’ll need to use the (optional) keyboard and dock. But to do that you’ll have to sit at a desk: it won't balance on your lap the way a laptop does. And then how much typing will you want to do on a screen that’s about the size of the original Macintosh’s?

Moreover, much of my writing is done while switching back and forth between my browser — where I read things that make me think, copy and paste quotations, and post links — and my text editor. Writing this post on an iPad would be a major pain in the neck, and this is just a blog post, not a novel.

I think even the biggest iPad fans are likely to concede this point — they will presumably retort that the iPad is basically a media consumption device. But have they thought about how much time we spend on our computers typing?

Okay, then, on to the media.

Music. First of all, you’ll need to make sure that you get an iPad with enough memory to hold a good bit of your music collection, at least until Apple moves to cloud-based music storage (which is bound to happen). But even if you do, you’re probably only going to be able to listen to the music while sitting down, likely with the iPad in your lap, or right next to you. In other words, this won't be significantly different from listening on your laptop, and therefore won't be nearly as convenient as listening on an iPod.

Photos. Photos will definitely be fun to look at on the iPad. How much time do you spend doing that?

Movies. This would seem to be a strength of the iPad, except that its 4.3 format significantly reduces the available size for widescreen films. Good, but not great. Movies are going to look a lot better on your laptop.

Web browsing. I hate Flash with a passion, but it is all over the internet, and that’s going to lead to a lot of frustration. It’s not just a matter of the games you can't play — Apple likes that, because it encourages you to buy games from their App Store — but think of all the retailers whose sites are Flash-based. (Come to think of it, maybe Apple likes that too: if you can't shop at J. Crew, that leaves you with more money to spend at the iTunes Store.)

Calendar. Apple seems to be making a big deal out of the iPad’s calendar, which is pretty sad. No doubt it looks really cool, but won't it almost always be a great deal more convenient to look at the calendar on your smartphone?

Getting work done. The one thing that I saw in the videos that I really like is the iPad version of Keynote. It would be great fun to create a Keynote presentation on the iPad — maybe when you’re traveling and didn't even know that you were going to need one — plug it into a projector, and wow people. But a great many of my Keynote presentations are made by copying and pasting text and images from my browser and from other applications on my computer. How easy is that going to be on a computer that can't have multiple applications open? And how much of the information (especially text-based information) that you need will even be on the iPad? It’s probably back home on your iMac or MacBook, which at some point in your travels you’ll probably be wishing you had with you.

So, in short: I have major doubts about the utility, for most users, of the iPad. There isn't a single thing it does that isn't done better by other products in the Apple lineup.

And one more comment: everyone who is sold on this device is trying to refute the critics — people like me who “just don't get it” — by quoting all the people who dismissed the iPhone. But let’s remember two things: first, some of us who are skeptical about the iPad were really enthusiastic about the iPhone, right from its first appearance; and second, I don't notice anyone quoting the people who predicted that the Apple TV would be a failure, or that the MacBook Air would be a fringe product. Not all of Apple’s products have done what the iPhone has done.

So for what it’s worth, my prediction: over the long haul, the iPad will be a minor success, but not a game-changer. It will be a heck of a lot more popular than the Apple TV, but nothing like the iPhone. And many of the people who buy them will within three months be setting them aside to gather dust, because they’ll discover that they’re happier with their smaller but utterly portable iPhones. I have a feeling that this time next year there will be a great many iPads available on eBay.

Carta Geografica Completa di tutti i Regni del Mondo

One of Matteo Ricci's maps. Amazing things may be seen at the Library of Congress. Thanks to Ryan H. for the tip.

Friday, January 29, 2010

the non-digital classroom

Mark Bauerlein is making a prediction:

As more kids grow up writing in snatches and conforming to the conventional patter, problems will become impossible to overlook. Colleges will put more first-year students into remedial courses, and businesses will hire more writing coaches for their own employees. The trend is well under way, and educators will increasingly see the nondigital space as a way of countering it. For a small but critical part of the day, they will hand students a pencil, paper, dictionary, and thesaurus, and slow them down. Writing by hand, students will give more thought to the craft of composition. They will pause over a verb, review a transition, check sentence lengths, and say, “I can do better than that.”

The nondigital space will appear, then, not as an antitechnology reaction but as a nontechnology complement. Before the digital age, pen and paper were normal tools of writing, and students had no alternative to them. The personal computer and Web 2.0 have displaced these tools, creating a new technology and a whole new set of writing habits. This endows pen and paper with a new identity, a critical, even adversarial one. In the nondigital space, students learn to resist the pressures of conformity and custom, to think and write against the fast and faster modes of the Web. Disconnectivity, then, serves a crucial educational purpose, forcing students to recognize the technology everywhere around them and to see it from a critical distance.

I hope he’s right, because this is already what I try to do in my classes. Though (as you can see) I blog, I tweet, I tumbl 4 ya, I set up blogs for some of my classes, I receive and respond to student writing electronically, and I even use Wikipedia, during class I focus almolst all of my attention on the reading and annotation of paper codices. Because I think those are technologies worth knowing — not the only technologies worth knowing, but important ones, ones with which all college students need considerable facility.

But will educators come to recognize, as Bauerlein predicts they will, the value of these tools and the power they yield of achieving a “critical distance” on other, more recent, technologies? I would like to agree, but I doubt it. Educators by and large equate “technology” with “very recent electronic technology,” and passionately believe that all problems have technological solutions. I can't imagine many of them doing what they would call “turning back the clock.”

But I devoutly hope I’m wrong. The people best equipped for navigating our world are those who have knowledge of multiple technologies, and multiple kinds of technologies. The Luddite and the techno-celebrant alike are crippled by the narrowness of their technological equipment.

Thursday, January 28, 2010

a quick iPad roundup

Gruber has the need for speed:

Lastly, there’s the fact that the iPad is using a new CPU designed and made by Apple itself: the Apple A4. This is a huge deal. I got about 20 blessed minutes of time using the iPad demo units Apple had at the event today, and if I had to sum up the device with one word, that word would be ‘fast’. . . . It is fast, fast, fast. The hardware really does feel like a big iPhone — and a big original iPhone at that, with the aluminum back. (I have never liked the plastic 3G/S iPhones as much as the original in terms of how it feels in my hand.) I expected the screen size to be the biggest differentiating factor in how the iPad feels compared to an iPhone, but I think the speed difference is just as big a factor. Web pages render so fast it was hard to believe. After using the iPhone so much for two and a half years, I’ve become accustomed to web pages rendering (relative to the Mac) slowly. On the iPad, they seem to render nearly instantly. (802.11n Wi-Fi helps too.)

The Maps app is crazy fast. Apps launch fast. Scrolling is fast. The Photos app is fast. . . .

But: everyone I spoke to in the press room was raving first and foremost about the speed. None of us could shut up about it. It feels impossibly fast. (And our next thought: What happens if Apple has figured out a way to make a CPU like A4 that fits in an iPhone? If they pull that off for this year’s new iPhone, look out.)

AKMA makes an important point for those of us who share a Text Patterns frame of mind:

And the super-good news, if Apple doesn’t ruin everything (and I don’t trust them not to), is that the iBook app rests on the open EPUB book format. I repeat my assertion/plea that this is the moment for some university press to lay claim to a huge untapped market share.

And for Nick Carr, the iPad (I infer this, anyway) is another step in decline of writing. Carr thinks that reading is in decent shape, and will probably continue to be, but the reign of “texting” means that “Writing will survive, but it will survive in a debased form. It will lose its richness. We will no longer read and write words. We will merely process them, the way our computers do.”

Wednesday, January 27, 2010

I'm all ears

If I ever write Text Patterns: the Book, this will absolutely be the cover image. (Here, via Culture Making.)

copyright and making sense

We interrupt this hiatus to comment that Larry Lessig's essay in The New Republic on copyright and the Google Book Settlement is by far the best thing I've read on the settlement and the many associated issues. Sample:
My wife had just given birth to our third child. On the morning of the child’s third day, doctors were worried about jaundice. By the evening, the child had fallen into a state of severe lethargy. We called the doctor. He wanted a report in two hours. If she did not improve, he wanted her taken to the emergency room. By midnight she had not improved, and so I bundled her into the car seat and raced to nearby Children’s Hospital.
As I sat waiting for the doctor, I began reading an article I had found through Google about jaundice and its dangers. Fortunately, the piece was published by the American Family Physician, which makes its articles available freely on the Internet. And so with an increasing feeling of panic, I read about the condition--hyperbilirubinemia--that the doctor feared our child had developed.
I reached a critical part of the article. It referred to a table. I turned the page to see the table. The table was missing. In its place was a notice: “The rightsholder did not grant rights to reproduce this item in electronic media.” No one had licensed the table for free distribution. Distribution was thus blocked. “Have your lawyer call my lawyer,” the article seemingly urged. “We’ll work something out.”
I sat in that waiting room chair staring in disbelief. It was a relief of sorts, to fear for the future of our culture rather than the future of my daughter. But I was astonished. I could not believe that we were this far down the path to insanity already. And that experience spurs me to ask some urgent questions. (The kid is fine, by the way.) Before we continue any further down this culturally asphyxiating road, can we think about it a little more? Before we release a gaggle of lawyers to police every quotation appearing in any book, can we stop for a moment to consider whether this way of organizing access to culture makes sense? Does this complexity get us something we would not get under the older system? Does this innovation in obsessive control produce any new understanding? Is it really progress?
The whole thing is a must-read.

Tuesday, January 26, 2010

report

Back from my travels — where I was treated wonderfully hospitably by the good folks at Baylor — but under the weather. I'll get back on the horse ASAP, but I don't know when P will be.

Saturday, January 23, 2010

Amazon's bad move

Amazon's decision to open the Kindle platform for app development is not smart. It seems obvious that Amazon is anticipating the arrival of the Great Apple Tablet and is trying to forestall its dominance by turning the Kindle into a multiple-use device. In other words, Amazon is granting Steve Jobs's argument that "general-purpose devices will win the day because I think people just probably aren’t willing to pay for a dedicated device" and trying, belatedly, to turn the Kindle into a kind of tablet.
This will never work. The Kindle, with its black-on-gray screen and slow processor, is engineered to be "dedicated device" — dedicated to reading — and simply doesn't have the hardware to be anything else. (As anyone knows who has tried to use the Kindle's primitive browser.) And it's the fact that the Kindle just does this one thing that attracts me, and many other people, to it. I like not being able to to anything but read on it. I don't want other features competing for my attention. And the more assiduously the Kindle tries to bolt on extraneous and (necessarily) poorly-implemented features, the more obvious will be its inferiority to Apple's tablet.
Kit Eaton has argued that 2010 will be the year of the e-reader, but the only year of the e-reader, because e-readers will necessarily be supplanted by the Apple tablet and other general-purpose devices. This may be true in the sense that over time general-purpose devices will outsell e-readers, but e-readers can still be successful products that make a lot of money for their manufacturers — if those manufacturers don't try to ape the tablets, but instead focus on creating the best possible environment for reading.

Friday, January 22, 2010

new world, potentially brave

I’ve written elsewhere, once or twice, about the experience of homeschooling my son Wesley. We’re still at it, and now, in the humanities portion of his curriculum, studying Dirty London — sanitation and social class in the Victorian era. He finished reading Dickens’s Bleak House last week, and today will be wrapping up Steven Johnson’s terrific account of the conquering of cholera in London, The Ghost Map.

I read The Ghost Map last year on my Kindle, but thought I had bought a paperback copy for Wes. However, it appears that I forgot. No problem: I handed him the Kindle, re-read the book via the Kindle app on my iPhone, and then prepared a reading quiz for him that he’ll access in Google Docs. When he writes about The Ghost Map and Bleak House later, I’ll show him how to find some useful sources online, especially through Google Books, and I’ll show him how to find searchable texts of both books, for instance via Amazon’s Look Inside the Book feature. As I was writing up the quiz last night, it struck me how recently this way of doing things — teaching with these particular technologies — would have been unimaginable. And yet to both of us it all seems perfectly natural.

No big deal, I guess, and nothing original here. But sometimes the obvious suddenly strikes home, if you know what I mean.

(Off to Baylor this weekend, back on Wednesday. Light or nonexistent posting until then, but it’s possible that fascinating things will show up via the Twitter feed.)

I got Googled too

Mark Zuckerberg is probably right when he says that privacy is ceasing to be a value; and then of course there's Scott McNealy's notorious — and now long-ago — comment that "You have no privacy anyway. Get over it". But I tend to think that there are degrees in these matters, and distinctions to be made. A great deal of personal information about me — i.e., about my character, personality, interests, friends, etc. — is available for anyone weird or bored enough to search online for it, but probably not as much as there would be if I used Facebook. But I'm sure I am as open to financial scrutiny, not by the average person googling but by people with access to the financial industries' many tools, as anyone else.

So there may not be as much privacy to preserve as there once was, but there's still some. And while I really don't believe that the particulars of my private life are of any interest to Google — they make money off aggregating data, not parsing it person by person — I can still get uncomfortable when I think about how much of my day-to-day life goes through Google's servers. That sometimes gets me thinking about the virtues of paper-based life organization, but more often what I consider is distributing my online information: moving my calendar from Google Calendar to my Backpack account, shifting from Google Reader to the gorgeous NewsFire or Fever, using a different search engine, and so on.
But here's the thing: Google gets more information from my Gmail account than from all those other sources — plus my search history — combined and multiplied several times over. So if I really want to distance myself from Google, I should probably ditch Gmail, right? And yet I haven't. Minor consideration: a quick check of All Mail yields 23,874 messages. What more can they learn? Major consideration: Gmail is so radically superior to any other email client that I can't bear the thought of going back to the Bad Old Days. And here's how Google and other smart companies defeat our concerns: by making products so valuable to us that we're willing to sell our privacy in order to get them.

Tuesday, January 19, 2010

asciimeo

Speaking of text patterns, how about these beautiful videos in text? And note the related iPhone app.
(Hat tip to Matt Frost.)

reading resolutions

Normally I think such resolutions are bad ideas, but these by Wayne Gooderham are sufficiently anti-resolutional that I like them:

My Reading Resolutions are important to me for the simple reason that if I'm not reading something in which my full interest is engaged, the feeling of disaffection tends to encroach upon all other areas of my life, rendering me a shadow of my former self, left to wander listlessly from room to room, sighing heavily and gazing wanly out of windows. Well, metaphorically, at least.

Of course, first and foremost, reading should be a pleasurable activity. Therefore, the whole point of my Reading Resolutions is to make me a better reader (thereby increasing my reading pleasure and the pleasure I get out of life, and so on). To this end, if it turns out I have misjudged a resolution and it is in fact having a detrimental effect on my reading life (and all that follows), I don't hesitate in breaking it. For example, one of my RRs for 2009 was to finish every book I started. This was a resolution I was forced to stick to at the time due to a project I was working on, and meant long and painful slogs through The Tin Drum, East of Eden and The Glass Bead Game (apologies in advance if these are your favourite books: they just weren't for me). Now, at the end of 2009, I'm happily breaking this resolution and reverting back to my old reading habit of giving up on books I'm not enjoying, on the grounds that life's too short to spend reading something you don't like.

information wants to be really, really expensive

Nick Carr is grumpy in ways I find consistently interesting. I’m going to quote a big chunk here and commend it to your thinking apparatus:

Never before in history have people paid as much for information as they do today.

I'm guessing that by the time you reached the end of that sentence, you found yourself ROFLAO. I mean, WTF, this the Era of Abundance, isn't it? The Age of Free. Digital manna rains from the heavens.

Sorry, sucker. The joke's on you.

Do the math. Sit down right now, and add up what you pay every month for:

-Internet service

-Cable TV service

-Cellular telephone service (voice, data, messaging)

-Landline telephone service

-Satellite radio

-Netflix

-Wi-Fi hotspots

-TiVO

-Other information services

So what's the total? $100? $200? $300? $400? Gizmodo reports that monthly information subscriptions and fees can easily run to $500 or more nowadays. A lot of people today probably spend more on information than they spend on food.

The reason we fork out all that dough is (I'm going to whisper the rest of this sentence) because we place a high monetary value on the content we receive as a result of those subscriptions and fees.

Now somebody remind me how we all came to think that information wants to be free.

Of course, not all of us are on the hook for all of those, but it’s worth taking a few minutes to add up what we do pay for, and how much. Sobering.

One of Nick's commenters suggests that his point is misleading because we're not paying all that much per bit of data. That's probably true, but it may not make the point the commenter wants it to make. Consider an analogy to restaurant dining: Americans in the past twenty years have spent far, far more on eating out than any of their ancestors did, and that's a significant development even if you point out that huge portions of fat-laden food mean that they're not paying all that much per calorie. In fact, that analogy may work on more than one level: are we unhealthily addicted to information (of any kind, and regardless of quality) in the same way that we're addicted to fatty foods?

Monday, January 18, 2010

things to come

Friends, I've been traveling the past few days — visiting family in Alabama, driving around in my daddy's old pickup, eating at Chick-fil-a and Jim 'n' Nick's — yeah, they've gotten kinda fancy and above their raisin', what with slick websites and Twitter feeds and all, but they still serve darn good smoked meat — and I have now returned to a big pile of work. And then at the end of the week I'll be headed to Texas for a few days to give a talk at Baylor on "The End of the Book and the Future of Reading."
Ergo, posting will be light for a while — but take heart, interesting links will continue to appear on my tumblelog, and all those posts are accessible via the Twitter feed you will see to the right of this page. And I may post a sample passage or two from the upcoming lecture.

speak, memory

Evan Maloney writes thoughtfully about how inconsistent our memories of books can be. "Are our memories of books determined by how much we enjoy them? Not for me. I read Kelman's How Late It Was, How Late in the mid-90s. I thought it was fantastic, and I never thought of it again until someone mentioned it last year. Conversely, in 2002 I read John Irving's A Widow for One Year, and I thought very little of it, and yet I often remember the little I thought." This is true for me as well: I can't discover any pattern that would account for what I remember and what I forget.
Maloney concludes by saying "Nobody can fully understand or explain the relationship between reading and memory. And that's a wonderful thing, because the mystery of how we remember a book is something that leads us deep inside the magic of storytelling." Well, if you say so. For me it's more a testimony to the frustrating unreliability and irregularity of memory.

Thursday, January 14, 2010

Kindles and the blind

Here’s a curious story:

Three U.S. universities will stop promoting the use of Amazon.com's Kindle DX e-book reader in classrooms after complaints that the device doesn't give blind students equal access to information.

Settlements with Case Western Reserve University in Cleveland, Pace University in New York City and Reed College in Portland, Oregon, were announced Wednesday by the U.S. Department of Justice. The National Federation of the Blind and the American Council of the Blind had complained that use of the Kindle devices discriminates against students with vision problems.

The complaints about the Kindle were based on the Americans with Disabilities Act, which prohibits discrimination on the basis of disability.

The three universities were among six schools participating in an Amazon.com pilot program testing the use of the Kindle DX in classrooms. On Monday, a fourth participating school, Arizona State University, also reached a settlement with the DOJ and the two organizations representing the blind.

Three other schools announced in late 2009 they will not deploy Kindle in classrooms.

The Kindle DX has the capability to convert text to synthesized speech, but the device does not include text-to-speech functionality for its menu and navigational controls, the DOJ said in a press release. Some reviewers and users of the device's text-to-speech software have also said the speech is difficult to listen to and the conversion can be inaccurate.

These points are absolutely correct, but doesn't the logic require that the universities stop using books as well? Presumably in classes that do require paper codexes, blind students do not use those editions, but rather Braille editions (or, if students just have very poor vision, large print editions). So why not follow the same policy in this case — Kindles for sighted students and Braille editions for the blind? I'm sure I'm missing something.

(Via Slashdot.)

Wednesday, January 13, 2010

it's a comin' and it's gonna be big

Kevin Kelly seems to be confused. About the (supposedly) emerging Brave New World he calls the Technium, he says, “I acknowledge the fact that multitasking and BlackBerrys and iPods and Twitter can be distracting. But we don’t really have the option of ignoring it.” But then, immediately afterwards, he says, “I think it’s necessary and good that there will always be an opt-out option.” Isn't that what in the pre-Technium days we used to call a contradiction?

Really, all Kelly means — as the whole interview shows — is that most people won’t opt out of new technological possibilities. But of course that tells us nothing about how many will, how many should, and how many will actually never have the Technium available to them because they’re too poor.

But in these matters Kelly is an evangelist — literally:

KELLY: But I don’t think the Technium is only about humans. It’s a type of learning. It’s a type of expression. It’s a type of possibility.

The Technium works as an ecology. Just as evolution has a longterm direction as we look 4 billion years into the past, so technology increases complexity and diversity, with increasing power.

LAWLER: So technology is part of evolution or God—that which drives the universe?

KELLY: Exactly. Some people call this the Great Story. Roving preacher Michael Dowd talks at churches about this alternative creation story. It is about evolution through God, that which started from nothing, grew into particles that gained mass and complexity, and then clumped into molecules and then became dust and planets and so forth. And technology is the latest variety.

LAWLER: So the Technium is one of the ways in which the universe is getting to know itself? And by increasing complexity, the universe becomes more self-aware?

KELLY: Exactly. I think of God as the intelligence of mind that is increasing the complexity of the universe.

He goes on to say, “In a sense there is no God as yet achieved, but there is that force at work making God, struggling through us to become an actual organized existence, enjoying what to many of us is the greatest conceivable ecstasy, the ecstasy of a brain, an intelligence, actually conscious of the whole, and with executive force capable of guiding it to a perfectly benevolent and harmonious end. That is what we are working to. When you are asked, ‘Where is God? Who is God?’ stand up and say, ‘I am God and here is God, not as yet completed, but still advancing towards completion, just in so much as I am working for the purpose of the universe, working for the good of the whole of society and the whole world, instead of merely looking after my personal ends.’”

Oh wait — that isn’t Kevin Kelly, that's George Bernard Shaw, writing one hundred and three years ago. Kelly appears not to be aware how elderly his theology is — and how little its persuasive power is augmented by calling it the Technium instead of the Life Force.

(My title, by the way, comes from an old Roz Chast cartoon: those are the words spoken by one of the "Four Press Agents of the Apocalypse." )

Tuesday, January 12, 2010

Farewell, Mr. Dalton

From Christopher at Survival of the Book I learned that the last B. Dalton bookstores are closing. I can't help but feel some nostalgia about this, because B. Dalton is one of the three employers I have had in my entire life. And my first employer.

B. Dalton came to Birmingham, Alabama in the summer of 1975, when I had just graduated from high school, as part of a brand-new mall called Century Plaza. (The mall itself, after many years of decline, closed last year.) I was sixteen, and this was my first job. For the first month, when we couldn't get into the still-under-construction building, we worked out of a mini-warehouse, lugging books and supplies down a long alley the UPS trucks were too big to get through. Then, when the store became available, we lugged them all back down the alley and onto trucks, drove half-a-mile, and unloaded them into the store. All this in an Alabama July. I don't know whether I’ve even been so consistently hot, tired, and sore.

What I remember most about that month is the wonder of taking from their boxes book after book that I had never heard of before, reading their covers, and then setting many of them aside to purchase at my 30% employee discount. (We couldn't buy anything until we got into the store and got the registers set up, which meant that when I finally got to purchase the books, I used up most of my paycheck.)

What impresses me now, as I look back, is the hopefulness that underlay the original inventory of the store. It contained a remarkably full selection of literature, philosophy, science, anthropology, psychology — you name it. Of course, many of those books didn't sell, and the home office in Minnesota knew it: B. Dalton was the first bookstore chain to computerize its inventory, and every evening when we closed we had to set the registers to send the day’s data through a modem to the General Office. So after six months the store’s inventory had altered for the worse, from my point of view, even though it certainly matched our clientele better.

(Incidentally, even before the store opened we got several calls a day from people wanting to speak to Mr. Dalton. It was remarkably hard to convince some of them that there was no no Mr. Dalton, that the store's name was a fiction. People are used to that sort of thing now — I doubt that anyone comes to Applebee's looking for Mr. Applebee — but it was a relatively unfamiliar practice at the time.)

Before the official opening date of the mall, the store was ours, and all the books therein. It was therefore something of a shock to have to open those gates one morning and let the rabble in. We felt violated, somehow — especially since we had been in the store until well after midnight the night before cleaning the parquet floors and then mopping them with linseed oil.

I was actually the person who opened the gate the first time, and as I did a woman walked in, approached me, and asked a question. Our first customer! “Do you have The Seagull, by Jonathan Livingston?”

I paused. “Um, you must mean Jonathan Livingston Seagull.”

“No,” she said, confidently. “I know what I want. I want The Seagull, by Jonathan Livingston.”

I paused again. “Well,” I said finally, “we don't have that.”

She turned on her heel and walked out. And thus began my career as a book salesman.

Monday, January 11, 2010

a new online review

The New Republic is launching its online book review, “The Book”, today. Looks very good indeed — I’m especially pleased by the inclusion of articles and reviews from the magazine’s illustrious archives. Isaac Chotiner is very concerned that we understand that this is not a dumbed-down or short-attention-span version of the magazine’s reviews:

The first thing to know about The Book is that it is a supplement to our print content–an attempt to apply the new technology to the old and untarnished purposes. While our online book review will certainly be lively, it will not be significantly more relaxed than our magazine itself. We are not slumming here, or surrendering to the carnival of the web. Quite the contrary. We are hoping to offer an example of resistance to it. Many of the writers you will read in The Book are the same writers you will read in the magazine. Their subjects, too, will be the same. Here you will find criticism, not blogging; pieces, not posts. Four or five times a week we will publish a new review of a new book. The length of these reviews will vary, and we will count on our readers sometimes to sustain an attention-span that is not generally required for reading online. Our main review of the day, which is the central feature of the site, will range widely over the genres. Fiction, history, art, poetry, scholarship, philosophy, children’s books, food books – all the books that, at least in our judgment, a thoughtful American should know about.

Here’s hoping it goes well. It’s hard not to think that the whole magazine (which has recently suffered significant budget cuts and layoffs) will be heading in this direction.

the absent presence

Brian Croxall wrote a paper for a session at the recent Modern Language Association Convention, but couldn't be there himself to read it. And that became, in a way, the subject of the paper:

Again, I’m not at the MLA this year because it’s not economically feasible. I had hoped to be here for job interviews—as well as to speak as a member of this panel discussion. This was my third year on the job market, and I applied to every job in North America that I was even remotely qualified for: all 41 of them. Unfortunately, I did not receive any interviews, despite having added two articles accepted by peer-reviewed journals, five new classes, and several new awards and honors to my vita. According to my records, applying to those 41 jobs cost me $257.54. I was prepared to pay the additional expenses of attending the MLA ($125 for registration, $279.20 for a plane ticket, approximately $180.00 for lodging with a roommate at a total of $584.20) out of pocket so that I could have a chance of getting one of those 41 jobs. I was even luckier than most faculty (remember, most of today’s faculty are contingent) in that my institution was willing to provide me with $200 support to attend conferences throughout the academic year. But once it became apparent that I wasn’t going to be having any interviews, I could no longer justify the outlay of $400.00 out of a salary that puts me only $1,210 above the 2009 Federal Poverty Guidelines. (And yes, that means I do qualify for food stamps while working a full-time job as a professor!)

I teach at a college that sends a great many of its graduates on to further education. I’ve written recommendation letters for students who went on to Yale, Harvard, Stanford, Chicago — you name it. Students that I’ve been particularly close to now teach at a wide range of institutions, for example Washington and Lee and Cornell, and others are doing very well in graduate school now. But it has become increasingly difficult for me to feel good about writing those letters. My default approach is to discourage, though I am always willing to help those people who persist through my discouragement. But at this point I simply cannot believe that the institutional structures of the university that we are all familiar with will last much longer. I just wish I could imagine what it is that will replace them.

Friday, January 8, 2010

"How is the Internet changing how you think?"

The “Question of the Year” at Edge is: “How is the Internet changing the way you think?” (Though some parts of the website phrase it differently: "How has the Internet changed the way you think?") I will be commenting on some of the answers — though not all 159 of them — in the coming days and weeks, but I want to start with Danny Hillis, because he makes an essential distinction that not too many others will bear in mind:

It seems that most people, even intelligent and well-informed people, are confused about the difference between the Internet and the Web. . . . The Web is a wonderful resource for speeding up the retrieval and dissemination of information and that, despite Wolfe's trivialization, is no small change. Yet, the Internet is much more than just the Web. . . . By the Internet, I mean the global network of interconnected computers that enables, among other things, the Web. I would like to focus on applications that go beyond human-to-human communication. In the long run, these are applications of the Internet that will have the greatest impact on who we are and how we think.

Today, most people only recognize that they are using the Internet when they are interacting with a computer screen. They are less likely to appreciate when they are using the Internet while talking on the telephone, watching television, or flying on an airplane. Some travelers may have recently gotten a glimpse of the truth, for example, upon learning that their flights were grounded due to an Internet router failure in Salt Lake City, but for most this was just another inscrutable annoyance. Most people have long ago given up on trying to understand how technical systems work. This is a part of how the Internet is changing the way we think.

I want to be clear that I am not complaining about technical ignorance. In an Internet-connected world, it is almost impossible to keep track of how systems actually function. Your telephone conversation may be delivered over analog lines one day and by the Internet the next. Your airplane route may be chosen by a computer or a human being, or (most likely) some combination of both. Don't bother asking, because any answer you get is likely to be wrong.

Soon, no human will know the answer. More and more decisions are made by the emergent interaction of multiple communicating systems, and these component systems themselves are constantly adapting, changing the way they work. This is the real impact of the Internet: by allowing adaptive complex systems to interoperate, the Internet has changed the way we make decisions. More and more, it is not individual humans who decide, but an entangled, adaptive network of humans and machines.

It seems to me difficult to overstress how important this is — and how much more important than the ways we interact with our personal computers.

Thursday, January 7, 2010

man of sorrows

My meditation on Samuel Johnson — on the three hundredth anniversary of his birth — is now available online.

Wednesday, January 6, 2010

suspiciously wiki

Wonderful post by Tim Carmody over at Snarkmarket about what I think of as a useful new word — or a new use of an already existing word. Someone had said of a piece of information given by someone else, “This story sounded suspiciously Wiki to me.” And as Tim points out, we all know exactly what the person means:

The obvious colloquial analogue would be “the story seemed fishy.” But note the distinction. A “fishy” story, like a “fish story,” is a farfetched story that is probably a lie or exaggeration that in some way redounds to the teller’s benefit. A “wiki” story, on the other hand, is a story, perhaps farfetched, that is probably backed up by no authority other than a Wikipedia article, or perhaps just a random website. The only advantage it yields to the user is that one appears knowledgeable while having done only the absolute minimum amount of research.

While a fishy story is pseudo-reportage, a wiki story is either pseudo-scientific or pseudo-historical. Otherwise, wiki-ness is characterized by unverifiable details, back-of-the-envelope calculations, and/or conclusions that seem wildly incommensurate with the so-called facts presented.

I’m going to start using this word in commenting on student papers.

I love Wikipedia — I use it every day — but it yields farfetched stories sometimes because people who write many of the articles rely on outdated information. Sometimes way outdated. For instance, in the generally useful article on the codex we find this passage:

The basic form of the codex was invented in Pergamon in the third century BCE. Rivalry between the Pergamene and Alexandrian libraries had resulted in the suspension of papyrus exports from Egypt. In response the Pergamenes developed parchment from sheepskin; because of the much greater expense it was necessary to write on both sides of the page.

No citation is given, and I found myself wondering where this information had come from and whether it is true. It sounded suspiciously Wiki to me. A day or two later, I happened to discover the origin of the claim: Pliny’s Natural History. Modern historians see no evidence for the story.

Tuesday, January 5, 2010

the tweet enthusiast

David Carr is pretty darn excited about Twitter. Now, I enjoy Twitter myself, as I think I’ve made clear — I’m well over 3000 tweets now — but Carr’s enthusiasm is, well, kind of annoying. And his case for the importance of Twitter just isn't substantive. He writes,

And now, nearly a year later, has Twitter turned my brain to mush? No, I’m in narrative on more things in a given moment than I ever thought possible, and instead of spending a half-hour surfing in search of illumination, I get a sense of the day’s news and how people are reacting to it in the time that it takes to wait for coffee at Starbucks. Yes, I worry about my ability to think long thoughts — where was I, anyway? — but the tradeoff has been worth it.

Okay, so what Twitter offers must really amazing to make it is worth his while to lose his ability to “think long thoughts.” So what does it do for him? Well, this is an interesting point:

The expressive limits of a kind of narrative developed from text messages, with less space to digress or explain than this sentence, has significant upsides. The best people on Twitter communicate with economy and precision, with each element — links, hash tags and comments — freighted with meaning. Professional acquaintances whom I find insufferable on every other platform suddenly become interesting within the confines of Twitter.

But here’s where I get a bit dubious:

At first, Twitter can be overwhelming, but think of it as a river of data rushing past that I dip a cup into every once in a while. Much of what I need to know is in that cup: if it looks like Apple is going to demo its new tablet, or Amazon sold more Kindles than actual books at Christmas, or the final vote in the Senate gets locked in on health care, I almost always learn about it first on Twitter.

“Need to know”? If you’re going to give examples of things that Twitter tells you that you “need to know,” you really need to do better than mention yet another in an endless series of Apple rumors. And even if you first hear about the Senate vote on health care via Twitter, how much longer would you have to wait to hear about it from some other source? Maybe half an hour, until it shows up on the New York Times home page? What does the immediacy of Twitter actually do for you in that situation?

Near the end of his piece, when Carr wants to demonstrate how “Twitter can flex some big muscles,” here’s his example: the morning after the attempted plane bombing in Detroit, someone tweeted from the Montreal airport, “New security rules for int’l flights into US. 1 bag, no electronics the ENTIRE flight, no getting up last hour of flight.” Carr then says, “imagine you or someone you loved was flying later that same day: Twitter might seem very useful.”

Really? That’s “flexing some big muscles”? if you were flying later that same day, this wouldn't even be enough lead time for you to finish that presentation you were planning to finish on the plane. There’s a case to be made for the value and importance of Twitter, but surely this ain’t it. The whole piece sounds like the self-justification of an addict. Maybe Carr needs to get away from Twitter for a while so he can think some long thoughts — long enough, anyway, to make a plausible case that he really is better off tweeting than thinking.

Monday, January 4, 2010

culling the tomes

Good advice here, largely from writers, about how to figure out which books in your personal library to get rid of. Aside from Joshua Ferris, who vehemently rejects the idea of eliminating any books from his personal library, the contributors acknowledge that sometimes people run out of room, or are moving somewhere where space will be at a premium, and therefore must cull their tomes. (“Cull the Tomes” — isn't that an old Scots folk song?)

So what should be the guidelines? The most common suggestions are that you eliminate (a) books you know you’ll never read and (b) books you have read but know you’ll never read again. These are good recommendations, but it must be noted that they require honest self-examination, which is something that not all of us are good at. And they also require predicting the future, which, again, is not a universally acquired skill.

When I was in college I acquired almost every book in the series of Latin American fiction published by Avon Books: Julio Cortázar’s Hopscotch and 62: A Model Kit, Jorge Amado’s Gabriela, Clove, and Cinnamon, everything by Gabriel Garcia Marquez, Epitaph of a Small Winner by Machado de Assis — I could go on. I can still remember the cover designs quite vividly. But when I got to grad school and knew that I would be specializing in seventeenth-century literature, I traded in the whole collection so I could acquire more books in my chosen field. (I didn't have enough money to buy them outright, and there wasn’t room in the grad-student-sized apartment my wife and I shared.)

That made sense at the time. But what I didn't realize was that in the very last class I took in graduate school I would discover the poetry and prose of W. H. Auden, and would end up gradually shifting my scholarly interests to the twentieth century — which in turn led, some years later, to my being asked to teach a course in contemporary Latin American literature in translation. So I had to go out and buy most of those books again — the ones that were still in print, anyway.

So when it comes to getting rid of books, remember this: you never know what you’re going to need somewhere down the line.

(All that said, I'm headed out with a friend later this morning to sell some books. Or, more likely, to trade them. I always plan to come come with cash in my pocket, but oddly, it rarely works that way. . . .)