Text Patterns - by Alan Jacobs

Thursday, December 30, 2010

posts unwritten, end-of-year edition, part 2

How interesting would it be to have a writer’s every keystroke recorded and played back? Pretty interesting, perhaps, but I don't want it to happen to me.

Though I think Pynchon’s Mason & Dixon is one of the greatest novels of the twentieth century, I somehow never got around to reading his next, Against the Day — but Dale Peck makes me think I should. I could blog my way through it right here. . . .

Carlin Romano worries, intelligently, about whether professors will retain the strength of will to assign whole books, given shortening attention spans. This here professor will, but that’s just one data point. Reading tough books can be challenging in a fun way.

More than a year ago Matthew Battles warned us against un-historical invocations of Gutenberg.
C. W. Anderson has a really cool annotated syllabus for Print Culture 101.

I think Heart of Darkness is a really bad choice for a graphic novel retelling — too much of its power lies in the magnificent narrative voice, e.g.:

I looked at him, lost in astonishment. There he was before me, in motley, as though he had absconded from a troupe of mimes, enthusiastic, fabulous. His very existence was improbable, inexplicable, and altogether bewildering. He was an insoluble problem. It was inconceivable how he had existed, how he had succeeded in getting so far, how he had managed to remain -- why he did not instantly disappear. 'I went a little farther,' he said, 'then still a little farther -- till I had gone so far that I don't know how I'll ever get back. Never mind. Plenty time. I can manage. You take Kurtz away quick -- quick -- I tell you.' The glamour of youth enveloped his parti-coloured rags, his destitution, his loneliness, the essential desolation of his futile wanderings. For months -- for years -- his life hadn't been worth a day's purchase; and there he was gallantly, thoughtlessly alive, to all appearances indestructible solely by the virtue of his few years and of his unreflecting audacity. I was seduced into something like admiration -- like envy. Glamour urged him on, glamour kept him unscathed. He surely wanted nothing from the wilderness but space to breathe in and to push on through. His need was to exist, and to move onwards at the greatest possible risk, and with a maximum of privation. If the absolutely pure, uncalculating, unpractical spirit of adventure had ever ruled a human being, it ruled this bepatched youth. I almost envied him the possession of this modest and clear flame. It seemed to have consumed all thought of self so completely, that even while he was talking to you, you forgot that it was he -- the man before your eyes -- who had gone through these things. I did not envy him his devotion to Kurtz, though. He had not meditated over it. It came to him, and he accepted it with a sort of eager fatalism. I must say that to me it appeared about the most dangerous thing in every way he had come upon so far.

This can't be represented graphically any more than a Picasso can be represented textually.

Wednesday, December 29, 2010

does anything change anything?

Marshall Poe says that “the Internet changes nothing”:

The media experts, however, tell us that there really is something new and transformative about the Internet. It goes under various names, but it amounts to “collaboration.” The Internet makes it much easier for people to do things together. Look, they say, at email discussion lists, community blogs, auction sites, product rating pages, gaming portals, wikis, and file trading services. Collaboration abounds online. That’s a fair point. But “easier” is not new or transformative. There is nothing new about any of the activities that take place on the aforementioned sites. We did them all in the Old World of Old Media. As for transformative, the evidence is thin. The basic institutions of modern society in the developed world—representative democracy, regulated capitalism, the welfare net, cultural liberalism—have not changed much since the introduction of the Internet. The big picture now looks a lot like the big picture then. . . .

Following this logic, let me also affirm that the printing press changed nothing: sure, it made making book easier, but “easier” is not new or transformative. People wrote and read books before the printing press, and they continued to write and read them afterwards. What’s the big deal?

Similarly, the internal combustion engine changed nothing. Before it was invented, we went to Grandma’s house, and even traveled from New York to Chicago — it just took a little longer. And “faster” is not new or transformative, you know.

I could go on for a while. . . . But in all seriousness, Poe makes some good points along the way. He’s just generating page views with an outrageous thesis. I bet he also advocates using federal municipal bonds to forcibly bus known Communists into your homes to Kill your puppies!

Tuesday, December 28, 2010

posts unwritten, end-of-year edition

Looking at my Pinboard and Instapaper pages — how I love those tools — I see so many stories I want to blog about but will probably not find time to. There’s no strict reason why there should be a statute of limitations on such things, and there remains a chance that I’ll come back to some of these stories later, but the end of the year just feels like a time for closing the books on some options and turning to others. So let me take note of a few worthwhile pursuits that I didn't manage to . . . pursue:

Robert Darnton offered “Three Jeremiads” about research libraries, concluding with a plea:

Would a Digital Public Library of America solve all the other problems—the inflation of journal prices, the economics of scholarly publishing, the unbalanced budgets of libraries, and the barriers to the careers of young scholars? No. Instead, it would open the way to a general transformation of the landscape in what we now call the information society. Rather than better business plans (not that they don’t matter), we need a new ecology, one based on the public good instead of private gain. This may not be a satisfactory conclusion. It’s not an answer to the problem of sustainability. It’s an appeal to change the system.

Natalie Binder has a really smart series of posts about the powers and limits of Google’s Ngrams. On the same subject, Geoffrey Nunberg is smart, sobering, and sardonic:

It's unlikely that "the whole field" of literary studies—or any other field—will take up these methods, though the data will probably figure in the literature the way observations about origins and etymology do now. But I think Trumpener is quite right to predict that second-rate scholars will use the Google Books corpus to churn out gigabytes of uninformative graphs and insignificant conclusions. But it isn't as if those scholars would be doing more valuable work if they were approaching literature from some other point of view.

This should reassure humanists about the immutably nonscientific status of their fields. Theories of what makes science science come and go, but one constant is that it proceeds by the aggregation of increments great and small, so that even the dullards have something to contribute. As William Whewell, who coined the word "scientist," put it, "Nothing which was done was useless or unessential." Humanists produce reams of work that is precisely that: useless because it's merely adequate. And the humanities resist the standardizations of method that make possible the structured collaborations of science, with the inevitable loss of individual voice. Whatever precedents yesterday's article in Science may establish for the humanities, the 12-author paper won't be one of them.

I can't decide how much of Jaron Lanier’s warning against “The Hazards of Nerd Supremacy” I agree with, if any, but as he always does, here Lanier provokes a great deal of thought.

I may list a few more of these stories-I-didn’t-write-about in the coming days.

opting out, revisited

Regular readers, if I have any regular readers, will know that this is the kind of thing I strongly disagree with:

Overwhelmed by all the noise, some have simply chosen to block it out — to opt out, say, of social networks and microblog platforms like Twitter. Alternatively, others have hewn close to these social networks, counting on them to sort through all the information coming at us.

But to be informed in the distributed world we live in, opting out isn’t really an option. For better or worse, we are watching a C-Span version of our lives trying to fast-forward to the good parts.

I love this almost-always-on connected life, Lord knows I do, but of course opting out is an option even for those who want to be “informed,” at least for now. I could subscribe to and read only print magazines — even just monthly and quarterly magazines — and be fully informed about everything I need to be informed about.

We tell ourselves, by way of self-justification, that we need Twitter, need our RSS feeds, need Facebook. But no, we don't. We just like them very much. And as far as I’m concerned that’s good enough. It’s just necessary always to remember that we’re making choices and could, if we wished, make different ones about how we’re informed and what we’re informed about.

In this light it’s good to be reminded of a passage from John Ruskin’s Modern Painters that I recently quoted on my tumblelog:

To watch the corn grow, and the blossoms set; to draw hard breath over ploughshare or spade; to read, to think, to love, to hope, to pray — these are the things that make men happy; they have always had the power of doing these, they never will have the power to do more. The world’s prosperity or adversity depends upon our knowing and teaching these few things: but upon iron, or glass, or electricity, or steam, in no wise.

Thursday, December 23, 2010

decision time

Here’s a fascinating little essay by Cory Doctorow on . . . well, it’s complicated. He’s explaining why he’s happy with his decision to self-publish his new collection of stories, but he’s using that situation to explore the problem — or the “problem” — of having too much information and too many options:

I'm not sorry I decided to become a publisher. For one thing, it's been incredibly lucrative thus far: I've made more in two days' worth of the experiment than I made off both of my previous short story collections' entire commercial lives (full profit/loss statements will appear as monthly appendices in the book). And I'm learning things about readers' relationship to writers in the 21st century.

But more than ever, I'm realising that the old problem of overcoming constraints to action has been replaced by the new problem of deciding what to do when the constraints fall away. The former world demanded relentless fixity of purpose and quick-handed snatching at opportunity; the new world demands the kind of self-knowledge that comes from quiet, mindful introspection.

That last sentence is great, and worthy of much reflection. When opportunities for acquiring and disseminating knowledge were fewer, we had to act quickly to seize them: who know when another would come by? But now, with so much we can know and so many ways to get our ideas out into the world, we need to seek time and space to filter through the options. We need, as never before, the virtues of discernment.

There’s something to think about in the holiday season. I’ll be back in a few days. In the meantime, a Merry Christmas to all, and God bless us every one!

Wednesday, December 22, 2010

portability policies

A typically thoughtful and thought-provoking essay by Jonathan Zittrain, emphasizing the need for internet users — especially those reliant on the cloud for storage of their data — to think about portability as much as they think about privacy:

We enjoy access to massive archives of our digital trail in the form of emails, chats, comments, and other bits of personal ephemera, all stored conveniently out in the cloud, ready to be called up or shared in a moment, from wherever we happen to be, on whatever device we choose. The services stowing that data owe a commitment of privacy defined by a specific policy—one that we can review before we commit. Yet if any of the cloud services we use restrict our ability to extract our data, it can become stuck—and we can become locked into those services. The solution there is for such services to offer data portability policies to complement their privacy policies before we begin to patronize them, to help preserve our freedom to choose services over the long term. By dismissing the principle of net neutrality, however, we endanger that ability not just by one cloud service provider but across the board: ISPs can perform deep packet inspection to glean whatever they can about us as we correspond with different sites across the Internet, and our data can become stranded in places as the shifting sands of our ISPs’ access policies constrict access to places they disfavor. Just as international diplomacy depends on the principles of the inviolable embassy, la valise diplomatique, and mutual reciprocity to operate in the ultimate best interests of all involved, so does net neutrality depend on maintaining an online environment that preserves those aspects that made it such a valuable and central part of modern life in the first place.

The analogy to international diplomatic law is especially interesting. My view of cloud storage for my own data seems to change day by day. . . .

Tuesday, December 21, 2010

the inconsistent relativist

Martin Rundkvist writes:

I'm a cultural and aesthetic relativist. This means that I acknowledge no objective standards for the evaluation of works of art. There are no definitive aesthetic judgements, there is only reception history. There is no objective way of deciding whether Elvis Presley is better than Swedish Elvis impersonator Eilert Pilarm. It is possible, and in fact rather common, to prefer Lady Gaga to Johann Sebastian Bach. De gustibus non est disputandum.

And he continues,

Conan the Barbarian, or Aragorn son of Arathorn, or Ronia the Robber's Daughter all represent something of central importance to the heritage sector and to the humanities in general. At the same time, on the one hand they embody something we must always seek to achieve, that is the wondersome excitement of discovering a fantastic past - and on the other something we must avoid if we are to fill any independent purpose at all, as these characters and the worlds they inhabit are fictional. Historical humanities, excepting the aesthetic disciplines, deal with reality. This is our unique competitive selling point that we must never lose sight of.

So here’s my question: if we have no grounds on which to say that one thing is better than another, on what grounds can we say that a particular story or character is “of central importance to the heritage sector and to the humanities in general”? That is, if it’s impossible to make “definitive aesthetic judgments,” what makes it possible to make such definitive judgments about what’s important and what isn't? I’m not sure you can consistently be a relativist about the one and a dogmatist about the other.

Rundkvist, like a lot of people, allows the word “objective” to get him off track. “Universal” is problematic in the same way. Aesthetic judgments, like moral and historical ones, are never made from nowhere and by no one; they’re made by real people in concrete situations, and the needs of both people and situations vary. But such judgments have to be made, and they can be made reliably. It’s really not that hard to make the case that Bach’s music is better than Lady Gaga’s, though there will be situations in which Bach’s music won't be the thing called for. And in much the same way one can make a case for the cultural value and historical importance of pulp fiction, though that importance will be rather different than the kind of importance that attaches itself to Bach's music. We make these kinds of judgments all the time, and only paralyze ourselves when we start invoking terms like "objective" and "universal."

Maybe more about this later. . . .

Monday, December 20, 2010

metadata and our discontents

See that? Huge spike on the word "internet" in . . . 1903. Natalie Binder explains why Google's really bad metadata is going to limit the usefulness of the word-hoard it is assembling.
But hey: it's going to get better.

getting started with Ngrams

Ben Schmidt writes the smartest thing I’ve yet seen about Google’s Ngram project:

But for now: it's disconnected from the texts. This severely compromises its usefulness in most humanities applications. I can't track evolutionary language in any subset of books or any sentence/paragraph context; a literary scholar can't separate out pulp fiction from literary presses, much less Henry James from Mark Twain. It was created by linguists, and treats texts fundamentally syntactically--as bags of words linked only by very short-term connections--two or three words. The wider network of connections that happen in texts is missing.

Don't doubt that it's coming, though. My fear right now is that all of the work is proceeding without the expertise that humanists have developed in understanding how to carefully assess our cultural heritage. The current study casually tosses out pronouncements about the changing nature of 'fame' in 'culture' without, at a first skim, at least, acknowledging any gap at all between print culture and the Zeitgeist. I know I've done the same thing sometimes, but I'm trying to be aware of it, at least. An article in Science promising the "Quantitative Analysis of Culture" is several bridges too far.

So is it possible to a) convince humanists they have something to gain by joining these projects; b) convincing the projects that they're better off starting within conversations, not treating this as an opportunity to reboot the entire study of culture? I think so. It's already happening, and the CHNM–Google collaboration is a good chance. I think most scholars see the opportunities in this sort of work as clearly as they see the problems, and this can be a good spur to talk about just what we want to get out of all the new forms of reading coming down the pike. So let's get started.

Yes, yes, yes. Let the traditional humanists stop sneering; let those on the digital frontier shun the language of “reinvention” and avoid suggesting that they have rendered other approaches to the humanities obsolete. (There aren't many that arrogant, but there are a few.) Also, this project does not create a new field. Let’s get started is just the right note to strike.

(The article from Science that Schmidt mentions is here.)

UPDATE: If you want to get some thoughts from someone who, unlike yours truly, actually knows what he's talking about, check out Dan Cohen.

Thursday, December 16, 2010

my philosophy of life

Posting will be light to nonexistent for a few days, as I travel to Alabama to visit my family and try to wrap up the end-of-semester . . . stuff, so let me just leave you to meditate on these words of wisdom, words worthy of governing a wise man's life: You have to play the ball where the monkey drops it.

Wednesday, December 15, 2010

paginating, embedding

Two cool posts from if:book: first, a defense of the value of pagination, even in digital texts:

It's important to realise what you're doing when you're scrolling. You're gazing at the line you were reading as you draw it up the screen, to near the top. When it gets to the top, you can continue reading. You do this very quickly, so it doesn't really register as hard work. Except that it changes your behaviour -- because a misfire sucks. A misfire occurs when you scroll too far too rapidly, and the line you were reading disappears off the top of the screen. In this case, you have to scroll in the other direction and try to recognise your line -- but how well do you remember it? Not necessarily by sight, so immediately you have to start reading again, just to find where you were. . . .

Beyond this, even if you have startling accuracy, still you are doing a lot of work, because your eyes must track your current line as it animates across the screen. For sustained reading, this quickly gets physically tiring.

Pagination works for long text, not because it has a real-world analogy to printed books or whatever, but because it maximises your interface: you read the entire screenful of text, then with a single command, you request an entirely new screenful of text. There's very little wastage of attention or effort. You can safely blink as you turn.

A strong argument.

Second, I didn't realize that the Internet Archive has created a cool tool for embedding whole books in webpages. I’m still trying to decide how useful this is, and what its uses might be, but anyway, it is cool.

Tuesday, December 14, 2010

reading personally

In support of Sarah Werner’s thoughtful comment on my previous post, I’d like to cite this wonderful passage from Edward Mendelson’s book The Things That Matter:

Anyone, I think, who reads a novel for pleasure or instruction takes an interest both in the closed fictional world of that novel and in the ways the book provides models of examples of the kinds of life that a reader might or might not choose to live. Most novels of the past two centuries that are still worth reading were written to respond to both of those interests. They were not written to be read objectively or dispassionately, as if by some nonhuman intelligence, and they can be understood most fully if they are interpreted and understood from a personal point of view, not only from historical, thematic, or analytical perspectives. A reader who identifies with the characters in a novel is not reacting in a naïve way that ought to be outgrown or transcended, but is performing one of the central acts of literary understanding.

Can “identifying” with characters, or reading in order to learn more about yourself, be done badly? Of course. But that would be a poor reason for repudiating such reading altogether. Academic criticism can be done badly too. Or so I hear.

Oprah's Dickens

Hillary Kelly at The New Republic is experiencing considerable agitá about Oprah’s selection of two Dickens novels for her book club:

But what can Oprah really bring to the table with these books? Oprah has said that, together, the novels will “double your reading pleasure.” But is that even true? And do the novels even complement each other? Can you connect Miss Havisham’s treatment of time to Carton’s misuse of his “youthful promise”? Well, don’t ask Oprah herself, as she “shamefully” admits she has “never read Dickens.” . . .

Even more confusingly, Oprah’s comments about Dickens making for cozy reading in front of a winter fire misinterprets the large-scale social realism of his work. It stands to reason that her sentimentalized view of Dickens might stem from A Christmas Carol — probably his most family-friendly read and one of his most frequently recounted tales. But her quaint view of Victoriana, as she’s expressed it, belies an ignorance of Dickens’s authorial intentions. Indeed, both A Tale of Two Cities and Great Expectations are dark and disturbing, with elaborate ventures into the seedy underbelly of London and the bloody streets of Paris. How can we trust a literary guide who, ignorant of the terrain ahead, promises us it will be light and easy? . . .

Indeed, Oprah’s readers have been left in the dark. They must now scramble about to decipher Dickens’s obscure dialectical styling and his long-lost euphemisms. . . .

And so on — at considerable length. Yes, we certainly can't countenance such a thing — masterpieces of literature being read by naïve people lacking certified professional instruction! Oh, the humanity! What if someone got caught up in Pip’s love for Estella, or Sydney Carton’s noble self-sacrifice, but failed to parse Dickens’s incisive critique of the Victorian social order? Can you imagine the consequences?

Kelly’s core concern is summed up here: “the sad truth is that, with no real guidance, readers cannot grow into lovers of the canon.” Cannot? Actually, that isn't a sad truth — it’s not a truth at all — though it is quite sad that someone thinks the world’s greatest works of art are so powerless to reach an audience without academic assistance. As a corrective to such dark thoughts she should read another Dickens novel, David Copperfield, especially this passage:

My father had left a small collection of books in a little room upstairs, to which I had access (for it adjoined my own) and which nobody else in our house ever troubled. From that blessed little room, Roderick Random, Peregrine Pickle, Humphrey Clinker, Tom Jones, the Vicar of Wakefield, Don Quixote, Gil Blas, and Robinson Crusoe, came out, a glorious host, to keep me company. They kept alive my fancy, and my hope of something beyond that place and time, — they, and the Arabian Nights, and the Tales of the Genii, — and did me no harm; for whatever harm was in some of them was not there for me; I knew nothing of it. . . . It is curious to me how I could ever have consoled myself under my small troubles (which were great troubles to me), by impersonating my favourite characters in them — as I did — and by putting Mr. and Miss Murdstone into all the bad ones — which I did too. I have been Tom Jones (a child's Tom Jones, a harmless creature) for a week together. I have sustained my own idea of Roderick Random for a month at a stretch, I verily believe.

“This was my only and my constant comfort,” David concludes. “When I think of it, the picture always rises in my mind, of a summer evening, the boys at play in the churchyard, and I sitting on my bed, reading as if for life.”

Reading as if for life. And with no teachers in sight. A miracle indeed; but one repeated every day. Oprah is giving many, many people an incentive to have experiences like David Copperfield’s, and by my lights that’s not a bad thing.

Francecso Franchi

For the chart lover, much to love in Francesco Franchi's Flickr photostream. Via Ministry of Type. (Click on the photo for a much larger and awesomer version.)

Monday, December 13, 2010

closed minds

Peter Conrad:

Hillier compares Chesterton to Dr Johnson, whom he physically resembled thanks to his dropsical belly and rolling gait, and whom he often impersonated in pageants. But Johnson's gruff dismissals – of Scotland, of opera, of Sterne's Tristram Shandy, and of anything or anyone who irritated him – were the expression of quirky prejudice; unlike Chesterton, he never pretended to papal infallibility. Johnson prevailed by bullying Boswell, but Chesterton threatened anathema, as when he disposed of the Enlightenment by announcing: "I know of no question that Voltaire asked which St Thomas Aquinas did not ask before him – only St Thomas not only asked, but answered the questions." There speaks a man with a closed mind, a neo-medievalist who abhorred Jews and pined for the return of an agrarian feudal economy in which with every man would be allocated "two acres and a cow".

Once someone says that Johnson — a man who by his own admission “talked for victory,” and was labeled "The Great Cham [Khan] of Literature" (by Oliver Goldsmith) for good reason — “never pretended to papal infallibility,” you need to be on your guard when he says anything else. Though Johnson is the incomparably greater writer, he and Chesterton manifested a similarly complex balance of confidence and vulnerability. And only a very closed mind — or a very ill-informed one — could deny that Aquinas did indeed anticipate and respond to the key questions posed against the Deity by Voltaire. One may not agree with the Angelic Doctor’s answers to questions Voltaire and his admirers thought unanswerable, but they are there.

Chesterton in that passage is merely trying to point out that our ancestors did not believe as they did merely out of ignorance. They thought about many of the same issues Whiggishly self-congratulatory late-moderns think about, but often came to different conclusions. And it’s actually rather instructive to discover what those conclusions are, and how they reached them. Aidan Nichols’s Discovering Aquinas is quite helpful in this regard.

Friday, December 10, 2010

information considered and reconsidered

Here's a wonderful little post by James Gleick about the meaning of the word "information," according to the OED. A palace indeed. This reminds of of one of my favorite books, Jeremy Campbell's Grammatical Man: Information, Entropy, Language, and Life — probably the first book I read that suggested serious connections among my own work (the interpretation of texts), cognitive science, and computers. It was this book that told me who Claude Shannon is and why he matters so very, very much to the world we now inhabit.

Grammatical Man is almost thirty years old now and much of the science is clearly outdated, but it's still fascinating, and I wish someone brilliant would tell the same story again, in light of present knowledge. Maybe a job for Steven Johnson?

Thursday, December 9, 2010

adventurousness and its enemies, part 3

Nick Carr's post on Craig Mod's brief for interactive storytelling is more incisive and cogent than mine. Not that that's any great achievement in itself . . . but just read Nick's post.

sociopathy

Susan Orlean has written a beautiful, melancholy post about the challenges of dealing with her mother's physical and mental decline — and having to deal with it from hundreds of miles away. She writes,
Sometimes I’m dazzled by how modern and fabulous we are, and how easy everything can be for us; that’s the gilded glow of technology, and I marvel at it all the time. And then my mom will call, and in the course of the conversation she’ll say something disjointed that disturbs me and reminds me of her frailty, and then she’ll mention that it’s snowing hard in Ohio and I’ll wonder how she’s going to get to the grocery store, and I look at my gadgets and gizmos, and I realize none of them will help me. If anything, they’ve filled me with the unreal idea that everything is possible; that virtual is actual; that you can delete things you don’t like; that you can find and have whatever it is you want whenever you want it; but instead I’m learning that the truest, immutable facts of life are a lot harder and slower and sometimes sadder, and always mystifying.
Please do read the whole little essay, which is touching and true.
The first commenter on the post responds in this way: "Susan, why does your note seem a notch too precious to me? We're all amateurs, but we all muddle through. Perhaps it's the Manhattan lifestyle, but most of us expect to have to do these things, take care of children and parents."
Now, there are answers to this comment. One might note that it's one thing to expect to deal with suffering, another thing altogether to be thrown into the midst of it. Theory and practice, you know. One might note that this comment could be equally directed towards someone who wrote a post about being diagnosed with cancer: "Perhaps it's the Manhattan lifestyle, but most of us expect that we will suffer and die." (So quit your whining.) One might also ask what "the Manhattan lifestyle" has to do with anything.
But nobody is likely to bother, because we all know that a person who writes something like this is one of two kinds of sociopath: the simple kind, who genuinely has no compassion for someone else's pain, or the complex kind, who suppresses any compassion in order to try to hurt someone he doesn't know, just for kicks. Obliviousness or intentional cruelty, those are the options. And in either case a critique is futile: the first kind of sociopath wouldn't understand that there's a problem, and the second would just smirk with satisfaction at having gotten a rise out of someone.
Yes, I know, I come back to this over and over again. But I think it matters. This kind of response is so common in online discourse that it forces, or should force, all of us to ask just what kind of people surround us, just how many of these sociopaths there are, and what variety they tend to be. And why they're like that.
Just after I read the post by Orlean and (unfortunately) allowed my eyes to drift down to the comments, I read at Letters of Note the fifteen-year-old John Updike's commendation of the Little Orphan Annie comic strip. Among other things, Updike wrote:
I admire the magnificent plotting of Annie’s adventures. They are just as adventure strips should be—fast moving, slightly macabre (witness Mr. Am), occasionally humorous, and above all, they show a great deal of the viciousness of human nature. I am very fond of the gossip-in-the-street scenes you frequently use. Contrary to comic-strip tradition, the people are not pleasantly benign, but gossiping, sadistic, and stupid, which is just as it really is.
That about sums it up.

Wednesday, December 8, 2010

The Pleasures of Reading are coming at you! (also in a good way)

Not on Amazon yet, but at OUP's site.

The Age of Anxiety is coming at you! (in a good way)

Pre-order here.

class blogs

On Twitter this morning I asked for thoughts on how best to run a class blog, and replies are coming in. People are reminding me of Mark Sample's excellent post on "blog audits," and are tossing around other ideas too.
When I set up blogs for class I tell students that there are five kinds of participation they can engage in:
    • Offering an interpretation of something we've read;
    • Asking a question about something we've read;
    • Linking to, quoting from, and responding to online articles or essays about what we're reading;
    • Providing contextual information — biographical, historical, whatever — about the authors we're reading and their cultural and intellectual worlds;
    • Commenting on the posts of their fellow students.
    One question I have is whether I should value some of these kinds of post more highly than others, and reward them accordingly. Any thoughts about that? Any other suggestions?

    adventurousness and its enemies, part 2

    It's not just in writing that the social can militate against innovation: it happens in teaching too. Some administrators want teachers to be willing to tweak their assignments, their syllabi, and their use of class time on a weekly, or even daily, basis, in response to student feedback — and then simultaneously insist that they want teachers to be imaginative and innovative.
    But these imperatives are inconsistent with one another, because students tend to be quite conservative in such matters; and the more academically successful they are, the more they will demand the familiar and become agitated by anything unfamiliar and therefore unpredictable. It is possible for a good teacher to manage this agitation, but it's not easy, and it requires you to have the courage of your convictions.
    You get this courage, I think, by being willing to persist in choices that make students uncomfortable. Now, some student discomfort results from pedagogical errors, but some of it is quite salutary; the problem is that you can't usually tell the one from the other until the semester is over — and sometimes not even then. I have made more than my share of boneheaded mistakes in my teaching, but often, over the years, I have had students tell me, "I hated reading that book, but now that I look back on it I'm really glad that you made us read it." Or, "That assignment terrified me because I had never done anything like it, but it turned out to be one of the best things I ever wrote." But if I had been faced to confront, and respond to, and alter my syllabus in light of, in-term opposition to my assignments, I don't know how many of them I would have persisted in. It would have been difficult, that's for sure.
    The belief that constant feedback in the midst of an intellectual project is always, or even usually, good neglects one of the central truths of the life of the mind: that the owl of Minerva flies only, or at least usually, at night.

    Tuesday, December 7, 2010

    adventurousness and its enemies

    Yesterday I wrote that insofar as writing becomes social, it will become less, not more, adventurous. Here’s why: imagine that James Joyce drafts the first episode of Ulysses and posts it online. What sort of feedback will he receive, especially from people who had read his earlier work? Nothing very commendatory, I assure you. By the time he posts the notoriously impenetrable third episode, with its full immersion in the philosophical meditations of a neurotic hyperintellectual near-Jesuit atheist artist-manqué, the few readers who haven't jumped ship already will surely be drawing out, and employing, their long knives. Then how will they handle the introduction of Leopold Bloom, and all the attention given to the inner life of this seemingly unremarkably and coarse-minded man? And, much later, the nightmare-fantasia in Nighttown? It doesn't bear thinking of.

    Would Joyce be able to resist the immense pressures from readers to give them something they recognize? Of course he would; he’s James Joyce. He doesn't give a rip about their incomprehension. (Which is why he wouldn't post drafts online in the first place, but never mind.) But how many other writers could maintain their commitment to experimentation and innovation amidst a cacophony of voices demanding the familiar? — which is, after all, what the great majority of voices always demand.

    Monday, December 6, 2010

    crabwise

    Umberto Eco always makes me think:
    I once had occasion to observe that technology now advances crabwise, i.e. backwards. A century after the wireless telegraph revolutionised communications, the Internet has re-established a telegraph that runs on (telephone) wires. (Analog) video cassettes enabled film buffs to peruse a movie frame by frame, by fast-forwarding and rewinding to lay bare all the secrets of the editing process, but (digital) CDs now only allow us quantum leaps from one chapter to another. High-speed trains take us from Rome to Milan in three hours, but flying there, if you include transfers to and from the airports, takes three and a half hours. So it wouldn’t be extraordinary if politics and communications technologies were to revert to the horse-drawn carriage.

    brave new digital world (number 3,782 in a series)

    Craig Mod on how the digital world changes books:

    The biggest change is not in the form stories take but in the writing process. Digital media changes books by changing the nature of authorship. Stories no longer have to arrive fully actualised. On the simplest level, books can be pushed to e-readers in a Dickensian chapter-by-chapter format — as author Max Barry did with his latest book, Machine Man. Beyond that, authorship becomes a collaboration between writers and readers. Readers can edit and update stories, either passively in comments on blogs or actively via wiki-style interfaces. Authors can gauge reader interest as the story unfolds and decide whether certain narrative threads are worth exploring.

    For better or for worse, live iteration frees authors from their private writing cells; the audience becomes directly engaged with the process. Writers no longer have to hold their breath for years as their work is written, edited and finally published to find out if their book has legs. In turn they can be more brazen and spelunk down literary caves that would have hitherto been too risky. The vast exposure brought by digital media also means that those risky niche topics can find their niche audiences.

    “Dickensian” in the first paragraph above gives away too much of the game; maybe all of it. In the Victorian era, books were pushed to magazines in a chapter-by-chapter format, and indeed, in era and in all others allowing for serial publication, “Authors [could] gauge reader interest as the story [unfolded] and decide whether certain narrative threads [were] worth exploring.”

    As for readers editing and updating stories, that was certainly a major feature of the publishing world before copyright laws became clear and enforceable: consider, for instance, the unknown writer who altered and continued Don Quixote, pausing only to mock Cervantes for his poverty and his war injuries (Cervantes had lost the use of an arm in the Battle of Lepanto).

    There are many literary activities that digital technology makes easier; I’m not sure there are any that it makes possible. And not all the things it makes easier are worth doing. We need to consider these matters on a case-by-case basis.

    As for the claim that digital technologies will make writers bolder, I think just the reverse is true, but I’ll explain that in a later post.

    Friday, December 3, 2010

    opting out of the monopolies

    At the Technology Liberation Front, Adam Thierer has been reviewing, in installments, Tim Wu’s new book The Master Switch, and has received interesting pushback from Wu. One point of debate has been about the definition of “monopoly”: Wu wants an expansive one, according to which a company can have plenty of competition, and consumers multiple alternatives, and yet that company can still be said to have a monopoly. (Thierer responds here.)

    I think Wu’s definition is problematic and not, ultimately, sustainable, but I see and sympathize with his major point. I can have alternatives to a particular service/product/company, and yet find it almost impossible to escape it because of what I’ve already invested in it. When I read stories like this, or talk to friends who work for small presses, I tell myself that I should never deal with Amazon again — and yet I do, in part because buying stuff from Amazon is so frictionless, but also because I have a significant number of Kindle books now, and all those annotations that I can access on the website. . . . I don't want to lose all that. I can feel my principles slipping away, just as they did when I tried to escape the clutches of Google.

    Amazon is not, technically speaking, a monopoly, and neither is Google. But they have monopoly-like power over me — at least for now. And I need to figure out just how problematic that is, and whether I should opt out of their services, and (if so) how to opt out of them, and what to replace them with. . . . Man, modern life is complicated. These are going to be some of the major moral issues of the coming decades: ones revolving around how to deal with services that have a monopolistic role in a given person’s life. Philip K. Dick saw it all coming. . . .

    Thursday, December 2, 2010

    less than singular

    Cosma Shalizi (click through to the original for important links):

    The Singularity has happened; we call it "the industrial revolution" or "the long nineteenth century". It was over by the close of 1918.

    Exponential yet basically unpredictable growth of technology, rendering long-term extrapolation impossible (even when attempted by geniuses)? Check.

    Massive, profoundly dis-orienting transformation in the life of humanity, extending to our ecology, mentality and social organization? Check.

    Annihilation of the age-old constraints of space and time? Check.

    Embrace of the fusion of humanity and machines? Check.

    Creation of vast, inhuman distributed systems of information-processing, communication and control, "the coldest of all cold monsters"? Check; we call them "the self-regulating market system" and "modern bureaucracies" (public or private), and they treat men and women, even those whose minds and bodies instantiate them, like straw dogs.

    An implacable drive on the part of those networks to expand, to entrain more and more of the world within their own sphere? Check. ("Drive" is the best I can do; words like "agenda" or "purpose" are too anthropomorphic, and fail to acknowledge the radical novely and strangeness of these assemblages, which are not even intelligent, as we experience intelligence, yet ceaselessly calculating.)

    Why, then, since the Singularity is so plainly, even intrusively, visible in our past, does science fiction persist in placing a pale mirage of it in our future? Perhaps: the owl of Minerva flies at dusk; and we are in the late afternoon, fitfully dreaming of the half-glimpsed events of the day, waiting for the stars to come out.

    Beautifully put. But I would argue that any Singularity that can happen without our noticing, or whose happening is a matter for debate, cannot be the Singularity that its worshippers hope for. The unrecognized eschaton is no eschaton at all.

    Wednesday, December 1, 2010

    Daily Lit

    Here's the place to go if you would like to have books emailed to you in installments, at a frequency you set yourself. At first I strongly disliked this idea: I thought, "But what if I get to an end of an installment and still have the time and the inclination to read further?" On further reflection, though, I remembered that many of the books one might choose to read through this service — the novels of Dickens most notably — came originally on the installment plan. So in one sense this returns fiction to an early mode of delivery.
    The pleasures of reading plot-driven books are often increased by such periodicity of encounter. During the Harry Potter Years I took delight in the periods between books when we fans could speculate about what was coming next. Conversely, when I recently read the massive cartoon epic Bone I found it thoroughly mediocre — but then suspected that I would have enjoyed it a good deal more if I had read it in its original installments and therefore had that time for speculation. So maybe the Daily Lit can exploit those pleasures.
    Only for plot-driven books, though. Character studies require something more like immersion in their intellectual environment. Can you imagine trying to read Mrs. Dalloway at the rate of a few pages a week?