Text Patterns - by Alan Jacobs

Thursday, August 21, 2014

The Genealogy of "Carol Brown": An Intertextual Reading of Parodic-Travestying Song

The Flight of the Conchords’ “Carol Brown (Choir of Ex-Girlfriends)” is an exemplary case study in the intertextualty of the comic song, or rather, the parodic-travestying song (see Bakhtin, “From the Prehistory of Novelistic Discourse”). Its major and obvious debts are to two previous popular songs, one American and one English, which, given that the Conchords are from New Zealand, might allow us to note the ongoing generative power of the postcolonial; but those concerns may perhaps be set aside for now. The tropes of a certain masculinist discourse shall be our primary focus here. “Carol Brown” and its ancestors point to a kind of “gender trouble” (see Judith Butler’s book of that title) in parodic-travestying popular song.

When “Jermaine” — let us employ, with due reservation, his self-nomination — sings “There must be fifty ways that lovers have left me,” he’s clearly signaling a debt to Paul Simon’s 1975 song “Fifty Ways to Leave Your Lover.”

But though “Fifty Ways” is explicitly invoked by the Conchords, a perhaps more direct and substantial influence goes unremarked. This is “Song for Whoever,” by The Beautiful South (1989).

Note that each of the three songs features a list of names, hearkening back to "Madamina, il catalogo è questo” — the famous “catalogue aria” from Mozart and da Ponte’s Don Giovanni — and perhaps even to the genealogies of the Hebrew and Christian Bibles (see, e.g., Genesis 5 and Matthew 1). 

Of the three songs, “Fifty Ways to Leave Your Lover” might at first seem to be the least thoroughly captured by the masculinist rhetorical enterprise, since it features a woman listing the names of men: Jack, Stan, Roy, Gus, and Lee. But this appearance is misleading: note that no woman actually speaks in the song, but rather is spoken for by the masculine singer — and the emphasis is solely on how she relates to him: “The problem is all in side your head, she said to me.” (This is not a song that would pass any musical version of the Bechdel Test.) If a woman seems to have power in this song, it is power yielded to her by the singer, provisionally and temporarily. He remains the true decision-maker. 

“Song for Whoever” is more obviously and flagrantly sexist, with its frank emphasis on using the tears of women for financial and reputational gain: “The Number One I hope to reap / Depends upon the tears you weep, / So cry, lover, cry.” Yet the song ultimately deconstructs itself, reaching its aporia in the namelessness of the singer: it is only the women who receive names, while he remains a cipher. He claims the power of speech and song — like Orpheus — but can only receive it by giving up his name, while the specificities of identity remain with the denigrated women. This reversal of power is indirectly acknowledged at the end of the song, with its narration of female vengeance — meant by the singer to be feared, but understood by the listener as a proper and indeed necessary act of retributive justice. 

This “return of the repressed,” as Freud might have called it, finds a completion and intensification in the video of “Carol Brown.” Note here the presence of the woman's name even in the song’s very title — indicative of things to come, as the singer strives unsuccessfully to control the narration of his sexual history. His crucial mistake is the decision to display images of his former lovers, with the obvious purpose of subjecting them to the masculine gaze — but to his surprise and consternation, those images come to life: an ideal instance of the feminine subaltern speaking back to masculinist power. 

Who organized all my ex-girlfriends into a choir  
And got them to sing?  
Ooh ooh ooh, shut up  
Shut up girlfriends from the past  

But — and this is the key point — they do not shut up. (He later repeats his order — “I thought I told you to shut uh-uh-up” — but they do not obey.) Through utterance they overcome their status as mere images, and take control of the song. As Baudrillard might put it, the simulacrum here becomes the hyperreal — and thereby the undoing of the Don Giovanni figure is complete. 

Let me close with one ambiguous, and ambivalent, note. The wild card in “Carol Brown,” the figure that represents and enacts excess of signification, is “Bret” — whose evident chief trait here is silence. Unlike “Jermaine” and unlike the “Choir of Ex-Girlfriends” he does not sing. And yet he acts: and his primary acts involve manipulation of the image of “Jermaine,” including, most notably, shrinking him. Thus through “Bret” we see the reversal of the woman-as-enlarging-mirror trope that Virginia Woolf limned so memorably in A Room of One’s Own.

One might then see Bret as a Trickster figure — see Lewis Hyde’s Trickster Makes This World, though one might also describe “Bret” as a “whiteface” version of the “signifying monkey” about which Henry Louis Gates has written so incisively — but a trickster acting in order to help liberate women from imprisonment in the image constructed by the masculine gaze. But does such behavior enact a genuine male feminism? Or does it rather re-inscribe masculinist control in the deceptive guise of the Liberator? These questions will have to be pursued at a later date. 

Wednesday, August 20, 2014

the gravitational pull of DFW

Wallace Books DeLillo 002 large

Ever since the Harry Ransom Center acquired the papers of David Foster Wallace and started posting photos of his annotated books, there has been a great deal of fuss about them. I think I even posted a few images myself on my Tumblr and/or here. People really started going into rhapsodies when someone posted what he said was DFW’s copy of Ulysses — though eventually he revealed that it was just a prank.  

DFW has become something like a patron saint of close reading, and who knows how many young writers and would-be writers out there have started writing copiously in their books in imitatio Davei? It’s hard to regret this, since careful, attentive reading is a pretty cool thing to be the patron saint of. And even if people start annotating just to be like DFW rather than to understand their books better, chances are that the practice will indeed help them as readers if they stick with it. Fake it ‘till you make it, as the wise men say. 

Mike Miley has been working in the DFW archives, and has found it a somewhat harrowing experience, in two ways. First, there is DFW’s habit of reading everything as a commentary on his own struggles and pathologies: 

Critics and fans alike rhapsodize about identifying with David Foster Wallace’s writing as though it can only be consoling and empowering, and I used to think so too, until I got too close and discovered what may be the most important truth about literature, the true “aesthetic benefit of close reading,” though I doubt the Mellon Foundation would be all that interested in hearing about my discovery, as it is beneficial only in the most cautionary of senses: there is such a thing as reading too closely.

Wallace’s annotations suggest that he had been reading too closely, searching for too much validation, guidance, or comfort in the books he read, to the point that his reading only wound up reinforcing his worst tendencies. Wallace found no escape from himself while he was reading; rather, his personal library remained just that: personal, continually bringing him back to his own struggles and inadequacies.

But there is also the danger, the greater danger, that the devoted fan will imitate DFW not just in his moral earnestness and intellectual rigor, but in that very self-absorption: 

And I found myself in danger of following him. Yes, this begins and ends as being about me, the guy in the frosty reading room in Austin, for fandom is always about the fan; the self is always the subject. The artist is, at best, the mask fans wear to distract themselves from the fact that they are looking into a mirror. I learned far more about myself through reading Wallace reading than I learned about David Foster Wallace. I discovered I had been reading Wallace too closely. For years I looked to Wallace for answers to just about everything — how to think, how to live, what to read and how. Turns out, I got what I wanted, if what I wanted was a more erudite way to criticize myself or a higher, more crippling level of self-consciousness than I already had. I did wind up understanding myself better, if only to understand where I might be headed and what I must avoid becoming.

This is why I’ve taken over two years to finish writing this, why I’ve stalled out time and time again in search of the right voice or style or insight into something that feels both too large for me to take on and too close for me to see clearly. This “DFW” persona, this mental state of Wallace’s, was a reflection of mine as well, albeit distorted and exaggerated through a funhouse mirror darkly. Wallace’s work reads like a more articulate, insightful version of the ticker-tape running in our own skulls — this is the cliché that everyone employs to describe Wallace’s writing, and for me it is absolutely true. However, no one really interrogates what that statement means or how far something like that goes. If I keep reading Wallace this closely, will I end up resembling him even more closely? Do the devices I borrow from him here — self-aware reportage, direct interrogation, hyperbolic jokes about mundane locations — show that I have moved beyond him or simply fallen further under his influence? If I continue on this path of emulation, will I reach the same conclusions about being alive as he did?

Miley’s essay is a sobering one, and you get the sense that he reached this level of genuine self-awareness (as opposed to mere self-absorption) just in time. 

I don’t think I’ve seen, in my lifetime, a writer who has generated the kind and intensity of veneration that DFW has. We might contrast his fans to, say, Tolkien fans, who know a little bit about the author — enough to have an image of a man in a colorful waistcoat smoking a pipe — but who can’t spare much time for him because they are so fully absorbed in his legendarium. But the people I know who love every word of Infinite Jest are also fascinated by Wallace himself: they are constantly aware of him as its author, of its relations to the circumstances of his own life.

Montaigne said of his Essays that “It is a book consubstantial with its author,” and this seems to be true for everything DFW wrote. Absorption in his work seems almost necessarily to involve scrutiny of his life. And given how his life ended, it’s hard not to see this as a worrisome trend. What I wouldn’t give for a detailed and sensitive ethnography of DFW devotees — something like what Tanya Luhrmann did for charismatic evangelicals. 

Tuesday, August 19, 2014

trolls gonna troll

Here (PDF) is some interesting — or depressing, or unsurprising, or all of the above — research on how people in online communities respond to feedback from their peers. The chief emphasis here is on how the more aggressive and hostile members of such communities respond to being called out for their bad behavior, especially when that calling-out takes the form of being modded down by other members. 

Basically, the response of such folks is twofold. First, they make a point of downvoting other people. Second, they double down on their aggression. So: in online communities aggressive and hostile people respond to criticism by intensifying their aggression and hostility. 

If such people primarily want attention from their peers, then the strategy is a reasonable one. Which is, in relation to my first sentence, why I choose “all of the above” to describe the research. 

On a low-traffic site like this one, it’s feasible for all comments to be held for moderation by me. On high-traffic sites there seems to be no workable solution — except, of course, to eliminate comments altogether

Thursday, August 14, 2014

what Facebook wants you to know (or not)

Net neutrality not an issue for you? You find Facebook’s algorithmic selectivity non-problematic?

Read Zeynep Tufekci :

And then I switched to non net-neutral Internet to see what was up. I mostly have a similar a composition of friends on Facebook as I do on Twitter.  

Nada, zip, nada.

No Ferguson on Facebook last night. I scrolled. Refreshed. This morning, though, my Facebook feed is also very heavily dominated by discussion of Ferguson. Many of those posts seem to have been written last night, but I didn’t see them then.

Overnight, ‘edgerank’ –or whatever Facebook’s filtering algorithm is called now — seems to have bubbled them up, probably as people engaged them more. But I wonder: what if Ferguson had started to bubble, but there was no Twitter to catch on nationally? Would it ever make it through the algorithmic filtering on Facebook? Maybe, but with no transparency to the decisions, I cannot be sure.

Stay on Facebook, and you’ll know only what Facebook wants you to know. 

And if that doesn’t worry you, consider this point from a recent talk by Maciej Ceglowski:

The relationship between the intelligence agencies and Silicon Valley has historically been very cozy. The former head of Facebook security now works at NSA. Dropbox just added Condoleeza Rice, an architect of the Iraq war, to its board of directors. Obama has private fundraisers with the same people who are supposed to champion our privacy. There is not a lot of daylight between the American political Establishment and the Internet establishment. Whatever their politics, these people are on the same team.

Something to keep in mind. Always. 

report from the Luddite kingdom

What world does Michael Solana live in? Apparently, a world where Luddites have taken power and have driven our kind and benevolent technologists into some pitiful hole-and-corner existence, where no one dares to suggest that technology can solve our problems. "Luddites have challenged progress at every crux point in human history. The only thing new is now they’re in vogue, and all our icons are iconoclasts. So it follows here that optimism is the new subversion. It’s daring to care. The time is fit for us to dream again.” 

Yes! Dare to dream! But take great care — do you realize what those Luddites will do to you if you as much as hint that technology can solve our problems? 

I have to say, it’s pretty cool to get a report from such a peculiar land. Where you and I live, of course, technology companies are among the largest and most powerful in the world, our media are utterly saturated with the prophetic utterances of their high priests, and people continually seek high-tech solutions to every imaginable problem, from obesity to road rage to poor reading scores in our schools. So, you know, comparative anthropology FTW. 

And now, two serious points:

1) To quote Freddie deBoer, Victory is yours. It has already been accomplished.” Is it really necessary for you to extinguish every last breath of dissent — even what comes to us in fiction? Relatedly:

2) Here again we see the relentless cultural policing of the pink police state. Stop reading young adult fiction! Stop writing dystopian fiction! Stop imagining what we do not wish you to imagine! 

In T. H. White’s The Once and Future King, here’s what happens when the Wart is turned into an ant: 

The place where he was seemed like a great field of boulders, with a flattened fortress at one end of it — between the glass plates. The fortress was entered by tunnels in the rock, and, over the entrance to each tunnel, there was a notice which said:

EVERYTHING NOT FORBIDDEN IS COMPULSORY

He read the notice with dislike, though he did not understand its meaning.

Welcome to the ant’s little world, where of course the converse is necessarily true: Everything that is not compulsory is forbidden. In the pink police state there are no adiaphora

first-person shooter

Ferguson police 2
A few nights ago at the movies I saw a trailer for the last installment of The Hobbit, and caught a brief glimpse of a scene in which someone is driving a cart — pulled by mountain goats? Were those mountain goats?? — along a frozen river, sliding around and knocking into rocky walls. Oh right, I thought, that’s like the glacier track from Cro-Mag Rally.

It’s probably like many other video games as well — I don’t play many, so I couldn’t tell you — but I noted it as a reminder of the extent to which Peter Jackson’s once-excellent filmmaking instincts have been subjugated by video-game aesthetics. I say that as someone who doesn’t think there’s anything intrinsically wrong with video-game aesthetics, in video games; but movies are a different animal and need to be treated differently.

I’m being imprecise, though, and should take more care. What I’ve been calling “video-game aesthetics” is really drawn from a subset of games, primarily side-scrolling games (think of the hobbits running through the caverns of the goblins in the first installment of the series) and first- and third-person shooters. These are appropriate visual styles for certain kinds of game, but I think generally constrain and cartoonify the visuality of cinema.

But because those games are so popular and (especially the shooters) are so utterly central to the experience of above all males under forty, we should probably spend more time than we do thinking about how immersion in those visual worlds shapes people’s everyday phenomenology. We do talk about this, but in limited ways, primarily in order to ask whether playing violent games makes people more violent. That’s a key question, but it needs to be broadened. Ian Bogost wants us to ask what it’s like to be a thing, but maybe we need also to ask: What is it like to be a shooter? What is it like to have your spatial, visual orientation to the world shaped by thousands of hours in shooter mode?

I want to suggest that there may be a strong connection between the visual style of video games and the visual style of American police forces — the "warrior cops” that Radley Balko has written (chillingly) about. Note how in Ferguson, Missouri, cops’ dress, equipment, and behavior are often totally inappropriate to their circumstances — but visually a close match for many of the Call of Duty games. Consider all the forest-colored camouflage, for instance:

600x3904
AP/Jeff Roberson
It’s a color scheme that is completely useless on city streets — and indeed in any other environment in which any of these cops will ever work. This isn’t self-protection; it’s cosplay. It’s as close as they can come to Modern Warfare 3:

2058078 call of duty modern warfare 3 xbox 360 1318517434 024

The whole display would be ludicrous — boys with toys — except the ammunition is real. The guns are loaded, even if some of them have only rubber bullets, and the tear gas truly burns. And so play-acted immersion in a dystopian future gradually yields a dystopian present.

What is is like to be a first-person shooter? It’s awesome, dude.

Wednesday, August 13, 2014

our new robo-reader overlords

“Robo-readers aren’t as good as human readers — they’re better,” the headline says. Hmmm. Annie Murphy Paul writes,

Instructors at the New Jersey Institute of Technology have been using a program called E-Rater in this fashion since 2009, and they’ve observed a striking change in student behavior as a result. Andrew Klobucar, associate professor of humanities at NJIT, notes that students almost universally resist going back over material they’ve written. But, Klobucar told Inside Higher Ed reporter Scott Jaschik, his students are willing to revise their essays, even multiple times, when their work is being reviewed by a computer and not by a human teacher. They end up writing nearly three times as many words in the course of revising as students who are not offered the services of E-Rater, and the quality of their writing improves as a result. Crucially, says Klobucar, students who feel that handing in successive drafts to an instructor wielding a red pen is “corrective, even punitive” do not seem to feel rebuked by similar feedback from a computer….

When critics like Les Perelman of MIT claim that robo-graders can’t be as good as human graders, it’s because robo-graders lack human insight, human nuance, human judgment. But it’s the very non-humanness of a computer that may encourage students to experiment, to explore, to share a messy rough draft without self-consciousness or embarrassment. In return, they get feedback that is individualized, but not personal — not “punitive,” to use the term employed by Andrew Klobucar of NJIT.

There are some serious conceptual confusions and evaded questions here. The most obviously evaded question is this: When students are robo-graded, the quality of their writing improves by what measure?

Les Perelman's objections are vital here. He has written,

Robo-graders do not score by understanding meaning but almost solely by use of gross measures, especially length and the presence of pretentious language. The fallacy underlying this approach is confusing association with causation. A person makes the observation that many smart college professors wear tweed jackets and then believes that if she wears a tweed jacket, she will be a smart college professor.

Robo-graders rely on the same twisted logic. Papers written under time pressure often have a significant correlation between length and score. Robo-graders are able to match human scores simply by over-valuing length compared to human readers. A much publicized study claimed that machines could match human readers. However, the machines accomplished this feat primarily by simply counting words.

And there's this:

ETS says its computer program tests “organization” in part by looking at the number of “discourse units” – defined as having a thesis idea, a main statement, supporting sentences and so forth. But Perelman said that the reward in this measure of organization is for the number of units, not their quality. He said that under this rubric, discourse units could be flopped in any order and would receive the same score – based on quantity.

Other parts of the formula, he noted, punish creativity. For instance, the computer judges “topical analysis” by favoring “similarity of the essay's vocabulary to other previously scored essays in the top score category.” “In other words, it is looking for trite, common vocabulary,” Perelman said. “To use an SAT word, this is egregious.” Word complexity is judged, among other things, by average word length…. And the formula also explicitly rewards length of essay.

Perelman went on to show how Lincoln would have received a poor grade on the Gettysburg Address (except perhaps for starting with “four score,” since it was short and to the point).

Notice, not incidentally, that Perelman's actual arguments belie Paul's statement that “critics like Les Perelman of MIT claim that robo-graders can’t be as good as human graders, it’s because robo-graders lack human insight, human nuance, human judgment.” It's perfectly clear even from these excerpts that Perelman's point is not that the robo-graders are non-human, but that they reward bad writing and punish good. And since the software only follows the algorithms that have been programmed into it, the problem actually begins with the programmers, who may not have any real understanding of what makes writing effective, or — and this seems to me more likely — can't find algorithms that identify it.

I suspect, then, that with this automated grading we're moving perilously close to a model that redefines good writing as “writing that our algorithms can recognize.” So why would any teachers ever adopt such software? That one has a simple answer: because the students are happier when they interact with the machines about their writing than when they have to respond to human teachers. If you read Paul's whole essay, you'll see that that's all the system has to commend it: it pacifies the children, while the teachers just stand by and watch. The software really is teaching the children, and what it's teaching them is to do what the software tells them to do. The achievement here is not improved writing, but improved obedience to algorithmic machines.

Welcome to the future of education.

Friday, August 8, 2014

the broken-glass mystery

Thomas Cranmer by Gerlach Flicke

What you see above is the portrait of Thomas Cranmer, Archbishop of Canterbury, painted in 1545 by Gerlach Flicke. It’s now in the National Portrait Gallery in London. 

My friend Betsy Childs was recently looking at this picture and noticed something curious: tiny pieces of broken glass, or perhaps chipped glass-coating, in the windows behind Cranmer. Here’s a close-up: 

Closeup

(You can see a high resolution version of the painting here.) Now, this painting is a very detailed one. For instance, Cranmer is holding a copy of the letters of St. Paul and one of the books on the table is Augustine’s On Faith and Works, which together illustrate Cranmer’s commitment to the core Reformation principle of justification by faith. Other elements of the painting have obviously been executed with great care but yield no clear meaning. For instance, what are we to make of the carving on the left — right next to the little strip of paper giving the date of the painting and Cranmer’s age — featuring a naked woman whose private parts are obscured by the face of some strange beast? (The Whore of Babylon, perhaps, against whom Cranmer contended? But why in a carving, and why there?) 

But what might the broken or chipped glass mean? Betsy wondered if I knew, and I don’t have even a guess. I checked Diarmaid MacCulloch’s magisterial biography of Cranmer, and while he discusses this painting at some length (pages 338-42), he doesn’t say anything about the glass. 

So Betsy wrote to the National Portrait Gallery. One of the curators there responded that the problematic glass was only discovered when the painting underwent restoration in the 1990s, and that it is definitely part of the original composition — but they don’t know what it means either. "Artists and patrons at this time had a very refined symbolic vocabulary, much of which has been lost to modern scholars. The painting is laden with Cranmer’s personal iconography and this device could relate to that. Alternatively, there might be an as-yet undiscovered theological interpretation, or a reference to Cranmer’s own works.” 

So: a mystery! Anyone have any guesses? 

Tuesday, August 5, 2014

"Officer, this man stole my authenticity!"

Freddie deBoer is exactly right about how annoying this Hillary Kelly post is. As Freddie points out, it assumes the absoluteness of a distinction that just doesn’t apply absolutely in many places — and, I might add, even when it does apply it doesn’t always apply in the same way. In America we tend to think of the suburbs as refuges for the economically comfortable, while the poor are confined to inner cities; but the relationship between Paris and its banlieues is almost the opposite.  

But what interests me about Kelly’s post is how intensely moralistic her language is. No doubt she would say that she’s exaggerating for effect, but I don’t think that she’d deny that she’s perfectly serious about her point. It’s a classic example of pink-police-state-style boundary policing — but in this case with actual (if highly artificial) boundaries. People from, say, Media who tell folks they’re from Philadelphia are not just simplifying for conversational ease, they are liars. They are fabulists.  

As a great man once said, Why so serious? It can only be because for Kelly being from the city is a mark of authenticity — and being from the burbs is necessarily and tragically inauthentic. Therefore to claim to be from the city when you’re not is an attempt to surreptitiously and dishonestly appropriate urbanite charisma. Being urban is gritty, it’s real; being from the suburbs is vacuous, bland — or so we’re told, even though we know that at best that’s a vague generalization. Kelly elevates a statistical probability into an ontological principle. Which is just silly.  

I was born and raised in the city of Birmingham, Alabama; my wife was raised mainly in one of the over-the-mountain white suburbs. Both of us have always told strangers we’re from Birmingham, and the idea that my wife could be called a liar or a fabulist for saying that strikes me as utterly bizarre. It provides the necessary information without burdening the people we meet with pedantic detail. If we get to know them better we can explore the differences in our upbringing.  

Of all the things to get outraged about! 

advice about advice about the scholarly life

Screen Shot 2014 08 03 at 6 57 12 AM

I’m going to disagree, a bit, in a way, with Robert George’s Advice to Young Scholars:

Although it is natural and, in itself, good to desire and even seek affirmation, do not fall in love with applause. It is a drug. When you get some of it, you crave more. It can easily deflect you from your mission and vocation. In the end, what matters is not winning approval or gaining celebrity. Your mission and vocation is to seek the truth and to speak the truth as God gives you to grasp it.

There is a particular danger for those who dissent from the reigning orthodoxies of a prevailing intellectual culture. You may be tempted to suppose that your willingness to defy the career-making (and potential career-breaking) mandarins of elite opinion immunizes you from addiction to affirmation and applause and guarantees your personal authenticity and intellectual integrity. It doesn’t. We are all vulnerable to the drug. The vulnerability never completely disappears. And the drug is toxic to the activity of thinking (and thus to the cause of truth-seeking).

To me, the reality of this temptation, no less than any other temptation, should keep us mindful of the need constantly to tend the garden of one’s interior life. If anything can immunize us against the temptation to love applause above truth, it is prayer.

There is obvious and important truth in this, for the Christian scholar, and yet I think George has phrased the matter in terms that are too individualistic. (I doubt that George would disagree with any of what I’m about to present, and would probably say that it is implicit in what he wrote. But I think it needs to be more explicit.) 

The prayerfulness that George rightly emphasizes needs to be not just private prayer — which is what at least some of his readers will think he means — but communal prayer, within a body of faithful believers. I agree that my "vocation is to seek the truth and to speak the truth as God gives you to grasp it,” but I need to test my own self-understanding against that of the larger body. I need people to whom I am accountable to tell me when they think my discernment is flawed. They won’t always be right; but without them I am sure to be usually, if not invariably, wrong. 

I think we should keep that point in mind when we reflect on George’s absolutely vital point about the dangers of congratulating ourselves on our “dissent from the reigning orthodoxies”; even when we do so dissent we can still, without realizing it, crave “applause more than truth.” Jamie Smith recently articulated a very sharp version of this point: 

Those "courageous" progressives don't really value the opinions or affirmations of conservative evangelicalism anyway. What they really value, long for, and try to curry is the favor of "the Enlightened” — whether that's the mainstream academy or the progressive chattering class who police our cultural mores of tolerance. Sure, these "courageous" progressives will take fire from conservative evangelicals —but that's not a loss or sacrifice for them. Indeed, their own self-understanding is fueled by such criticism.  In other words, these stands don't take "courage" at all; they don't stand to lose anything with those they truly value.  

Similarly, "courageous" conservatives who "stand up" to the progressive academy aren't putting much at risk because that's not where they look for validation and it's not where their professional identities are invested. They are usually "populists" (in a fairly technical sense of the word) whose professional lives are much more closely tethered to the church and popular opinion.  And in those sectors, "standing up to" the academy isn't a risk at all--it's a way to win praise. When your so-called contrarian stands win favor from those you value most...well, it's hard to see how "courage" applies. 

To this I reply with a warm Yes — and also an Ouch, because I’m sure I have done just this: patted myself on the back for my “courage” in standing up to people whose approval I didn’t want anyway. The key point is: We always want someone’s applause, someone’s approval. 

The question is: Whose? Here I want to go back to my recent post on the Righteous Mind and the Inner Ring, and C. S. Lewis’s distinction between Inner Rings, which draw people in to their destruction, and communities which offer the kind of genuine membership that contributes to our flourishing. Lewis’s best treatment of these issues is his novel That Hideous Strength, which is flawed in many ways (I think) but absolutely brilliant on this point. 

The image at the top of this post, and the one that’s about to follow, come from a talk that I’ve given a couple of times on this theme. In THS Lewis portrays with great skill the rhetorical differences, which are also moral differences, between Mark Studdock’s recruitment into N.I.C.E. and Jane Studdock’s invitation to join the company at St. Anne’s. 

Screen Shot 2014 08 03 at 6 57 34 AM

Jane, like Lewis himself, wants to be left alone, and resists incorporation into any social body. Mark craves affirmation — applause — and is undiscerning about who it comes from. Jane has to learn the value of belonging to people who are trustworthy and want her to flourish; Mark has to find the courage to resist being assimilated by a voracious social machine that wants to consume him. Their paths fork; then converge. 

I think moral maturity for all of us involves learning what our temptations are. Do we (falsely) think we can go it alone? Are we tempted to go along to get along? Do we even understand what groups we want to belong to — whose applause we desire — and why? I can’t imagine anybody to whom these questions are not relevant, but they have a particular importance for scholars, because scholarship is something that we learn within really powerful socializing institutions. (I don’t know of any institution that socializes more thoroughly than graduate school.)

Simply to give in to these forces is unwise; to be completely independent of them is impossible. This is why I have always insisted on the importance for Christian scholars of serious commitment to a church community: by participating in a different body, with different priorities and participants, you are better able to put the demands of the scholarly world into proper perspective. You don’t escape them or ignore them or rise triumphantly above them; but you can learn to give them the conditional and limited allegiance they deserve. 

Monday, August 4, 2014

the most annoying thing you'll read today

Funny, the little things that annoy a person. For example: when someone tweets a link and prefaces it with “The best thing you’ll read today.” Well, first of all, bub, what makes you think I’ll read it at all? Just because you link to it? I don't think so. But second, even if I do read it, how do you know what else I might be reading today? If I’m in the middle of Anna Karenina, do you really think that this blog post about Edward Snowden or Beyoncé or Gary Shteyngart is going to be better than that?

I’m exaggerating my annoyance just for fun. But still, what underlies the “best thing you’ll read today” meme is the genuine if unconscious expectation that we’re all just reading stuff published in the past 48 hours. When someone says that article X or blog post Y is the best thing I’ll read today, what they really mean is that it’s the best thing that they found this morning on Flipboard or in their RSS feed or on Tumblr. It’s the best thing that just now showed up. Which is not the best thing simpliciter.

Just saying. 

cruel to be kind

Ripil main
A kindness tracker. That’s what this is. You can keep track of all of your good deeds, and — here’s the key thing — you can compete with others in kindness competitions. And then when you kick their stupid loser butts you can do this:


Friday, August 1, 2014

the end of intellectual property?

From the conclusion of Adrian Johns’s remarkable book Piracy: The Intellectual Property Wars from Gutenberg to Gates

The confrontation between piracy and the intellectual property defense industry is perhaps set to trigger a radical transformation in the relation between creativity and commercial life. That idea is not as inconceivable as it may seem. Such turning points have happened before — about once every century, in fact, since the end of the Middle Ages. The last major one occurred at the height of the industrial age, and catalyzed the invention of intellectual property. Before that, another took place in the Enlightenment, when it led to the emergence of the first modern copyright system and the first modern patents regime. And before that, there was the creation of piracy in the 1660s-1680s. By extrapolation, we are already overdue to experience another revolution of the same magnitude. If it does happen in the near future, it may well bring down the curtain on what will then, in retrospect, come to be seen as a coherent epoch of about 150 years: the era of intellectual property.

A remarkable book, indeed, but not without its longeurs — Johns likes to tell his stories in great detail, and while my scholarly-completist side admires this trait, my readerly side sometimes wished for less exhaustive treatments. 

But it’s a very rich book full of remarkable events, which Johns shrewdly analyzes. It deserves careful reading by people in a wide range of disciplines, from the history of science to the history of law to political philosophy to the history and theory of technology. I have sometimes thought about inaugurating a Text Patterns Book Club, and this seems like a great candidate. Another one might be Nick Carr’s forthcoming The Glass Cage: Automation and Us. Thoughts? 

How Uninformed Critiques of Digital Humanities Are Taking Over Journalism!

This essay by Catherine Tumber is disappointingly empty, but also indicative of a certain and all-too-common mode of thought. It seems that Tumber has read almost nothing in the digital humanities except Adam Kirsch’s recent critique of that multifaceted movement, and — remarkably enough! — she agrees with Kirsch, "whom we can thank for reading these books so we don’t have to,” adding nothing of her own to his arguments, except the evidence of what appears to be half an hour of web browsing.

She assures us that in his treatment "Kirsch does not cherry pick; he plucks work by leading theorists in the field.” But one of the most common modes of intellectual cherry-picking is taking passages or ideas out of their context, and Tumber, who as we have just seen has not read the books in question, is scarcely in a position to judge whether Kirsch has done that or not. Some of the leading figures in DH — in a response which, though it was published in the same journal that published Kirsch’s critique, Tumber seems unaware of — make it clear that his treatment of their ideas grossly misrepresents them: 

Third, the notion that so called “digital humanities” is characterized by an urge “to accelerate the work of thinking by delegating it to a computer” is patently nonsensical. Throughout Digital_Humanities we argue not “to throw off the traditional burden” but, on the contrary, for a critical and transformative engagement that is rooted in the very traditions of humanistic inquiry. If Kirsch did some close-reading of the book, he would find it to be a celebration not of the digital—as some starry-eyed salvific or materialist ideology—but of the vitality and necessity of the humanities.

Having read the book, I think their statement is quite accurate. But don’t take my word for it: read it yourself. You’ll be a big step ahead of Catherine Tumber.

Here’s what we could use more of in this debate: 

1. Reading a lot before critiquing, in the spirit of intellectual responsibility

2. Remembering that many of the approaches to literary study we’re familiar with were themselves attacked as anti-humanistic just a couple of decades ago. 

Here’s what we could use less of in this debate: 

1. Critiquing without doing much reading. 

2. Presenting your lack of interest in a particular intellectual approach, or set of approaches, as a sign of virtue or humanistic integrity. It’s okay not to be interested in everything that everyone else is doing; we don’t need so to exalt our preferences for something else. 

3. Stupid clickbaity headlines. “Technology is Taking Over English Departments”? “Bulldozing the Humanities”? Give me a break. 

relating and identifying

Rebecca Mead writes about The Scourge of "Relatability":

What are the qualities that make a work ‘relatable,’ and why have these qualities come to be so highly valued? To seek to see oneself in a work of art is nothing new, nor is it new to enjoy the sensation. Since Freud theorized the process of identification—as a means whereby an individual develops his or her personality through idealizing and imitating a parent or other figure—the concept has fruitfully been applied to the appreciation of the arts. Identification with a character is one of the pleasures of reading, or of watching movies, or of seeing plays, though if it is where one’s engagement with the work begins, it should not be where critical thought ends. The concept of identification implies that the reader or viewer is, to some degree at least, actively engaged with the work in question: she is thinking herself into the experience of the characters on the page or screen or stage.  

But to demand that a work be ‘relatable’ expresses a different expectation: that the work itself be somehow accommodating to, or reflective of, the experience of the reader or viewer. The reader or viewer remains passive in the face of the book or movie or play: she expects the work to be done for her. If the concept of identification suggested that an individual experiences a work as a mirror in which he might recognize himself, the notion of relatability implies that the work in question serves like a selfie: a flattering confirmation of an individual’s solipsism."

While sharing Mead’s frustration with the rise of this stupid word, I don’t follow her argument about how “relatability” differs from “identification.” Is wanting the work to be a mirror really so different from wanting it to be a selfie? Aren’t those just two slightly different ways of describing the same impulse? 

People, especially young people, used to say, when explaining their dislike of a book, “I just couldn’t identify with it” or “I just couldn’t identify with the characters.” Now they say, “it just wasn’t relatable.” Both of these are just shorthand ways of saying “This work bored me and I think it’s the work’s fault, not mine.” And that is a shorthand way of describing … well, what? Probably a wide range of experiences, all of which have one thing in common: they’re not interesting enough to readers or viewers for them to to inquire seriously into the causes of boredom.

I think what the language of relatability and the language of identification typically, if not invariably, connote — and they do this whether used positively or negatively — is weakness of response. And this is why the terms remain so vague, maddeningly so for those of a verbally critical bent. When people really love a work, or really hate it, they enjoy explaining why. When they sorta kinda like it, or sorta kinda dislike it, they say that it was or wasn’t relatable, or that they could or couldn’t identify with the characters. “Relatable” and “identify" are words that ought to come with a shrug pre-attached. 

 

UPDATE: All this said, I think what got this conversation started, the mini-uproar over Ira Glass’s saying that King Lear is unrelatable, is pretty silly. It was merely an off-the-cuff remark, as Glass later said — which I think supports the point I’m making in this post. Casual remarks usually deserve no more than casual responses. 

Thursday, July 31, 2014

a revolution I can get behind!

The Power of the Doodle: Improve Your Focus and Memory:

Recent research in neuroscience, psychology and design shows that doodling can help people stay focused, grasp new concepts and retain information. A blank page also can serve as an extended playing field for the brain, allowing people to revise and improve on creative thoughts and ideas.

Doodles are spontaneous marks that can take many forms, from abstract patterns or designs to images of objects, landscapes, people or faces. Some people doodle by retracing words or letters, but doodling doesn't include note-taking. 'It's a thinking tool,' says Sunni Brown, an Austin, Texas, author of a new book, 'The Doodle Revolution.' It can affect how we process information and solve problems, she says.

The Doodle Revolution! Yes!

I doodled my way through my education — always abstract patterns, usually a kind of cross-hatching — and almost never took notes. This puzzled my teachers, but I always remembered what I heard in class better when I doodled. 

When I was in college, I spent an entire semester developing an immensely intricate doodle on the back of one of my notebooks. When I finally filled in the last corner, on the last day of class, I sat back and looked with satisfaction on my achievement. Then I felt a tap on my shoulder. A guy who had been sitting behind me all term said, “I’ll give you five bucks for that.” So I carefully tore off the back cover and exchanged it for his fiver. We both went away happy. 

totem and taboo

I’ve been enjoying and profiting from James Poulos’s ongoing analysis of what he calls the “pink police state”: see installments to date here and here. This passage from the second essay strikes me as especially noteworthy: 

The new regime is not totalitarian, fascist, socialist, capitalist, conservative, or liberal, according to the accepted and common definitions of those terms. It is not even adequately described as corporatist, although corporatism is very much at home within it. The “pink police state” is not a police state in the sense that George Orwell would be familiar with, but one in which a militarized, national policing apparatus is woven into the fabric of trillions of transactions online and off. Nor is it a “pinko commie” regime in the sense of enforcing “political correctness” out of total allegiance to Party; rather, it enforces the restrictions and permissions doled out by its sense of “clean living.” To invoke Michel Foucault again, ours is an age when governance is inseparable from hygiene in the minds of the elite that rules over both the private and public sector. To them, everything is theoretically a health issue.

This hygienic impulse is indeed vital to the current regime, and has been growing in intensity for some time. It reaches into every area of culture. C. S. Lewis noted its presence fifty years ago in literary criticism, after articulating his own view of the pleasures of reading: 

Being the sort of people we are, we want not only to have but also to analyse, understand, and express, our experiences. And being people at all—being human, that is social, animals—we want to 'compare notes', not only as regards literature, but as regards food, landscape, a game, or an admired common acquaintance. We love to hear exactly how others enjoy what we enjoy ourselves. It is natural and wholly proper that we should especially enjoy hearing how a first-class mind responds to a very great work. That is why we read the great critics with interest (not often with any great measure of agreement). They are very good reading; as a help to the reading of others their value is, I believe, overestimated.

This view of the matter will not, I am afraid, satisfy what may be called the Vigilant school of critics. To them criticism is a form of social and ethical hygiene. They see all clear thinking, all sense of reality, and all fineness of living, threatened on every side by propaganda, by advertisement, by film and television. The hosts of Midian 'prowl and prowl around'. But they prowl most dangerously in the printed word. 

This idea that criticism is required to discourage people from reading (or viewing!) things that are bad for them, or not ideally good for them — or, to put it in a more pointed way, that criticism is necessary for policing cultural boundaries — has been around for a while but has become, I think, increasingly prominent. I’ve written a bit about it on this blog, for instance here. And I see it at work in my friend Ruth Graham’s critique of adults reading YA fiction. (Austin Kleon helpfully gathered some of my thoughts on the matter here.) 

So this “vigilant” attitude towards reading is just one example of the ways in which hygienic policing is intrinsic to the current cultural regime. And it strikes me that what may be needed, and what James is to some degree providing, is what I think I want to call a psycho-anthropological analysis of this policing. I am not, generally speaking, a fan of Freud, but there are passages in his Totem and Taboo that strike me as deeply relevant to the questions James raises. 

Think, for instance, of his point that taboo “means, on the one hand, ‘sacred’, ‘consecrated’, and on the other ‘uncanny’, ‘dangerous’, ‘forbidden’, ‘unclean’.” That which is taboo is automatically a matter of great fascination, simultaneously frightening and compelling. 

And this: 

Anyone who has violated a taboo becomes taboo himself because he possesses the dangerous quality of tempting others to follow his example: why should he be allowed to do what is forbidden to others? Thus he is truly contagious in that every example encourages imitation, and for that reason he himself must be shunned.

But a person who has not violated any taboo may yet be permanently or temporarily taboo because he is in a state which arouses the quality of arousing forbidden desires in others and of awakening a conflict of ambivalence in them.

Having rejected the taboos of our ancestors, especially our Christian ancestors, the current regime does not live without taboos but replaces them with others; and having created a world without gods, it places upon itself the greatest responsibility imaginable for preserving moral cleanliness. In the absence of gods, the totems and the taboos alike increase in magnitude.  

I expect James will be saying more about this kind of thing in future installments of the series, and I hope to be replying here. I want to comment especially on the totems or idols that balance out the taboos. 

Saturday, July 26, 2014

what we can claim for the liberal arts

Please read this wonderful post by Tim Burke on what liberal-arts education can and can’t do — or rather, what we who love it can plausibly claim on its behalf and what we can’t. Excerpt:


No academic (I hope) would say that education is required to achieve wisdom. In fact, it is sometimes the opposite: knowing more about the world can be, in the short-term, an impediment to understanding it. I think all of us have known people who are terrifically wise, who understand other people or the universe or the social world beautifully without ever having studied anything in a formal setting. Some of the wise get that way through experiencing the world, others through deliberate self-guided inquiry.
What I would be prepared to claim is something close to something Wellmon says, that perhaps college might “might alert students to an awareness of what is missing, not only in their own colleges but in themselves and the larger society as well”.
But my “might” is a bit different. My might is literally a question of probabilities. A well-designed liberal arts education doesn’t guarantee wisdom (though I think it can guarantee greater concrete knowledge about subject matter and greater skills for expression and inquiry). But it could perhaps be designed so that it consistently improves the odds of a well-considered and well-lived life. Not in the years that the education is on-going, not in the year after graduation, but over the years that follow. Four years of a liberal arts undergraduate experience could be far more likely to produce not just a better quality of life in the economic sense but a better quality of being alive than four years spent doing anything else.
There are several important elements to Tim’s argument, the most important of which are: 
(a) It does no good to answer simplistic denunciations of liberal-arts education with simplistic cheerleading. Just as there are no books the reading of which will automatically make you a better person — thus the G. C. Lichtenberg line Auden liked to quote: “A book is like a mirror; if an ass looks in, you can’t expect an apostle to look out” — so too there is no form of education that will automatically create better people. But some forms of education, as Tim says, may “improve the odds.” That’s the point at which we need to engage the argument. 
(b) If we who practice and love the liberal arts want to defend them, we also have to be prepared to improve them, to practice them better — and this may well require of us a rethinking of how the liberal arts tradition related to disciplinarity. As always, Tim is refusing the easy answers here, which are two: first, that the disciplinary structures created in and for the modern university are adequate to liberal education; and second, that we should ditch the disciplines and be fully interdisciplinary. Both answers are naïve. (The problems with the latter, by the way, were precisely identified by Stanley Fish a long time ago.) The academic disciplines — like all limiting structures, including specific vocabularies, as Kenneth Burke pointed out in his still-incisive essay on “terministic screens” — simultaneously close off some options and enable others. We need more careful scrutiny of how our disciplinary procedures do their work on and in and with students. 
I’m mainly channeling Tim here, but I would just add that another major element that we need to be thinking about here is desire: What are students drawn to, what do they love? To what extent can we as teachers shape those desires? My colleague Elizabeth Corey has recently published a lovely essay — alas, paywalled — on education as the awakening of desire; and while I wholeheartedly endorse her essay, I have also argued that there are limits to what we can do in that regard. 
In any event, the role of desire in liberal education is a third vital topic for exploration, in addition to the two major points I have extracted from Tim’s post — which, let me remind you, you should read. 

Friday, July 25, 2014

you must remember this

Please forgive me for ignoring the main thrust of this post by William Deresiewicz. I'm just going to comment on one brief but strange passage:

A friend who teaches at a top university once asked her class to memorize 30 lines of the eighteenth-century poet Alexander Pope. Nearly every single kid got every single line correct. It was a thing of wonder, she said, like watching thoroughbreds circle a track.

A “thing of wonder”? Memorizing a mere thirty lines of poetry?

As I've often noted, in any class in which I assign poetry I ask students to memorize at least 50 lines (sometimes 100) and recite them to me. I've been doing that for more than twenty years now, and all the students get all the lines right. If they don't, they come back until they do. It's not a big deal. Yet to Deresiewicz, who taught for years at Yale, and his friend who teaches at a “top university,” the ability to recite thirty lines of Pope — probably the easiest major English poet to memorize, given his exclusive use of rhyming couplets — seems an astonishing mental feat. What would they think of John Basinger, who knows the whole of Paradise Lost by heart? Or even a three-year-old reciting a Billy Collins poem — which is also every bit of 30 lines?

In my school days I had to memorize only a few things: the preamble to the Constitution, the Gettysburg Address, a Shakespeare passage or two. But for previous generations, memorization and recitation were an essential and extensive part of their education. Perhaps only the classical Christian education movement keeps this old tradition alive. The amazement Deresiewicz and his friend feel at a completely trivial achievement indicates just how completely memorization has been abandoned. In another generation we'll swoon at someone who can recite her own phone number.

 

UPDATE: Via my friend at Princeton University Press Jessica Pellien, a book by Catherine Robson called Heart Beats: Everyday Life and the Memorized Poem. Here’s the Introduction in PDF.

the right tools for the job

This talk by Matthew Kirschenbaum provokes much thought, and I might want to come back to some of its theses about software. But for now I'd just like to call attention to his reflections on George R. R. Martin's choice of writing software:

On May 13, in conversation with Conan O’Brien, George R. R. Martin, author of course of the Game of Thrones novels, revealed that he did all of his writing on a DOS-based machine disconnected from the Internet and lovingly maintained solely to run … WordStar. Martin dubbed this his “secret weapon” and suggested the lack of distraction (and isolation from the threat of computer viruses, which he apparently regards as more rapacious than any dragon’s fire) accounts for his long-running productivity.

And thus, as they say, “It is known.” The Conan O’Brien clip went viral, on Gawker, Boing Boing, Twitter, and Facebook. Many commenters immediately if indulgently branded him a “Luddite,” while others opined it was no wonder it was taking him so long to finish the whole Song of Fire and Ice saga (or less charitably, no wonder that it all seemed so interminable). But WordStar is no toy or half-baked bit of code: on the contrary, it was a triumph of both software engineering and what we would nowadays call user-centered design…. WordStar’s real virtues, though, are not captured by its feature list alone. As Ralph Ellison scholar Adam Bradley observes in his work on Ellison’s use of the program, “WordStar’s interface is modelled on the longhand method of composition rather than on the typewriter.” A power user like Ellison or George R. R. Martin who has internalized the keyboard commands would navigate and edit a document as seamlessly as picking up a pencil to mark any part of the page.

There was a time when I wouldn't have understood how Martin could possibly have preferred some ugly old thing like WordStar. I can remember when my thinking about these matters started to change. It happened fifteen years ago, when I read this paragraph by Neal Stephenson:

In the GNU/Linux world there are two major text editing programs: the minimalist vi (known in some implementations as elvis) and the maximalist emacs. I use emacs, which might be thought of as a thermonuclear word processor. It was created by Richard Stallman; enough said. It is written in Lisp, which is the only computer language that is beautiful. It is colossal, and yet it only edits straight ASCII text files, which is to say, no fonts, no boldface, no underlining. In other words, the engineer-hours that, in the case of Microsoft Word, were devoted to features like mail merge, and the ability to embed feature-length motion pictures in corporate memoranda, were, in the case of emacs, focused with maniacal intensity on the deceptively simple-seeming problem of editing text. If you are a professional writer–i.e., if someone else is getting paid to worry about how your words are formatted and printed–emacs outshines all other editing software in approximately the same way that the noonday sun does the stars. It is not just bigger and brighter; it simply makes everything else vanish. For page layout and printing you can use TeX: a vast corpus of typesetting lore written in C and also available on the Net for free.

The key phrase here, for me, was the deceptively simple-seeming problem of editing text. When I read those words I realized that editing text was much of what I needed to do, and that Microsoft Word wasn't very good at that. Stephenson's essay (still a delight to read, by the way, though quite outdated now) set me off on a long quest for the best writing environment that has ended up not with emacs or vi but rather with a three-component system. I have written about these matters before, but people ask me about them all the time, so I thought I would provide a brief summary of my system.

The first component is my preferred text editor, BBEdit, which seems to me to strike a perfect balance between the familiar conventions of Macintosh software and the power typically found only in command-line text editors.

The second component is the scripts John Gruber (with help from Aaron Swartz) wrote to create Markdown, a simple and easy-to-use but powerful syntax for indicating structure in plain-text documents.

The third component is John MacFarlane's astonishing pandoc, which allows me to take my Markdown-formatted plain text and turn it into … well,almost anything this side of an ice-cream sundae. If my publisher wants a MS Word document, pandoc will turn my Markdown text into that. If I want to create an e-book, pandoc can transform that same text into EPUB. When I need to make carefully formatted printed documents, for instance a course syllabus, pandoc will make a LaTeX file. I just can't get over how powerful this tool is. Now I almost never have to write in anything except BBEdit and my favorite text editor for the iPad, Editorial.

That's it. With a good text editor and some scripts for formatting, a writer can focus all of his or her attention on the deceptively simple-seeming problem of editing text. That makes writing less frustrating and more fun. This is what George R. R. Martin has achieved with WordStar, and he's right to stick with it rather than turn to tools that do the essential job far less well.

Wednesday, July 23, 2014

breaking the spell


cows eating grass
I just got back from a brief vacation at Big Bend National Park, and when I was packing I made sure to stick a novel in my backpack. I’m not going to name it, but it is a very recent novel, by a first-time novelist, that has received a great deal of praise. Before my departure I had already read the first chapter and found it quite promising. I was excited.

The next few chapters, I discovered while on my trip, were equally compelling; they carried me some fifty pages into the book. But in the next fifty pages the narrative energy seemed to flag. The act of reading started to feel effortful. And then, about 130 pages in (about halfway through the book), I had a sudden thought: This is just someone making up a story.

And that was it; the spell was broken, my investment in the novel over and done with. I couldn’t read another paragraph. Which is an odd thing, because of course it was just someone making up a story — that’s what novels are, and I knew when I bought the book what it was. But nothing can be more deadly to the experience of reading fiction than the thought that came (quite unbidden) to my mind.

Coleridge famously wrote of literature’s power “to transfer from our inward nature a human interest and a semblance of truth sufficient to procure for these shadows of imagination that willing suspension of disbelief for the moment, which constitutes poetic faith.” (Like most writers before the twentieth century, Coleridge used “poetic” to mean what we now call “literary.”) But really, the requisite suspension of disbelief is willing only in a peculiar anticipatory sense: it has to become unwilling, involuntary, in the actual act of reading, or else all the magic of storytelling is lost.

I have found in the past few years that this has happened to me more and more often as I read fiction, especially recent fiction. There are many possible reasons for this, including an increasing vulnerability to distraction and the return to the reading habits of my youth that I describe in this essay. But I’m inclined to think that neither of those is the problem. Rather, I think that for the last fifty years or more “literary” fiction‚ and a good deal of “genre” fiction as well, has recycled far too many of the same themes and tropes. Like a number of other readers, I’m inclined to assign much of the blame for this to and capturing of so much English-language fiction by university-based creative writing programs, which suffer from the same pressures of conformity that all academic work suffers from. (And yes, the author of the novel I abandoned is a creative-writing-program graduate, though I just now looked that up.)

In other words, I have just been around the same few fictional blocks far too many times. I’m tired of them all, and only only satisfied when I’m surprised.

Maybe that’s not the problem. But I sure feel that it is.

 

P.S. Something that just occurred to me: A long time ago Northrop Frye noted — I can’t at the moment recall where — Ben Jonson's frustration that Shakespeare’s plays were far more inconsistently and incoherently put together than his own but were nevertheless, somehow, more popular, and commented that this was just it: Jonson’s plays were put together, more like “mechanical models of plays” than the real thing, whereas Shaksepeare’s plays had all the odd growths and irregular edges of organic life. This is my chief complaint with much fiction of the past fifty years, including much very highly regarded fiction, like that of John Updike: these aren’t novels, they are mechanical models of novels. Precision-engineered down to the last hidden screw, but altogether without the spark of life.

Thursday, July 17, 2014

my course on the "two cultures"

FOTB (Friends Of This Blog), I have a request for you. This fall I’m teaching a first-year seminar for incoming Honors College students, and our topic is the Two Cultures of the sciences and the humanities. We’ll begin by exploring the lecture by C. P. Snow that kicked off the whole debate — or rather, highlighted and intensified a debate had already been going on for some time — and the key responses Snow generated (F. R. Leavis, Lionel Trilling, Loren Eiseley). We’ll also read the too-neglected book that raised many of the same issues in more forceful ways, and a few years before Snow, Jacob Bronowski’s Science and Human Values.

Then we’ll go back to try to understand the history of the controversy before moving forward to consider the forms it is taking today. Most of the essays I’ll assign may be found by checking out the “twocultures” tag of my Pinboard bookmarks, but we’ll also be taking a detour into science/religion issues by considering Stephen Jay Gould’s idea of non-overlapping magisteria and some of the responses to it.

What other readings should I consider? I am a bit concerned that I am presenting this whole debate as one conducted by white Western men — Are there ways of approaching these questions by women or people from other parts of the world that might put the issues in a different light? Please make your recommendations in the comments below or on Twitter.

Thanks!

the problems of e-reading, revisited

In light of the conversation we were having the other day, here is some new information

The shift from print to digital reading may lead to more than changes in speed and physical processing. It may come at a cost to understanding, analyzing, and evaluating a text. Much of Mangen’s research focusses on how the format of reading material may affect not just eye movement or reading strategy but broader processing abilities. One of her main hypotheses is that the physical presence of a book—its heft, its feel, the weight and order of its pages—may have more than a purely emotional or nostalgic significance. People prefer physical books, not out of old-fashioned attachment but because the nature of the object itself has deeper repercussions for reading and comprehension. “Anecdotally, I’ve heard some say it’s like they haven’t read anything properly if they’ve read it on a Kindle. The reading has left more of an ephemeral experience,” she told me. Her hunch is that the physicality of a printed page may matter for those reading experiences when you need a firmer grounding in the material. The text you read on a Kindle or computer simply doesn’t have the same tangibility.

In new research that she and her colleagues will present for the first time at the upcoming conference of the International Society for the Empirical Study of Literature and Media, in Torino, Italy, Mangen is finding that that may indeed be the case. She, along with her frequent collaborator Jean-Luc Velay, Pascal Robinet, and Gerard Olivier, had students read a short story—Elizabeth George’s “Lusting for Jenny, Inverted” (their version, a French translation, was called “Jenny, Mon Amour”)—in one of two formats: a pocket paperback or a Kindle e-book. When Mangen tested the readers’ comprehension, she found that the medium mattered a lot. When readers were asked to place a series of events from the story in chronological order—a simple plot-reconstruction task, not requiring any deep analysis or critical thinking—those who had read the story in print fared significantly better, making fewer mistakes and recreating an over-all more accurate version of the story. The words looked identical—Kindle e-ink is designed to mimic the printed page—but their physical materiality mattered for basic comprehension.

Note that the printed book is being compared here to the Kindle, which means that the distractions of connectivity I talked about in the previous post aren’t relevant here. (I’m assuming that they mean an e-ink Kindle rather than a Kindle Fire, though it would be important to know that for sure.) 

My hunch, for what it’s worth, is that it is indeed “the physicality of the printed page” that makes a significant difference — in a couple of specific senses.

First of all, the stability of the text on a printed page allows us (as most readers know) to have visual memories of where passages are located: we see the page quadratically, as it were, divided into upper left, lower left, upper right, and lower right. This has mnemonic value. 

Second, the three-dimensionality of a book allows us to connect certain passages with places in the book: when we’re near the beginning of a book, we’re getting haptic confirmation of that through the thinness on one side and thickness on the other, and as we progress in our reading the object in our hands is continually providing us with information that supplements what’s happening on the page. 

A codex is then an informationally richer environment than an e-reader. 

There are, I suspect, ways that software design can compensate for some of this informational deficit, though I don’t know how much. It’s going to be interesting to see whether any software engineers interest themselves in this problem. 

As for me, I suspect I’ll continue to do a lot of reading electronically, largely because, as I’ve mentioned before, I’m finding it harder to get eyewear prescriptions that suit my readerly needs. E-readers provide their own lighting and allow me to change the size of the type — those are enormous advantages at this stage of my life. I would love to see the codex flourish, but I don’t know whether it will flourish for me, and I am going to have some really difficult decisions to make as a teacher. Can I strongly insist that my students use codexes while using electronic texts myself? 

Wednesday, July 16, 2014

DH in the Anthropocene

This talk by Bethany Nowviskie is extraordinary. If you have any interest in where the digital humanities — or the humanities more generally — might be headed, I encourage you to read it. 

It’s a very wide-ranging talk that doesn’t articulate a straightforward argument, but that’s intentional, I believe. It’s meant to provoke thought, and does. Nowviskie’s talk originates, it seems to me, in the fact that so much work in the digital humanities revolves around problems of preservation. Can delicate objects in our analog world be properly digitized so as to be protected, at least in some senses, from further deterioration? Can born-digital texts and images and videos be transferred to other formats before we lose the ability to read and view them? So much DH language, therefore, necessarily concerns itself with concepts connecting to and deriving from the master-concept of time: preservation, deterioration, permanence, impermanence, evanescence. 

For Nowviskie, these practical considerations lead to more expansive reflections on how we — not just “we digital humanists” but “we human beings” — understand ourselves to be situated in time. And for her, here, time means geological time, universe-scale time. 

Now, I’m not sure how helpful it is to try to think at that scale. Maybe the Long Now isn’t really “now” at all for us, formed as we are to deal with shorter frames of experience. I think of Richard Wilbur’s great poem “Advice to a Prophet”

Spare us all word of the weapons, their force and range,
The long numbers that rocket the mind;
Our slow, unreckoning hearts will be left behind,
Unable to fear what is too strange.

Nor shall you scare us with talk of the death of the race.
How should we dream of this place without us? —
The sun mere fire, the leaves untroubled about us,
A stone look on the stone’s face?

Maybe thinking in terms too vast means, for our limited minds, not thinking at all. 

But even as I respond in this somewhat skeptical way to Nowviskie’s framing of the situation, I do so with gratitude, since she has pressed this kind of serious reflection about the biggest questions upon her readers. It’s the kind of thing that the humanities at their best always have done. 

So: more, I hope, at another time on these themes. 

how problematic is e-reading?

Naomi Baron thinks it’s really problematic in academic contexts: 

What’s the problem? Not all reading works well on digital screens.

For the past five years, I’ve been examining the pros and cons of reading on-screen versus in print. The bottom line is that while digital devices may be fine for reading that we don’t intend to muse over or reread, text that requires what’s been called "deep reading" is nearly always better done in print.

Readers themselves have a keen sense of what kind of reading is best suited for which medium. My survey research with university students in the United States, Germany, and Japan reveals that if cost were the same, about 90 percent (at least in my sample) prefer hard copy for schoolwork. If a text is long, 92 percent would choose hard copy. For shorter texts, it’s a toss-up.

Digital reading also encourages distraction and invites multitasking. Among American and Japanese subjects, 92 percent reported it was easiest to concentrate when reading in hard copy. (The figure for Germany was 98 percent.) In this country, 26 percent indicated they were likely to multitask while reading in print, compared with 85 percent when reading on-screen. Imagine wrestling with Finnegan’s Wake while simultaneously juggling Facebook and booking a vacation flight. You get the point.

And maybe she’s right, but she also seems to be eliding some important distinctions. For instance, when she says that “digital reading ... encourages distraction and invites multitasking,” what she’s really referring to is “reading on a capable internet-connected device” — probably an iPad. A Kindle or Nook or Kobo, with either very limited internet access or none at all, wouldn’t provide such distractions. 

To be sure, digital reading is increasingly dominated by tablets, as their share of the market grows and that of the dedicated e-readers shrinks, but it’s still wrong to blame “digital reading” for a problem that’s all about internet connectivity. 

Also: Baron’s research is with university students, which is to say, people who learned to read on paper and did all their serious reading on paper until quite recently. What we don’t know is how kids who learn to read on digital devices — a still-small category — will feel about these matters by the time they get to university. That is, what Baron is attributing to some intrinsic difference between digital reading and reading on paper might well be a matter of simple familiarity. I don’t think we’ve yet reached the point where we can make that decision. 

I say all this as a lover of books and a believer that reading on paper has many great advantages that our digital devices have yet to replicate, much less to exceed. But, to judge only from this excerpt of a larger project, I doubt that Baron has an adequate experimental design.