Text Patterns - by Alan Jacobs

Tuesday, December 31, 2013

more on thinking

So, about having ideas worth expressing. Let me start by working my way through this passage from C. S. Lewis on the value of reading old books:

Every age has its own outlook. It is specially good at seeing certain truths and specially liable to make certain mistakes. We all, therefore, need the books that will correct the characteristic mistakes of our own period. And that means the old books. All contemporary writers share to some extent the contemporary outlook — even those, like myself, who seem most opposed to it. Nothing strikes me more when I read the controversies of past ages than the fact that both sides were usually assuming without question a good deal which we should now absolutely deny. They thought that they were as completely opposed as two sides could be, but in fact they were all the time secretly united — united with each other and against earlier and later ages — by a great mass of common assumptions. We may be sure that the characteristic blindness of the twentieth century — the blindness about which posterity will ask, "But how could they have thought that?" — lies where we have never suspected it, and concerns something about which there is untroubled agreement between Hitler and President Roosevelt or between Mr. H. G. Wells and Karl Barth. None of us can fully escape this blindness, but we shall certainly increase it, and weaken our guard against it, if we read only modern books. Where they are true they will give us truths which we half knew already. Where they are false they will aggravate the error with which we are already dangerously ill. The only palliative is to keep the clean sea breeze of the centuries blowing through our minds, and this can be done only by reading old books. Not, of course, that there is any magic about the past. People were no cleverer then than they are now; they made as many mistakes as we. But not the same mistakes. They will not flatter us in the errors we are already committing; and their own errors, being now open and palpable, will not endanger us. Two heads are better than one, not because either is infallible, but because they are unlikely to go wrong in the same direction. To be sure, the books of the future would be just as good a corrective as the books of the past, but unfortunately we cannot get at them.

This is a brilliant passage, which I want to endorse almost wholly — just with some relatively minor adjustmemts. When CSL speaks of “the characteristic blindness of the twentieth century” he means “the characteristic blindness of somewhat-to-highly educated Westerners in the twentieth century.” Had he met people from China or Kenya or Saudi Arabia or Indonesia who had not been educated in the Western style, he would have found without leaving his own time considerably greater intellectual diversity than he experienced in England.

The key point here is: get out of your comfort zone, your echo chamber. But don't do so by seeking out the crowd-pleasers and rabble-rousers from outside your typical group (unless you’re trying to understand sociological phenomena). If you’re a conservative who wants to understand liberalism, don’t bother with Michael Moore; if you’re a liberal who wants to understand conservatism, don't bother with Sarah Palin; if you’re an unbeliever who’s curious about Christianity, ignore Joel Osteen; if you’re an orthodox Christian trying to get a fix on atheism, steer clear of Bill Maher.

If you seek out what’s strange to you in its better expressions, several things will happen. First of all, you’ll court being changed by the encounter, having your views altered, perhaps in significant ways. You’ll learn that the people who disagree with you are almost certainly, taken as a whole, morally and intellectually the equal of the people you agree with. (Just like those people from the past whom Lewis commends to us.) You’ll probably come to realize that any question that is fiercely debated is fiercely debated because there aren't simple and obvious answers to it. You’ll find it harder to enjoy the simplistic, cheap pleasures of Bulverism.

Let’s be clear about this: if you follow my advice, in the short run you’ll find it harder to express your ideas because you’ll be less sure what they are. It’ll be tempting to fall back on prefabricated assumptions, intellectual clichés that do the same work as linguistic clichés. Thus George Orwell:

A scrupulous writer, in every sentence that he writes, will ask himself at least four questions, thus: 1. What am I trying to say? 2. What words will express it? 3. What image or idiom will make it clearer? 4. Is this image fresh enough to have an effect? And he will probably ask himself two more: 1. Could I put it more shortly? 2. Have I said anything that is avoidably ugly? But you are not obliged to go to all this trouble. You can shirk it by simply throwing your mind open and letting the ready-made phrases come crowding in. They will construct your sentences for you — even think your thoughts for you, to a certain extent — and at need they will perform the important service of partially concealing your meaning even from yourself.

If you’d prefer to let easy, simplistic categories and assumptions “think your thoughts for you,” then stay within the familiar orbit of those who think and say only what you’re already prepared to hear. But if you want to have thoughts worth expressing, you’re going to have to take the risk of being slowed down and even seriously altered.

But if you do take this risk, you’ll learn a lot. For one thing, if you read the best and most serious proponents of alternative views, of other cultures, of other times, you’ll likely be driven to seek out the strongest voices for your own views. You’ll find out whether your own tradition is all that you thought it was, and maybe you’ll learn that it had even greater resources than you had expected. But however that works out, you’ll gain the inestimable benefit of coming to see (and, I hope, to embrace) the full humanity of those who think very differently than you do. This is the royal road towards the greatest of virtues. As Iris Murdoch once wrote, “Love is the extremely difficult realization that something other than oneself is real.”

Monday, December 30, 2013


Austin Kleon is one of my favorite people on the internet, and I like this little reflection of his on doing something small every day. I like Steven Johnson’s writing too, and this series on writing looks like it’ll be fun and to some people quite useful.

But whenever I am asked to contribute to this genre — the writerly-tips-and-tricks genre or whatever you want to call it — I have always declined. I guess I am just skeptical that what I do will work for other people. But, being nothing if not critical, I can tell you what I think is generally missing from the genre, and why I think its absence is important.

By way of getting to my point, let me encourage you to look again at Johnson’s posts. He tells you how to “keep your hunches alive,” how to use e-book annotations, how to keep researching as you write, and so on. All very good in its way.

But: What if your ideas are crap? What good does it do — for you or the world — if you are clever and efficient in communicating thoughts that are carelessly arrived at, or ill-formed and incompletely worked through, or utterly unimaginative repetitions of what people in your would-be peer group have already said?

Now, perhaps your highest intellectual ambition is to be asked to give a TED talk, in which case all those vices I just listed will be magically transformed into virtues. But if you want to do really good work, intellectually and/or artistically substantive work, then your first question can never be “How do I express my ideas?” but rather “How can I acquire ideas that are worthy of being expressed?”

I don't have an actual answer to that question, but I have some thoughts that I'll get to in another post.

Sunday, December 29, 2013

the uses of art

John Armstrong writes about art:

The idea that art’s value should be understood in therapeutic terms is not new. In fact, it is the most enduring way of thinking about art, having its roots in Aristotle’s philosophical reflections on poetry and drama. In the Poetics, Aristotle argued that tragic drama can elevate how we experience fear and pity—two emotions that help shape our experience of life. The broad implication is that the task of art is to help us flourish, to be “virtuous,” in Aristotle’s special sense of that word: that is, to be good at living, even in challenging circumstances.

This understanding of art has been in abeyance in recent decades, but it is, I believe, the only plausible way of thinking about art’s value. Other approaches, as we have seen, must tacitly assume it, even when they deny it. To consider art from a therapeutic point of view is not to abandon profundity but to embrace it and to return art to a central place in modern culture and modern life.

I would find this argument intriguing, except … “the task of art is to help us flourish”? Not a task, or even an especially important task, but the one and only?

Why do people talk this way?

As I’ve already noted, I’ve been spending a lot of time lately in the company of Bach’s music and of some of his commentators, and in Bach: Music in the Castle of Heaven John Eliot Gardiner writes,

Expanding on the celebrated formulation by the fifteenth-century theorist Johannes Tinctoris – Deum delectare, Dei laudes decorare (‘To please God, to embellish the praise of God’) – Bach had defined music’s purpose in his Orgel-Büchlein as ‘For the highest God alone honour; for my neighbour that he may instruct himself from it.’ Beneath its flowery surface, we are shown the underlying didactic purpose of his collection, one close to the twin purposes of music in the Lutheran tradition: die Ehre Gottes und des Nechsten Erbauung – for giving honour to God (the standard Orthodox position) and for edifying one’s neighbour (the slant favoured by the Pietists).

Once Bach is ensconced in Leipzig his views begin to lean towards the more ‘enlightened’ formulations of musicians such as Friedrich Erhard Niedt, embracing aesthetic pleasure as well as devotion and edification. We now find him adopting in his Generalbasslehre of 1738 a different two-fold purpose of music: ‘zur Ehre Gottes und zulässiger Ergötzung des Gemüths’ – ‘for giving honour to God and for the permissible delight of the soul’. He explains, ‘And so the ultimate end or final purpose of all music … is nothing other than the praise of God and the recreation of the soul. Where this is not taken into account, then there is no true music, only a devilish bawling and droning.’

Glorification, instruction, edification, recreation — these are all valid “tasks of art,” and they vary in importance not just according to the artist, but also according to circumstance and, for that matter, according to the needs of a given recipient at a given moment. In general I may listen to Bach’s choral music to feel more fully the glory of God, and listen to the keyboard music for pleasure — but sometimes those functions are reversed, and they are always to some degree mixed.

Anybody who has read much of my writing knows that this is a recurrent theme: I deeply dislike convenient simplifications of the richness and diversity of human experiences. This is the heart of my critique of the critique of digital dualism: it leaves us with an even more limited vocabulary with which to describe what we do and think and feel. I am fond of quoting the philosopher Bernard Williams: “We suffer from a poverty of concepts.” Indeed we do. And it seems to me that we are especially conceptually poor when we talk about art and technology.

Thursday, December 26, 2013

a bit of review

Well, this has been quite a busy year for me, especially given my move from the suburbs of Chicago, where I lived for 29 years, to the great Republic — um, State of Texas. And for those who are interested, here are some writing projects that bore fruit this year.

I edited a critical edition of W. H. Auden's long poem For the Time Being: A Christmas Oratorio, the brief Preface to which you can find here. There's a short review of the edition here and a longer, more detailed one here.

I published The Book of Common Prayer: A Biography. You may read the introduction to the book here and reviews here and (briefly) here.

I do most of my periodical writing these days for my dear friend John Wilson at Books and Culture. This year I wrote, at some length, about

evolutionary accounts of storytelling;

Walker Percy and Carl Sagan;

a graphic memoir featuring James Joyce;

Sarah Losh's mysterious and beautiful Cumbrian church;

Francis Spufford's marvelous book Unapologetic;

American readers of C. S. Lewis; and

Thomas Pynchon's new novel Bleeding Edge.

And I published an essay on "Auden's Theology" in a collection called Auden in Context, the Kindle edition of which costs a paltry $63.20! It's not too late for an electronic stocking-stuffer!

I hope that among these various writings you'll find something that captures your interest.

Monday, December 23, 2013

the desolation of Peter Jackson

My son and I went to see The Dissolution of Smog The Desecration of Snog The Desolation of Smaug today. I am infuriated.

Let me begin my talking about what I liked. The barrels-down-the-river scene was fun and funny. Laketown was delightfully shabby. Smaug looked really cool.

That’s it. The rest was utter dreck. As my son commented, the only thing that could possibly rescue this movie would be a Mystery Science Theater 3000 version of it. (And just so you know, I really enjoyed the Lord of the Rings movies, and have frequently defended them against their detractors, especially Tolkien purists.) So let me just note a few of the many, many things I hated about this movie. Some semi-spoilers follow.

First of all, the video-game aesthetics that so afflicted the first Hobbit film are even worse here. When you combine the game-style action with the 48fps film rate, and then put 3D on top of that, watching this movie is like being slightly high on pot and playing a circa-2005 Xbox game while watching a 1970s sitcom marathon out of the corner of your eye. Its artifice shouts from the rooftops. The spiderwebs that looked so cool and gross when Frodo was wrapped in them in 2003 now look like cheap plastic doilies arranged on Martin Freeman’s head.

Second: speaking of Martin Freeman, who was the best thing in the first movie, he has nothing to do here. Almost no one in this movie does any real acting, but Freeman isn’t even given a chance. He has one briefly cute scene with Gandalf, and is given a few pleasant lines with Smaug, but that’s it. He’s completely wasted. Evangeline Lilly is given far more to do than Freeman — a choice that I cannot imagine any other director in the world making. The scene where Jackson has her pacing back and forth and woodenly declaiming her lines to an equally wooden Lee Pace as Thranduil would be painful in a high-school drama class.

Third: so, about Smaug. He’s awesome-looking and -sounding (Cumberbatched to the Nth degree) but seems to be highly inconsistent in his powers. For instance, whenever the dwarves and their hobbit mascot are conveniently hidden behind a wall he can blast massive shockwaves of fire in their general direction; but when they’re standing three feet away right in front of him he just chats with them. And it’s not like he alters in a discernible direction: his two moods alternate like cinematic clockwork. Chat, then blast; chat, then blast. He’s an absent-minded dragon, I guess, who can’t remember whom he wants to incinerate or why. I mean, even when people are standing right in front of him and taunting him he does nothing — but as soon as they scramble to safety he’s like the business end of a Saturn V.

Which leads me, fourth, to Gandalf. One of the problems with Jackson’s LOTR is the way Gandalf’s powers inexplicably wax and wane: in the first movie he can confront and defeat a Balrog — a Balrog, for heaven’s sake: have you seen those things? — but collapses in a heap before the leader of the Nazgul in the third one. And that was supposed to be the new and improved post-resurrection Gandalf. I guess you could argue that this movie’s pre-resurrection Gandalf is a less formidable figure, which doesn't fit the Tolkien character, but that’s okay, let’s grant PJ and his co-authors the right to do with Gandalf what they will. But, then, why does this pantywaist Gandalf stroll right into the fortress of the Necromancer as though he’s taking his daily constitutional in Wizard’s Park? Apparently he just wants to find out who the Necromancer is, but is that really the ideal way to do it? Walk into a creepy fortress saturated with black magic and shout “Who are you people?” Just because you’re a wizard, there’s no need to be a moron also, is there?

(Parenthetically: Peter Jackson seems to think that a wizard’s power resides wholly in his staff, so that when his staff is taken away he’s helpless — which, I mean, okay, but then why is Gandalf never able, in any of the Tolkien films, to do much more with his staff than shine a bright light? In this one he does poke ineffectually at some orcs, and elsewhere he smites a couple of nasties with it, but, if you look at all the Jackson Tolkien films in toto, basically it’s just a flashlight. An inconveniently enormous flashlight. )

Fifth: it’s only for a couple of seconds, but we get Radagast’s %$#@! buggy-bunny again. And Gandalf sends Radagast away to give a message to Galadriel, even though Galadriel and he can communicate telepathically.

Sixth and lastly (as Dogberry once said), I have no idea what is going on in the last few minutes as the dwarves confront Smaug. Somehow eight or nine dwarves are able to get all the mighty furnaces of their ancestors running again in two or three minutes, and the furnaces are so powerful that it takes them only another 30 seconds or so to create rivers of molten metal, and then they make a giant golden statue of a dwarf to mesmerize Smaug — or maybe they don't make it but just fill it with molten gold? — but whether they make it or pump it full of gold-syrup it doesn’t melt but rather shoots the gold-syrup out of its eyeballs — though Smaug has to conveniently stop and stare first at Thorin and then at the Great Idol long enough to make all the machinery work? I mean, the scene is completely nonsensical, in a way that no respectable video-game (the genre it’s trying to imitate) would ever allow to happen.

I could write a post three times as long as this one if I wanted to list all the absurdities and solecisms of this film. But I’ll spare you. It’s stupid and ugly, and you shouldn’t spend your money on it.


digital dualism and experiential monism

I'm going to begin by quoting only the concluding paragraph of a fairly long essay by Nathan Jurgenson, so please click through and read the whole thing to make sure I'm not misrepresenting the argument. Here's the end:

Of course, digital devices shouldn’t be excused from the moral order — nothing should or could be. But too often discussions about technology use are conducted in bad faith, particularly when the detoxers and disconnectionists and digital-etiquette-police seem more interested in discussing the trivial differences of when and how one looks at the screen rather than the larger moral quandaries of what one is doing with the screen. But the disconnectionists’ selfie-help has little to do with technology and more to do with enforcing a traditional vision of the natural, healthy, and normal. Disconnect. Take breaks. Unplug all you want. You’ll have different experiences and enjoy them, but you won’t be any more healthy or real.

First of all, I don't understand the need for an accusation of "bad faith." Perhaps if the disconnectionists are wrong they are sincerely wrong. I see no reason to attribute to them this particular moral failing.

Second, I fully endorse Jurgensen's point that the connected life is no less real than the disconnected. Our lives are always real; anything we do is as real as anything else we do. I also agree that “the disconnectionists establish a new set of taboos as a way to garner distinction at the expense of others, setting their authentic resistance against others’ unhealthy and inauthentic being” — this is indeed far too strong and too common an element of disconnectionist rhetoric.

But there are elements of Jurgenson’s argument that I can’t endorse. Let me get at them by noting that the question I would like to put to the disconnectionists is this: What are you going to do once you disconnect? You've got a lot of extra time on your hands now: how do you plan to use it?

Suppose a sedentary man who had been spending several hours a day playing World of Warcraft decided to disconnect and take up running. Wouldn't he in fact have made himself healthier by that decision? — and let’s be precise here: not by the decision to disconnect as such but by the subsequent decision to do something better, which required disconnection as a prerequisite. And doesn't that disprove Jurgensen's blunt claim that if you disconnect “you won’t be any more healthy”?

Perhaps Jurgenson didn’t mean that kind of health. But if you can become physically healthier by replacing one kind of activity with another, then why not in other areas of life as well? Perhaps also Jurgenson would say that he merely meant that one doesn't automatically become healthier by disconnecting; but that's a very, very different claim than the one that he actually made in his essay. To respond to the claim that disconnection will make you healthier by saying that disconnection won't make you healthier doesn't advance the discussion: it just replaces one highly dubious generalization with another.

What does disconnection do? It depends. It depends on why you disconnect, on what you were doing when you were connected, on what you do instead of being connected. (Now reverse the polarities and ask what connection does, and you’ll need to employ the same logic. Imagine a person who’s sedentary because she reads books all the time getting an iPhone with fitness apps that she uses to help her become more physically active and more disciplined.)

A lot of Jurgenson’s recent work has been focused on this critique of digital dualism, but my concern is that Jurgenson may just be replacing a simplistic dualism with an amorphous monism. At one point in his essay he writes, “The obsession with authenticity has at its root a desire to delineate the ‘normal’ and enforce a form of ‘healthy’ founded in supposed truth.” Note that every significant term here is placed either in literal or implicit scare quotes: normal, healthy, truth, authenticity. Not all of these terms are obviously useless, and I’d particularly like to make a case for the value — even the necessity — of thinking about our leisure-time decisions in terms of what conduces to our health.

So while Jurgenson is right to deconstruct the binaries of digital dualism, he’s wrong, I think, to believe that such a critique requires deconstruction of the values and concerns that drive digital dualism. We may agree that digital dualism is an inadequate response to the role that digital technologies play in our lives; but it does not follow that that role requires no reflection, no interrogation. All digital tools and toys — just like all non-digital ones — are prone to misuse, and in my view the categories by which we distinguish right use from misuse are precisely those of health, at least, health in the broader and richer sense of flourishing, eudaimonia.

Now, as a Christian, I’d want to steer a conversation about these matters from health to eudaimonia and ultimately to the love of God and neighbor; but I’d be happy to start with health, and remain in that conceptual ambit for a while. And I certainly don't think we’re helped to make wise technological decisions by a deconstruction of digital dualism that leaves us with even fewer means of sorting through our complicated technological experiences.

Saturday, December 21, 2013

Twitter at its worst

(1) Dim-witted person tweets, or is reported to have said, something dim-witted.

(2) Equally dim-witted people notice and tweet their outrage.

(3) After a while, brighter people note the kerfuffle and decide to weigh in, either to reinforce or to critique the outrage.

(4) These brighter people are followed by even brighter people, who begin to see the whole thing as Significant and therefore requiring their response. They tweet and blog their thoughts.

(5) Result: the intellectual agenda of some of the brightest people on Twitter is, often for several days, established and directed by some of the most dim-witted ones.

Friday, December 20, 2013

the true meaning of Mimesis

Let me just be blunt: this Arthur Krystal essay on Erich Auerbach and Mimesis is disappointingly superficial and offers no substantive insights into Auerbach or his great masterpiece.

There are two major points to keep in mind if you want to understand the real significance of Auerbach.

First, as David Damrosch has noted in the best essay I've ever read about Auerbach, Mimesis is an assertion, in the face of the Nazi demolition of culture, of the enormous humanistic, and humane, and simply human, value of philology. Damrosch:

Begun in exile in 1942 and completed in April 1945 (the very month of Hitler's death), Mimesis stands as an affirmation of the scholar's ability to rise above every obstacle that adverse historical circumstances can present. Or, to put it differently: Auerbach responds to the loss of his homeland and the collapse of his scholarly world through the recreation of European culture, both in the evocation of texts from across the tradition and by the display of humanistic scholarship at its best, with analyses at once judicious and loving, objective and deeply personal.

Philology is for Auerbach (as it was in different ways for Nietzsche, Tolkien, and A. E. Housman) the humanistic discipline par excellence, the intellectual nexus where deep historical learning and the mastery of multiple languages converge with refined taste and sensibility. It is everything Nazism is not. It is the last bastion of the Republic of Letters, where neither nationality nor ethnicity mean anything, only a passion for learning and a love of the beautiful.

That is the first point. The second is this: Mimesis is a document of hope, hope grounded in what Mikhail Bakhtin called “great time.” Great time concerns history as it unfolds over, not decades, not even centuries, but millennia. Bakhtin believed there are meanings implicit in, say, Athenian tragedy that will only be discovered in the distant future. As he wrote in a notebook near the end of his life,

There is neither a first nor last word and there are no limits to the dialogic context (it extends into the boundless past and the boundless future). Even past meanings, that is, those born in the dialogue of past centuries, can never be stable (finalized, ended once and for all) — they will always change (be renewed) in the process of subsequent, future development of the dialogue. At any moment in the development of the dialogue there are immense, boundless masses of forgotten contextual meanings, but at certain moments of the dialogue’s subsequent development along the way they are recalled and reinvigorated in renewed form (in a new context). Nothing is absolutely dead: every meaning will have its homecoming festival.

This is very much the implicit argument of Mimesis: in his brilliant chapter on the Gospel narratives, for instance, Auerbach shows how those stories upend the classical model of stylistic decorum — a high style for high things, a low style for low things — in ways that would not bear ripe literary fruit for nearly eighteen hundred years.

So even if the Nazis were winning as Auerbach wrote — in his Istanbul exile — they would not win forever. They would be unable to eradicate the great Western culture that Auerbach had devoted his life to; it would return, it would find new life, its deepest and richest meanings would eventually have their homecoming festival. In its hopefulness Mimesis is a passionate act of defiance, the defiance of the truly cultured in the face of culture's powerful defilers.

That's what Mimesis is all about, and that's why it's one of the greatest books of the twentieth century.

Thursday, December 19, 2013

on Bach

Paul Elie's Reinventing Bach and John Eliot Gardiner's Bach: Music in the Castle of Heaven are, as I'm sure many reviewers have already noted, complementary and contrasting books. Gardiner's emphasis is consistently on the performance of Bach, and on the ways in which historical scholarship has enabled us to reconstruct the great composer's intentions and purposes. Elie, on the other hand, focuses on modern recording technologies as enabling past, present, and (probably) future revivals of interest in Bach. Elie doesn't ignore performance, and in fact relates several anecdotes of Bach concerts he has attended, but for him, performance is always secondary to the individual listener's experience of the music — experience made ever more powerful by ever more sophisticated technologies of recording and playback.

I love Gardiner's book, digressive and sometimes incoherent though it is, because I'm fascinated by historical detective stories — the ways in which dogged researchers can unearth surprising pieces of information which turn out to alter how we think about a subject or a work of art. But Elie's emphasis on recording and individual listening matches my own experience much more closely. In part, this is simply because of a back problem I've had all my life: it's very difficult for me to sit upright for any length of time, which has limited my ability to attend the theater and the concert hall. I find it much easier to concentrate on music when I can get into a comfortable position and be still for an extended period.

Nevertheless, I am always aware that when I'm listening to music alone I am missing the distinctive experience that you get when you listen in the company of others. But I don't feel that loss very strongly; in fact, when I listen to music in the company of others I tend to be distracted by their presence, and it's not that often that I feel that we have had a truly shared experience. Honestly, if I could choose to be the only member of an audience I almost always would. So like Elie I am not troubled by the thought that music listening is becoming an increasingly solitary experience. There are worse things in the world. Though I do understand why musicians and lovers of the concert hall feel this retreat into solitude is a great loss, I don't feel it myself. Indeed, listening to music on headphones is perhaps my major way to cultivate a disconnected solitude that's otherwise hard for me to find.

As might be expected, given their differing views on the centrality of performance, Gardiner and Elie differ in the music of Bach they emphasize. Bach's choral music dominates the portrait drawn by Gardiner, whose conducting of the two great Passions, the B-Minor Mass, and (especially) the cantatas is justly famous. Elie, by contrast, tends to emphasize the great soloists and the music they played: Albert Schweitzer and the organ pieces, Glenn Gould and the various piano masterpieces, Pablo Casals and Yo-Yo Ma and the cello suites. Here too I find myself more sympathetic to Elie: when I listen to Bach, which is often, it's the keyboard works, the cello suites, and the violin sonatas and partitas to which I most often turn.

And along those lines: I recently discovered that the cellist Christopher Costanza has not only recorded Bach's cello suites but has created a website that embeds recordings of each suite along with Costanza's commentary on each movement. It's a very helpful way to listen to the music more thoughtfully, even if the fidelity of the recording is not the best.

Wednesday, December 18, 2013

learning attention

Here at Baylor's Honors Program, we offer first year seminars that are meant to introduce our students to the challenges and benefits of honors-level work. I'm going to be teaching one of those next year, and here are some of the possibilities I've been considering:

(1) The Two Cultures: C.P. Snow wrote fifty years ago that the sciences and the humanities were drawing farther and farther apart, and that this severance socially dangerous. Was he right? Have things changed since then? If so, for the better or the worse? If there is a gap to be bridged, what can we do to bridge it?

(2) The Future of the Book: With the ever-increasing availability of e-books and other forms of digital reading, are we seeing the imminent demise of the bound-in-paper book, the codex? If so, what effects will the digitization of text have on the reading (and writing, and researching) experience? We'll investigate these questions with some attention to the history of the codex and especially its role in the shaping of Christian culture.

(3) Attention: Are digital technologies destroying our capacity to pay attention — or are they just changing it, in some ways for the better? In this class we will try to understand what attention is, why it has been thought important in prayer, study, and personal relationships, and how it may best be nurtured as our technological environment changes.

(4) Poetry as Theology: Can poetry not just describe spiritual experience but actually be a way of doing Christian theology? Of seriously contributing to the work of theology? Poets studied will include Dante, John Milton, T. S. Eliot, W. H. Auden, and Gjertrud Schnackenberg, among others.

I asked my current students which they would prefer, and the first option got the most votes. (They also convinced me that the last one would probably be better as an upper-level course.)

I would enjoy teaching any of these, but the one I think is most important is the third. I'm inclined to think that every college should offer a required first-year course on attention and attentiveness. Attention is costly -- there's a reason why we speak of paying attention -- and as a resource it is easy to deplete though also renewable. Simply to make students aware of the costs and the renewability would be a major service to them, I think.


Hey folks, I've decided to disable comments for this blog. I don't get a great many comments here anyway, and the constructive conversation about the issues raised here tend to happen on Twitter and in the responses people make on their own blogs. (Same as it ever was.) So check in with me tweetwise!

Tuesday, December 17, 2013

behold, thy salvation cometh

Samuel Arbesman thinks we have a problem: too many specialists, not enough generalists. The age of the polymath is over, but we can bring it back! How? Why, we just need to give people the right tools, that is, we need to “embrace the machines” — the computing machines — and teach everybody to code. “Far from being a tech-centric perspective, coding connects ideas across fields.” When tech is everything, then we won’t be tech-centric anymore.

So we see once more that technological solutionism has a response to every problem — but it’s always exactly the same response. Salvation is sola codes, by code alone. In code we trust. Code is the way, the truth, and the life; no one comes to polymathy except by it. Blessed be the knowledge of the code. At the compilation of the code every knee shall bow and every tongue confess that code is Lord. Amen.

who hacks the planet?

Eli Kintisch’s 2010 book Hack the Planet explores the rise of geoengineering as a response to global warming: Since human beings are apparently unwilling to change their behavior in order to avoid unfortunate effects on the planet’s ecosystem, why not then change the way the planet responds to our behavior?

But the chief problem with hacking the planet is that you’d be hacking the planet, and, as Kintisch pointed out in a related article, it’s hard to envision ways of testing planet-hacks before employing them. You can’t really release sunlight-blocking aerosols in one unobtrusive corner of the atmosphere to see what they do. In the end, if such strategies are deployed — as David Keith of Harvard in a new book says they must be — then someone is going to have to bite the bullet and attempt on a huge scale an endeavor whose results will be pretty unpredictable.

And as Kintisch notes in a brief review of Keith’s book, geoengineering could be the source of major international conflicts in the 21st century:

solar geoengineering could be a major geopolitical issue in the 21st century, akin to nuclear weapons during the 20th—and the politics could, if anything, be even trickier and less predictable. The reason is that compared with acquiring nuclear weapons, the technology is relatively easy to deploy. “Almost any nation could afford to alter the Earth’s climate,” Keith writes. That fact, he says, “may accelerate the shifting balance of global power, raising security concerns that could, in the worst case, lead to war.”
The potential sources of conflict are myriad. Who will control Earth’s thermostat? What if one country blames geoengineering for famine-inducing droughts or devastating hurricanes? No treaties ban climate engineering explicitly. And it’s not clear how such a treaty would operate. [...]

Accepting the concept of the Anthropocene means accepting that humans have the responsibility to find technological fixes for disasters they have created. But little progress has been made toward a process for rationally supervising such activity on a global scale. We need a more open discussion about a seemingly outlandish but real geopolitical risk: war over climate engineering.

I think here of Robert Oppenheimer's notorious line: "When you see something that is technically sweet, you go ahead and do it and argue about what to do about it only after you've had your technical success." To a lot of scientists planet-hacking looks technically very sweet indeed, and no doubt they'll be able to find politicians to agree with them. But which country will be the quickest to release its geoengineers to do their thing? Being first-to-market in planet-hacking may not be a good thing — for those of us who're getting hacked.

Friday, December 13, 2013

biased against creativity?

“People say they like creativity but they really don’t” is Slate’s summary of a new paper. Having read the paper, “The Bias Against Creativity: Why People Desire But Reject Creative Ideas” (PDF), I think Slate is right that that’s the paper’s claim. But I don't think that’s what the research actually shows.

The paper’s authors rightly say that “Creative ideas are both novel and useful,” but if I’m reading their paper rightly — and please correct me in the comments if I’m not — what they show is that the people they tested were suspicious of novelty. What the authors seem not to be taking into account is that all creative ideas are by definition novel, but not all novel ideas are creative — in fact, I think it’s fair to say that most novel ideas are pretty stupid. (Samuel Johnson is often credited — erroneously — with saying to a writer “Your manuscript is both original and good. But the parts that are good are not original, and the parts that are original are not good.” Even if Dr. Johnson didn’t say it, the quip makes a point.)

So when people prove to be skeptical about novel ideas, aren’t they just being rational? Running the numbers appropriately? That doesn't mean that they’re “biased against creativity,” only that they know from experience that the great majority of people who think they’re creative really aren’t. That’s why when my wife was in a meeting some years ago and she heard for the thousandth time someone making an appeal for “thinking outside the box” she replied, somewhat plaintively, “Can we first try finding one or two people who can think inside the box?”

in which this Anglican intervenes in a Catholic debate

Not really on-topic for this blog, but let me say an incoherent word or five about Dana Gioia’s essay on Catholic writing today and Eve Tushnet’s response to it. Since Eve just listed her points without trying to make an argument from them, I shall follow her excellent example.

(1) I think the story that Dana tells would look a good bit different if he had considered Christian writing, rather than just Catholic writing. His focus is too narrowly on the internal struggles of the Catholic Church and not enough on the larger place of Christianity in American society. The successes of the mid-twentieth-century Catholic writers he admires were attributable in part to a culture that was generally well-disposed towards stories grounded in the Christian narrative, and it’s arguable that Walker Percy and Flannery O’Connor, neither of whom were usually published by Catholics, have been more admired in the past half-century among Protestants.

(2) Dana writes, in a passage more important than it might seem, “In the literary sphere, American Catholics now occupy a situation closer to that of 1900 than 1950. It is a cultural and religious identity that exists mostly in a marginalized subculture or else remains unarticulated and covert in a general culture inclined to mock or dismiss it.” Since that earlier Catholic culture was closer in time, and indeed in spiritual formation, to the great artistic works of the Catholic past (Dante, Palestrina), I’d like to pose this question: Why should we not simply think of the generation of Percy, O’Connor, Lowell et al. as a curious aberration in the history of Catholic writing in America, one we should not expect to be repeated?

(3) Eve’s response concerns itself solely with Catholic fiction, while Dana’s concerns Catholic writing. If she looked at his whole topic she might have to readjust her thoughts a bit. For instance, in relation to her thoughts about literary cultures becoming subcultures, poets have been used to that for a long time: tiny sales, small presses, little to no representation in high-circulation periodicals. Even assuming that Eve has rightly identified the Condition of Fiction Today, it’s the Permanent Condition of Poetry.

(4) Eve writes, “Telling a young Catholic writer to go have a career like Flannery O’Connor’s is like telling a young Catholic father to get a good stable union job at the Chrysler plant.” I think this is exactly wrong: O’Connor’s career is exactly the kind of career a young writer today might plausibly have. After getting her MFA she moved back in with her mother, who lived in a low-cost-of-living area and did not need her daughter to make a full-time income. Flannery could therefore devote herself to writing, corresponding with friends (through the U.S. Mail rather than through blogs and Twitter, but that’s a minor detail), and ordering books that she had delivered to her home. Gradually her career and her thought developed along their own distinctive lines, story by story and letter by letter, though she never sold many books and would have been hard-pressed to make a living wage had she lived in a big city. O’Connor ought to be the patron saint of today’s young writers. Instead they all think they have to move to Brooklyn and do everything that all the sad young literary men (and women) do.

(5) I shall now ride a hobby-horse. Eve writes,

Donna Tartt’s new novel hits almost all of Gioia’s criteria for a Catholic novel. (The exceptions: For the sacramentality of nature, substitute that of art; and the meaning of suffering is an anguished question in the book, so it isn’t presented as redemptive.) Christianity itself does not appear, but — does it have to? Anyway, I’m just going to close with the recommendation that The Goldfinch is the best thing this extraordinary (Catholic!) author has written so far.

“The sacramentality of nature” is not a Catholic (is not a Christian) idea, it is a pagan idea. The sacramentality of the sacraments is a Catholic idea. What Eve means, I think, is “nature as a means of conveying grace,” but anything from time to time can be a means of conveying grace. What makes something sacramental is the covenantal promise, by Christ himself or by the Church speaking on his behalf, that grace shall be conveyed through it.

Climbing down from my hobby-horse, I’ll adjust my hunting jacket and add this: I have been thinking for about thirty years about what it means for a work of art to be “Christian,” to have the adjective “Christian” rightly applied to it, and I have pretty much decided that it’s a useless term. Some people argue that such an artwork needs to embody, more or less explicitly, some element of Christian teaching or belief. For others it’s enough that the writer is a Christian. For still others it’s sufficient that the work contain ideas or themes that are generally consistent with some Christian teaching or belief. “Christian art” is an almost infinitely malleable wax nose. It’s not a term I use.

“Christian writer” and “Catholic writer” are scarcely better. There’s a vague uneasy general recognition that every writer who is a Catholic is not really a Catholic writer, and that many writers who were raised Catholic but have left it behind retain some significant residual Catholic sensibilities. (Re-run that sentence and replace “Catholic” with “Christian.”) I’m not sure that when we talk this way we ever really know what we mean. If writers aren’t dealing explicitly (or implicitly but strongly) with Christian themes and ideas, and sometimes even if they are, I don't think we could ever really how much of a mark Christianity has left on them without rewinding their lives and re-raising them without a Christian upbringing, or with a very different one. I sometimes doubt whether if Flannery O’Connor had been raised a lukewarm Methodist she would have written any differently. James Joyce’s Jesuit education may not have shaped his mind as profoundly as many critics believe.

So discussions of this kind seem malformed to me, and therefore relatively fruitless. To Dana Gioia I want to say, “Let’s strip away all the peripheral questions and ask the really key one: How is the Church forming, or failing to form, its children, and how can a stronger power of formation be cultivated?” And to Eve Tushnet I want to say, “Let’s talk about why we like what we like, especially when we discern theological and spiritual resonances that are important to us as readers, however the authors happen to be placed in relation to Christianity.” Maybe if we took those routes we would be less likely to bog down in implacably foggy terminology.

Thursday, December 12, 2013

more about JSTOR (etc.)

The other day I wrote a post for the Atlantic’s Tech Channel arguing that JSTOR and other academic content aggregators and distributors will be hard to displace — and not just because they wield financial power. They wield that power in part because they offer something that we academics and our students think of as a genuine service: the filtering and credentialing of scholarship. My claim, in brief, is that it’s so difficult and time-consuming to teach students how to perform triage on the firehose of information — to mix metaphors more wildly than you’ll see them mixed elsewhere today — that many of us will find it irresistible simply to defer to the judgments implicitly made by JSTOR and Project Muse when they include journals in their databases.

I just want to emphasize here a point that I only made glancingly in that post: that these temptations will surely be stronger for the many adjuncts teaching in American colleges and universities today, who rarely can meet with their students as often as they like or for as long — often they have no offices in which to do so —, who may be juggling courses as multiple schools, and who may have limited contact with the academic librarians who could help. This is a reminder of the multiple entanglements that afflict the American academy — the various forces that work together to impede the achievement of meaningful learning.

So let me make the pitch once more for the rise of colleges with different financial and educational priorities than is the norm today. And, while I’m at it, I’m still waiting for the VCs to show up and make me the founding president of Cassiodorus College.

Wednesday, December 11, 2013

beyond snark and smarm

Just a brief couple of comments on the whole snark-vs.-smarm Ultimate Revenge Cage Match that was kicked off by Tom Scocca in this article:

One: Scocca’s way of distinguishing between snark and smarm is completely incoherent. “Smarm,” he says, “would rather talk about anything other than smarm. Why, smarm asks, can't everyone just be nicer?” Smarm is about avoiding the hard work of honest criticism in order to cultivate an atmosphere of positive reinforcement, like the one Isaac Fitzgerald says he wants to cultivate at Buzzfeed. And yet when people are nice at all to Edward Snowden, when they’re deeply critical of him, when they say “Edward Snowden is an unstable, sensation-seeking narcissist” and “Edward Snowden is a traitor” — well, it turns out that for Scocca that’s smarm too. So smarm, I guess, is being nice to people Tom Scocca thinks you ought to be mean to and being mean to people he thinks you ought to be nice to.

Two: Incoherent though Scocca’s portrait of smarm is, it’s having the effect of further solidifying an already common and utterly pernicious idea, which is that the critic must choose between being “nice” and being “snarky.” Thus Malcolm Gladwell responds to Scocca by arguing, in effect, that given such a choice it’s better to be nice than to be snarky:

What defines our era, after all, is not really the insistence of those in authority that we all behave properly and politely. It is defined, instead, by the institutionalization of satire. Stephen Colbert and Jon Stewart and “Saturday Night Live” and, yes, Gawker have emerged, all proceeding on the assumption that the sardonic, comic tone permits a kind of honesty in public discourse that would not be possible otherwise. This is the orthodoxy Scocca is so anxious to defend. He needn’t worry. For the moment, we are all quite happy to sink giggling into the sea.

But what if neither snark nor smarm is adequate to the critical task?

Almost a hundred years ago Rebecca West wrote of “the duty of harsh criticism” in a period that suffered from “the vice of amiability”:

The mind can think of a hundred twisted traditions and ignorances that lie across the path of letters like a barbed wire entanglement and bar the mind from an important advance. For instance, there is the tradition of unreadability which the governing classes have imposed on the more learned departments of literature, such as biography and history. We must rebel against the formidable army of Englishmen who have achieved the difficult task of becoming men of letters without having written anything. They throw up platitudinous inaugural addresses like wormcasts, they edit the letters of the unprotected dead, and chew once more the more masticated portions of history; and every line they write perpetuates the pompous tradition of eighteenth century “book English” and dissociates more thoroughly the ideas of history and originality of thought. We must dispel this unlawful assembly of peers and privy councillors round the wellhead of scholarship with kindly but abusive, and, in cases of extreme academic refinement, coarse criticism.

And more than two thousand years before West the writer of Ecclesiasticus taught us the other pole of our duty:

Let us now praise famous men, and our fathers that begat us.

The Lord hath wrought great glory by them through his great power from the beginning.

Such as did bear rule in their kingdoms, men renowned for their power, giving counsel by their understanding, and declaring prophecies:

Leaders of the people by their counsels, and by their knowledge of learning meet for the people, wise and eloquent are their instructions:

Such as found out musical tunes, and recited verses in writing:

Rich men furnished with ability, living peaceably in their habitations:

All these were honoured in their generations, and were the glory of their times.

There be of them, that have left a name behind them, that their praises might be reported.

And some there be, which have no memorial; who are perished, as though they had never been; and are become as though they had never been born; and their children after them.

But these were merciful men, whose righteousness hath not been forgotten.

With their seed shall continually remain a good inheritance, and their children are within the covenant.

Their seed standeth fast, and their children for their sakes.

Their seed shall remain for ever, and their glory shall not be blotted out.

Their bodies are buried in peace; but their name liveth for evermore.

The people will tell of their wisdom, and the congregation will shew forth their praise.

To praise (unstintingly) what is praiseworthy, and to expose (charitably but firmly and even, when necessary, harshly) what is false and what leads people astray: these are indispensable functions of criticism.

Monday, December 9, 2013

theological theses on technologies of knowledge

These thoughts were originally formulated in the context of a faculty seminar at Wheaton College on theology and technology, but I thought it might be worthwhile to share them here. The “we” invoked at several points here may be taken to mean, generally and non-exclusively, “Christians in the academy,” but there are many others to whom these thoughts apply. Or so I think.

(1) No particular technology can be usefully evaluated in isolation from the whole network of technologies to which it is inevitably related.

(2) Technologies have become the chief means by which those networks — what Michel Foucault called “power-knowledge regimes” — are sustained.

(3) Brian Brock refers to the current regime as “technological modernity,” while Neil Postman calls it “Technopoly,” but both of them help us identify the key features of the regime: a commitment to rationalization and regularization of human behavior; a confidence that tools can direct human will into proper channels, with what is “proper” being wholly accessible to autonomous human reason; a belief in the inevitability of progress; and an insistence that technologies are always neutral, equally capable of being used for good or ill.

(4) Stanley Hauerwas has rightly said that to be a Christian is to work “with the grain of the universe,” but none of those core commitments of Technopoly run wholly (or at all) with the grain of the Christian story.

(5) Therefore, while working against the grain of anything is wearisome, to that we are called. Any thought that we can create a form of life that will allow us to live always with the grain, and therefore without struggle and tension, amounts to a wish-fulfillment fantasy. Paul’s images of “running the race” and “fighting the good fight” do not just concern internal “spiritual” struggles.

(6) One branch or department of technological modernity is called “higher education.” As individual teachers and scholars, we have, in order to be in the world though not of it, chosen some degree of accommodation to the standards of this regime in order to achieve its accreditation and recognition. Most Christian colleges and universities have done the same.

(7) Walter Ong points out that no society that has achieved literacy has even voluntarily returned to a condition of primary orality. This is a specific example of a general rule: When a once-dominant technology yields to a new one, it does not disappear, but its cultural role changes and it never becomes dominant again. Similarly, there is no point in imagining that either we as individuals or Wheaton as an institution can fully, or even mostly, extricate ourselves from the regime of higher education. This is the specific form the burden identified in point 5 takes for us.

(8) So our daily vocational prayer should be something like this: “Lord, I want to work with the grain of your story, your Creation, your Way. Teach me to discern the direction that grain runs, and help me to identify what runs against it. Through your indwelling Spirit and the common life of your Church, give me strength and courage to follow the path that you set before me.”

(9) Any thoughts about the role of the book in our professional lives and more generally our lives as Christ-followers must be pursued within the parameters of the reflections above. We should not revere the book or any other technology in itself, but value it insofar as it helps us to work with the grain of God’s universe.

(10) As a corollary, any decision to stick with paper codices instead of digital texts will be a trivial decision if in most other respects we are unreflective participants in Technopoly.

(11) The codex certainly seems to have been especially well-suited to the preservation and transmission of the Gospel, but we don’t have a control group for purposes of comparison, nor can we roll back the tape of history and replay it with the codex taken out. A Church without codices might have been worse than the one that arose; it might have been better; it surely would have been different, with a different mix of virtues and vices. There’s no way for us to know.

(12) There is no power-knowledge regime under which the Gospel cannot be preached and the Christian life practiced. God will not leave us comfortless, even if He allows our codices to be taken away.

(13) If we value codices, we should strive to preserve them. But we can only do this if we first think critically and seriously about why we love them — what virtues they embody that we do not want to lose.

(14) If we do this, then we will be better prepared to adapt if codices (or other technologies that we like) decline. Being practiced in working against the grain of our social order, we will remember that even unfamiliar and unpromising technologies can be turned to godly purposes.

(15) This does not mean that those technologies are neutral, only that they are to some degree redeemable. Remember, our decision to accept, even if only provisionally, the rules and standards of our disciplines and institutions means that we have given up a full range of choices about our technologies.

(16) “I am neither an optimist nor a pessimist. Jesus Christ is risen from the dead!” — Lesslie Newbigin

the dissenters

You know how I wrote that blog post a whole back about how much I hate writing on my iPad? Apparently not everyone feels that way:

Most people only use the iPad's on-screen keyboard for tapping out emails, tweets or Facebook updates. 

But Patrick Rhone of St. Paul wrote a book that way -- with his Apple tablet at a slight incline on a desk or table at a variety of locations, and his index fingers flying across the virtual keys. 

This isn't Rhone's only feat of mobile productivity. The technology consultant and prolific blogger customarily composes lengthy blog posts -- sometimes nearing 1,000 words each -- on his iPhone screen in horizontal orientation. 

"The majority of the blog posts I write these days, I write in landscape, using my iPhone, typing with my thumbs," he said. "Why? Well, because it's what I have on hand all the time, and when inspiration hits me, I could be anywhere."

What a nightmare. But to each his own.

Saturday, December 7, 2013

on Colin Wilson

At some point in my senior year of high school I told my parents that I wanted to go to college, and they shrugged. It wasn’t a choice they had much sympathy with, and they were not inclined to offer any financial support — indeed, they were probably unable: my mother’s job was not a high-paying one and my father worked irregularly. Since none of us knew anything about scholarships or student loans, we ultimately agreed that if I paid for my university education they would allow me to continue living at home for a while longer. This was a good thing for me, since I was only sixteen.

Tuition at the University of Alabama at Birmingham was low enough that I could work in a bookstore about 25 hours a week — full-time during breaks and over the summer — and keep my head above water, but I was always tired, and after a while my grades started to slip. So I took a year away from school and just worked at the bookstore and read. In those days I wrote the title of every book I read in small neat letters in the squares of a Sierra Club Calendar, which I would pick up in early January when the unsold calendars were deeply discounted, and at the end of my year away from school I counted them up and discovered that I had read 250 books.

One of the writers I discovered that year was Colin Wilson, England’s very own self-educated bohemian existentialist, who has just died at the age of 82. Much of the atmosphere of Wilson’s life and writing is captured in the photograph above, and when I encountered that atmosphere I (briefly) found it intoxicating. Wilson’s interest in the occult led me to other occult writers — Carlos Casteñeda, predictably enough in that period, but also some weirder and more obscure figures like the so-called T. Lobsang Rampa — but all that left me untouched. Though I am a convinced Christian, I do not have a religious or even a “spiritual” temperament, for which, when I think about what I was reading then, I am thankful. Nor could I take seriously Wilson’s image of himself as a man who radiated such powerful psycho-sexual energy that women (like the woman on the sofa in the photo above, I suppose) were helpless before him.

But it was the Wilson who slept in parks at night and read all day in the British Library that captured my imagination. Wilson’s first and most famous book was called The Outsider, and like him I felt myself an outsider to the life of the mind: given my family’s relative poverty and indifference to education, and given what seemed to me the dreary dutifulness of most of my professors, I had no chance of intellectual achievement unless I taught myself, and since I had few principles by which I might be guided, I fell back on the one that I knew: omnivorous reading.

I don’t remember one word or one thought from the Colin Wilson books I read that year, but it may be that, for all his blustery self-assertion, his under-educated overconfidence, he had more influence on my intellectual formation than I know. Certainly I have retained all my life a sense of outsidedness to the institutions I have participated in, and a somewhat perverse determination to take my own path, especially when it’s one that wiser and more prudent people assure me I should not take. Maybe I need to thank Colin Wilson for that.

Thursday, December 5, 2013

Sinclair and Davies

I'm a big fan of Iain Sinclair's work — which I've written about at some length here — and I think Ray Davies is one of the most interesting people in the history of rock-and-roll, for reasons I explain a bit here and here. So how happy was I see see this joint profile of the two of them, focused on their common interest in an imagined America? Really happy.

Wednesday, December 4, 2013

e-readers for the world

Here’s a brief but interesting story about Worldreader:

A former Amazon executive who helped Jeff Bezos turn shopping into a digital experience has set out to end illiteracy. David Risher is now the head of Worldreader, a nonprofit organization that brings e-books to kids in developing countries through Kindles and cellphones. 

Risher was traveling around the world with his family when he got the idea for Worldreader. They were doing volunteer work at an orphanage in Ecuador when he saw a building with a big padlock on the door. He asked a woman who worked there what was inside, and she said, "It's the library." 

"I asked, 'Why is it locked up?' And she said it took too long for books to get there," says Risher. "[The books] came by boat and by the time they got there, they were uninteresting to the kids. And I said, 'Well, can we take a look inside? I'd like to see this.' And she said, 'I think I've lost the key.' " 

This, Risher thought, can be fixed. If it's so hard to give kids access to physical books, why not give them e-books and the digital devices they would need to read them? Risher had joined Amazon at its beginning, helping it grow into the dominant online retailer it is today. He felt he could apply some of the lessons he had learned at Amazon to the problem of illiteracy.

Here’s the Worldreader homepage. I really hope this project takes off.

And, as I have written in this New Atlantis essay, I am watching closely to see how these developments, and the very widespread and still increasing use of cellphones in Africa, will affect Christianity. Many of the people who get e-readers from Worldreader will be Christians, and perhaps the majority of those will download Bibles to their e-readers. What translations will they choose? Will those who encounter the Bible primarily or exclusively in digital form read it in discernibly different ways than those who read it in codices? This inquiring mind really wants to know.

writing for young people revisited

Jonathan Myerson has standards. Not for him the craven apologies of the Creative Writing Program at the University of Kent, their admission of wrongdoing at having suggested that children’s literature isn’t really literature at all. Myerson hoists his literary flag:

Come on, University of Kent, why the grovelling retreat? Your creative writing website got it right first time. You know perfectly well that when you made a distinction between "great literature" and "mass-market thrillers or children's fiction", you were standing up for something. That Keats is different from Dylan, or, in this instance, that Philip Roth does say something rather more challenging than JK Rowling, that Jonathan Franzen does create storylines more ambiguous and questioning than Stephanie Meyer's. What's so wrong with that? I'll go forward carrying the banner even if you won't.

Like Kent, we at City University take on creative writing MA students specifically to write literary novels – so we are quite ready to define what's required to write for adults as opposed to children. It isn't about the quality of the prose: the best children's books are better structured and written than many adult works. Nor is it about imaginary worlds – among the Lit Gang, for instance, Kazuo Ishiguro, Cormac McCarthy and Michael Chabon have all created plenty of those. It's simpler than that: a novel written for children omits certain adult-world elements which you would expect to find in a novel aimed squarely at grown-up readers.

The problem here is that Myerson fails to see that self-consciously “adult” novels, while they are indeed open to experiences, and to techniques, that children’s lit doesn't reckon with, also have blind spots, vast areas of human experience of which they are apparently ignorant. The estimable Adam Roberts covered this just a couple of months ago in an absolutely brilliant blog post that I wrote about here. The elaboration of "ambiguous and questioning ... story lines" may be a literary virtue — though perhaps not one that Jonathan Franzen possesses — but it is certainly not the only literary virtue or an indispensable one. The novel that is self-consciously for adults isn’t more comprehensive than the novel that is self-consciously for young people; it just covers different things. And, as Roberts makes clear, it habitually omits some of the most important experiences of life.

Tuesday, December 3, 2013

reduplicated Hamlets

After all the agitation and bipolar oscillation of the first four acts of Hamlet, by the end the prince seems to be at peace. “There's a divinity that shapes our ends,” he has learned; and when faced with the possibility of death, he knows that “the readiness is all.” And yet that readiness doesn't actually answer any of his questions or make him resolved to take action against the usurping Claudius. “Is it not perfect justice to quit him with this arm?” he asks, but he seems to have no plan to pursue such justice, perfect though it may be.

Is he perhaps waiting on the “divinity who shapes our ends” to provide the opportunity? Or is it simply not in him to be an agent of vengeance, the stereotypical avenging role Laertes is so manifestly comfortable with (to Hamlet's disgust)? It is noteworthy that when called upon by the Ghost to take vengeance he instead (a) writes and directs a play and (b) serves as his mother's ad hoc priest/confessor. Theatrical impresarios and priests alike work behind the scenes, pulling strings, directing and ordering others: they take no public action of their own.

But what if that's God's job? It's interesting that everyone in the end pays for their sins without Hamlet's planning or deciding to make them do so. There's a divinity that shapes their ends too. So if the play begins by bringing in a Purgatorial ghost whose existence suggests that “there are more things in heaven and earth than are dreamt of in [Horatio's Lutheran] philosophy,” it ends with what seems to be an affirmation of a key principle of the magisterial Reformation: the meticulous providence of God.

In short, the evidence in this play for life's meaning and purpose points in multiple directions — as does all evidence for what human beings are like. Thus Hamlet's words to Rosencrantz and Guildernstern:

I have of late — but wherefore I know not — lost all my mirth, forgone all custom of exercises, and indeed it goes so heavily with my disposition that this goodly frame, the earth, seems to me a sterile promontory; this most excellent canopy, the air — look you, this brave o'erhanging firmament, this majestical roof fretted with golden fire — why, it appears no other thing to me than a foul and pestilent congregation of vapors. What a piece of work is a man! How noble in reason, how infinite in faculty! In form and moving how express and admirable! In action how like an angel, in apprehension how like a god! The beauty of the world. The paragon of animals. And yet, to me, what is this quintessence of dust?

So this is indeed “a play in the interrogative mood” (as Harry Levin write long ago in a still-incisive book) and it seems determined to dramatize our questions rather than offer us any answers to them. In this sense it is very much a document of its time. Not for this period the bold attempt by Thomas Aquinas to sort out and systematize virtually the whole of human knowledge; there was already too much to know, and no one capable of ordering it all. Instead the watchword is, and had to be, Montaigne's questioning motto, Que scay-je? What do I know?

Long ago Northrop Frye wrote that if Hamlet was the definitive Shakespearean play of the nineteenth century, and King Lear, with its mixture of tragedy and absurdity, of the twentieth, the play for the twenty-first century might well be Antony and Cleopatra. Why? Because, Frye argued, it is about how deeply personal relations are usurped by world-historical events. That seemed plausible at the time, but maybe, as it turns out, we're back in the world of Hamlet: confronted constantly by evidence of what we know, what we're capable of, and at the same time faced with what we can't master and can't understand. We look into the ever-more-technologically-sophisticated future and we can't tell whether it points to the apotheosis of humanity or its utter abrogation, and we're not even sure we know how we might tell the difference. We may all be Hamlets — “reduplicated Hamlets,” in Auden's phrase — after all. What do we know?

Saturday, November 30, 2013

"Cheat the Prophet" revisited

I was having some fun on Twitter this morning with this piece of prophetic silliness — silly even for the a-scientist-predicts-the-future genre, which is saying a lot. Computers will disappear! — because they will be ubiquitous, and I’m sure there’s no need even to wonder if ubiquitous computing could be useful to ubiquitous governments, because we’re told later in the piece that technology is bad for dictators. Capitalism will be perfected! — which means that there will no longer be any possibility of sales resistance, of saying No to the capitalists. That silly “digital divide” people used to talk about never happened! — which I know is the object of constant gratitude for all those kids in Bangladesh and Mozambique with their iPads. And since there’s no mention of global warming in the piece, or the provision of electricity to places and people that don’t have it, or the availability of clean water to places and people who currently don’t even have that, I’m sure all those little glitches in the March of Progress will have been straightened out by 2050, probably with a few lines of elegant code.

You know the kind of thing. So here I just want to make one comment: that whenever I read this kind of thing I find myself recalling the first chapter of Chesterton’s The Napoleon of Notting Hill, from 1904, which begins with these still-utterly-relevant words:

The human race, to which so many of my readers belong, has been playing at children’s games from the beginning, and will probably do it till the end, which is a nuisance for the few people who grow up. And one of the games to which it is most attached is called “Keep to-morrow dark,” and which is also named (by the rustics in Shropshire, I have no doubt) “Cheat the Prophet.” The players listen very carefully and respectfully to all that the clever men have to say about what is to happen in the next generation. The players then wait until all the clever men are dead, and bury them nicely. They then go and do something else. That is all. For a race of simple tastes, however, it is great fun.

Friday, November 29, 2013

a broken spell

Recently I was reading a lovely autobiographical essay by Zadie Smith about — well, in the way of the true essay, it’s about several things: gardens, civility, grief, memory. Much of it concerns her travels with her late father, and those scenes are beautifully rendered.

And then I came to her description of the Borghese Gardens in Rome, and read this:

For our two years in Rome, the Borghese Gardens became a semiregular haunt, the place most likely to drag us from our Monti stupor. And I always left the park reluctantly; it was not an easy transition to move from its pleasant chaos to the sometimes pedantic conventionality of the city. No, you can’t have cheese on your vongole; no, this isn’t the time for a cappuccino; yes, you can eat pizza on these steps but not near that fountain; in December we all go to India; in February we all ski in France; in September of course we go to New York. Everything Romans do is perfect and delightful, but it is sometimes annoying that they should insist on all doing the same things at exactly the same time. I think their argument is: given that all our habits are perfect and delightful, why would anyone stray from them?

And in an instant all my interest and sympathy evaporated. Those are the things that “Romans” do, yes? Travel to India and New York City, ski in France? These are the habits of “Romans”? But of course Smith means the tiny, tiny fraction of Romans who have the extravagant wealth to do these things — the .01 percent, the absolute elite. These are the “Romans” she knows.

To this one might reply, well, Smith herself was not born into privilege: a biracial woman from London who grew up in straitened circumstances if not absolute poverty, she knows what it’s like to struggle. Exactly: all the more reason for her not to take privilege — extraordinary privilege — as the norm. “Romans” indeed.

I am willing, seriously willing, to consider that this response may well be a failure of charity on my part, so I record it not as a confident judgment but as a snapshot of readerly experience. Whether I was right or wrong to respond as I did, I think it noteworthy that with that paragraph my involvement in the essay — which until that point had been complete, I had been absorbed — ended. I listlessly cast my eyes over the last few paragraphs. The voice that had so delighted me a few moments before now seemed to me almost precious in its complacency. A lovely little spell had broken and could not be brought back. Whether it was Smith or I who broke it I leave as an exercise for you, my readers.

Wednesday, November 27, 2013

the rich are different

In his great autobiographical essay “Such, Such Were the Joys,” George Orwell remembers his schooldays:

There never was, I suppose, in the history of the world a time when the sheer vulgar fatness of wealth, without any kind of aristocratic elegance to redeem it, was so obtrusive as in those years before 1914. It was the age when crazy millionaires in curly top-hats and lavender waistcoats gave champagne parties in rococo house-boats on the Thames, the age of diabolo and hobble skirts, the age of the ‘knut’ in his grey bowler and cut-away coat, the age of The Merry Widow, Saki's novels, Peter Pan and Where the Rainbow Ends, the age when people talked about chocs and cigs and ripping and topping and heavenly, when they went for divvy week-ends at Brighton and had scrumptious teas at the Troc. From the whole decade before 1914 there seems to breathe forth a smell of the more vulgar, un-grown-up kind of luxury, a smell of brilliantine and crème-de-menthe and soft-centred chocolates — an atmosphere, as it were, of eating everlasting strawberry ices on green lawns to the tune of the Eton Boating Song. The extraordinary thing was the way in which everyone took it for granted that his oozing, bulging wealth of the English upper and upper-middle classes would last for ever, and was part of the order of things. After 1918 it was never quite the same again. Snobbishness and expensive habits came back, certainly, but they were self-conscious and on the defensive. Before the war the worship of money was entirely unreflecting and untroubled by any pang of conscience. The goodness of money was as unmistakable as the goodness of health or beauty, and a glittering car, a title or a horde of servants was mixed up in people's minds with the idea of actual moral virtue.

What follows is purely subjective and impressionistic, but: I think in America in 2013 we’re back to that point, back, that is, to an environment in which “the worship of money [is] entirely unreflecting and untroubled by any pang of conscience.”

We have plenty of evidence that the very rich are deficient in generosity and lacking in basic human empathy, and yet there seems to be a general confidence in the very rich — a widespread belief that those who have amassed great wealth, by whatever means, can be trusted to fix even the most intractable social problems.

Consider in this light, and as just one example, the widespread enthusiasm for the rise of the MOOC. The New York Times called 2012 The Year of the MOOC in an article comprised almost wholly of MOOC-makers’ talking-points, and even when the most prominent advocate of MOOCs abandons them as a lost cause, he still gets reverential puff-pieces. Some people can do no wrong. They just have to have enough money — and to have gotten it in the right way.

I think this “entirely unreflecting” “worship of money” is sustained by one thing above all: wealth-acquisition in America today, in comparison to wealth-acquisition in the Victorian age or across the Pacific in China, feels clean. Pixel-based and sootless. No sweatshops in sight — those are well-hidden in other parts of the world. We may happen to find out that Amazon’s warehouses aren’t that different than sweatshops, but that doesn't seem to make much of a difference, in large part because our own dealings with Amazon are so frictionless and, again, clean: no handing over of cash, not even credit cards after you enter your number that first time, just pointing and clicking and waiting for the package to show up on your porch. Oh look, there it is. Not only are the actual conditions of production hidden, but even the nature of the transaction is invisible, de-materialized. (I could be talking about MOOCs here as well: they work the same way.)

It’s almost impossible to think of Jeff Bezos or Steve Jobs or Sebastian Thrun as robber baron industrialists or even as captains of industry, even if the occasional article appears identifying them as such, because what they do doesn't fit our imaginative picture of “industry.” They seem more like the economic version of the Mr. Fusion Home Energy Reactor in Doc’s DeLorean: you just throw any old crap in and pure hi-res digital money comes out.

Cue Donald Fagen:

Just machines to make big decisions
Programmed by fellas with compassion and vision
We’ll be clean when that work is done
We’ll be eternally free, yes, and eternally young

Happy Thanksgiving, everybody.

Monday, November 25, 2013

on reading and flux

Please read this lovely reflection by Frank Chimero on “what screens want” — a gloss on Kevin Kelly’s what technology wants — though Chimero makes this important and (to my mind) necessary pivot near the end: “Let me leave you with this: the point of my writing was to ask what screens want. I think that’s a great question, but it is a secondary concern. What screens want needs to match up with what we want.”

It’s a rich and subtle essay that covers several key topics, and thinks in appropriately large terms; I’ll be returning to it. But just for now I want to zero in on an especially intriguing part of the essay in which Chimero meditates on Eadweard Muybridge’s early moving pictures of a running horse.


Of these images Chimero writes,

And you know, these little animations look awfully similar to animated GIFs. Seems that any time screens appear, some kind of short, looping animated imagery of animals shows up, as if they were a natural consequence of screens. 

Muybridge’s crazy horse experiment eventually led us to the familiar glow of the screen. If you’re like me, and consider Muybridge’s work as one of the main inroads to the creation of screens, it becomes apparent that web and interaction design are just as much children of filmmaking as they are of graphic design. Maybe even more so. After all, we both work on screens, and manage time, movement, and most importantly, change. 

So what does all of this mean? I think the grain of screens has been there since the beginning. It’s not tied to an aesthetic. Screens don’t care what the horses look like. They just want them to move. They want the horses to change. 

Designing for screens is managing that change. To put a finer head on it, the grain of screens is something I call flux.

He then goes on to define high, medium, and low flux, and to describe some situations in which one or the other might be called for.

All this has me thinking about the degree of flux appropriate to different reading experiences. This seems to me highly variable according to genre and purpose. For instance, the New Republic’s iPad app is designed to offer higher flux than other magazine apps I’ve seen, which are minimally interactive: here you have poems that you can use your finger to slide into view, taps that activate deeper levels of content, and so on. Sometimes it’s too much, and at other times it takes too long to figure out how a given story works — they vary more than they ought to — but in general I like it. A good deal of thought has gone into the design, and more often than not the interactions are appropriate to the particular story and help me to engage more fully with it.

But I would never want to read Anna Karenina this way. The kind of concentration demanded by a long, complex, serious novel cannot bear much, if any, flux. And unnecessary flux can readily be avoided by reading it in a codex — hooray for that! But if people do gradually shift more and more towards reading on some kind of screen or another, and screens become increasingly capable of variable degrees of flux (as e-ink screens currently are not), then we readers will be ever more dependent on designers who possess a deep sensitivity to context and purpose — pixel-based designers who are widely, as a matter of basic professional competence, as flexible and nuanced in their design languages as the best print-based designers are today. Or, at the very least, they’ll need to build in the possibility of opting out of their fluxier interfaces. As someone who’s headed for a more screen-based reading future, I’m a little nervous about all this.

disrupting journalism!

(Nah, not really. Just wanted to try out that language for size.)

But: I was talking with some people on Twitter this morning about my frustrations with what has now become a very familiar set of experiences: the whole merry-go-round of publicity that accompanies the appearance of a book.

Before I go any further, I should note that my adventures on this merry-go-round amount to nothing in comparison with what people-who-make-their-living-by-writing go through. Only once in my career have I written a book that generated perceptible media attention, and doing the publicity for that absolutely exhausted me — which probably accounts for my dyspeptic attitude towards even small bouts of book-promoting exercises today. I can't even begin to imagine what it must be like to be Neil Gaiman: "I’m currently dealing with how to go back to being a writer. Rather than whatever it is that I am. A traveller, a signer, a promoter, a talker, a lecturer."

So here's how it goes: a journalist writes or calls to ask for an interview, and wants to do the interview by phone. If I agree — in violation of my profound dislike of the telephone — then commences the awkward dance of trying to find a time when we can both talk, and, when that's finally worked out, I am permitted to try to improvise on the spot answers to questions that I have already answered, with considerably greater care, in the book itself. Then I just have to hope — though the years have almost cured me of hoping — that the journalist transcribes what I say accurately and in its proper context. And, for dessert, I get to be annoyed by the way I put things and wish I could go back and express myself more clearly.

(By the way, no belief is more sacrosanct among journalists that the belief that it would be profoundly unethical to let me rewrite my comment about, say, nineteenth-century controversies over the Ornaments Rubric — even though I've yet to find anyone who can explain to me why that would be so. They always invoke politicians and political controversy, without explaining why the same rules should apply to interviewing politicians and interviewing scholars or other writers.)

Perhaps you can tell that I'm not thrilled about this way of doing things? So my common practice now is to decline phone interviews and ask to do things by email instead. Sometimes I am told that this is not permissible, in which case, Oh well. (When I've been given a reason, that reason has always been "because in email you don't get the give-and-take," which always makes me wonder whether there are email clients without Reply buttons.) But when people agree, then I sit down to answer the questions and realize, wait a minute, I'm writing the article! I'm going to do all the work and they're going to get the byline and the paycheck! Well, it was my choice, after all....

I'm supposed to be willing to do all this because it gets my book "exposure," it has "publicity value," and I suppose that once may have been true, but I wonder to what extent it now is? Certainly publishers believe in it, and promote the model; but I have my doubts that a model formed by a kind of handshake agreement among publishers (who want to get the word out about their books) and journalists (who need ever-new "content") is all that it needs to be when we all have the internet and its social media at our fingertips.

I'm just wondering — genuinely wondering — whether there might be models of doing ... this kind of thing ... don't know what to call it ... that might be more flexible and generous and less taxing to everyone concerned. Especially, of course, The Author, but I've been on both sides of this fence: I have interviewed people for articles — almost always by email, though once I bought lunch for a well-known musician for an Oxford American piece that never saw the light of day — and I've written for dailies, weeklies, bimonthlies, monthlies, quarterlies, the whole show, so I know those challenges as well. There's drudgery for journalists in the usual way of doing business, and maybe it could be made more fun for them as well.

Even small adjustments could help: Alex Massie suggested to me the value of IM interviews, and that made me remember the few times I've done those — I really enjoyed them. They have the spontaneity of conversation but also allow you to take a moment to get your thought into shape before committing to the Enter key. In another exchange that happened almost simultaneously — I like that about Twitter — Erin Kissane emphasized just this value of conversation, and I suppose that's one reason why I have always enjoyed talking with Ken Myers for his wonderful Mars Hill Audio Journal: the dialogue gradually and naturally unfolds, and while Ken always edits with care and skill to make me sound smarter than I am, he never eliminates that conversational tone. If doing publicity were always like that....

Anyway, I'd love to hear some good — disruptive! innotative! — ideas in the comments, especially from journalists. And thanks to those of you who, over the years, have helped to put my ideas before the public.

And by the way: if you don't subscribe to the Mars Hill Audio Journal, you should consider it. It's great.