Text Patterns - by Alan Jacobs

Monday, September 4, 2017

redirecting to Pinboard

I had thought that I might be able to resume at least some blogging here, but it's just not going to happen. Too much to do — and soon I will begin blogging in support of my forthcoming book How to Think. Please keep an eye on that site, if thinking is the kind of thing you're into.

I am still of course fascinated by all the themes I have written about here, and am always reading about them. I keep track of much of my reading on my Pinboard page, and will try to remember to use the textpatterns tag to note things that might of of interest to my readers here.

Also, I have essays on Textpatterns-related topics coming out in the journal that hosts this blog, the good old New Atlantis, and also in The Hedgehog Review. I'll point to those, when the time comes, on Twitter.

Friday, September 1, 2017

Can anyone help me understand Ephraim Radner?

While I’m on the twofold subject of (a) reading outside my speciality and (b) asking for help, I want to say something about the theologian Ephraim Radner. Several people I know and admire very much have encouraged me to read Radner, whom they in turn admire very much, and for a good many years now I have tried, repeatedly. But there’s a problem. The problem is that I simply cannot understand what he is saying. I do not know that I’ve ever come across a writer — not even Jacques Lacan — who has defeated me as thoroughly as Radner has. And this genuinely worries me, because while most of these people will acknowledge that Radner is not the most elegant writer, none of them seem to have any trouble making sense of his writing, and seem befuddled by my befuddlement.

Let me take some illustrative examples from Radner’s recent book A Time to Keep: Theology, Mortality, and the Shape of a Human Life. Here is what he describes as his “central argument”:

To have a body and deploy it is bound up with the fact that we are born and we die within a short span of years. And this being born and dying is itself — in all its biology of connection, memory, and hope — a mirror of and vehicle for the truth of God’s life as our creator.

The first sentence there seems clear enough: we know our bodies only as dying bodies. That doesn’t seem like a controversial point, but assuming I have read it correctly, I move on to the next sentence — and immediately run aground. “Being born and dying” has, or is accompanied by, a “biology of connection,” but I have absolutely no idea what might be meant by “biology of connection.” I am not even able to hazard a serious guess: maybe something like, we are biologically wired to be connected to … each other? Or maybe to the rest of the created order, in that we eat other living things? And all this confusion comes before we get to the idea of a biology of memory and hope, which I find even more inscrutable.

But then it gets even tougher. Because this being born and dying, with its accompanying biologies, is a “mirror” of … it would be difficult enough if the rest of the sentence were “God’s life as our creator,” but the phrase “the truth of” comes first, so I am once more wholly at sea. Let’s try to unpack this. God has a life “as our creator,” which I assume must mean something like the life God experiences in relation to Creation, as opposed to the internal life of the Trinitarian godhead. The “truth of” this life is distinguished, I suppose, from false ideas about it? It is, then, the character of that life truly perceived? So that if we perceive the life of God-as-creator truly we will then see that it is a mirror of our lives? — but if so, is it a mirror in the sense of being its opposite, its reversal? And then the brevity of our lives is the “vehicle” by which we perceive the eternal life of our God as creator? Probably not, because God is eternal in himself, not just as our creator … but I’m out of guesses. I cannot make any sense out of this passage, or indeed out of Radner’s writing as a whole.

One might say that all this becomes clearer if you read the whole book. But I have read the whole book — my eyes have passed over every word, I have scribbled thoughts and queries in the margins — and I am no better off.

At the end of the book Radner comments that “the argument of this book has been that thinking about who we are as created human beings comes down to numbering our days,” and while the phrases “numbering our days” and “day-numbering” occur frequently in the book, I’m afraid I don’t know what they mean either. It sometimes seems to me that the whole book does not say anything more or other than what a priest whispers to me each Ash Wednesday, as he inscribes an ashy cross on my forehead: Remember that thou art dust, and to dust thou shalt return. But there must be more to this book than that. Can anyone help me understand?

how to mark your participation in an academic guild

This is just a brief follow-up to last night’s post on my personal blog about my experience reading biblical scholars. All scholarly guilds have their characteristic markers of valid participation, but they vary considerably. For biblical scholars those markers seem to be, as far as I can tell, largely structural: that is, as I explained in that post, monographs are expected to begin with a methodological introduction and a literature review. But in my field, literary study, the markers tend not to be structural but terminological. We can organize our monographs in a good many ways, but we need to signal our deference to guild sensibilities by deploying certain terms: in one era we needed to point to aporias in the texts we studied, while later on we needed to acknowledge our complicity in the very structures we sought to critique, or to speak with appropriate regretfulness about the power of patriarchy; later on still it was heteronormativity that needed to be acknowledged.

The rules were never specific, and we could always neglect certain terms if we made use of others that were equally au courant; but terminological markers have to be there for a book to be a guild book. It would be interesting to hear from various academics about what they perceive to be the key scholarly markers of membership in their own guilds. Comment below, perhaps? (I'm also to hear challenges to my thoughts on these matters.)

Monday, August 28, 2017

The History of Disenchantment

Here's a brief description of a course I'll be teaching next semester:

In a wonderful early poem, "Merlin Enthralled," Richard Wilbur describes the way that magic drains from the Arthurian world when the wizard is no longer around to generate it:

Fate would be fated; dreams desire to sleep.
This the forsaken will not understand.
Arthur upon the road began to weep
And said to Gawen, "Remember when this hand

Once haled a sword from stone; now no less strong
It cannot dream of such a thing to do."
Their mail grew quainter as they clopped along.
The sky became a still and woven blue.

A hundred years ago the great sociologist Max Weber wrote that “The fate of our times is characterized by rationalization and intellectualization and, above all, by the disenchantment of the world” (Entzauberung der Welt). We experience this, he added, as an “iron cage” of rationalization. The purpose of this course is to explore Weber’s great thesis. Is it correct? If so, what are its consequences? What intellectual strategies have we formed to deal with this disenchantment, to break the bars of this iron cage? And if Weber’s thesis is not right, in what forms has an enchanted world persisted?

Major readings:

  • Weber, selected writings on the rationalized social order
  • Charles Taylor, A Secular Age
  • Susanna Clarke, Jonathan Strange and Mr Norrell
  • Neil Gaiman, American Gods
  • Jason Josephson-Storm, The Myth of Disenchantment

Supplementary readings:

  • various essays on the “secularization thesis”
  • Hans Blumenberg, The Legitimacy of the Modern Age
  • Owen Chadwick, The Secularization of the European Mind in the Nineteenth Century (selections)
  • Leon Kass, The Beginning of Wisdom: Reading Genesis (selections)
  • Keith Thomas, Religion and the Decline of Magic (selections)
  • C. S. Lewis and J. R. R. Tolkien on the “enchantment of worldliness”
  • selected essays and excerpts by Marina Warner

The logic behind many of these choices should be clear -- it's obvious why Taylor's magnum opus will be the central text here -- but a few may need explanation. Gaiman's novel is a great case study in various culturally particular forms of enchantment and disenchantment, and a profound meditation on how technology affects both. Jonathan Strange and Mr Norrell explores the conditions and consequences of re-enchantment. Josephson-Storm's book puts some hard questions to Weber's thesis and to narratives of secularization more generally. Kass presents Genesis as an intrinsically disenchanting text from the outset, in which it demotes the sun, moon, and stars from the status of deities to that of mere created things -- big lights in the sky, worthy neither of worship nor of terror.

Comments and suggestions welcome.

Saturday, August 26, 2017

let joy be unconfined ...

... because there's a new Adam Roberts novel!

No one has yet said to me, “Of course you praise Adam Roberts’s novels, you’re his friend.” But if anyone ever did say that to me I’d reply that Adam and I have become friends in large part because I admire his novels — and his criticism as well. A few months back I commented to Adam that I couldn't remember how we first connected, and he reminded me that it was in the comments section of a now-silent website called The Valve. His posts there intrigued me, I commented, he replied, I decided to read one of his novels — and a friendship was born, one that I greatly value.

So ... am I prejudiced in his favor? Only in what we might call a Hazlittian sense, I would argue: I am prejudiced in favor of Adam’s writing because what I have read by him has consistently given me pleasure. And that is the right kind of prejudice to have.

Which brings me to The Real-Town Murders,which is a really good novel — you should read it. You should buy it. You also should support Adam’s completion of Anthony Burgess’s idea for a book, The Black Princeplease do, or the book won’t be published and I won't get to read it.

But back to The Real-Town Murders. It’s a fantastic read, fast-paced, edge-of-your-seat stuff — but it’s also sometimes disorienting, and I want to emphasize the disorientation it produces, because that’s related to something Adam has written about before — with, I think, disarming honesty — which is, not to put too fine a point on it, his neglect by the SF world. Not complete neglect, mind you, but significant neglect, especially of a book with the ambition, profound intelligence, and emotional depth of The Thing Itself, about which I have written here. It’s in light of this neglect, and Adam’s understandable puzzlement at it, that I want to say something about The Real-Town Murders.

As I say, it’s a terrific thriller — Roberts writes masterful chase scenes — most of his books have chase scenes, and they’re always great — but it’s also kinda weird. For instance, it can be really hard in The Real-Town Murders to know if someone is dead. Sometimes you think people must be dead but they turn out not to be. At the risk of a spoiler, here’s an example:

‘You see,’ Pu said. ‘You see, you can’t reinvigorate the Real simply by decreeing it. You can’t make it happen by fiat. You have to make it more attractive than the Shine. More intriguing to the people who … Who …’ Her weight slumped away from Alma, and she struggled to continue holding her upright. But she had gone, and Alma was not strong enough. As slowly as she could she lowered Pu Sto’s body to the ground. The aircars banked overhead and came down into the turf fifteen metres away.

Pu Sto had fainted.

Fainted?? You said “she had gone,” and we know what that means! You said “her body”! Damn you, Roberts! So a little later on, when someone else seems to have been killed, I the reader am waiting, waiting, waiting for the revelation that my assumption was wrong ... again ... but no. This time the assumption is correct. (Isn’t it?)

Here’s another thing: sometimes in this book human language goes awry. That is, certain characters temporarily lose the ability to speak grammatically coherent sentences. Sometimes this happens to automated systems, bots, as:

‘Relevant company documentation and answer any question to podscrip pending in your legally permitted break for lunch,’ said the receptionist. It had been prodded into a less secure margin of its response algorithm.

‘Furious green ideas?’ Alma asked.

‘Profitability supersedes itself in a company atmosphere of positivity and,’ said the receptionist, smiling.

‘Realising that nothing changes,’ Alma tried, ‘change everything.’

‘Happy to leverage all options and drill down to the next level.’

‘Let me ask you a direct question: are you, in fact, not the Ordinary, but rather the Extraordinary Transport Consultancy?’

‘Thank you for your input,’ beamed the receptionist.

‘Teleportation?’ Alma tried. ‘Instant transportation devices?’

‘No comment,’ the receptionist replied, rather too rapidly, and shut down.

But it happens to human beings too — and Roberts never explains why. He just throws us into this weird world where sometimes humans, like digital machines, develop linguistic glitches — and perhaps for the same reasons, given that the future society he describes draws human beings closer and closer to as purely digital a world as can be managed. And there are people who just speak oddly, by my standards, for reasons that might be related to the online world called the Shine or because they have a regional accent that I don't know about or...?

‘I know he works in the world, but his free time is all online. All of it! And you need to own dare stand – I make sure he eats. He has always ate. He used to weight a hefty number. Loves his food. He comes to mine, and I feed him till his stomach bulges. Then it’s o mama and gut-ache mama and I see it shrink down.’

There’s even a (relatively minor) character — one whose language is still more distorted — whose name seems to change: for most of the book he’s called “Lester” but there’s a period where he’s called “Ernest,” and I don't know what to make of that, because, though I’m tempted to say that it’s just a copy-editing oversight, there's another minor character whose name changes repeatedly through the handful of pages in which we see her. (So I’m thinking: is Ernest someone different? Did I miss something? Surely I missed something. But I’m caught up in this story here and don't want to go back to be sure.)

With Roberts, you never know — this is my point. Roberts likes making fictional knight’s moves, which is another way of saying that he is a perverse rather than an accommodating writer. To me, this is endlessly delightful; I enjoy having my legs taken out from under me, from time to time, as I read. I laugh at how Roberts sneaks in a line from Pynchon here, a line from Shakespeare there. I love this novel’s extended, multi-faceted homage to and riff on Hitchcock — another guy who was good at chase scenes — who makes an uncredited appearance here, as he typically did in his own films, but whose name is never mentioned except as the provider of an epigraph for the novel’s second part: “Puns are the highest form of literature.”

It’s all enormous fun. But I suspect that there is a kind of reader — a quite common kind of reader — for whom it would be rather too much. Many readers like their fictional moves to be straight, like those of a rook, or (when they’re in an adventurous mood) on a diagonal, like those of a bishop. This starting out on one path and then suddenly veering off — well, it’s rather disorienting, isn't it? Rather perverse. I say: let Roberts do his thing! Take a ride! But many readers will simply prefer writers who are willing to do more to accommodate the most typical readerly expectations. It’s the way of the world. And this, I think, is why Adam Roberts hasn’t won a major SF award, though he has written several of the very finest SF novels of this millennium.

In addition to the pleasures it provides, The Real-Town Murders is also an extremely thoughtful meditation on one of the classic forms of literary pleasure. People have long asked “Why does tragedy give pleasure?”, which is a very good question — but one might equally well ask why thrillers give pleasure, why mysteries do — why death does, the sudden appearance of death in the midst of life. (In tragedies the most important death comes at the end; in mysteries it comes at the beginning.) It’s a question you might expect both Adam Roberts and Alfred Hitchcock to have some thoughts about. And they do. We could talk about those thoughts once you’ve read the novel.

Tuesday, June 27, 2017

a great silence cometh

We’re going to have radio silence here at Text Patterns for a while — I’m coming to the end of a year of research leave and have a number of projects, small and large, that I need to wrap up before school resumes in August.

One of those projects is related to my previous post: Our social media put us in an odd situation in relation to words, our own and others’. People say things they don't mean, or that they declare afterwards that they don't mean; they retweet or report or link to statements that they later (when challenged) say they don't endorse. Every day we see people positioning themselves in oblique relation to language. Mikhail Bakhtin — early in his career, which is to say, early in the history of the Soviet Union, when people’s words were already landing them in prison — wrote of the moral imperative of being answerable for one’s language; Wendell Berry, in one of his most important essays, wrote of the inestimable value of “standing by words,” standing by what we say. It’s hard to imagine concepts more foreign to the way language is used on social media today. So I’m writing an essay about the restoration of answerability.

I’m also in the early stages of writing about the changing fortunes of the concept of nature, something that I discussed briefly in a few recent posts.

As usual, I’ll be posting images, links, and very brief reflections (often on Christian matters) at my personal blog.

I’m continuing to think about Anthropocene theology, but that’s going to be a very long-term project.

The next book I write, by the way, will be called Auden and Science.

And then I have various responsibilities concerning the books I’ve already written. How to Think: A Guide for the Perplexed — that’s the subtitle for the U.K. edition, which I like better than the one for the American edition — will be out this fall, and I’ll be involved in the publicity for that. Also, I’ll be blogging about some of the ideas of the book on that site.

The Year of Our Lord 1943: Christian Intellectuals and Total War will be out next year from Oxford UP, and I have final revisions to do for that, plus all the questionnaire-answering and form-filling that accompany publication.

AND: I’ll need to start prepping for my fall classes! By the time I return I’ll have been fifteen months out of the classroom, and that’s a long time. I am itching to reconnect with students, believe it or not.

But the research leave was great. I got a lot of work done and have been refreshed by the opportunity to read and think and reflect. This past year also marked the end of a decade of illness in my family, which means that for the past few months I have had the strange and wonderful experience of not having to spend a lot of time caring for a sick loved one. (That’s taken me a while to get used to, oddly enough. I still wake up in the morning casting my mind around for the first thing I need to do for ... oh wait, everyone’s fine!) So I’ll be headed back into teaching with a lot of energy. Lord, may it please you that the next few years are like this one — for me personally, I mean. I’d prefer not to have any more years like this one in American politics.

Ciao for now!

so many questions

I have many questions — real, deep, sincere questions — about this.

  • Does Katherine Dettwyler really believe that a person deserves torture and death for stealing a poster?
  • Or does she, rather, believe that a person deserves torture and death for being a clueless privileged culturally-imperialist white male?
  • Or does she, perhaps, believe that a person deserves not torture and death but maybe arrest for being a clueless privileged culturally-imperialist white male, and just wrote carelessly?
  • Is she right that a significant number of her white male students "think nothing of raping drunk girls at frat parties and snorting cocaine, cheating on exams, and threatening professors with physical violence"?
  • How do any or all of these beliefs affect her ability to do her job as a teacher?
  • How many college teachers share these beliefs?
  • Is this a situation in which no "beliefs" as such are involved, but Dettwyler's Facebook post was rather an unfocused spur-of-the-moment venting arising from frustration with a lousy job or lousy working conditions?
  • If your answer to the previous question is "Yes" or "Probably yes," how do you account for the fact that Dettwyler seems to have made very similar comments on blogs?
  • How might a tendency to go off on unfocused spur-of-the-moment venting arising from frustration with a lousy job or lousy working conditions affect a person's ability to do her or his job as a teacher?
  • Did the University of Delaware ask Dettwyler for an explanation of her post and/or comments?
  • Did the university ask her to apologize for them?
  • Suppose that she did apologize — would that be sufficient for her to keep her job?
  • What would the University of Delaware have done if a tenured faculty member had made precisely the same comments?
  • As those outside the academy go apoplectic over these matters, those inside the academy shrug. Is shrugging enough?
  • What does it mean that so many people these days wish death on strangers whom they dislike or disagree with?
  • Should we feel better when we're told that people don't really mean it when they, for instance, respond to a tweet expressing a view about English grammar by wishing an entire generation of Americans dead?
  • Like, if you don't really in your heart of hearts want those people you disagree with to die in a fire or be raped and tortured, then we don't have a problem? Is that the argument?
  • Presumably all of the above, and worse, has been said to Katherine Dettwyler since her Facebook post went public — does that help?
  • Does vigilante vengeance have limits?
  • Even if it's just verbal vengeance?
  • Is forgiveness a social good?

Monday, June 26, 2017

historical knowledge and world citizenship

Few writers have meant as much to me, as consistently, over many years as Loren Eiseley — I say a bit about my teenage discovery of him in this essay. I am now writing a piece on the new Library of America edition of his essays for Education and Culture, and, man, is it going to be hard for me to keep it below book-length. I keep coming across little gems of provocation and insight. In lieu of buttonholing my family and making them listen to me read passages aloud — though I’m not saying I’ll never do that — I may post some choice quotations here from time to time.

Here’s a wonderful passage from the early pages of The Firmament of Time, Eiseley’s lapidary and meditative history of geology, or rather of what the rise of modern geology did to the human experience of time. I like it because it illuminates certain blind spots of today’s academics — and people more generally — and because it reminds us just how essential the study of history is.

Like other members of the human race, scientists are capable of prejudice. They have occasionally persecuted other scientists, and they have not always been able to see that an old theory, given a hairsbreadth twist, might open an entirely new vista to the human reason. I say this not to defame the profession of learning but to urge the extension of education in scientific history. The study leads both to a better understanding of the process of discovery and to that kind of humbling and contrite wisdom which comes from a long knowledge of human folly in a field supposedly devoid of it. The man who learns how difficult it is to step outside the intellectual climate of his or any age has taken the first step on the road to emancipation, to world citizenship of a high order.

He has learned something of the forces which play upon the supposedly dispassionate mind of the scientist; he has learned how difficult it is to see differently from other men, even when that difference may be incalculably important. It is a study which should bring into the laboratory and the classroom not only greater tolerance for the ideas of others but a clearer realization that even the scientific atmosphere evolves and changes with the society of which it is a part. When the student has become consciously aware of this, he is in a better position to see farther and more dispassionately in the guidance of his own research. A not unimportant by-product of such an awareness may be an extension of his own horizon as a human being.

Topsy-turvy, Tono-Bungay

In his blog-through of the works of H. G. Wells, Adam Roberts has reached Tono-Bungay, and there’s much food for thought in the post. Real food, not patent medicine like Tono-Bungay itself. Much of the novel, in Adam’s account, considers just that relationship: between the real and the unreal, the health-giving and the destructive, the truly valuable and mere waste — all the themes that Robertson Davies explores in The Rebel Angels and that are also, therefore, the chief concern of my recent essay on Davies, “Filth Therapy”.

Here I might quote Adam quoting some people who quote some other person:

Patrick Brantlinger and Richard Higgins quote William Cohen’s Introducing Filth: Dirt, Disgust, and Modern Life to the effect that ‘polluting or filthy objects’ can ‘become conceivably productive, the discarded sources in which riches may lie’, adding that ‘“Riches” have often been construed as “waste”’ and noting that ‘the reversibility of the poles — wealth and waste, waste and wealth — became especially apparent with the advent of a so-called consumer society during the latter half of the nineteenth century’ [‘Waste and Value: Thorstein Veblen and H. G. Wells’, Criticism, 48:4 (2006), 453].

This prompts me to want to write a sequel to “Filth Therapy,” though I clearly need to read Introducing Filth first.

It occurs to me that these are matters of longstanding interest to Adam, whose early novel Swiftly I have described as “excresacramental” — it was the first novel by Adam that I read, and given how completely disgusting it is, I’m rather surprised that I kept reading him. But he’s that good, even when he’s dirty-minded, as it were.

These themes make their way into fiction, I think, because of an ongoing suspicion, endemic now in Western culture if not elsewhere, that we have it all wrong, that we have valued what we should not have valued and vice versa, that we have built our house only having first rejected the stone that should be the chief cornerstone. As the old General Confession has it, “We have left undone those thinges whiche we ought to have done, and we have done those thinges which we ought not to have done, and there is no health in us.” This suspicion, which is often muted but never quite goes away, is perhaps the most lasting inheritance of Christianity in a post-Christian world: the feeling that we have not just missed the mark but are utterly topsy-turvy.

Christianity is always therefore suggesting to us the possibility of a “revaluation of all values,” a phrase that Nietzsche in The Antichrist used against Christianity:

I call Christianity the one great curse, the one great intrinsic depravity, the one great instinct for revenge for which no expedient (i.e. A means of attaining an end, especially one that is convenient but considered improper or immoral) is sufficiently poisonous, secret, subterranean, petty — I call it the one immortal blemish of mankind… And one calculates time from the dies nefastus on which this fatality arose - from the first day of Christianity! Why not rather from its last? From today? Revaluation of all values!

But Nietzsche issues this call because he thinks that Christianity itself has not set us right-side-up, but rather turned us upside-down. It was Christianity that first revalued all values, saying that the first shall be last and the last first, and he who seeks his life will lose it while he who loses his life shall find it, and blessed are the meek, and blessed are the poor in spirit…. Nietzsche’s call is therefore a call for restoration of the values that Christianity flipped: rule by the strong, contempt for the weak. It is, when considered in the long historical term, a profoundly conservative call.

Whether or not Nietzsche’s demand for a new paganism is right, surely it is scarcely necessary: for rule by the strong and contempt for the weak is the Way of the World, always has been and always will be; Christianity even at its most powerful can scarcely distract us from that path, much less set us marching in the opposite direction. Because that Way is so intrinsic to our neural and moral orientation, because we run so smoothly along its well-paved road, it is always useful to us to read books that don’t suggest merely minor adjustments in our customs but rather point to the possibility of something radically other. Such books are at the very least a kind of tonic, and a far better one than the nerve-wracking stimulation of Tono-Bungay.

Saturday, June 24, 2017

the big impediment to going iOS-only

At Macdrifter, Gabe Weatherhead makes a vital point:

But Apple has a blind spot that I think might mean the iPad never has parity with the Mac. The App Store just doesn’t encourage big powerful app development.

The price point on the iOS App Store is too low for many indie developers to succeed. I look at the most powerful apps I use and they come from a handful of companies dedicated to craft but supported by Mac revenue. Omnigraffle, OmniFocus, and OmniOutliner are great. But there are few competitors in this space that can reach the same level of quality and still make a profit. I suspect that in an iOS-only world these apps would end.

Transmit and Coda by Panic are top-tier software, but even they seem to be struggling to justify their existence.

This doesn’t mean that great apps don’t still get released on iOS. They just don’t keep getting supported. My favorite text editors, the best personal databases, the variety of bookmark managers. They might keep rolling out, but the majority aren’t actively developed.

This is exactly right. Even the pro-quality apps that remain in development tend to be updated inconsistently and (in comparison to Mac apps) rarely. Every time I think about going iOS-only, I realize that too many of the apps I rely on are apps … I can’t rely on.

Wednesday, June 21, 2017

Darwin's mail

I’ve just read Janet Browne’s two-volume biography of Charles Darwin, and it’s a magnificent achievement — one of the finest biographies I’ve ever read. I especially admire Browne’s judgment in knowing when to stick with the events of the life and when to pull back her camera to reveal the larger social contexts in which Darwin worked.

One of the interesting subthemes of the book concerns the way Darwin gravitated towards technologies that would allow him to pursue whatever aroused his curiosity — and whenever his curiosity was aroused he tended to become obsessive until he satisfied it. For instance, though he hated having his photograph taken, he made extensive use of photographs in writing his peculiar book on The Expression of the Emotions in Man and Animals.

Also: if you think of the various scientific institutions and journals of the Victorian era as a kind of network, he and his chief supporters (Thomas Henry Huxley above all) skillfully exploited those technologies to spread the news of natural selection far more quickly than it could have been expected to spread otherwise.

But as someone who has a long-standing interest in the postal service, these passages from the early pages of the second volume are especially provocative:

Systematically, he turned his house into the hub of an ever-expanding web of scientific correspondence. Tucked away in his study, day after day, month after month, Darwin wrote letters to a remarkable number and variety of individuals. He relied on these letters for every aspect of his evolutionary endeavour, using them not only to pursue his investigations across the globe but also to give his arguments the international spread and universal application that he and his colleagues regarded as essential footings for any new scientific concept. They were his primary research tool. Furthermore, after the Origin of Species was published, he deliberately used his correspondence to propel his ideas into the public domain—the primary means by which he ensured his book was being read and reviewed. His study inside Down House became an intellectual factory, a centre of administration and calculation, in which he churned out requests for information and processed answers, kept himself at the leading edge of contemporary science, and ultimately orchestrated a transformation in Victorian thought.

Maybe it wasn't the telegraph that was the Victorian internet but rather the penny post — even if it was slower.

And Darwin was utterly unashamed to use letters to get other people (friends, family, and often strangers) to do research for him:

He also hunted down anyone who could help him on specific issues, from civil servants, army officers, diplomats, fur-trappers, horse-breeders, society ladies, Welsh hill-farmers, zookeepers, pigeon-fanciers, gardeners, asylum owners, and kennel hands, through to his own elderly aunts or energetic nieces and nephews. Many of his letters went to residents of far-flung regions — India, Jamaica, New Zealand, Canada, Australia, China, Borneo, the Hawaiian Islands — reflecting the increasing European domination of the globe and rapidly improving channels of communication.

It’s a good thing that Darwin was a wealthy man: “In 1851 he spent £20 on ‘stationery, stamps & newspapers’ (nearly £1,000 in modern terms) ... By 1877 Darwin’s expenditure on postage and stationery had doubled to £53 14s. 7d, a sum roughly equal to his butler’s annual salary.”

And here’s Browne’s incisive summary of this method:

If there was any single factor that characterised the heart of Darwin’s scientific undertaking it was this systematic use of correspondence. Darwin made the most of his position as a gentleman and scientific author to obtain what he needed. He was a skilful strategist. The flow of information that he initiated was almost always one-way. Like countless other well-established figures of the period, Darwin regarded his correspondence primarily as a supply system, designed to answer his own wants. “If it would not cause you too much trouble,” he would write. “Pray add to your kindness,” “I feel that you will think you have fallen on a most troublesome petitioner,” “I trust to your kindness to excuse my troubling you.” ...

Alone at his desk, captain of his ship, safely anchored in his country estate on the edge of a tiny village in Kent, he was in turn manager, chief executive, broker, and strategist for a world-wide enterprise. Once, in a passing compulsion, he attached a mirror to the inside of his study window, angled so that he could catch the first glimpse of the postman turning up the drive. It stayed there for the rest of his life.

Tuesday, June 20, 2017

Nature

Bruno Latour shares with Timothy Morton a determination to overthrow the concept of “nature” because he believes that that concept “makes it possible to recapitulate the hierarchy of beings in a single ordered series.” Therefore any genuine (non-anthropocentric) “political ecology” — of the sort I briefly described in a previous post — requires “the destruction of the idea of nature” (Politics of Nature, p. 25).

As for Morton, he thinks the idea of nature does one basic thing: since nature is, fundamentally and always, “a surrounding medium that sustains our being,” it follows that “Putting something called Nature on a pedestal and admiring it from afar does for the environment what patriarchy does for the figure of Woman. It is a paradoxical act of sadistic admiration” (Ecology without Nature, pp. 4, 5).

We might notice that these two reasons for rejecting the concept of Nature are incompatible. The problem for Latour is that Nature places humans within “a single ordered series,” but at the top of it; for Morton, human beings are outside the order of Nature, which “surrounds” us. I think Morton is closer to being correct than Latour is: the way that Latour describes Nature is actually more appropriate to the concept that preceded it within the discourses of Western thought, Creation. It is from the Jewish and Christian account of Creation — over which human beings have been given “dominion” — that the “single ordered series,” at the top of which humans stand, emerges. But Morton’s critique applies better to the term that has recently succeeded Nature within our talk about such matters, “the environment.”

It’s therefore tempting to say that Nature is the term that bridges the historical gap between Creation and “the environment.” But that would oversimplify the story. And the deficiency that Latour and Morton share is an ahistorical oversimplification of what Raymond Williams calls “perhaps the most complex word in the language” — and even Williams’s account seems condensed in comparison with the one that C. S. Lewis gives in his long essay in Studies in Words. But Williams gets at the really key point here when he issues this caution: “since nature is a word which carries, over a very long period, many of the major variations of human thought — often, in any particular use, only implicitly yet with powerful effect on the character of the argument — it is necessary to be especially aware of its difficulty.” This is just what Morton and Latour fail to do.

And it would do no good to claim to be interested in only one of the many meanings of the word “nature,” because, as Williams points out, they tend to bleed into one another. Nature in the sense of “that which surrounds humans and with which we interact” is always in complex, confusing relation with Nature as the whole show, all that there is, as when Pope writes: “All are but parts of one stupendous whole / Whose body Nature is and God the soul.” Here human beings are clearly part of that “body” (thus Nature is here still a synonym for Creation). But of course we can lose our awareness of our place in that “one stupendous whole,” through pride or simply through participation in technological modernity, and can then come to see our lives as somehow “unnatural”; which can then lead us in turn to seek ways to become “one with Nature.” At the very least we seek what Morton and Donna Haraway both refer to as kinship, and the words “kin” and “kind” are Anglo-Saxon equivalents of Nature: the old name for Mother Nature is Dame Kind.

But kinship is not identity, and this is precisely why “Nature,” “nature,” “natural,” “unnatural,” all drift in and out of focus and shift their meanings like holograms. To understand this you need only read King Lear, where all these notions are ceaselessly deployed and redeployed — where Lear wonder what his nature is while cursing his unnatural daughters, and Edmund reckons with the consequences of being Gloucester’s natural (i.e. bastard) son. We do not know what we are kin to or how close the kinship is; we don’t know what we are like, which means we do not understand our true nature. And so the narrow and historically insensitive definitions offered by Latour and Morton simply won’t do; they are manifestly inadequate to the situation on the ground.

But this we know: kinship is not identity. We have very good reasons to doubt that our creaturely cousins, or siblings — St. Francis’s Brother Sun and Sister Moon, among others — ask these questions. Which means that from them we can learn much but perhaps not everything we want and need to know. As Auden wrote,

But trees are trees, an elm or oak
Already both outside and in,
And cannot, therefore, counsel folk
Who have their unity to win.

Monday, June 19, 2017

Q&A

Q: Why are you reading all this stuff, anyway? It seems pretty obvious that you don’t have much sympathy for it.

A: That’s a good question. I could give several answers. For one thing, I’m not as lacking in sympathy as you think. I have respect for any good-faith attempts to reckon with the immensely vexed question of what it means to be human, and the corollary questions about how we are most healthily related to the nonhuman, and I think Morton and Haraway are really trying to figure these things out. There’s a moral urgency to their writing that I admire.

Q: Is that so? Sure doesn’t sound like it.

A: Well, yeah, I guess that last post was kind of negative. As I was reading Morton I realized that some pretty important intellectual decisions had been made before he even began his argument, and I wanted to register that protest.

Q: But if you feel that a particular philosophical project has gone astray from the start, why not just move along to thinkers and lines of thought you find more fruitful, more resonant with potential?

A: Remember that this is a work in progress: as I said in that post, I’m currently reading Morton, I’m not done. (And in a sense I’m still reading Haraway, even though I put her book aside months ago.) When you’re blogging your way through a reading project, any one post is sure to give an incomplete picture of your response and likely to give a misleading one.

Q: Fair enough, I guess, but there does seem to be a pattern to your writing about a lot of recent work. You read it, think about it, and then declare that there are resources in the history of Christian thought that address these questions — whatever the questions are — better than the stuff you’ve been reading does. So why not just read and think about those Christian figures who always seem to do it better?

A: Because often those non-Christian (or non-religious, or anti-Christian, or anti-religion) thinkers often raise important questions that Christians tend to neglect, and I have to see whether there are in fact such adequate resources from within Christianity to address the questions raised by others. So far I have found that my tradition is indeed up for those challenges, buts its resources are augmented and strengthened by having to address what it never would have asked on its own. I truly believe that Christianity will emerge stronger from a genuinely dialogical encounter with rival traditions, in part because it will (as it has so often in the past) adopt and adapt what is best in those traditions for its own purposes. It doesn’t always work out that way; Hank Hill was right when he said to the praise band leader “You’re not making Christianity better, you’re just making rock-and-roll worse!” But most of the time the genuinely dialogical encounter more than pays for itself.

Q: Maybe. But you often seem out of your depth with the kind of thing —the kind of stuff you’re reading theses days — and often in the mood to kick over the traces. Wouldn’t you be better off sticking with the stuff you actually have a professional level of knowledge of? Auden? Other twentieth century religiously-inclined literary figures?

A: Honestly, you may be right. I often wonder about that very point. And that’s one of the reasons — that’s the main reason, I guess — why I talk about writing books of the technological history of modernity and the Anthropocene condition but end up writing them about the stuff I have spent most of my career teaching.

Q: So why are you even here, man? Why not drop this blog and get back to work in your own field?

A: Because this is a place where I can exercise my habitual curiosity about things I don’t know much about. Because this is a kind of Pensieve for me, a way to clear away thoughts that otherwise would clog up my brain. Because every once in a while something of value coalesces out of all this randomness. I have very few readers and still fewer commenters, so I’m not getting the thrill of regular feedback, but hitting the “Publish” button offers an acceptable simulacrum of accomplishment. Those are probably not very good reasons, but they’re the reasons I have.

But I’m not gonna lie: spending so much time reading stuff with which at a deep level I’m at odds is wearing, it really is. Especially since I know that the people I’m reading — and working so hard to read fairly — are highly unlikely to treat serious Christian thinkers with comparable respect. With any respect. They don’t know that Christian theology that’s deeply and resourcefully engaged with the modern world exists, and if they knew they wouldn’t care. What’s I’m doing when I read thinkers like Morton and Haraway is an engagement on my part, but it’s not a conversation. That’s just what it’s like if you want to bring Christian thought to bear on modern academic discourse. You only do it if you believe you’re called to do it.

Saturday, June 17, 2017

in responsibilities begin dreams

Lately I've been reading the philosopher Timothy Morton, who has a lot to say about living in the Anthropocene, and I see that he has a forthcoming book called Humankind: Solidarity with Non-Human People. On his website the book is described thus:

What is it that makes humans human? As science and technology challenge the boundaries between life and non-life, between organic and inorganic, this ancient question is more timely than ever. Acclaimed Object-Oriented philosopher Timothy Morton invites us to consider this philosophical issue as eminently political. It is in our relationship with non-humans that we decided the fate of our humanity. Becoming human, claims Morton, actually means creating a network of kindness and solidarity with non-human beings, in the name of a broader understanding of reality that both includes and overcomes the notion of species. Negotiating the politics of humanity is the first and crucial step to reclaim the upper scales of ecological coexistence, not to let Monsanto and cryogenically suspended billionaires to define them and own them.

The book isn't out yet, but I find this description worrying. The idea that "becoming human ... actually means creating a network of kindness and solidarity with non-human beings" sounds wonderful, in the most abstractly theoretical terms, but I doubt we can solve our and the world's problems simply by "negotiating the politics of humanity" — at least if Morton means, as I suspect he does based on what I have read so far, redefining the sphere of the political to include the whole range of nonhuman creatures, including the vast and ontologically complex phenomena he calls hyperobjects. Because we don't have a great track record of treating one another well, do we? I'm all for "kindness and solidarity with non-human beings," but first things first, you know? A good many people out there can't even manage kindness and solidarity with parents whose children were murdered in school shootings.

I'm reminded here of a comment made by Maciej Ceglowski in a recent talk. Responding to the claims of Silicon Valley futurists that we're just a few decades away from ending the reign of Death and achieving immortality for at least some, Ceglowski said, "I’m not convinced that a civilization that is struggling to cure male-pattern baldness is ready to take on the Grim Reaper." Similarly, I'd encourage those who plan to achieve kinship with all living things to call me back once they can have rational and peaceable conversations with people who live on their block.

I have the same concern with Morton's project as I do with Donna Haraway's theme of "making kin," which I wrote about here. I suspect that much of the appeal of seeking communion with pigeons, plutonium, and black holes (to use examples taken from Haraway and Morton) is that pigeons, plutonium, and black holes don't talk, tweet, or vote. If projects like Haraway's and Morton's don't reckon seriously with this problem, then they are likely to be in equal parts frivolous and evasive.

Such projects raise, for me, a further question, which is whether the language of kinship and solidarity is the right language to accomplish what its users want. Because you can achieve a feeling of kinship or solidarity without taking on any particular responsibility for the well-being of another creature. Here the old Christian language of "stewardship" seems to me to have greater force, and a force that is especially applicable to the Anthropocene moment: We do not own this world, but it has been entrusted to our care, and only if we seriously strive to live up to the terms of that trust will we have a chance of achieving true kinship and solidarity with all that we care for. It seems to me that Yeats had it backwards: it is not the case that "in dreams begin responsibilities," but in responsibilities begin dreams.



After posting this I realized that I'm not done. Morton, Haraway, Graham Hartman, and others working along similar lines are keen to bridge the gaps between humans and nonhumans — or, perhaps it would be better to say, deny the validity of the gaps that human beings perceive to exist between themselves and the rest of the world. They thus conclude that we require a new philosophical orientation to the nonhuman world, though one that employs quite familiar concepts (kinship, solidarity, intention, purpose, desire) — those concepts are just deployed in relation to beings/objects which formerly were thought to be outside the scope of such terms. We're not used to thinking that hammers have desires and black holes have consciousness.

This strategy of employing familiar language in unfamiliar contexts gives the appearance of being radical but may not be quite that. It strikes me as being largely a reversal of Skinnerian behaviorism: the behaviorists said that human beings are nothing special because they're just like animals and plants, responding to stimuli in law-governed ways; now the object-oriented ontologists say that human beings are nothing special because animals and plants (and hammers and black holes) all possess the traits of consciousness and desire that we have traditionally believed to be distinctive to us. The goal of the philosophical redescription seems to be the same: to dethrone humanity, to get us to stop thinking of ourselves as sitting at the pinnacle of the Great Chain of Being.

And underlying this goal is the assumption (often stated explicitly by all these figures, I think) that our belief in our unique and superior status among the rest of the beings/objects in the world has led us to abuse those beings/objects for our own enrichment or amusement.

I think this whole project is unlikely to bear the fruit it wants to bear, and I have several reasons for thinking so, which I will just gesture at here and develop in later posts.

(1) I doubt the power of philosophical redescription. Changes in our practices will lead to changes in description, not the other way around. The failure to recognize the direction that the causal arrow points is the signal failure of people who, being symbol manipulators by profession, think that the manipulation of symbols is the key to All Good Things. (I have written about this often, for instance, here.)

(2) I don't think we have taken the role of Apex Species on the Great Chain of Being too seriously, I think we have failed to take it seriously enough.

(3) I believe that all of these difficulties can best be addressed by living into certain ancient ways of thinking — which, in our neophilic age, is a hard sell, I know.

People will say, "Go back to Christianity? We tried that and it got us into this situation." To which the obvious rejoinder is the Chestertonian one that Christianity hasn't been tried and found wanting, it has been found difficult and left untried. But perhaps more to the point: everything has failed. Every day I hear lefties say that capitalism has been tried and didn't work, and righties say that socialism has been tried and didn't work — to which each side retorts that its preferred system hasn't really been tried, hasn't been implemented properly and thoroughly.

And all of these people are correct. Every imaginable system has been put into play with partial success at best, and the problems result from incomplete or half-hearted implementation of that system and from flaws inherent to it — which flaws are precisely what make people half-hearted or incomplete in their implementation of it. Everything has been tried and found wanting, and found difficult and left untried. This is the human condition. Attempts to remedy social and personal ills always run aground on both the sheer complexity of our experience and our mixed and conflicting desires (mixed and conflicted both within ourselves and in relation to one another).

New vocabularies, or even the deployment of old vocabularies in supposedly radical new ways, won't fix that. Which is not to say that improvements in conditions are impossible.

Much more on all this later.

Friday, June 16, 2017

digital culture through file types

This is a fabulous idea by Mark Sample: studying digital culture through file types. He mentions MP3, GIF, HTML, and JSON, but of course there are many others worthy of attention. Let me mention just two:

XML: XML is remarkably pervasive, providing the underlying document structure for things ranging from RSS and Atom feeds to office productivity software like Microsoft Office and iWork — but secretly so. That is, you could make daily and expert use of a hundred different applications without ever knowing that XML is at work under the hood.

Text: There's a great story to be told about how plain text files went from being the most basic and boring of all file types to a kind of lifestyle choice — a lifestyle choice I myself have made.

If you have other suggestions, please share them here or with Mark.

men ignoring (as well as interrupting) women

The New York Times is wrong about a great many things these days, but it's certainly right about this: men really do interrupt women All. The. Time. (And the NYT has covered this story before.) I have seen the phenomenon myself in many faculty meetings over the years, and it's especially painful when a woman sits in silence through 45 minutes of a meeting, finally decides to say something — and is instantly cut off.

I have often talked too much in meetings, but I don't think I do this — women who have worked with me, please let me know if I'm wrong. Please. (Could you do it in an email instead of in the comments, below, though? That would be a kindness.) But interrupting is just one of many ways confident and articulate men — or confident men who just think they're articulate — can sideline their female colleagues.

Once, some years ago now, a younger colleague asked me to join her for lunch. She wanted to talk to me about something: the fact that I had not expressed interest in or support of her scholarship, even though it overlapped with my own in some areas. My first thought was that I really did admire her work and thought; but that was immediately followed by the realization that I had never told her so. I had completely failed to offer the support and encouragement that would have meant a lot to her as someone making her way in our department and our institution. So I apologized, and asked if she would forgive me, which of course she did.

In the aftermath of that lunch meeting, I thought a lot about why I had so manifestly failed my colleague, and I've continued to think about it since. I don't fully understand the complexities of the situation, and I may be looking for self-exculpation here, but I do think I've identified one element of the problem, and it involves sexually-segregated socializing.

A number of my younger male colleagues had expressed gratitude for my support of them, and when I thought about how I had expressed that support — the advice I had given, the responses to their work — I realized that that had rarely happened on campus, in our offices or hallways, but rather in coffee shops and pubs. When we met on free mornings for coffee to chat as we got through some grading or diminished the size of our inboxes, or met in the evenings after work for a pint or two — that's when I got the chance to say some supportive things.

But while we often asked our female colleagues to join us for such outings, they rarely did. I am honestly not sure whether they just weren't interested, or had conflicting obligations, or didn't hear enough to make it perfectly clear that their presence was really wanted and that we didn't desire to create a Boy's Club. But I do know that I should have been aware of these dynamics and found other ways to let the women in my circles know that I valued their work. Once that single colleague had the boldness to call my attention to my shortcomings in this area, I made an effort to compensate — though I don't know that I ever did enough.

I especially want to ask my fellow academics: What do you think about the account I've given? Does it sound plausible? What am I missing, either about myself or about the general social dynamics?

frequency of citation does not equal quality of research

Google Scholar has just added a set of what it calls Classic papers: "Classic papers are highly-cited papers in their area of research that have stood the test of time. For each area, we list the ten most-cited articles that were published ten years earlier." The problem here is the equating of frequent citation with "standing the test of time." As it happens, many scholarly papers retracted by the journals where they were published continue to be widely cited anyway. Frequency of citation is not a good proxy for "classic" status.

Thursday, June 15, 2017

the oven bird imagines the future

This provocative post by Alec Ryrie asks an important question: Why is our culture’s dystopian imagination so absolute? Drawing on a recent history thesis by Olive Hornby that describes outbreaks of plague in early-modern England during which between a third and half of the people in some communities died. Not all but a handful, not 99%, but a little less than half, maybe. Enough to inflict profound damage on the emotional, spiritual, and economic life of a place — but not enough to destroy it altogether. Ryrie:

Most disasters are not absolute. They are real, devastating, and consequential, but they do not wipe the slate clean. Human beings are resilient and are also creatures of habit. You can panic, but you can’t keep panicking, and once you’ve finished, you tend to carry on, because what else is there? The real catastrophes of the West in the past century (world wars, the Spanish flu) have been of this kind: even as the principal imagined one (nuclear war) is of the absolute variety.

We need to learn to be better at imagining serious but non-terminal disasters, the kind which are actually going to hit us. (For a recent cinematic example, the excellent and chilling Contagion.) That way, when we confront such things, we will be less tempted simply to say ‘Game over!’ and to attempt to reboot reality, and will instead try to work out how to deal with real, permanent but not unlimited damage.

In such a case you can’t say “Game over” — he’s quoting Aliens there — because the game isn’t over. The game goes on, in however damaged a form, leaving us all forced to confront the truth taught us by the oven bird:

There is a singer everyone has heard,
Loud, a mid-summer and a mid-wood bird,
Who makes the solid tree trunks sound again.
He says that leaves are old and that for flowers
Mid-summer is to spring as one to ten.
He says the early petal-fall is past
When pear and cherry bloom went down in showers
On sunny days a moment overcast;
And comes that other fall we name the fall.
He says the highway dust is over all.
The bird would cease and be as other birds
But that he knows in singing not to sing.
The question that he frames in all but words
Is what to make of a diminished thing.

Wednesday, June 14, 2017

literary fiction and climate change, revisited

Here we have Siddhartha Deb making precisely the same inexplicable error that Amitav Ghosh, whom he quotes, made last year — a mistake on which I commented at the time. The thought sequence goes like this:

1) Declare yourself interested only in “literary” fiction;

2) Define literary fiction as a genre concerned only with the quotidian reality of today;

3) Complain that literary fiction is deficient in imaginative speculation about the realities and possibilities of climate change.

But if you have already conflated “literary fiction” and “fiction” — note how Deb uses the terms interchangeably — and have defined the former as having a “need to keep the fluky and the exceptional out of its bounds, conceding the terrain of improbability — cyclones, tornadoes, tsunamis, and earthquakes — to genre fiction,” then you have ensured the infallibility of your thesis. Because any story that engages with “the fluky and the exceptional” (or, riskily, the future) ipso facto becomes “genre fiction” and therefore outside the bounds of your inquiry.

This self-blinkering leads Deb into some very strange statements:

In the United States too, even well meaning liberal fiction, often falling under the rubric of cli-fi, reveals itself as incapable in grappling with [our steadfast rapaciousness]. This is perhaps because to think of modern life as a failure, and to question the idea of progress, requires an extremism of vision or a terrifying kind of independence. An indie bestseller like Emily St. John Mandel’s Station Eleven, set in an eco-apocalypse, features rhapsodies on the internet and electricity. Marcel Theroux in Far North includes a paean to modern flight as one of the finest inventions of “our race,” even though the effect of air travel on carbon emissions is quite horrific.

Let me just pause to note that Deb has a rather expansive notion of “the United States,” given that Emily St. John Mandel is Canadian and Marcel Theroux was born in Uganda and educated wholly in England. Setting that aside, Deb’s description of Mandel’s book is farcically inaccurate. It is true that there are characters in the book, some among the handful of people who have survived a plague that killed 99.9% of humanity, who miss the internet and electricity. Does Deb think that in such an world nobody would miss those technologies? Or is it his view that a truly virtuous writer should make a point of suppressing such heretical notions?

Either position is silly. Of course people in such a world would miss technological modernity, for good reasons and bad. At one point we get “an incomplete list” of what’s gone:

No more diving into pools of chlorinated water lit green from below. No more ball games played out under floodlights. No more porch lights with moths fluttering on summer nights. No more trains running under the surface of cities on the dazzling power of the electric third rail. No more cities. No more films, except rarely, except with a generator drowning out half the dialogue, and only then for the first little while until the fuel for the generators ran out, because automobile gas goes stale after two or three years. Aviation gas lasts longer, but it was difficult to come by.

No more screens shining in the half-light as people raise their phones above the crowd to take pictures of concert states. No more concert stages lit by candy-colored halogens, no more electronica, punk, electric guitars.

No more pharmaceuticals. No more certainty of surviving a scratch on one's hand, a cut on a finger while chopping vegetables for dinner, a dog bite....

No more countries, all borders unmanned.

No more fire departments, no more police. No more road maintenance or garbage pickup. No more spacecraft rising up from Cape Canaveral, from the Baikonur Cosmodrome, from Vandenburg, Plesetsk, Tanegashima, burning paths through the atmosphere into space.

No more Internet. No more social media, no more scrolling through litanies of dreams and nervous hopes and photographs of lunches, cries for help and expressions of contentment and relationship-status updates with heart icons whole or broken, plans to meet up later, pleas, complaints, desires, pictures of babies dressed as bears or peppers for Halloween. No more reading and commenting on the lives of others, and in so doing, feeling slightly less alone in the room. No more avatars.

Again: Does Deb think people in a devastated world wouldn't think this way? Or does it think it wrong to give voice to such memories and reflections?

Does he think that such a list offers nothing but regret?

The central figures of Station Eleven are the members of a group called the Traveling Orchestra. They play classical music and perform plays.

All three caravans of the Traveling Symphony are labeled as such, THE TRAVELING SYMPHONY lettered in white on both sides, but the lead caravan carries an additional line of text: Because survival is insufficient.

When I first read Station Eleven I had mixed feelings about it, but in the two years since I have thought often about the Traveling Symphony and what it achieved, what it reminded people of, what it made possible. The book offers, especially through the Symphony, a moving and at times profound meditation on the complex relationships that obtain among technology, art, and human flourishing. I’d strongly recommend that Siddhartha Deb read it.

And he should read some Kim Stanley Robinson while he’s at it.

play as work

Andromeda

Peter Suderman writes about playing the video game Mass Effect: Andromeda,

The game boasts an intricate conversation system, and a substantial portion of the playtime is spent talking to in-game characters, quizzing them for information (much of which adds color but is ultimately irrelevant), asking them for assignments, relaying details of your progress, and then finding out what they would like you to do next.

At a certain point, it started to feel more than a little familiar. It wasn't just that it was a lot like work. It was that it was a lot like my own work as a journalist: interviewing subjects, attempting to figure out which one of the half-dozen questions they had just answered provided useful information, and then moving on to ask someone else about what I had just been told.

Eventually I quit playing. I already have a job, and though I enjoy it quite a bit, I didn't feel as if I needed another one.

But what about those who aren't employed? It's easy to imagine a game like Andromeda taking the place of work.

You should read the whole article, because it’s a fascinating and deeply reflective account of the costs and benefits of a world in which “about three quarters of the increase in leisure time among men since 2000 has gone to gaming.” What I love about Peter’s narrative is that it is sure to make video-game alarmists less alarmed and video-game enthusiasts less enthusiastic.

I have a thousand ideas and questions about this essay, but I’ll mention just one line of thought here: I find myself wondering how, practically speaking, video games got this way. Did game designers learn through focus groups and beta testing that games with a significant work-like component were more addictive? Or were they simply answering to some need in their own psyches? I’m guessing that the correct answer is: some of both. But in any case, there’s a strong suggestion here that human beings experience a deep need for meaningful work, and will accept meaningfulness in small quantities or in fictional form rather than do without it.

Tuesday, June 13, 2017

Penguin Café

Another music post...

Nearly thirty years ago now I bought a CD on pure impulse, knowing almost nothing about the performers: When in Rome, by the Penguin Café Orchestra. You’ve probably heard some of their songs: “Perpetuum Mobile” — in 15/8 time! — or “Telephone and Rubber Band”, though maybe not my favorite of their songs, “Dirt.” The style is difficult to describe and definitely doesn't work for everyone. Simon Jeffes, who founded the PCO, wrote its songs, and played whatever instruments needed playing for a given tune, called their work “modern semi-acoustic chamber music,” and, in a different context, “imaginary folklore.” I like that latter description: I imagine a hidden land somewhere populated by people of English, Celtic, Portuguese, and Venezuelan descent, playing away on instruments they found in their grandparents’ attics. As I say, not for everyone, but I loved it from the start.

When Simon Jeffes died of a brain tumor in 1997, at the age of 48, it seemed that the story of PCO was over. But a one-off reunion concert on the tenth anniversary of his death, featuring his son Arthur, caused a great many people to say that they want more. So Arthur Jeffes (an archeologist by training) got some musicians together and founded Penguin Café to play his father’s music and some of his own. The results are getting more interesting — for instance, in Cantorum, an attempt to use some of the characteristic rhythms and repetitions of electronic music with analog ones. Check it out:

Monday, June 12, 2017

Nils Frahm



A few years ago the German pianist/composer/producer Nils Frahm fell out of bed and broke his thumb. As he later recalled,

All of a sudden I had so much time, an unexpected holiday. I cancelled most of my schedule and found myself being a little bored. Even though my doctor told me not to touch a piano for a while, I just couldn’t resist. I started playing a silent song with 5 fingers on my right and the remaining 4 on my left hand. I set up one microphone and recorded another tune every other night before falling asleep.

If you click on the link above, you’ll see that you can download for free the resulting recording, called Screws in honor of what held his thumb together as it was healing.

I like Frahm’s electronic music very much, but it’s his solo piano work that really captivates me. He often uses an upright piano that he has modified slightly by adjusting the size and texture of the felts, though his wonderful 2015 record Solo was recorded on a unique 12-foot-tall piano called the Klavins M370. He can play loud and fast, but his best music is slow and contemplative, and has reminded many people of Erik Satie’s Gymnopédies, though when his improvisations get chordal they remind me a bit of Keith Jarrett’s quieter moments.

Maybe the most important predecessor to Frahm, though, is Glenn Gould — not in pianistic technique, but in recording technique. In his recording sessions, Gould famously insisted that the microphones be placed as close to the piano strings as possible, yielding a very intimate sound — one which was intensified, I think, by his spare use of pedals. Try listening to a random piece from Gould’s version of Bach’s Well-Tempered Clavier and then compare it to, say Sviatoslav Richter’s (equally great) version, and you’ll immediately envision Richter playing on a big stage in a great concert hall. Gould’s music is for the private listener.

And Frahm takes this emphasis on privacy even further. He has fitted one of his pianos with a pickup that sits inside the instrument, so that you can hear the mechanism moving as the hammers lift and drop and as the pedals engage and disengage. You’re reminded that pianos are made largely of wood — Frahm seems to be playing a living creature rather than a thing. I am not certain that in recording Screws he had the mic inside the piano, but it sounds like it to me; and there are ambient noises from the room in which he recorded it too. In an interview a few years back he commented: “There is something very beautiful about a mono recording of a piano. ‘Screws’ which I just recorded was with one microphone, an old condenser, fed through an EMT stereo reverb and that was it. That was the whole process.” Simple, analog, warm, quiet, private. (Similarly, here’s Nils with one of his favorite toys.)

However: the benefits of such simplicity and warmth are not so easily accessed by the listener. Listening to Frahm’s solo music on a bog-standard pair of earbuds will not allow you to discern many of the subtleties that make it beautiful, and will reveal none of them if there’s any noise at all in the room where you’re listening. My hearing is not nearly as good as it once was, thanks to a youth misspent in too much rock-and-roll played at far too high a volume, but I’ve found that to get the most out of Frahm’s music I benefit from the lossless 24-bit versions he offers on his site, played through a DAC headphone amp and a very good set of headphones. So, as so often in our world today: simplicity and warmth are expensive, and increasingly available only to a privileged few.

But in the best way available to you, check out Nils Frahm’s music. It’s truly remarkable.

Tuesday, May 30, 2017

alert: latency in posting


Friends: My beloved and I are about to take a road trip to Southern California, where next week I'll be leading a faculty seminar at Biola University. We'll take our time driving out there and driving back, because I've never seen the desert Southwest and plan to enjoy taking some of it in. Blogging will resume soon after our return.

Monday, May 29, 2017

iOS users and meta-users

The most recent episode of Canvas — the podcast on iOS and "productivity" (a word I hate, but never mind that for now) hosted by Federico Viticci and Fraser Speirs — focused on hopes for the upcoming iOS 11. Merlin Mann joined the podcast as a guest, and the three of them went around and talked about features they'd like to see introduced to iOS.

Some examples: Viticci wants the ability to record, in video and sound, actions performed on the iPad; Speirs imagines having a digital equivalent of a transparent sheet to draw down over the iPad screen on which he could write with an Apple Pencil, thereby marking up, as it were, things that are happening in an app; and Merlin Mann, who has 450 apps on his iOS devices, wishes for the ability to batch-delete apps, for example, ones that he hasn't used in two years or more.

Listening to the episode, I thought: These aren't iOS users, not even power users, they're meta-users. Viticci writes and talks about iOS for a living; Speirs teaches students how to use iPads; Mann makes his way in life talking about productivity, especially (though not only) on digital devices. Their iOS wish-lists make them the edgiest of edge-cases, because their uses are all about the uses of others.

As for me, a user neither power nor meta, many of my wishes for iOS involve things that Apple can't do on its own. For instance:

  • I wish Bluetooth worked better, but Bluetooth is a standard Apple doesn't control. No matter how well Apple handles its implementation of the standard, they can't control how well device manufacturers handle their implementations. But in any case, given how long Bluetooth has been around, it really, really ought to work better than it does.
  • This site is on Blogger (sigh), and Google has withdrawn their iOS Blogger app and made sure that the Blogger UI doesn't render properly on Safari for iOS — it seems that they're trying to drive iOS users towards Chrome. (Also, there are no good blogging apps for iOS: some are abandonware, some have hideously ugly and non-intuitive UIs, and one, Blogo, demands that you sign up for an account and turn over your data to its owners.)
  • Many, many websites just don't render properly on an iPad, and I expect will never do so. Which makes me wonder what Apple can do on its end (besides enabling Reader View, which is great) to improve poor rendering. E.g.: One of the most lasting problems in iOS involves selecting text, which can be extremely unpredictable: sometimes when you touch the screen nothing selects, while at other times when you're trying to select just one word the whole page gets selected instead. But these problems almost always happen on websites, and are a function, I think, of the poor rendering in Safari for iOS. Is there anything that Apple can do about this, I wonder?

Among the things that Apple can definitely do something about, here are a few wishes from me:

  • When you're connected to a wi-fi network and the signal gets weak or intermittent, and there's another known network with a stronger signal available, your iOS device should switch to that better network automatically. Optimize for best connection.
  • Apple should strongly push developers to implement Split View.
  • Apple should strongly push developers of keyboard-friendly apps to implement keyboard shortcuts — and if they have Mac apps, the same shortcuts on both platforms (the people at Omni are great at this).
  • This is perhaps pie-in-the-sky, but I crave extensive, reliable natural-language image searching in Photos. But I expect we'll get this from Google before we get it from Apple.

Sunday, May 28, 2017

"major collegiate disorders"

A follow-up to yesterday's post...

Of course it's possible to reach too far into the past to get context for current events in the university, but this book certainly offers some interesting food for thought:


I love the fact that there was something called the Conic Section Rebellion.

Anyone who said that nothing like this could happen today would, I think, be correct; but I leave as a potentially illuminating exercise for my readers this question: Why couldn't it happen today?

Saturday, May 27, 2017

getting context, and a grip

Several long quotations coming. Please read them in full.

James Kirchik writes,

Of the 100 or so students who confronted [Nicholas] Christakis that day, a young woman who called him “disgusting” and shouted “who the fuck hired you?” before storming off in tears became the most infamous, thanks to an 81-second YouTube clip that went viral. (The video also — thanks to its promotion by various right-wing websites — brought this student a torrent of anonymous harassment). The videos that Tablet exclusively posted last year, which showed a further 25 minutes of what was ultimately an hours-long confrontation, depicted a procession of students berating Christakis. In one clip, a male student strides up to Christakis and, standing mere inches from his face, orders the professor to “look at me.” Assuming this position of physical intimidation, the student then proceeds to declare that Christakis is incapable of understanding what he and his classmates are feeling because Christakis is white, and, ipso facto, cannot be a victim of racism. In another clip, a female student accuses Christakis of “strip[ping] people of their humanity” and “creat[ing] a space for violence to happen,” a line later mocked in an episode of The Simpsons. In the videos, Howard, the dean who wrote the costume provisions, can be seen lurking along the periphery of the mob.

Of Yale’s graduating class, it was these two students whom the Nakanishi Prize selection committee deemed most deserving of a prize for “enhancing race and/or ethnic relations” on campus. Hectoring bullies quick to throw baseless accusations of racism or worse; cosseted brats unscrupulous in their determination to smear the reputations of good people, these individuals in actuality represent the antithesis of everything this award is intended to honor. Yet, in the citation that was read to all the graduating seniors and their families on Class Day, Yale praised the latter student as “a fierce truthteller.”

Let's look at these episodes at Yale in relation to something that happened at Cornell nearly fifty years ago. Paul A. Rahe was an undergraduate at Cornell then, and tells the story:

At dawn on April 18, 1969 — the Saturday of Parents’ Weekend and the day after the student conduct tribunal issued a reprimand (as minor a penalty as was available) to those who had engaged in the “toy-gun spree” — a group of black students, brandishing crowbars, seized control of the student union (Willard Straight Hall), rudely awakened parents sleeping in the guest rooms upstairs, used the crowbars to force open the doors, and ejected them from the union.

Later that day, they brought at least one rifle with a telescopic sight into the building. On Sunday afternoon, the administration agreed to press neither civil nor criminal charges and not to take any other measures to punish those who had occupied Willard Straight Hall, to provide legal assistance to anyone who faced civil charges arising from the occupation, and to recommend that the faculty vote to nullify the reprimands issued to those who had engaged in the “toy-gun spree.” Upon hearing that this agreement had been reached, 110 black students marched out of Willard Straight Hall in military formation to celebrate their victory, carrying more than seventeen rifles and bands of ammunition.

The next day, when the faculty balked and stopped short of accepting the administration’s recommendation, one AAS leader went on the campus radio and threatened to “deal with” three political science professors and three administrators, whom he singled out by name, “as we will deal with all racists.” Finally, on Wednesday, April 23, the faculty met at a special meeting and capitulated to the demands of the AAS, rescinding the reprimand issued by the student conduct tribunal and calling for a restructuring of the university.

At the very least, the Cornell story should give us some context for thinking about what happened at Yale last year. More generally, we should remember that the ceaseless hyperventilation of social media tends to make us think that American culture today is going through a unique process of dissolution. Rick Perlstein is one of my least favorite historians, but he does well to set us straight on that:

“The country is disintegrating,” a friend of mine wrote on Facebook after the massacre of five policemen by black militant Micah Johnson in Dallas. But during most of the years I write about in Nixonland and its sequel covering 1973 through 1976, The Invisible Bridge, the Dallas shootings might have registered as little more than a ripple. On New Year’s Eve in 1972, a New Orleans television station received this message: “Africa greets you. On Dec. 31, 1972, aprx. 11 pm, the downtown New Orleans Police Department will be attacked. Reason — many, but the death of two innocent brothers will be avenged.” Its author was a twenty-three-year-old Navy veteran named Mark James Essex. (In the 1960s, the media had begun referring to killers using middle names, lest any random “James Ray” or “John Gacy” suffer unfairly from the association.) Essex shot three policemen to death, evading arrest. The story got hardly a line of national attention until the following week, when he began cutting down white people at random and held hundreds of officers at bay from a hotel rooftop. Finally, he was cornered and shot from a Marine helicopter on live TV, which also accidentally wounded nine more policemen. The New York Times only found space for that three days later.

Stories like these were routine in the 1970s. Three weeks later, four men identifying themselves as “servants of Allah” holed up in a Brooklyn sporting goods store with nine hostages. One cop died in two days of blazing gun battles before the hostages made a daring rooftop escape. The same week, Richard Nixon gave his second inaugural address, taking credit for quieting an era of “destructive conflict at home.” As usual, Nixon was lying, but this time not all that much. Incidents of Americans turning terrorist and killing other Americans had indeed ticked down a bit over the previous few years — even counting the rise of the Black Liberation Army, which specialized in ambushing police and killed five of them between 1971 and 1972.

In Nixon’s second term, however, they began ticking upward again. There were the “Zebra” murders from October 1973 through April 1974 in San Francisco, in which a group of Black Muslims killed at least fifteen Caucasians at random and wounded many others; other estimates hold them responsible for as many as seventy deaths. There was also the murder of Oakland’s black school superintendent by a new group called the Symbionese Liberation Army, who proceeded to seal their militant renown by kidnapping Patty Hearst in February 1974. Then, in May, after Hearst joined up with her revolutionary captors, law enforcement officials decimated their safe house with more than nine thousand rounds of live ammunition, killing six, also on live TV. Between 1972 and 1974 the FBI counted more than six thousand bombings or attempted bombings in the United States, with a combined death toll of ninety-one. In 1975 there were two presidential assassination attempts in one month.

Let's pause for a moment to think about that: More than six thousand bombings or attempted bombings in two years.

So, is the country disintegrating? In comparison with the Nixon years: No. Not even with Donald Ivanka Kushner Trump in charge. Which is not to say that it couldn't happen, only that it hasn't yet happened, and if we want to avoid further damage we would do well to study the history of fifty years ago with close attention. For the national wounds that were opened in the Sixties may have scabbed over from time to time in the decades since, but they have never healed.

And in relation specifically to the university, we might ask some questions:

  • How significant is it that most of the people running our universities today were undergraduates when things like the Cornell crisis happened?
  • If it is significant, what is the significance?
  • To what extent are the social conflicts that plague some universities today continuations of the conflicts that plagued them fifty years ago?
  • If universities today seem, to many critics, to have lost their commitment to free speech and reasoned disagreement, have they abandoned those principles any more completely they did at the height of those earlier student protests?
  • What happened in the intervening decades? Did universities recover their core commitments wholly, or partially, or not at all?
  • How widespread are protests (and the "coddling" of protestors) today in comparison to that earlier era?
  • What needs to be fixed in our universities?
  • Are universities that have gone down this particular path — praising and celebrating students who confront, berate, and in some cases threaten faculty — fixable? (A question only for those who think such behavior is a bug rather than a feature.)

Vital questions all, I think; but not ones that can be answered in ignorance of the relevant history.