Text Patterns - by Alan Jacobs

Tuesday, June 27, 2017

a great silence cometh

We’re going to have radio silence here at Text Patterns for a while — I’m coming to the end of a year of research leave and have a number of projects, small and large, that I need to wrap up before school resumes in August.

One of those projects is related to my previous post: Our social media put us in an odd situation in relation to words, our own and others’. People say things they don't mean, or that they declare afterwards that they don't mean; they retweet or report or link to statements that they later (when challenged) say they don't endorse. Every day we see people positioning themselves in oblique relation to language. Mikhail Bakhtin — early in his career, which is to say, early in the history of the Soviet Union, when people’s words were already landing them in prison — wrote of the moral imperative of being answerable for one’s language; Wendell Berry, in one of his most important essays, wrote of the inestimable value of “standing by words,” standing by what we say. It’s hard to imagine concepts more foreign to the way language is used on social media today. So I’m writing an essay about the restoration of answerability.

I’m also in the early stages of writing about the changing fortunes of the concept of nature, something that I discussed briefly in a few recent posts.

As usual, I’ll be posting images, links, and very brief reflections (often on Christian matters) at my personal blog.

I’m continuing to think about Anthropocene theology, but that’s going to be a very long-term project.

The next book I write, by the way, will be called Auden and Science.

And then I have various responsibilities concerning the books I’ve already written. How to Think: A Guide for the Perplexed — that’s the subtitle for the U.K. edition, which I like better than the one for the American edition — will be out this fall, and I’ll be involved in the publicity for that. Also, I’ll be blogging about some of the ideas of the book on that site.

The Year of Our Lord 1943: Christian Intellectuals and Total War will be out next year from Oxford UP, and I have final revisions to do for that, plus all the questionnaire-answering and form-filling that accompany publication.

AND: I’ll need to start prepping for my fall classes! By the time I return I’ll have been fifteen months out of the classroom, and that’s a long time. I am itching to reconnect with students, believe it or not.

But the research leave was great. I got a lot of work done and have been refreshed by the opportunity to read and think and reflect. This past year also marked the end of a decade of illness in my family, which means that for the past few months I have had the strange and wonderful experience of not having to spend a lot of time caring for a sick loved one. (That’s taken me a while to get used to, oddly enough. I still wake up in the morning casting my mind around for the first thing I need to do for ... oh wait, everyone’s fine!) So I’ll be headed back into teaching with a lot of energy. Lord, may it please you that the next few years are like this one — for me personally, I mean. I’d prefer not to have any more years like this one in American politics.

Ciao for now!

so many questions

I have many questions — real, deep, sincere questions — about this.

  • Does Katherine Dettwyler really believe that a person deserves torture and death for stealing a poster?
  • Or does she, rather, believe that a person deserves torture and death for being a clueless privileged culturally-imperialist white male?
  • Or does she, perhaps, believe that a person deserves not torture and death but maybe arrest for being a clueless privileged culturally-imperialist white male, and just wrote carelessly?
  • Is she right that a significant number of her white male students "think nothing of raping drunk girls at frat parties and snorting cocaine, cheating on exams, and threatening professors with physical violence"?
  • How do any or all of these beliefs affect her ability to do her job as a teacher?
  • How many college teachers share these beliefs?
  • Is this a situation in which no "beliefs" as such are involved, but Dettwyler's Facebook post was rather an unfocused spur-of-the-moment venting arising from frustration with a lousy job or lousy working conditions?
  • If your answer to the previous question is "Yes" or "Probably yes," how do you account for the fact that Dettwyler seems to have made very similar comments on blogs?
  • How might a tendency to go off on unfocused spur-of-the-moment venting arising from frustration with a lousy job or lousy working conditions affect a person's ability to do her or his job as a teacher?
  • Did the University of Delaware ask Dettwyler for an explanation of her post and/or comments?
  • Did the university ask her to apologize for them?
  • Suppose that she did apologize — would that be sufficient for her to keep her job?
  • What would the University of Delaware have done if a tenured faculty member had made precisely the same comments?
  • As those outside the academy go apoplectic over these matters, those inside the academy shrug. Is shrugging enough?
  • What does it mean that so many people these days wish death on strangers whom they dislike or disagree with?
  • Should we feel better when we're told that people don't really mean it when they, for instance, respond to a tweet expressing a view about English grammar by wishing an entire generation of Americans dead?
  • Like, if you don't really in your heart of hearts want those people you disagree with to die in a fire or be raped and tortured, then we don't have a problem? Is that the argument?
  • Presumably all of the above, and worse, has been said to Katherine Dettwyler since her Facebook post went public — does that help?
  • Does vigilante vengeance have limits?
  • Even if it's just verbal vengeance?
  • Is forgiveness a social good?

Monday, June 26, 2017

historical knowledge and world citizenship

Few writers have meant as much to me, as consistently, over many years as Loren Eiseley — I say a bit about my teenage discovery of him in this essay. I am now writing a piece on the new Library of America edition of his essays for Education and Culture, and, man, is it going to be hard for me to keep it below book-length. I keep coming across little gems of provocation and insight. In lieu of buttonholing my family and making them listen to me read passages aloud — though I’m not saying I’ll never do that — I may post some choice quotations here from time to time.

Here’s a wonderful passage from the early pages of The Firmament of Time, Eiseley’s lapidary and meditative history of geology, or rather of what the rise of modern geology did to the human experience of time. I like it because it illuminates certain blind spots of today’s academics — and people more generally — and because it reminds us just how essential the study of history is.

Like other members of the human race, scientists are capable of prejudice. They have occasionally persecuted other scientists, and they have not always been able to see that an old theory, given a hairsbreadth twist, might open an entirely new vista to the human reason. I say this not to defame the profession of learning but to urge the extension of education in scientific history. The study leads both to a better understanding of the process of discovery and to that kind of humbling and contrite wisdom which comes from a long knowledge of human folly in a field supposedly devoid of it. The man who learns how difficult it is to step outside the intellectual climate of his or any age has taken the first step on the road to emancipation, to world citizenship of a high order.

He has learned something of the forces which play upon the supposedly dispassionate mind of the scientist; he has learned how difficult it is to see differently from other men, even when that difference may be incalculably important. It is a study which should bring into the laboratory and the classroom not only greater tolerance for the ideas of others but a clearer realization that even the scientific atmosphere evolves and changes with the society of which it is a part. When the student has become consciously aware of this, he is in a better position to see farther and more dispassionately in the guidance of his own research. A not unimportant by-product of such an awareness may be an extension of his own horizon as a human being.

Topsy-turvy, Tono-Bungay

In his blog-through of the works of H. G. Wells, Adam Roberts has reached Tono-Bungay, and there’s much food for thought in the post. Real food, not patent medicine like Tono-Bungay itself. Much of the novel, in Adam’s account, considers just that relationship: between the real and the unreal, the health-giving and the destructive, the truly valuable and mere waste — all the themes that Robertson Davies explores in The Rebel Angels and that are also, therefore, the chief concern of my recent essay on Davies, “Filth Therapy”.

Here I might quote Adam quoting some people who quote some other person:

Patrick Brantlinger and Richard Higgins quote William Cohen’s Introducing Filth: Dirt, Disgust, and Modern Life to the effect that ‘polluting or filthy objects’ can ‘become conceivably productive, the discarded sources in which riches may lie’, adding that ‘“Riches” have often been construed as “waste”’ and noting that ‘the reversibility of the poles — wealth and waste, waste and wealth — became especially apparent with the advent of a so-called consumer society during the latter half of the nineteenth century’ [‘Waste and Value: Thorstein Veblen and H. G. Wells’, Criticism, 48:4 (2006), 453].

This prompts me to want to write a sequel to “Filth Therapy,” though I clearly need to read Introducing Filth first.

It occurs to me that these are matters of longstanding interest to Adam, whose early novel Swiftly I have described as “excresacramental” — it was the first novel by Adam that I read, and given how completely disgusting it is, I’m rather surprised that I kept reading him. But he’s that good, even when he’s dirty-minded, as it were.

These themes make their way into fiction, I think, because of an ongoing suspicion, endemic now in Western culture if not elsewhere, that we have it all wrong, that we have valued what we should not have valued and vice versa, that we have built our house only having first rejected the stone that should be the chief cornerstone. As the old General Confession has it, “We have left undone those thinges whiche we ought to have done, and we have done those thinges which we ought not to have done, and there is no health in us.” This suspicion, which is often muted but never quite goes away, is perhaps the most lasting inheritance of Christianity in a post-Christian world: the feeling that we have not just missed the mark but are utterly topsy-turvy.

Christianity is always therefore suggesting to us the possibility of a “revaluation of all values,” a phrase that Nietzsche in The Antichrist used against Christianity:

I call Christianity the one great curse, the one great intrinsic depravity, the one great instinct for revenge for which no expedient (i.e. A means of attaining an end, especially one that is convenient but considered improper or immoral) is sufficiently poisonous, secret, subterranean, petty — I call it the one immortal blemish of mankind… And one calculates time from the dies nefastus on which this fatality arose - from the first day of Christianity! Why not rather from its last? From today? Revaluation of all values!

But Nietzsche issues this call because he thinks that Christianity itself has not set us right-side-up, but rather turned us upside-down. It was Christianity that first revalued all values, saying that the first shall be last and the last first, and he who seeks his life will lose it while he who loses his life shall find it, and blessed are the meek, and blessed are the poor in spirit…. Nietzsche’s call is therefore a call for restoration of the values that Christianity flipped: rule by the strong, contempt for the weak. It is, when considered in the long historical term, a profoundly conservative call.

Whether or not Nietzsche’s demand for a new paganism is right, surely it is scarcely necessary: for rule by the strong and contempt for the weak is the Way of the World, always has been and always will be; Christianity even at its most powerful can scarcely distract us from that path, much less set us marching in the opposite direction. Because that Way is so intrinsic to our neural and moral orientation, because we run so smoothly along its well-paved road, it is always useful to us to read books that don’t suggest merely minor adjustments in our customs but rather point to the possibility of something radically other. Such books are at the very least a kind of tonic, and a far better one than the nerve-wracking stimulation of Tono-Bungay.

Saturday, June 24, 2017

the big impediment to going iOS-only

At Macdrifter, Gabe Weatherhead makes a vital point:

But Apple has a blind spot that I think might mean the iPad never has parity with the Mac. The App Store just doesn’t encourage big powerful app development.

The price point on the iOS App Store is too low for many indie developers to succeed. I look at the most powerful apps I use and they come from a handful of companies dedicated to craft but supported by Mac revenue. Omnigraffle, OmniFocus, and OmniOutliner are great. But there are few competitors in this space that can reach the same level of quality and still make a profit. I suspect that in an iOS-only world these apps would end.

Transmit and Coda by Panic are top-tier software, but even they seem to be struggling to justify their existence.

This doesn’t mean that great apps don’t still get released on iOS. They just don’t keep getting supported. My favorite text editors, the best personal databases, the variety of bookmark managers. They might keep rolling out, but the majority aren’t actively developed.

This is exactly right. Even the pro-quality apps that remain in development tend to be updated inconsistently and (in comparison to Mac apps) rarely. Every time I think about going iOS-only, I realize that too many of the apps I rely on are apps … I can’t rely on.

Wednesday, June 21, 2017

Darwin's mail

I’ve just read Janet Browne’s two-volume biography of Charles Darwin, and it’s a magnificent achievement — one of the finest biographies I’ve ever read. I especially admire Browne’s judgment in knowing when to stick with the events of the life and when to pull back her camera to reveal the larger social contexts in which Darwin worked.

One of the interesting subthemes of the book concerns the way Darwin gravitated towards technologies that would allow him to pursue whatever aroused his curiosity — and whenever his curiosity was aroused he tended to become obsessive until he satisfied it. For instance, though he hated having his photograph taken, he made extensive use of photographs in writing his peculiar book on The Expression of the Emotions in Man and Animals.

Also: if you think of the various scientific institutions and journals of the Victorian era as a kind of network, he and his chief supporters (Thomas Henry Huxley above all) skillfully exploited those technologies to spread the news of natural selection far more quickly than it could have been expected to spread otherwise.

But as someone who has a long-standing interest in the postal service, these passages from the early pages of the second volume are especially provocative:

Systematically, he turned his house into the hub of an ever-expanding web of scientific correspondence. Tucked away in his study, day after day, month after month, Darwin wrote letters to a remarkable number and variety of individuals. He relied on these letters for every aspect of his evolutionary endeavour, using them not only to pursue his investigations across the globe but also to give his arguments the international spread and universal application that he and his colleagues regarded as essential footings for any new scientific concept. They were his primary research tool. Furthermore, after the Origin of Species was published, he deliberately used his correspondence to propel his ideas into the public domain—the primary means by which he ensured his book was being read and reviewed. His study inside Down House became an intellectual factory, a centre of administration and calculation, in which he churned out requests for information and processed answers, kept himself at the leading edge of contemporary science, and ultimately orchestrated a transformation in Victorian thought.

Maybe it wasn't the telegraph that was the Victorian internet but rather the penny post — even if it was slower.

And Darwin was utterly unashamed to use letters to get other people (friends, family, and often strangers) to do research for him:

He also hunted down anyone who could help him on specific issues, from civil servants, army officers, diplomats, fur-trappers, horse-breeders, society ladies, Welsh hill-farmers, zookeepers, pigeon-fanciers, gardeners, asylum owners, and kennel hands, through to his own elderly aunts or energetic nieces and nephews. Many of his letters went to residents of far-flung regions — India, Jamaica, New Zealand, Canada, Australia, China, Borneo, the Hawaiian Islands — reflecting the increasing European domination of the globe and rapidly improving channels of communication.

It’s a good thing that Darwin was a wealthy man: “In 1851 he spent £20 on ‘stationery, stamps & newspapers’ (nearly £1,000 in modern terms) ... By 1877 Darwin’s expenditure on postage and stationery had doubled to £53 14s. 7d, a sum roughly equal to his butler’s annual salary.”

And here’s Browne’s incisive summary of this method:

If there was any single factor that characterised the heart of Darwin’s scientific undertaking it was this systematic use of correspondence. Darwin made the most of his position as a gentleman and scientific author to obtain what he needed. He was a skilful strategist. The flow of information that he initiated was almost always one-way. Like countless other well-established figures of the period, Darwin regarded his correspondence primarily as a supply system, designed to answer his own wants. “If it would not cause you too much trouble,” he would write. “Pray add to your kindness,” “I feel that you will think you have fallen on a most troublesome petitioner,” “I trust to your kindness to excuse my troubling you.” ...

Alone at his desk, captain of his ship, safely anchored in his country estate on the edge of a tiny village in Kent, he was in turn manager, chief executive, broker, and strategist for a world-wide enterprise. Once, in a passing compulsion, he attached a mirror to the inside of his study window, angled so that he could catch the first glimpse of the postman turning up the drive. It stayed there for the rest of his life.

Tuesday, June 20, 2017


Bruno Latour shares with Timothy Morton a determination to overthrow the concept of “nature” because he believes that that concept “makes it possible to recapitulate the hierarchy of beings in a single ordered series.” Therefore any genuine (non-anthropocentric) “political ecology” — of the sort I briefly described in a previous post — requires “the destruction of the idea of nature” (Politics of Nature, p. 25).

As for Morton, he thinks the idea of nature does one basic thing: since nature is, fundamentally and always, “a surrounding medium that sustains our being,” it follows that “Putting something called Nature on a pedestal and admiring it from afar does for the environment what patriarchy does for the figure of Woman. It is a paradoxical act of sadistic admiration” (Ecology without Nature, pp. 4, 5).

We might notice that these two reasons for rejecting the concept of Nature are incompatible. The problem for Latour is that Nature places humans within “a single ordered series,” but at the top of it; for Morton, human beings are outside the order of Nature, which “surrounds” us. I think Morton is closer to being correct than Latour is: the way that Latour describes Nature is actually more appropriate to the concept that preceded it within the discourses of Western thought, Creation. It is from the Jewish and Christian account of Creation — over which human beings have been given “dominion” — that the “single ordered series,” at the top of which humans stand, emerges. But Morton’s critique applies better to the term that has recently succeeded Nature within our talk about such matters, “the environment.”

It’s therefore tempting to say that Nature is the term that bridges the historical gap between Creation and “the environment.” But that would oversimplify the story. And the deficiency that Latour and Morton share is an ahistorical oversimplification of what Raymond Williams calls “perhaps the most complex word in the language” — and even Williams’s account seems condensed in comparison with the one that C. S. Lewis gives in his long essay in Studies in Words. But Williams gets at the really key point here when he issues this caution: “since nature is a word which carries, over a very long period, many of the major variations of human thought — often, in any particular use, only implicitly yet with powerful effect on the character of the argument — it is necessary to be especially aware of its difficulty.” This is just what Morton and Latour fail to do.

And it would do no good to claim to be interested in only one of the many meanings of the word “nature,” because, as Williams points out, they tend to bleed into one another. Nature in the sense of “that which surrounds humans and with which we interact” is always in complex, confusing relation with Nature as the whole show, all that there is, as when Pope writes: “All are but parts of one stupendous whole / Whose body Nature is and God the soul.” Here human beings are clearly part of that “body” (thus Nature is here still a synonym for Creation). But of course we can lose our awareness of our place in that “one stupendous whole,” through pride or simply through participation in technological modernity, and can then come to see our lives as somehow “unnatural”; which can then lead us in turn to seek ways to become “one with Nature.” At the very least we seek what Morton and Donna Haraway both refer to as kinship, and the words “kin” and “kind” are Anglo-Saxon equivalents of Nature: the old name for Mother Nature is Dame Kind.

But kinship is not identity, and this is precisely why “Nature,” “nature,” “natural,” “unnatural,” all drift in and out of focus and shift their meanings like holograms. To understand this you need only read King Lear, where all these notions are ceaselessly deployed and redeployed — where Lear wonder what his nature is while cursing his unnatural daughters, and Edmund reckons with the consequences of being Gloucester’s natural (i.e. bastard) son. We do not know what we are kin to or how close the kinship is; we don’t know what we are like, which means we do not understand our true nature. And so the narrow and historically insensitive definitions offered by Latour and Morton simply won’t do; they are manifestly inadequate to the situation on the ground.

But this we know: kinship is not identity. We have very good reasons to doubt that our creaturely cousins, or siblings — St. Francis’s Brother Sun and Sister Moon, among others — ask these questions. Which means that from them we can learn much but perhaps not everything we want and need to know. As Auden wrote,

But trees are trees, an elm or oak
Already both outside and in,
And cannot, therefore, counsel folk
Who have their unity to win.

Monday, June 19, 2017


Q: Why are you reading all this stuff, anyway? It seems pretty obvious that you don’t have much sympathy for it.

A: That’s a good question. I could give several answers. For one thing, I’m not as lacking in sympathy as you think. I have respect for any good-faith attempts to reckon with the immensely vexed question of what it means to be human, and the corollary questions about how we are most healthily related to the nonhuman, and I think Morton and Haraway are really trying to figure these things out. There’s a moral urgency to their writing that I admire.

Q: Is that so? Sure doesn’t sound like it.

A: Well, yeah, I guess that last post was kind of negative. As I was reading Morton I realized that some pretty important intellectual decisions had been made before he even began his argument, and I wanted to register that protest.

Q: But if you feel that a particular philosophical project has gone astray from the start, why not just move along to thinkers and lines of thought you find more fruitful, more resonant with potential?

A: Remember that this is a work in progress: as I said in that post, I’m currently reading Morton, I’m not done. (And in a sense I’m still reading Haraway, even though I put her book aside months ago.) When you’re blogging your way through a reading project, any one post is sure to give an incomplete picture of your response and likely to give a misleading one.

Q: Fair enough, I guess, but there does seem to be a pattern to your writing about a lot of recent work. You read it, think about it, and then declare that there are resources in the history of Christian thought that address these questions — whatever the questions are — better than the stuff you’ve been reading does. So why not just read and think about those Christian figures who always seem to do it better?

A: Because often those non-Christian (or non-religious, or anti-Christian, or anti-religion) thinkers often raise important questions that Christians tend to neglect, and I have to see whether there are in fact such adequate resources from within Christianity to address the questions raised by others. So far I have found that my tradition is indeed up for those challenges, buts its resources are augmented and strengthened by having to address what it never would have asked on its own. I truly believe that Christianity will emerge stronger from a genuinely dialogical encounter with rival traditions, in part because it will (as it has so often in the past) adopt and adapt what is best in those traditions for its own purposes. It doesn’t always work out that way; Hank Hill was right when he said to the praise band leader “You’re not making Christianity better, you’re just making rock-and-roll worse!” But most of the time the genuinely dialogical encounter more than pays for itself.

Q: Maybe. But you often seem out of your depth with the kind of thing —the kind of stuff you’re reading theses days — and often in the mood to kick over the traces. Wouldn’t you be better off sticking with the stuff you actually have a professional level of knowledge of? Auden? Other twentieth century religiously-inclined literary figures?

A: Honestly, you may be right. I often wonder about that very point. And that’s one of the reasons — that’s the main reason, I guess — why I talk about writing books of the technological history of modernity and the Anthropocene condition but end up writing them about the stuff I have spent most of my career teaching.

Q: So why are you even here, man? Why not drop this blog and get back to work in your own field?

A: Because this is a place where I can exercise my habitual curiosity about things I don’t know much about. Because this is a kind of Pensieve for me, a way to clear away thoughts that otherwise would clog up my brain. Because every once in a while something of value coalesces out of all this randomness. I have very few readers and still fewer commenters, so I’m not getting the thrill of regular feedback, but hitting the “Publish” button offers an acceptable simulacrum of accomplishment. Those are probably not very good reasons, but they’re the reasons I have.

But I’m not gonna lie: spending so much time reading stuff with which at a deep level I’m at odds is wearing, it really is. Especially since I know that the people I’m reading — and working so hard to read fairly — are highly unlikely to treat serious Christian thinkers with comparable respect. With any respect. They don’t know that Christian theology that’s deeply and resourcefully engaged with the modern world exists, and if they knew they wouldn’t care. What’s I’m doing when I read thinkers like Morton and Haraway is an engagement on my part, but it’s not a conversation. That’s just what it’s like if you want to bring Christian thought to bear on modern academic discourse. You only do it if you believe you’re called to do it.

Saturday, June 17, 2017

in responsibilities begin dreams

Lately I've been reading the philosopher Timothy Morton, who has a lot to say about living in the Anthropocene, and I see that he has a forthcoming book called Humankind: Solidarity with Non-Human People. On his website the book is described thus:

What is it that makes humans human? As science and technology challenge the boundaries between life and non-life, between organic and inorganic, this ancient question is more timely than ever. Acclaimed Object-Oriented philosopher Timothy Morton invites us to consider this philosophical issue as eminently political. It is in our relationship with non-humans that we decided the fate of our humanity. Becoming human, claims Morton, actually means creating a network of kindness and solidarity with non-human beings, in the name of a broader understanding of reality that both includes and overcomes the notion of species. Negotiating the politics of humanity is the first and crucial step to reclaim the upper scales of ecological coexistence, not to let Monsanto and cryogenically suspended billionaires to define them and own them.

The book isn't out yet, but I find this description worrying. The idea that "becoming human ... actually means creating a network of kindness and solidarity with non-human beings" sounds wonderful, in the most abstractly theoretical terms, but I doubt we can solve our and the world's problems simply by "negotiating the politics of humanity" — at least if Morton means, as I suspect he does based on what I have read so far, redefining the sphere of the political to include the whole range of nonhuman creatures, including the vast and ontologically complex phenomena he calls hyperobjects. Because we don't have a great track record of treating one another well, do we? I'm all for "kindness and solidarity with non-human beings," but first things first, you know? A good many people out there can't even manage kindness and solidarity with parents whose children were murdered in school shootings.

I'm reminded here of a comment made by Maciej Ceglowski in a recent talk. Responding to the claims of Silicon Valley futurists that we're just a few decades away from ending the reign of Death and achieving immortality for at least some, Ceglowski said, "I’m not convinced that a civilization that is struggling to cure male-pattern baldness is ready to take on the Grim Reaper." Similarly, I'd encourage those who plan to achieve kinship with all living things to call me back once they can have rational and peaceable conversations with people who live on their block.

I have the same concern with Morton's project as I do with Donna Haraway's theme of "making kin," which I wrote about here. I suspect that much of the appeal of seeking communion with pigeons, plutonium, and black holes (to use examples taken from Haraway and Morton) is that pigeons, plutonium, and black holes don't talk, tweet, or vote. If projects like Haraway's and Morton's don't reckon seriously with this problem, then they are likely to be in equal parts frivolous and evasive.

Such projects raise, for me, a further question, which is whether the language of kinship and solidarity is the right language to accomplish what its users want. Because you can achieve a feeling of kinship or solidarity without taking on any particular responsibility for the well-being of another creature. Here the old Christian language of "stewardship" seems to me to have greater force, and a force that is especially applicable to the Anthropocene moment: We do not own this world, but it has been entrusted to our care, and only if we seriously strive to live up to the terms of that trust will we have a chance of achieving true kinship and solidarity with all that we care for. It seems to me that Yeats had it backwards: it is not the case that "in dreams begin responsibilities," but in responsibilities begin dreams.

After posting this I realized that I'm not done. Morton, Haraway, Graham Hartman, and others working along similar lines are keen to bridge the gaps between humans and nonhumans — or, perhaps it would be better to say, deny the validity of the gaps that human beings perceive to exist between themselves and the rest of the world. They thus conclude that we require a new philosophical orientation to the nonhuman world, though one that employs quite familiar concepts (kinship, solidarity, intention, purpose, desire) — those concepts are just deployed in relation to beings/objects which formerly were thought to be outside the scope of such terms. We're not used to thinking that hammers have desires and black holes have consciousness.

This strategy of employing familiar language in unfamiliar contexts gives the appearance of being radical but may not be quite that. It strikes me as being largely a reversal of Skinnerian behaviorism: the behaviorists said that human beings are nothing special because they're just like animals and plants, responding to stimuli in law-governed ways; now the object-oriented ontologists say that human beings are nothing special because animals and plants (and hammers and black holes) all possess the traits of consciousness and desire that we have traditionally believed to be distinctive to us. The goal of the philosophical redescription seems to be the same: to dethrone humanity, to get us to stop thinking of ourselves as sitting at the pinnacle of the Great Chain of Being.

And underlying this goal is the assumption (often stated explicitly by all these figures, I think) that our belief in our unique and superior status among the rest of the beings/objects in the world has led us to abuse those beings/objects for our own enrichment or amusement.

I think this whole project is unlikely to bear the fruit it wants to bear, and I have several reasons for thinking so, which I will just gesture at here and develop in later posts.

(1) I doubt the power of philosophical redescription. Changes in our practices will lead to changes in description, not the other way around. The failure to recognize the direction that the causal arrow points is the signal failure of people who, being symbol manipulators by profession, think that the manipulation of symbols is the key to All Good Things. (I have written about this often, for instance, here.)

(2) I don't think we have taken the role of Apex Species on the Great Chain of Being too seriously, I think we have failed to take it seriously enough.

(3) I believe that all of these difficulties can best be addressed by living into certain ancient ways of thinking — which, in our neophilic age, is a hard sell, I know.

People will say, "Go back to Christianity? We tried that and it got us into this situation." To which the obvious rejoinder is the Chestertonian one that Christianity hasn't been tried and found wanting, it has been found difficult and left untried. But perhaps more to the point: everything has failed. Every day I hear lefties say that capitalism has been tried and didn't work, and righties say that socialism has been tried and didn't work — to which each side retorts that its preferred system hasn't really been tried, hasn't been implemented properly and thoroughly.

And all of these people are correct. Every imaginable system has been put into play with partial success at best, and the problems result from incomplete or half-hearted implementation of that system and from flaws inherent to it — which flaws are precisely what make people half-hearted or incomplete in their implementation of it. Everything has been tried and found wanting, and found difficult and left untried. This is the human condition. Attempts to remedy social and personal ills always run aground on both the sheer complexity of our experience and our mixed and conflicting desires (mixed and conflicted both within ourselves and in relation to one another).

New vocabularies, or even the deployment of old vocabularies in supposedly radical new ways, won't fix that. Which is not to say that improvements in conditions are impossible.

Much more on all this later.

Friday, June 16, 2017

digital culture through file types

This is a fabulous idea by Mark Sample: studying digital culture through file types. He mentions MP3, GIF, HTML, and JSON, but of course there are many others worthy of attention. Let me mention just two:

XML: XML is remarkably pervasive, providing the underlying document structure for things ranging from RSS and Atom feeds to office productivity software like Microsoft Office and iWork — but secretly so. That is, you could make daily and expert use of a hundred different applications without ever knowing that XML is at work under the hood.

Text: There's a great story to be told about how plain text files went from being the most basic and boring of all file types to a kind of lifestyle choice — a lifestyle choice I myself have made.

If you have other suggestions, please share them here or with Mark.

men ignoring (as well as interrupting) women

The New York Times is wrong about a great many things these days, but it's certainly right about this: men really do interrupt women All. The. Time. (And the NYT has covered this story before.) I have seen the phenomenon myself in many faculty meetings over the years, and it's especially painful when a woman sits in silence through 45 minutes of a meeting, finally decides to say something — and is instantly cut off.

I have often talked too much in meetings, but I don't think I do this — women who have worked with me, please let me know if I'm wrong. Please. (Could you do it in an email instead of in the comments, below, though? That would be a kindness.) But interrupting is just one of many ways confident and articulate men — or confident men who just think they're articulate — can sideline their female colleagues.

Once, some years ago now, a younger colleague asked me to join her for lunch. She wanted to talk to me about something: the fact that I had not expressed interest in or support of her scholarship, even though it overlapped with my own in some areas. My first thought was that I really did admire her work and thought; but that was immediately followed by the realization that I had never told her so. I had completely failed to offer the support and encouragement that would have meant a lot to her as someone making her way in our department and our institution. So I apologized, and asked if she would forgive me, which of course she did.

In the aftermath of that lunch meeting, I thought a lot about why I had so manifestly failed my colleague, and I've continued to think about it since. I don't fully understand the complexities of the situation, and I may be looking for self-exculpation here, but I do think I've identified one element of the problem, and it involves sexually-segregated socializing.

A number of my younger male colleagues had expressed gratitude for my support of them, and when I thought about how I had expressed that support — the advice I had given, the responses to their work — I realized that that had rarely happened on campus, in our offices or hallways, but rather in coffee shops and pubs. When we met on free mornings for coffee to chat as we got through some grading or diminished the size of our inboxes, or met in the evenings after work for a pint or two — that's when I got the chance to say some supportive things.

But while we often asked our female colleagues to join us for such outings, they rarely did. I am honestly not sure whether they just weren't interested, or had conflicting obligations, or didn't hear enough to make it perfectly clear that their presence was really wanted and that we didn't desire to create a Boy's Club. But I do know that I should have been aware of these dynamics and found other ways to let the women in my circles know that I valued their work. Once that single colleague had the boldness to call my attention to my shortcomings in this area, I made an effort to compensate — though I don't know that I ever did enough.

I especially want to ask my fellow academics: What do you think about the account I've given? Does it sound plausible? What am I missing, either about myself or about the general social dynamics?

frequency of citation does not equal quality of research

Google Scholar has just added a set of what it calls Classic papers: "Classic papers are highly-cited papers in their area of research that have stood the test of time. For each area, we list the ten most-cited articles that were published ten years earlier." The problem here is the equating of frequent citation with "standing the test of time." As it happens, many scholarly papers retracted by the journals where they were published continue to be widely cited anyway. Frequency of citation is not a good proxy for "classic" status.

Thursday, June 15, 2017

the oven bird imagines the future

This provocative post by Alec Ryrie asks an important question: Why is our culture’s dystopian imagination so absolute? Drawing on a recent history thesis by Olive Hornby that describes outbreaks of plague in early-modern England during which between a third and half of the people in some communities died. Not all but a handful, not 99%, but a little less than half, maybe. Enough to inflict profound damage on the emotional, spiritual, and economic life of a place — but not enough to destroy it altogether. Ryrie:

Most disasters are not absolute. They are real, devastating, and consequential, but they do not wipe the slate clean. Human beings are resilient and are also creatures of habit. You can panic, but you can’t keep panicking, and once you’ve finished, you tend to carry on, because what else is there? The real catastrophes of the West in the past century (world wars, the Spanish flu) have been of this kind: even as the principal imagined one (nuclear war) is of the absolute variety.

We need to learn to be better at imagining serious but non-terminal disasters, the kind which are actually going to hit us. (For a recent cinematic example, the excellent and chilling Contagion.) That way, when we confront such things, we will be less tempted simply to say ‘Game over!’ and to attempt to reboot reality, and will instead try to work out how to deal with real, permanent but not unlimited damage.

In such a case you can’t say “Game over” — he’s quoting Aliens there — because the game isn’t over. The game goes on, in however damaged a form, leaving us all forced to confront the truth taught us by the oven bird:

There is a singer everyone has heard,
Loud, a mid-summer and a mid-wood bird,
Who makes the solid tree trunks sound again.
He says that leaves are old and that for flowers
Mid-summer is to spring as one to ten.
He says the early petal-fall is past
When pear and cherry bloom went down in showers
On sunny days a moment overcast;
And comes that other fall we name the fall.
He says the highway dust is over all.
The bird would cease and be as other birds
But that he knows in singing not to sing.
The question that he frames in all but words
Is what to make of a diminished thing.

Wednesday, June 14, 2017

literary fiction and climate change, revisited

Here we have Siddhartha Deb making precisely the same inexplicable error that Amitav Ghosh, whom he quotes, made last year — a mistake on which I commented at the time. The thought sequence goes like this:

1) Declare yourself interested only in “literary” fiction;

2) Define literary fiction as a genre concerned only with the quotidian reality of today;

3) Complain that literary fiction is deficient in imaginative speculation about the realities and possibilities of climate change.

But if you have already conflated “literary fiction” and “fiction” — note how Deb uses the terms interchangeably — and have defined the former as having a “need to keep the fluky and the exceptional out of its bounds, conceding the terrain of improbability — cyclones, tornadoes, tsunamis, and earthquakes — to genre fiction,” then you have ensured the infallibility of your thesis. Because any story that engages with “the fluky and the exceptional” (or, riskily, the future) ipso facto becomes “genre fiction” and therefore outside the bounds of your inquiry.

This self-blinkering leads Deb into some very strange statements:

In the United States too, even well meaning liberal fiction, often falling under the rubric of cli-fi, reveals itself as incapable in grappling with [our steadfast rapaciousness]. This is perhaps because to think of modern life as a failure, and to question the idea of progress, requires an extremism of vision or a terrifying kind of independence. An indie bestseller like Emily St. John Mandel’s Station Eleven, set in an eco-apocalypse, features rhapsodies on the internet and electricity. Marcel Theroux in Far North includes a paean to modern flight as one of the finest inventions of “our race,” even though the effect of air travel on carbon emissions is quite horrific.

Let me just pause to note that Deb has a rather expansive notion of “the United States,” given that Emily St. John Mandel is Canadian and Marcel Theroux was born in Uganda and educated wholly in England. Setting that aside, Deb’s description of Mandel’s book is farcically inaccurate. It is true that there are characters in the book, some among the handful of people who have survived a plague that killed 99.9% of humanity, who miss the internet and electricity. Does Deb think that in such an world nobody would miss those technologies? Or is it his view that a truly virtuous writer should make a point of suppressing such heretical notions?

Either position is silly. Of course people in such a world would miss technological modernity, for good reasons and bad. At one point we get “an incomplete list” of what’s gone:

No more diving into pools of chlorinated water lit green from below. No more ball games played out under floodlights. No more porch lights with moths fluttering on summer nights. No more trains running under the surface of cities on the dazzling power of the electric third rail. No more cities. No more films, except rarely, except with a generator drowning out half the dialogue, and only then for the first little while until the fuel for the generators ran out, because automobile gas goes stale after two or three years. Aviation gas lasts longer, but it was difficult to come by.

No more screens shining in the half-light as people raise their phones above the crowd to take pictures of concert states. No more concert stages lit by candy-colored halogens, no more electronica, punk, electric guitars.

No more pharmaceuticals. No more certainty of surviving a scratch on one's hand, a cut on a finger while chopping vegetables for dinner, a dog bite....

No more countries, all borders unmanned.

No more fire departments, no more police. No more road maintenance or garbage pickup. No more spacecraft rising up from Cape Canaveral, from the Baikonur Cosmodrome, from Vandenburg, Plesetsk, Tanegashima, burning paths through the atmosphere into space.

No more Internet. No more social media, no more scrolling through litanies of dreams and nervous hopes and photographs of lunches, cries for help and expressions of contentment and relationship-status updates with heart icons whole or broken, plans to meet up later, pleas, complaints, desires, pictures of babies dressed as bears or peppers for Halloween. No more reading and commenting on the lives of others, and in so doing, feeling slightly less alone in the room. No more avatars.

Again: Does Deb think people in a devastated world wouldn't think this way? Or does it think it wrong to give voice to such memories and reflections?

Does he think that such a list offers nothing but regret?

The central figures of Station Eleven are the members of a group called the Traveling Orchestra. They play classical music and perform plays.

All three caravans of the Traveling Symphony are labeled as such, THE TRAVELING SYMPHONY lettered in white on both sides, but the lead caravan carries an additional line of text: Because survival is insufficient.

When I first read Station Eleven I had mixed feelings about it, but in the two years since I have thought often about the Traveling Symphony and what it achieved, what it reminded people of, what it made possible. The book offers, especially through the Symphony, a moving and at times profound meditation on the complex relationships that obtain among technology, art, and human flourishing. I’d strongly recommend that Siddhartha Deb read it.

And he should read some Kim Stanley Robinson while he’s at it.

play as work


Peter Suderman writes about playing the video game Mass Effect: Andromeda,

The game boasts an intricate conversation system, and a substantial portion of the playtime is spent talking to in-game characters, quizzing them for information (much of which adds color but is ultimately irrelevant), asking them for assignments, relaying details of your progress, and then finding out what they would like you to do next.

At a certain point, it started to feel more than a little familiar. It wasn't just that it was a lot like work. It was that it was a lot like my own work as a journalist: interviewing subjects, attempting to figure out which one of the half-dozen questions they had just answered provided useful information, and then moving on to ask someone else about what I had just been told.

Eventually I quit playing. I already have a job, and though I enjoy it quite a bit, I didn't feel as if I needed another one.

But what about those who aren't employed? It's easy to imagine a game like Andromeda taking the place of work.

You should read the whole article, because it’s a fascinating and deeply reflective account of the costs and benefits of a world in which “about three quarters of the increase in leisure time among men since 2000 has gone to gaming.” What I love about Peter’s narrative is that it is sure to make video-game alarmists less alarmed and video-game enthusiasts less enthusiastic.

I have a thousand ideas and questions about this essay, but I’ll mention just one line of thought here: I find myself wondering how, practically speaking, video games got this way. Did game designers learn through focus groups and beta testing that games with a significant work-like component were more addictive? Or were they simply answering to some need in their own psyches? I’m guessing that the correct answer is: some of both. But in any case, there’s a strong suggestion here that human beings experience a deep need for meaningful work, and will accept meaningfulness in small quantities or in fictional form rather than do without it.

Tuesday, June 13, 2017

Penguin Café

Another music post...

Nearly thirty years ago now I bought a CD on pure impulse, knowing almost nothing about the performers: When in Rome, by the Penguin Café Orchestra. You’ve probably heard some of their songs: “Perpetuum Mobile” — in 15/8 time! — or “Telephone and Rubber Band”, though maybe not my favorite of their songs, “Dirt.” The style is difficult to describe and definitely doesn't work for everyone. Simon Jeffes, who founded the PCO, wrote its songs, and played whatever instruments needed playing for a given tune, called their work “modern semi-acoustic chamber music,” and, in a different context, “imaginary folklore.” I like that latter description: I imagine a hidden land somewhere populated by people of English, Celtic, Portuguese, and Venezuelan descent, playing away on instruments they found in their grandparents’ attics. As I say, not for everyone, but I loved it from the start.

When Simon Jeffes died of a brain tumor in 1997, at the age of 48, it seemed that the story of PCO was over. But a one-off reunion concert on the tenth anniversary of his death, featuring his son Arthur, caused a great many people to say that they want more. So Arthur Jeffes (an archeologist by training) got some musicians together and founded Penguin Café to play his father’s music and some of his own. The results are getting more interesting — for instance, in Cantorum, an attempt to use some of the characteristic rhythms and repetitions of electronic music with analog ones. Check it out:

Monday, June 12, 2017

Nils Frahm

A few years ago the German pianist/composer/producer Nils Frahm fell out of bed and broke his thumb. As he later recalled,

All of a sudden I had so much time, an unexpected holiday. I cancelled most of my schedule and found myself being a little bored. Even though my doctor told me not to touch a piano for a while, I just couldn’t resist. I started playing a silent song with 5 fingers on my right and the remaining 4 on my left hand. I set up one microphone and recorded another tune every other night before falling asleep.

If you click on the link above, you’ll see that you can download for free the resulting recording, called Screws in honor of what held his thumb together as it was healing.

I like Frahm’s electronic music very much, but it’s his solo piano work that really captivates me. He often uses an upright piano that he has modified slightly by adjusting the size and texture of the felts, though his wonderful 2015 record Solo was recorded on a unique 12-foot-tall piano called the Klavins M370. He can play loud and fast, but his best music is slow and contemplative, and has reminded many people of Erik Satie’s Gymnopédies, though when his improvisations get chordal they remind me a bit of Keith Jarrett’s quieter moments.

Maybe the most important predecessor to Frahm, though, is Glenn Gould — not in pianistic technique, but in recording technique. In his recording sessions, Gould famously insisted that the microphones be placed as close to the piano strings as possible, yielding a very intimate sound — one which was intensified, I think, by his spare use of pedals. Try listening to a random piece from Gould’s version of Bach’s Well-Tempered Clavier and then compare it to, say Sviatoslav Richter’s (equally great) version, and you’ll immediately envision Richter playing on a big stage in a great concert hall. Gould’s music is for the private listener.

And Frahm takes this emphasis on privacy even further. He has fitted one of his pianos with a pickup that sits inside the instrument, so that you can hear the mechanism moving as the hammers lift and drop and as the pedals engage and disengage. You’re reminded that pianos are made largely of wood — Frahm seems to be playing a living creature rather than a thing. I am not certain that in recording Screws he had the mic inside the piano, but it sounds like it to me; and there are ambient noises from the room in which he recorded it too. In an interview a few years back he commented: “There is something very beautiful about a mono recording of a piano. ‘Screws’ which I just recorded was with one microphone, an old condenser, fed through an EMT stereo reverb and that was it. That was the whole process.” Simple, analog, warm, quiet, private. (Similarly, here’s Nils with one of his favorite toys.)

However: the benefits of such simplicity and warmth are not so easily accessed by the listener. Listening to Frahm’s solo music on a bog-standard pair of earbuds will not allow you to discern many of the subtleties that make it beautiful, and will reveal none of them if there’s any noise at all in the room where you’re listening. My hearing is not nearly as good as it once was, thanks to a youth misspent in too much rock-and-roll played at far too high a volume, but I’ve found that to get the most out of Frahm’s music I benefit from the lossless 24-bit versions he offers on his site, played through a DAC headphone amp and a very good set of headphones. So, as so often in our world today: simplicity and warmth are expensive, and increasingly available only to a privileged few.

But in the best way available to you, check out Nils Frahm’s music. It’s truly remarkable.