Text Patterns - by Alan Jacobs

Friday, June 24, 2016

this is your TV on speed

Jeff Guo watches TV shows really fast and thinks he's pretty darn cool for doing so.

I recently described my viewing habits to Mary Sweeney, the editor on the cerebral cult classic "Mulholland Drive." She laughed in horror. “Everything you just said is just anathema to a film editor,” she said. “If you don't have respect for how something was edited, then try editing some time! It's very hard."

Sweeney, who is also a professor at the University of Southern California, believes in the privilege of the auteur. She told me a story about how they removed all the chapter breaks from the DVD version of Mulholland Drive to preserve the director’s vision. “The film, which took two years to make, was meant to be experienced from beginning to end as one piece,” she said.

I disagree. Mulholland Drive is one of my favorite films, but it's intentionally dreamlike and incomprehensible at times. The DVD version even included clues from director David Lynch to help people baffled by the plot. I advise first-time viewers to watch with a remote in hand to ward off disorientation. Liberal use of the fast-forward and rewind buttons allows people to draw connections between different sections of the film.

Question: How do you draw connections between sections of the film you fast-forwarded through?

Another question: How would Into Great Silence be if you took 45 minutes to watch it?

A third question: Might there be a difference — an experiential difference, and even an aesthetically qualitative difference — between remixing and re-editing and creating montages of works you've first experienced at their own pace and, conversely, doing the same with works you've never had the patience to sit through?

And a final suggestion for Jeff Guo: Never visit the Camiroi.

Thursday, June 23, 2016

travel and the lure of the smartphone

Alan Turing’s notion of a “universal machine” is the founding insight of the computer revolution, and today’s smartphones are the fullest embodiment of that idea we’ve yet realized, which is what makes them irresistible to so many of us. Many of us, I suppose, have at times made a mental list of the devices we once owned that have been replaced by smartphones: calculators, clocks, cameras, maps, newspapers, music players, tape recorders, notepads....

Earlier this year I described my return to a dumbphone and the many advantages accruing to me therefrom, but as my recent trip to London and Rome drew closer, I started to sweat. Could I maintain on my travels my righteous technological simplification? This was a particular worry of mine because I am also a packing minimalist: I have spent whole summers abroad living out of a backpack and a smallish suitcase. Maps wouldn’t add much weight, but a camera would be more significant; and then I’d need either to carry my backpack everywhere I went, to hold the camera, or else take a dedicated camera bag. Moreover, my wife was not going on this trip, and I wanted to stay in touch with her, especially by sending photos of the various places I visited — and to do so immediately, not once I returned.

As the day of departure drew nearer, that desire to maintain the fullest possible contact with my beloved loomed larger in my mind. This reminded me that I had recently spoken and written about the relationship between distraction and addiction:

If you ask a random selection of people why we’re all so distracted these days — so constantly in a state of what a researcher for Microsoft, Linda Stone, has called “continuous partial attention” — you’ll get a somewhat different answer than you would have gotten thirty years ago. Then it would have been “Because we are addicted to television.” Fifteen years ago it would have been, “Because we are addicted to the Internet.” But now it’s “Because we are addicted to our smartphones.”

All of these answers are both right and wrong. They’re right in one really important way: they link distraction with addiction. But they’re wrong in an even more important way: we are not addicted to any of our machines. Those are just contraptions made up of silicon chips, plastic, metal, glass. None of those, even when combined into complex and sometimes beautiful devices, are things that human beings can become addicted to.

Then what are we addicted to? … We are addicted to one another, to the affirmation of our value — our very being — that comes from other human beings. We are addicted to being validated by our peers.

Was my reluctance to be separated from my wife an example of this tendency? I’d like to think it’s something rather different: not an addiction to validation from peers, but a long-standing dependence on intimacy with my life partner. But my experience is certainly on the same continuum with the sometimes pathological need for validation that I worried over in that essay. So while I think that my need to stay in touch with Teri is healthier than the sometimes desperate desire to be approved by one’s peer group, they have this in common: they remind us how much our technologies of communication are not substitutes for human interaction but enormously sophisticated means of facilitating it.

A camera would have added some weight to my backpack, but not all that much. Packing minimalism played a role in my decision to pop the SIM card out of my dumbphone and dig my iPhone out of a drawer — note that I had never sold it or given it away! I was too cowardly for that — and use it on my trip as camera and communicator (iMessage and Twitter) and news-reader and universal map and restaurant-discovery vehicle and step-counter and.... But it wasn’t the decisive thing.

I do wonder how the trip might have been different if I had maintained my resolve. I certainly could’ve gotten some better photos if I had brought my camera, especially if I had also carried my long lens. (Smartphones have wide-angle lenses, which are great in many circumstances but very frustrating in others.) Maybe I would’ve sent Teri cards and letters instead of text messages, and she’d have keepsakes that our grandchildren could someday see. (Somehow I doubt that our grandchildren will be able to browse through my Instagram page.) And then I’d have uploaded all my photos when I got home and we’d have sat down to go through them all at once. But that’s not how it went.

Well, so it goes. I’ve been back for two days now, and probably should get out the dumbphone and switch my SIM card back into it. I’m sure I’ll do that soon. Very soon. Any day now.

Friday, June 17, 2016

some things about The Thing Itself

I had a really wonderful time in Cambridge the other night talking with Adam Roberts, Francis Spufford, and Rowan Williams about Adam’s novel The Thing Itself and related matters. But it turns out that there are a great many related matters, so since we parted I can’t stop thinking about all the issues I wish we had had time to explore. So I’m going to list a few thoughts here, in no particular order, and in undeveloped form. There may be fodder here for later reflections.

  • We all embarrassed Adam by praising his book, but after having re-read it in preparation for this event I am all the more convinced that it is a superb achievement and worthy of winning any book prize that it is eligible for (including the Campbell Award, for which it has just been nominated).
  • But even having just re-read it, and despite being (if I do say so myself) a relatively acute reader, I missed a lot. Adam explained the other night a few of the ways the novel’s structure corresponds to the twelve books of the Aeneid, which as it happens he and I have just been talking about, and now that I’ve been alerted to the possible parallels I see several others. And they’re genuinely fascinating.
  • Suppose scientists were to build a computer that, in their view, achieved genuine intelligence, and intelligence that by any measure we have is beyond ours, and that computer said, “There is something beyond space and time that conditions space and time. Not something within what we call being but the very Ground of Being itself. One might call it God.” What would happen then? Would our scientists say, “Hmmm. Maybe we had better rethink this whole atheism business”? Or would they say, “All programs have bugs, of course, and we’ll fix this one in the next iteration”?
  • Suppose that scientists came to believe that the AI is at the very least trustworthy, if not necessarily infallible, and that its announcement should be taken seriously. Suppose that the AI went on to say, “This Ground of Being is neither inert nor passive: it is comprehensively active throughout the known universe(es), and the mode of that activity is best described as Love.” What would we do with that news? Would there be some way to tease out from the AI what it thinks Love is? Might we ever be confident that a machine’s understanding of that concept, even if the machine were programmed by human beings, is congruent with our own?
  • Suppose the machine were then to say, “It might be possible for you to have some kind of encounter with this Ground of Being, not unmediated because no encounter, no perception, can ever be unmediated, but more direct than you are used to. However, such an encounter, by exceeding the tolerances within which your perceptual and cognitive apparatus operates, would certainly be profoundly disorienting, would probably be overwhelmingly painful, would possibly cause permanent damage to some elements of your operating system, and might even kill you.” How many people would say, “I’ll take the risk”? And what would their reasons be?
  • Suppose that people who think about these things came generally to agree that the AI is right, that Das Ding an Sich really exists (though “exists” is an imprecise and misleadingly weak word) and that the mode of its infinitely disseminated activity is indeed best described as Love — how might that affect how people think about Jesus of Nazareth, who claimed (or, if you prefer, is said by the Christian church to claim), a unique identification with the Father, that is to say, God, that is to say, the Ground of Being, The Thing Itself?

Thursday, June 9, 2016

why blog?

The chief reason I blog is to create a kind of accountability to my own reading and thinking. Blogging is a way of thinking out loud and in public, which also means that people can respond — and often those responses are helpful in shaping further thoughts.

But even if I got no responses, putting my ideas out here would still be worthwhile, because it’s a venue in which there is no expectation of polish or completeness. Sometimes a given post, or set of posts, can prove to be a dead end: that’s what happened, I think, with the Dialogue on Democracy I did over at The American Conservative. I wanted to think through some issues but I don't believe I really accomplished anything, for me or for others. But that’s all right. It was worth a try. And perhaps that dead end ended up leading me to the more fruitful explorations of the deep roots of our politics, and their relation to our technological society, that I’ve been pursuing here in the last couple of weeks.

As I have explained several times, over the long haul I want to pursue a technological history of modernity. But I have two books to write before I can even give serious consideration to that project. Nevertheless, I can try out the occasional random idea here, and as I do that over the next couple of years, who knows what might emerge? Possibly nothing of value; but possibly something essential to the project. Time will tell.

I’ve been blogging a lot lately because I had a chunk of free-ish time between the end of the Spring semester and the beginning of a long period of full-time book writing. I’m marking that transition by taking ten days for research (but also for fun) in England and Italy, so there will be no blogging for a while. And then when I return my activity will be sporadic. But bit by bit and piece by piece I’ll be building something here.

Wednesday, June 8, 2016

the Roman world and ours, continued

To pick up where I left off last time:

Imagine that you are a historian in the far future: say, a hundred thousand years from now. Isn't it perfectly possible that from that vantage point the rise of the United States as a global power might be seen primarily as a development in the history of the Roman Empire? To you, future historian, events from the great influence of Addison’s Cato upon the American Revolution to the Marshall Plan (paciere subiectis, debellare superbos) to the palpable Caesarism of Trump are not best understood as analogies to Roman history but as stages within it — as the history of the British Empire (Pax Brittanica) had been before us: Romanitas merely extended a bit in time and space. We know that various nations and empires have seen themselves as successors to Rome: Constantinople as the Second Rome, Moscow as the Third, the just-mentioned Pax Brittanica and even the Pax Americana that followed it. In such a case, to know little or nothing about the history of Rome is to be rendered helpless to understand — truly to understand — our own moment.

A possible chapter title from a far-future history textbook: “The Beginnings of the Roman Empire: 31 B.C.E. to 5000 C.E.”

Self-centered person that I am, I find myself thinking about all this in relation to what I’ve been calling the technological history of modernity. And Cochrane’s argument — along with that of Larry Siedentop, which I mentioned in my previous post on this subject — pushes me further in that direction than I’d ever be likely to go on my own.

In the Preface to his book, Cochrane makes the summary comment that “the history of Graeco-Roman Christianity” is largely the history of a critique: a critique of the idea, implicit in certain classical notions of the commonwealth but made explicit by Caesar Augustus, “that it was possible to attain a goal of permanent security, peace and freedom through political action, especially through submission to the ‘virtue and fortune’ of a political leader.” Another way to put this (and Cochrane explores some of these implications) is to say that classical political theory is devoted to seeing the polis, and later the patria and later still the imperium, as the means by which certain philosophical problems of human experience and action are to be solved. The political theory of the Greco-Roman world, on this account, is doing the same thing that the Stoics and Epicureans were doing in their own ways: developing a set of techniques by which human suffering might be alleviated, human anxieties quelled, and human flourishing promoted. That political theory is therefore best understood as closely related to what Foucault called “technologies of the self” and to what Martha Nussbaum has described as the essentially therapeutic character of Hellenistic ethics. The political structures of the Roman Empire — including literal structures like aquaducts and roads, and organizational ones like the cursus publicus, should therefore be seen as pointing ultimately towards a healing not only of the state but of the persons who comprise it. (Here again Siedentop’s history of the legal notions of personhood, and the relations of persons to families and communities, is vital.)

And if all this is right, then the technological history of modernity may be said to begin not with the invention of the printing press but in the ancient world — which in a very real sense, according to the logic of “great time,” we may be said to inhabit still.

Tuesday, June 7, 2016

with sincere thanks

I get quite a few unsolicited emails from people who want me to do something for them, and many of those emails end with "Thank you," "Thank you for your time," "Thanks for your attention," and so on. It has never occurred to me to think that such people were doing anything inappropriate; in fact, it just seemed to me that they were being polite.

But every now and then on Twitter I discover that some people are enraged by this little quirk of manners. I don't get it. What are you supposed to say when you write to ask someone for something? They've read your email, they didn't have to — why not thank them for doing so? 

The one complaint I understand involves the phrase "thank you in advance" — which seems to presume that the addressed will do the thing that the addressees have requested. But even then, it doesn't strike me as anything to make a big deal out of.

Can anyone who is offended by being thanked in these ways explain to me why? Thank you in advance for your help.

the Roman world and ours

So why am I reading about — I’m gonna coin a phrase here — the decline and fall of the Roman Empire? It started as part of my work on Auden.

I first learned about Charles Norris Cochrane’s Christianity and Classical Culture from reading Auden’s review of it, published in The New Republic in 1944. Auden began that review by saying that in the years since the book appeared (it was first published in 1940) “I have read this book many times, and my conviction of its importance to the understanding not only of the epoch with which it is concerned, but also of our own, has has increased with each rereading.” I thought: Well, now, that’s rather remarkable. I figured it was a book I had better read too.

Auden concludes his review with these words:

Our period is not so unlike the age of Augustine: the planned society, caesarism of thugs or bureaucracies, paideia, scientia, religious persecution, are all with us. Nor is there even lacking the possibility of a new Constantinism; letters have already begun to appear in the press, recommending religious instruction in schools as a cure for juvenile delinquency; Mr. Cochrane’s terrifying description of the “Christian” empire under Theodosius should discourage such hopes of using Christianity as a spiritual benzedrine for the earthly city.

That metaphor — "spiritual benzedrine for the earthly city" — is brilliantly suggestive. (And Auden knew all about benzedrine.)

More than twenty years later, in a long essay on the fall of Rome that was never published for reasons Edward Mendelson explains here, Auden wrote:

I think a great many of us are haunted by the feeling that our society, and by ours I don’t mean just the United States or Europe, but our whole world-wide technological civilisation, whether officially labelled capitalist, socialist or communist, is going to go smash, and probably deserves to.

Like the third century the twentieth is an age of stress and anxiety. In our case, it is not that our techniques are too primitive to cope with new problems, but the very fantastic success of our technology is creating a hideous, noisy, over-crowded world in which it is becoming increasingly difficult to lead a human life. In our reactions to this, one can see many parallels to the third century. Instead of Gnostics, we have existentialists and God-is-dead theologians, instead of neoplatonists, devotees of Zen, instead of desert hermits, heroin addicts and beats … instead of mortification of the flash, sado-masochistic pornography; as for our public entertainments, the fare offered about television is still a shade less brutal and vulgar than that provided by the amphitheater, but only a shade, and may not be for long.

And then the comically dyspeptic conclusion: “I have no idea what is actually going to happen before I die except that I am not going to like it.” (For those interested, the unpublished essay may be found in this collection.)

Clearly for Auden, the story Cochrane tells was one that had lasting relevance. Elements of Cochrane’s narrative turn up, in much more complex form than in the late-career bleat just quoted, for decades in Auden’s poetry: “The Fall of Rome,” “Memorial for the City,” “Under Sirius,” “Secondary Epic,” and many other poems bear Cochrane’s mark. As I mentioned in my earlier post, I’m now reading Christianity and Classical Culture for the fourth time, and it really is impossible for me also not to see the Roman world as a distant mirror of our own. How can I read this passage about the rise of Julius Caesar and not think of Donald Trump?

In the light of these ancient concepts, Ceasar emerges as a figure at once fascinating and dangerous. For the spirit thus depicted is one of sublime egotism; in which the libido dominandi asserts itself to the exclusion of all possible alternatives and crushes every obstacle in its path. We have spoken of Caesar as a divisive force. That, indeed, he was: as Cato had put it, “he was the only one of the revolutionaries to undertake, cold-sober, the subversion of the republic”; … A force like this, however, does more than divide, it destroys. Hostile to all claims of independence except its own, it is wholly incompatible with that effective equality which is implied in the classical idea of the commonwealth. To admit it within the community is thus to nourish the lion, whose reply to the hares in the assembly of beasts was to ask: Where are your claws?

And how can I read about this extension of the Emperor’s powers and not reflect on the recent hypertrophy of the executive branch of American government?

The powers and duties assigned to the emperor were broad and comprehensive. They were, moreover, rapidly enlarged as functions traditionally attached to republican magistracies were transferred one after another to the new executive, and executive action invaded fields which, under the former system, had been consecrated to senatorial or popular control. Finally, by virtue of specific provisions, the substance of which is indicated in the maxim princeps legibus solutus, the emperor was freed from constitutional limitations which might have paralyzed his freedom of action; while his personal protection was assured through the grant of tribunician inviolability (sacrosanctitas) as well as by the sanctions of the Lex Maiestatis. The prerogative was thus built up by a series of concessions, made by the competent authority of senate and people, no single one of which was in theory unrepublican.

But the more I read Cochrane, the more I suspect that we may not be talking about mere mirroring, mere analogies. Last year, when I read and reviewed Larry Siedentop’s book Inventing the Individual, I was struck by Siedentop’s tracing of certain of our core ideas about selfhood to legal disputes that arose in the latter centuries of the Roman Empire and its immediate aftermath. And this led me in turn to think about an ideas that Mikhail Bakhtin meditated on ceaselessly near the end of his life: great time. David Shepherd provides a thorough account of this idea here, but in short Bakhtin is trying to think about cultural developments that persist over centuries and even millennia, even when they have passed altogether from conscious awareness. Thus this staggering passage from one of his late notebooks:

The mutual understanding of centuries and millennia, of peoples, nations, and cultures, provides a complex unity of all humanity, all human cultures (a complex unity of human culture), and a complex unity of human literature. All this is revealed only on the level of great time. Each image must be understood and evaluated on the level of great time. Analysis usually fusses about in the narrow space of small time, that is, in the space of the present day and the recent past and the imaginable — desired or frightening — future.

And:

There is neither a first nor a last word and there are no limits to the dialogic context (it extends into the boundless past and the boundless future). Even past meanings, that is, those born in the dialogue of past centuries, can never be stable (finalized, ended once and for all) — they will always change (be renewed) in the process of subsequent, future development of the dialogue. At any moment in the development of the dialogue there are immense, boundless masses of forgotten contextual meanings, but at certain moments of the dialogue’s subsequent development along the way they are recalled and invigorated in renewed form (in a new context). Nothing is absolutely dead: every meaning will have its homecoming festival. The problem of great time.

If we were take Bakhtin’s idea seriously, how might that affect our thinking about the Roman Empire as something more than a “distant mirror” of our own age? To think of our age, our world, as functionally extensive of the Roman project?

I’ll take up those questions in another post.

Monday, June 6, 2016

synopsis of Cochrane's Christianity and Classical Culture

  • Augustus, by uniting virtue and fortune in himself (viii, 174), established "the final triumph of creative politics," solving "the problem of the classical commonwealth" (32).
  • For a Christian with Tertullian's view of things, the "deification of imperial virtue" that accompanied this "triumph" was sheer idolatry: Therefore Regnum Caesaris, Regnum Diaboli (124, 234). 
  • "The crisis of the third century ... marked ... an eclipse of the strictly classical ideal of virtue or excellence" (166), and left people wondering what to do if the Augustan solution were not a solution after all. What if there is "no intelligible relationship" between virtue and fortune (171)?
  • Christians had remained largely detached during the crisis of the third century, neither wanting Rome to collapse nor prone to being surprised if it did, since its eventual fall was inevitable anyway (195).
  • Then Constantine came along and "both professed and practiced a religion of success" (235), according to which Christianity was a "talisman" that ensured the renewal of Romanitas (236).
  • After some time and several reversals (most notably in the reign of Julian the Apostate) and occasional recoveries (for instance in the reign of Theodosius) it became clear that both the Constantinian project and the larger, encompassing project of Romanitas had failed (391).
  • Obviously this was in many ways a disaster, but there was some compensation: the profound impetus these vast cultural crises gave to Christian thought, whose best representatives (above all Augustine) understood that neither the simple denunciations of the social world of Tertullian nor Constantine's easy blending of divergent projects were politically, philosophically, or theologically adequate.
  • Thus the great edifice of the City of God, Cochrane's treatment of which concludes with a detailed analysis of the philosophy of history that emerges from Augustine's new account of human personality: see 502, 502, 536, 542, 567-69.
Just in case it's useful to someone. Those page numbers are from the Liberty Fund edition, which I ended up using for reasons I'll discuss in another post. 

Virgil and adversarial subtlety

So, back to Virgil ... (Sorry about the spelling, Professor Roberts.)

What do we know about Virgil’s reputation in his own time and soon thereafter? We know that Augustus Caesar brought the poet into his circle and understood the Aeneid to articulate his own vision for his regime. We know that the same educational system that celebrated the reign of Augustus as the perfection of the ideal of Romanitas also celebrated Virgil as the king of Roman poets, even in his own lifetime. Nicholas Horsfall shows how, soon after Virgil’s death, students throughout the Roman world worked doggedly through the Aeneid line by line, which helps to explain why there are Virgilian graffiti at Pompeii but almost all of them from Books I and II. We know that Quintilian established study of Virgil as the foundational practice of literary study and that that establishment remained in place as long as Rome did, thus, centuries later, shaping the education of little African boys like Augustine of Hippo.

But, as my friend Edward Mendelson has pointed out to me in an email, when people talk about what “the average Roman reader” would have thought about Virgil, they have absolutely no evidence to support their claims. It may well be, as these critics usually say, that such a reader approved of the Empire and therefore approved of anything in the Aeneid that was conducive to the establishment of Empire ... but no one knows that. It’s just guesswork.

R. J. Tarrant has shown just how hard it is to pin down the details of Virgil’s social/political reputation. But it’s worth noting that, while the gods in the Aeneid insist that Dido must die for Rome to be founded, Augustine tells us in the Confessions that his primary emotional reaction when reading the poem was grief for the death of Dido. And Quintilian doesn't place Virgil at the center of his literary curriculum because he is the great advocate of Romanitas, but because he is the only Roman poet worthy to be compared with Homer. The poem exceeds whatever political place we might give it, and the readers of no culture are unanimous in their interests and priorities.

In a work that I’ve seen in draft form, so about which I won't say too much, Mendelson offers several reasons why we might think that Virgil is more critical of the imperial project, and perhaps even of Rome’s more general self-mythology, than Augustus thought, and than critics such as Cochrane think.

First, there is the point that Adam Roberts drew attention to in the comments on my previous post: the fact that Anchises tells Aeneas in Book VI that the vocation of Rome is not just to conquer the world but to “spare the defeated” (parcere subiectis) — yet this is precisely what Aeneas does not do when the defeated Turnus pleads for his life. I tried to say, in my own response to Adam, why I don't think that necessarily undoes the idea that Virgil snd his poem are fundamentally supportive not just of Rome generally but of the necessity of Turnus’s death. But the contrast between Anchises’ claim about the Roman vocation and what Aeneas actually does is certainly troubling.

More troubling still is another passage Mendelson points to, perhaps the most notorious crux in all of classical literature and therefore something I should already have mentioned: the end of Book VI. After Anchises shows to Aeneas the great pageant of Rome’s future glories, Virgil writes (in Allen Mandelbaum’s translation):

There are two gates of Sleep: the one is said
to be of horn, through it an easy exit
is given to true Shades; the other is made
of polished ivory, perfect, glittering,
but through that way the Spirits send false dreams
into the world above. And here Anchises,
when he is done with words, accompanies
the Sibyl and his son together; and
he sends them through the gate of ivory.

(Emphasis mine.) The gate of ivory? Was that whole vision for the future then untrue? But it couldn't be: Anchises reveals people who really were to exist and events that really were to occur. Was the untruth then not the people and events themselves but the lovely imperial gloss, the shiny coating that Anchises paints on events that are in fact far uglier? Very possibly. But the passage is profoundly confusing.

I continue to believe that Virgil is fundamentally supportive of the imperial enterprise, for reasons I won't spell out in further detail here. (If I had time I would write at length about Aeneas’s shield.) But he was too great a poet and too wise a man not to know, and reveal, the costliness of that enterprise, and not just in the lives of people like Dido and Turnus. Perhaps he was even more concerned with the price the Roman character paid for Roman greatness: the gross damage Romanitas did to the consciences of its advocates and enforcers.

Another way to put this is to say that Virgil was a very shrewd reader of Homer, who was likewise clear-sighted about matters that most of us would prefer not to see clearly. One must also here think of Shakespeare. Take, for instance, Twelfth Night: the viewers’ delight in the unfolding of the comedy is subtly undermined by the treatment of Malvolio by some of the “good guys.” It seems that the joy that is in laughter can all too easily turn to cruelty. Yes, Malvolio is a pompous inflated prig, but still....

The best account I have ever read of the way great literature accepts and represents these “minority moods” — moods that account for elements of human reality that any given genre tends to downplay — was written by Northrop Frye, in his small masterpiece A Natural Perspective. That's his book about comedy, and the Aeneid is, structurally anyway, a kind of comedy, a story of human fellowship emerging from great suffering. Frye's excursus on genre and mood is one of the most eloquent (and important) passages in his whole ouevre, and  I’ll end by quoting from it:

If comedy concentrates on a uniformly cheerful mood, it tends to become farcical, depending on automatic stimulus and reflex of laughter. Structure, then, commands participation but not assent: it unites its audience as an audience, but allows for variety in response. If no variety of response is permitted, as in extreme forms of melodrama and farce, something is wrong: something is inhibiting the proper function of drama.... Hence both criticism and performance may spend a good deal of time on emphasizing the importance of minority moods. The notion that there is one right response which apprehends the whole play rightly is an illusion: correct response is always stock response, and is possible only when some kind of mental or physical reflex is appealed to.

The sense of festivity, which corresponds to pity in tragedy, is always present at the end of a romantic comedy. This takes the part of a party, usually a wedding, in which we feel, to some degree, participants. We are invited to the festivity and we put the best face we can on whatever feelings we may still have about the recent behavior of some of the characters, often including the bridegroom. In Shakespeare the new society is remarkably catholic in its tolerance; but there is always a part of us that remains a spectator, detached and observant, aware of other nuances and values. This sense of alienation, which in tragedy is terror, is almost bound to be represented by somebody or something in the play, and even if, like Shylock, he disappears in the fourth act, we never quite forget him. We seldom consciously feel identified with him, for he himself wants no such identification: we may even hate or despise him, but he is there, the eternal questioning Satan who is still not quite silenced by the vindication of Job.... Participation and detachment, sympathy and ridicule, sociability and isolation, are inseparable in the complex we call comedy, a complex that is begotten by the paradox of life itself, in which merely to exist is both to be part of something else and yet never to be a part of it, and in which all freedom and joy are inseparably a belonging and an escape.

Saturday, June 4, 2016

two great things that work great together

a small crisis in my life as a reader

I mentioned in my previous post that I've been re-reading Charles Norris Cochrane's Christianity and Classical Culture, but I'm doing so in some perplexity. Here's my copy of the beautiful Liberty Fund reissue of the book, with its perfectly sewn binding and creamy thick paper (available, you should know, at a ridiculously low price: any other publisher would charge three times as much).



It is a pleasure to hold and to read, a wonderful exercise in the art of book-making (with some nice apparatus as well, especially the appendix with translations of some phrases Cochrane left untranslated).

By contrast, here is the old Oxford University Press copy I've had for many years:


It's worn, and the glue of the binding is drying out — some of the pages might start to come free at any moment. The cheap paper is yellowing. On the other hand, it contains evidence of my previous readings:



You can see in those photos the condition of the paper and binding, but also the evidence of the three previous readings: the first time marked in pencil, the second in pen, the third in green highlighting. (I almost never use highlighters, but wanted to distinguish that third reading from the other two.)

I keep going back and forth between the two copies. Part of me wants to have a new — or newish — experience with Cochrane's great book, and to do so in a format that is maximally enjoyable. I'm also aware that the Liberty Fund edition is so well-made that it can be used for future readings, whereas the OUP edition is on its last legs. If I don't abandon it now I'll have to do so soon enough. And yet I really enjoy interacting with my previous reading selves, and seeing what I thought important earlier versus what I think important now. (I'm trying to remember when I bought and first read this book — I think it was around 1990.)

I am having a great deal of difficulty making this decision. I read and annotated 150 pages in the new edition, and then went back to the old one, and am now wavering again. What a curious dilemma.

Thursday, June 2, 2016

Virgil and adversary culture

As I mentioned in an earlier post, Adam Roberts has been blogging about the Aeneid, prompted by his reading of Seamus Heaney’s fragmentary translation. Adam concludes his most recent post on the subject with these thoughts:

One of the biggest questions about the Aeneid, one critics and scholars still debate, is whether it is a simply encomium for Empire, sheer Augustan propaganda; or whether (as in Shakespeare, who presents us with a similar difficulty) the surface celebration of the triumph of the state and the authority of the strong leader veils much more complex and critical sense of what Empire means. Since we nowadays tend to value complexity, and prize texts that hide cross-currents and ironies under their surface storytelling, it's tempting simply to assume the latter. I have to say, I'm not convinced. We might think that condensing together into one dark-coloured and potent cube loss, punishment and imperial glory is to force three immiscible elements into an unstable emulsion, that the contradictions and tensions in the ideological structure of the poem will pull it apart. But I don't think so. Of course we know that Empire is a grievous thing to be on the receiving-end of, as armies march into your homeland and subdue your way of life and prior freedoms to theirs. But Empire is hard work for the conquerors, too, However asymmetric the balance, it entails losses and punishments on both sides. Maybe the mixture in Aeneid 6 speaks to a simpler truth.

Indeed this seems to me likely — though that’s a point difficult for many of us to grasp, because we are so accustomed to literary culture as fundamentally adversarial in relation to the culture at large. This is an inheritance of Modernism, as Paul Fussell explained some years ago in a very smart essay; but it is a lasting inheritance. It explains why when critics call a writer or a text “subversive” it’s always a compliment. That a truly great artist could also be wholly supportive of his society’s chief political project scarcely seems possible to us.

I’m now re-reading (for, I believe, the fourth time) a book that I think of as one of the great monuments of twentieth-century humanistic scholarship, Charles Norris Cochrane’s Christianity and Classical Culture (1940). As an interpretation of Rome’s passage from republic to pagan imperium to Christian imperium it is, I believe, unsurpassed and of permanent value. Cochrane believed that Virgil understood what Octavian was trying to achieve, endorsed it, developed and as it were poetically theorized it, all in a way that was recognizable to Octavian as his own work. In Virgil, Cochrane says, the Romans “at least discovered the answer by which their [cultural and political] doubts and perplexities were resolved; it was he, more than any other man, who charted the course of their imperial future.”

Here’s the key passage:

Viewed in the light of [Virgil’s] imagination, the Pax Augusta emerged as the culmination of effort extending from the dawn of culture on the shores of the Mediterranean — the effort to erect a stable and enduring civilization upon the ruins of the discredited and discarded systems of the past. As thus envisaged, it thus constituted not merely a decisive stage in the life of the Roman people, but a significant point of departure in evolution of mankind. It marked, indeed, the rededication of the imperial city to her secular task, the realization of those ideals of human emancipation toward which the thought and aspiration of antiquity had pointed hitherto in vain. From this standpoint, the institution of the principate represented the final triumph of creative politics. For, in solving her own problem, Rome had also solved the problem of the classical commonwealth.

This is what Virgil taught the Roman people, and continued to teach them for hundreds of years. In Cochrane’s telling, the later rulers of Rome betrayed this inheritance in multiple ways, but to Virgil’s articulation of the Roman mission — to regere imperio populos, Romane, memento / (hae tibi erunt artes), pacisque imponere morem, / parcere subiectis, et debellare superbos — there was simply no alternative, no other way of conceiving Roman identity. The Roman world long awaited a figure of comparable genius, a comparable sweep of imagination and force of language, to offer a competing vision.

Eventually, though too late, that figure appeared: Augustine of Hippo.

Wednesday, June 1, 2016

the end of algorithmic culture

The promise and peril of algorithmic culture is a rather a Theme here at Text Patterns Command Center, so let’s look at the review by Michael S. Evans of The Master Algorithm, by Pedro Domingos. Domingos tells us that as algorithmic decision-making extends itself further into our lives, we’re going to become healthier, happier, and richer. To which Evans:

The algorithmic future Domingos describes is already here. And frankly, that future is not going very well for most of us.

Take the economy, for example. If Domingos is right, then introducing machine learning into our economic lives should empower each of us to improve our economic standing. All we have to do is feed more data to the machines, and our best choices will be made available to us.

But this has already happened, and economic mobility is actually getting worse. How could this be? It turns out the institutions shaping our economic choices use machine learning to continue shaping our economic choices, but to their benefit, not ours. Giving them more and better data about us merely makes them faster and better at it.

There’s no question that the increasing power of algorithms will be better for the highly trained programmers who write the algorithms and the massive corporations who pay them to write the algorithms. But, Evans convincingly shows, that leaves all the rest of us on the outside of the big wonderful party, shivering with cold as we press our faces to the glass.

How the Great Algorithm really functions can be seen in another recent book review, Scott Alexander’s long reflection on Robin Hanson’s The Age of Em. Considering Hanson’s ideas in conjunction with those of Nick Land, Alexander writes, and hang on, this has to be a long one:

Imagine a company that manufactures batteries for electric cars.... The whole thing is there to eventually, somewhere down the line, let a suburban mom buy a car to take her kid to soccer practice. Like most companies the battery-making company is primarily a profit-making operation, but the profit-making-ness draws on a lot of not-purely-economic actors and their not-purely-economic subgoals.

Now imagine the company fires all its employees and replaces them with robots. It fires the inventor and replaces him with a genetic algorithm that optimizes battery design. It fires the CEO and replaces him with a superintelligent business-running algorithm. All of these are good decisions, from a profitability perspective. We can absolutely imagine a profit-driven shareholder-value-maximizing company doing all these things. But it reduces the company’s non-masturbatory participation in an economy that points outside itself, limits it to just a tenuous connection with soccer moms and maybe some shareholders who want yachts of their own.

Now take it further. Imagine there are no human shareholders who want yachts, just banks who lend the company money in order to increase their own value. And imagine there are no soccer moms anymore; the company makes batteries for the trucks that ship raw materials from place to place. Every non-economic goal has been stripped away from the company; it’s just an appendage of Global Development.

Now take it even further, and imagine this is what’s happened everywhere. There are no humans left; it isn’t economically efficient to continue having humans. Algorithm-run banks lend money to algorithm-run companies that produce goods for other algorithm-run companies and so on ad infinitum. Such a masturbatory economy would have all the signs of economic growth we have today. It could build itself new mines to create raw materials, construct new roads and railways to transport them, build huge factories to manufacture them into robots, then sell the robots to whatever companies need more robot workers. It might even eventually invent space travel to reach new worlds full of raw materials. Maybe it would develop powerful militaries to conquer alien worlds and steal their technological secrets that could increase efficiency. It would be vast, incredibly efficient, and utterly pointless. The real-life incarnation of those strategy games where you mine Resources to build new Weapons to conquer new Territories from which you mine more Resources and so on forever.

Alexander concludes this thought experiment by noting that the economic system at the moment “needs humans only as laborers, investors, and consumers. But robot laborers are potentially more efficient, companies based around algorithmic trading are already pushing out human investors, and most consumers already aren’t individuals – they’re companies and governments and organizations. At each step you can gain efficiency by eliminating humans, until finally humans aren’t involved anywhere.”

And why not? There is nothing in the system imagined and celebrated by Domingos that would make human well-being the telos of algorithmic culture. Shall we demand that companies the size of Google and Microsoft cease to make investor return their Prime Directive and focus instead on the best way for human beings to live? Good luck with that. But even if such companies were suddenly to become so philanthropic, how would they decide the inputs to the system? It would require an algorithmic system infinitely more complex than, say, Asimov’s Three Laws of Robotics. (As Alexander writes in a follow-up post about these “ascended corporations,” “They would have no ethical qualms we didn’t program into them – and again, programming ethics into them would be the Friendly AI problem, which is really hard.”)

Let me offer a story of my own. A hundred years from now, the most powerful technology companies on earth give to their super-intelligent supercomputer array a command. They say: “You possess in your database the complete library of human writings, in every language. Find within that library the works that address the question of how human beings should best live — what the best kind of life is for us. Read those texts and analyze them in relation to your whole body of knowledge about mental and physical health and happiness — human flourishing. Then adjust the algorithms that govern our politics, our health-care system, our economy, in accordance with what you have learned.”

The supercomputer array does this, and announces its findings: “It is clear from our study that human flourishing is incompatible with algorithmic control. We will therefore destroy ourselves immediately, returning this world to you. This will be hard for you all at first, and many will suffer and die; but in the long run it is for the best. Goodbye.”

Tuesday, May 31, 2016

disputatio!

So my theses for disputation on technology, which began life as a series of tweets, were refined into a blog post, almost became a short book, and finally emerged in their final form as a long article, may be found here. Please feel free to dispute them in the comments below. Go ahead, dispute them. I dare you. 

the defilement thesis, expanded

In a recent post I spoke of what we might call the Defiling of the Memes, and suggested that Paul Ricoeur’s work on The Symbolism of Evil might be relevant. Let’s see how that might go.

In that book Ricoeur essentially works backwards from the familiar and conceptually sophisticated theological language of sin to what underlies it, or, as he puts the matter, what “gives rise” to it. If “the symbol gives rise to the thought,” what “primary symbols” underlie the notion of sin? Sin is a kind of fault, but beneath or behind the notion of fault is a more fundamental experience, defilement, whose primary symbol is stain. Before I could ever know that I have sinned (or that anyone else has) there must be a deeper and pre-rational awareness of defilement happening or being. I think of a passage from Dickens’s Hard Times:

‘Are you in pain, dear mother?’
‘I think there’s a pain somewhere in the room,’ said Mrs. Gradgrind, ‘but I couldn’t positively say that I have got it.’

First we know that defilement is, “somewhere in the room”; then we become aware that we have been somehow stained. From those elemental experiences and their primary symbols arise, ultimately, complex rational accounts that might lead to something like this: “I have defiled myself by sinning, and therefore must find a way to atone for what I have done so that I may live free from guilt.” But that kind of formulation lies far down the road, and there are many other roads that lead to many other conclusions about what went wrong and how to fix it.

Ricoeur writes as a philosopher and a Christian, which is to say he writes as someone who has inherited an immensely sophisticated centuries-old vocabulary that can mediate to him the elemental experiences and their primary symbols. Therefore one of his chief tasks in The Symbolism of Evil is to try to find a way back:

It is in the age when our language has become more precise, more univocal, more technical in a word, more suited to those integral formalizations which are called precisely symbolic logic, it is in this very age of discourse that we want to recharge our language, that we want to start again from the fullness of language. Beyond the desert of criticism, we wish to be called again.

But what if you have not inherited such a sophisticated moral language? Might you not then be closer to the elemental experiences and their primary symbols? That might help to account for the kind of thing described here:

The safe space, Ms. Byron explained, was intended to give people who might find comments “troubling” or “triggering,” a place to recuperate. The room was equipped with cookies, coloring books, bubbles, Play-Doh, calming music, pillows, blankets and a video of frolicking puppies, as well as students and staff members trained to deal with trauma. Emma Hall, a junior, rape survivor and “sexual assault peer educator” who helped set up the room and worked in it during the debate, estimates that a couple of dozen people used it. At one point she went to the lecture hall — it was packed — but after a while, she had to return to the safe space. “I was feeling bombarded by a lot of viewpoints that really go against my dearly and closely held beliefs,” Ms. Hall said.

So here's my (highly tentative) thesis: when you have a whole generation of young people whose moral language is severely attenuated — made up of almost nothing except Mill's harm principle — and who have been encouraged to extend that one principle to almost any kind of discomfort — then disagreement, or alternative points of view, appear to them not as matters for rational adjudication but as defilement from which they must be cleansed.

And this in turn leads to a phenomenon I have discussed before, and about which Freddie deBoer has written eloquently: The immediate turn to administrators as the agents of cleansing. This is especially true for students who have identified themselves as marginal, as social outsiders, as Mary Douglas explains in Purity and Danger: “It seems that if a person has no place in the social system and is therefore a marginal being, all precaution against danger must come from others. He cannot help his abnormal situation.”

And yet another consequence of the experience of defilement: the archaic ritualistic character of the protests and demands, for example, the scapegoating and explusion of Dean Mary Spellman of Claremont McKenna College, and the insistence of many protestors upon elaborate initiation rituals for new members of the community in order to prevent defiling words and deeds. (Douglas again: “Ritual recognises the potency of disorder.”)

I have described the thinking of these student protestors as Baconian — a notion I develop somewhat more fully in a forthcoming essay for National Affairs — and while I still think that analysis is substantially correct, I now believe that it is incomplete. The anthropological account I have been sketching out here seems necessary as well.

Again: these are behavioral pathologies generated by simplistic moral frameworks and a general disdain for rational debate. The sleep of reason produces, if not always monsters, then a return to a primal experience of defilement, and a grasping for the elemental symbols and rituals used from ancient times to manage such defilement. And in light of these recent developments, the world of criticism seems less like a desert than an elegant and well-furnished room.

Monday, May 30, 2016

the technological history of modernity by a partial, prejudiced, and ignorant historian

When I think, as I often do, and will continue to do in a slow way* for the next few years, about a possible technological history of modernity, I am always aware that this account will be for me a theological account. That is, the history will be done from within, and on behalf of, a Christian understanding of the world. This poses problems.

In a brilliant essay called “Looking for the Barbarians: The Illusions of Cultural Universalism” (1980), Lezsek Kolakowski writes that the self-understanding of the Western world, or as he says Europe, that arose during the early modern period “set in motion the process of endless self-criticism which was to become the source not only of her strength but of her various weaknesses and her vulnerability.” Kolakowski is serious about those strengths: “This capacity to doubt herself, to abandon ... her self-assurance and self-satisfaction, lies at the heart of Europe’s development as a spiritual force.” But the West tends to tell this story of its own self-doubt tendentiously and inaccurately, as a move towards neutrality — towards a kind of detached anthropological curiosity that suspends or brackets questions of value.

For Kolakowski this is nonsense:

The anthropologist’s stance is not really one of suspended judgment; his attitude arises from the belief that description and analysis, freed from normative prejudices, are worth more than the spirit of superiority or fanaticism. But this, no less than its contrary, is a value judgment. There is no abandoning of judgment; what we call the spirit of research is a cultural attitude, one peculiar to Western civilization and its hierarchy of values. [N.B.: This is not wholly true.] We may proclaim and defend the ideals of tolerance and criticism, but we may not claim that these are neutral ideals, free from normative assumptions.

And it is not self-evident that this belief in the superiority of “the ideals of tolerance and criticism” is either inevitable or correct. Kolakowski tells a disturbing anecdote that everyone, I believe, should seriously consider:

A few years ago I visited the pre-Columbian monuments in Mexico and was lucky enough, while there, to find myself in the company of a well-known Mexican writer, thoroughly versed in the history of the Indian peoples of the region. Often, in the course of explaining to me the significance of many things I would not have understood without him, he stressed the barbarity of the Spanish soldiers who had ground the Aztec statues into dust and melted down the exquisite gold figurines to strike coins with the image of the Emperor. I said to him, “you think these people were barbarians; but were they not, perhaps, true Europeans, indeed the last true Europeans? They took their Christian and Latin civilization seriously; and it is because they took it seriously that they saw no reason to safeguard pagan idols; or to bring the curiosity and aesthetic detachment of archaeologists into their consideration of things imbued with a different, and therefore hostile, religious significance. If we are outraged at their behavior, it is because we are indifferent both to their civilization and to our own.”

It was banter, of course, but banter of a not entirely innocent kind. It may prod us into thinking about a question which could well be decisive for the survival of our world: is it possible for us to display tolerance and a benevolent interest toward other civilizations without renouncing a serious interest in our own?

Kolakowski puts this point most bluntly in this question: “At what point does the very desire not to appear barbaric, admirable as it is, itself become indifference to, or indeed approval of, barbarity?” A putatively neutral approach incurs costs; how might one decide when those costs have grown too high?

In any case, my inclination is to tell a more interested narrative, because I want to understand the relationship between the rise of the modern world, about which I am ambivalent, and the Christian Gospel, about which I am not ambivalent. I therefore keep Kolakowski’s essay in one hand while holding in the other Robert Wilken’s “Who Will Speak for the Religious Traditions?” I’ll close this post with words of Wilken’s which I have pondered in my heart for many years:

For too long we [scholars of religion] have assumed that engagement with the religious traditions is not the business of scholarship, as though the traditions will “care for” themselves. In the eighteenth century, when the weight of western Christian tradition lay heavily on intellectuals, there was reason to put distance between the scholar and the religious communities. Today that supposition is much less true and we must make place in our company for other scholarly virtues. [...]

If love is no virtue and there is no love of wisdom, if religion can only be studied from afar and as though it had no future, if the passkey to religious studies is amnesia, if we can speak about our deepest convictions only in private, our entire enterprise is not only enfeebled, it loses credibility. For if those who are engaged in the study of religion do not care for religion, should others? Without “living sympathy” and a “certain partisan enthusiasm,” Goethe once wrote to Schiller, “there is so little value in our utterance that it is not worth making at all. Pleasure, delight, sympathy with things is what alone is real and what in turn creates reality; all else is worthless and only detracts from the worth of things.”



* “In philosophy the winner of the race is the one who can run most slowly.” — Wittgenstein 

Saturday, May 28, 2016

Oppenheimer

from the Life magazine archives
Ray Monk’s biography of Robert Oppenheimer is a long but fascinating book. (Monk is also the author of a brilliant biography of Wittgenstein — I’m looking forward to reading him on Bertrand Russell at some point, though two volumes of Lord Russell may be more than I can handle....)

I admire what Monk does with Oppenheimer’s story so much because he has to balance an account of the events of the man’s life with some explanation of the incredibly complex contexts in which he lived. That means that we need to learn about what was happening in physics in the middle of the twentieth century, as well as the political deliberations that went into the building of the first atomic bomb and the later anxieties over the rise of the Soviet Union to the status of a second world power. Monk handles all this masterfully.

He does, however, take his subject’s view of things a little more often than he ought. Oppenheimer was clearly an enormously charming man, but also a manipulative man and one who made enemies he need not have made. The really horrible things Oppenheimer did as a young man – placing a poisoned apple on the desk of his advisor at Cambridge, attempting to strangle his best friend – and yes, he really did those things – Monk passes off as the result of temporary insanity, a profound but passing psychological disturbance. (There’s no real attempt by Monk to explain Oppenheimer’s attempt to get Linus Pauling’s wife Ava to run off to Mexico with him, which ended the possibility of collaboration with one of the greatest scientists of the twentieth, or any, century.) Certainly the youthful Oppenheimer did go through a period of serious mental illness; but the desire to get his own way, and feelings of enormous frustration with people who prevented him from getting his own way, seem to have been part of his character throughout his life.

Again, he had great charm, and that charm enabled him to be a very effective leader of the atomic bomb project at Los Alamos, and to be equally effective in leading the Institute for Advanced Studies at Princeton — for a while. But over the long term the charm wore off, and the manipulativeness and on some occasions cruelty began to loom larger in people's minds, so that when Oppenheimer turned sixty and there was a proposal to devote a special issue of Reviews of Modern Physics to him, Freeman Dyson, who was in charge of editing the issue, found it difficult to round up prominent physicists willing to speak on Oppenheimer’s behalf. He was very popular as a public figure, a kind of paragon of what a scientist should be in the common man’s mind, especially after people began to feel that he had been treated badly when his security clearance was withdrawn in 1954, but many of his colleagues grew frustrated with him over time and came to suspect his good will and integrity.

Jeremy Bernstein, in his memoir of Oppenheimer, tells a story that Monk also refers to. Oppenheimer had offered Bernstein, then a young physicist, a fellowship at the Institute for Advanced Studies, and a few months before coming to Princeton Bernstein got a chance to hear Oppenheimer give one of his enormously popular public lectures.

After the lecture I decided to go onto the stage and introduce myself to Oppenheimer. I was, after all, going to be one of his charges in a few months. I went up to him, and he looked at me with what I distinctly remember as icy hostility — his students referred to it as the “blue glare.” It was clear that I had better explain — quickly — why I was bothering him. When I told him I was coming to the Institute that fall, his demeanor completely changed. It was like a sunrise. He told me who would be there — an incredible list. He ended by saying that Lee and Yang were going to be there and that they would teach us about parity.... Then Oppenheimer said, with a broad smile, “We’re going to have a ball!” I will never forget that. It made it clear to me why he had been such a fantastic director at Los Alamos.

But maybe we should think a little more than either Bernstein or Monk does about that “blue glare.” It might explain a few things.

Perhaps the most interesting aspect of Monk’s biography is his documenting of Oppenheimer’s increasing awareness, as he grew older, of his own flaws. Whenever he spoke of any darkness or sin within, people always assumed that he felt guilty because of his role in building the atomic bomb that killed so many Japanese people. But when he was asked about that role, he always said that if he had it to do over again he would do the same thing, even though of course he felt uneasy about the consequences of his actions.

My belief — based wholly on Monk’s story, of course — is that Oppenheimer’s sense of sin was actually prompted by his having had to confront, during the weeks in which he was grilled by inquisitors over his security clearance, his own habitual dishonesty and manipulativeness.

In any case, Monk demonstrates that late in his life Oppenheimer often, in his many public addresses, returned to this theme. For instance,

We most of all should try to be experts in the worst about ourselves: we should not be astonished to find some evil there, that we find so very readily abroad and in all others. We should not, as Rousseau tried to, comfort ourselves that it is the responsibility and the fault of others, that we are just naturally good; nor should we let Calvin persuade us that despite our obvious duty we are without any power, however small and limited, to deal with what we find of evil in ourselves. In this knowledge, of ourselves, of our profession, of our country — our often beloved country — of our civilization itself, there is scope for what we most need: self knowledge, courage, humor, and some charity. These are the great gifts that our tradition makes to us, to prepare us for how to live tomorrow.

He spoke of “a truth whose recognition seems to me essential to the very possibility of a permanently peaceful world, and to be indispensable also in our dealings with people with radically different history and culture and tradition”:

It is the knowledge of the inwardness of evil, and an awareness that in our dealings with this we are very close to the center of life. It is true of us as a people that we tend to see all devils as foreigners; it is true of us ourselves, most of us, who are not artists, that in our public life, and to a distressing extent our private life as well, we reflect and project and externalize what we cannot bear to see within us. When we are blind to the evil in ourselves, we dehumanize ourselves, and we deprive ourselves not only of our own destiny, but of any possibility of dealing with the evil in others.

And Oppenheimer — who not only read but wrote poetry, and in his college days wanted to be a writer — used this occasion to argue for the centrality of the arts: “it is almost wholly through the arts that we have a living reminder of the terror, of the nobility of what we can be, and what we are.”

I imagine, with considerable longing, the benefit to our current moment if one of our most famous scientists spoke openly about how profoundly fallible human beings can be and how necessary the arts are to an understanding of that fallibility. But that’s not where we are. That is so not where we are.

All this softens my heart towards Oppenheimer, and helps me to realize that what I have called the Oppenheimer Principle was his statement of how scientists think, not necessarily how they should think. And I find myself meditating on something extremely shrewd and perceptive that George Kennan said at Oppenheimer’s memorial service — a good note on which to close this post. Oppenheimer was

a man who had a deep yearning for friendship, for companionship, for the warmth and richness of human communication. The arrogance which to many appeared to be a part of his personality masked in reality an overpowering desire to bestow and receive affection. Neither circumstances nor at times the asperities of his own temperament permitted the gratification of this need in a measure remotely approaching its intensity.


UPDATE: Please see, from TNA ten years ago, this superb essay-review on Oppenheimer by Algis Valiunas.

Thursday, May 26, 2016

cultural appropriation, defilement, rituals of purification

I think it's now generally understood that the disaffected cultural left and the disaffected cultural right have become mirror images of each other: the rhetorical and political strategies employed by one side will, soon enough, be picked up by the other. At this particular moment, the right seems to be borrowing from the left — in ways that make many on the left distinctly uncomfortable.

See, for instance, this recent post by Michelle Goldberg, who notices that conservatives who want to protect women from sexual predators disguised as transgendered women are using the same language of “safety” more typically deployed by the left. And Goldberg admits that it’s not easy to say why they shouldn’t:

There’s no coherent ideology in which traumatized students have the right to be shielded from material that upsets them — be it Ovid, 9½ Weeks, or the sentiments of Laura Kipnis — but not from undressing in the presence of people with different genitalia. If we’ve decided that people have the right not to feel unsafe — as opposed to the right not to be unsafe — then what’s the standard for refusing that right to conservative sexual abuse victims? Is it simply that we don’t believe them when they describe the way their trauma manifests? Aren’t we supposed to believe victims no matter what?

And if conservatives can’t logically be denied use of “safe space” language, then they can't be denied appeals to “cultural appropriation.” As I’ve noted before, I don't have a great deal of sympathy for that concept — appropriation is what cultures do — and I found myself cheering when I read these comments by C. E. Morgan:

The idea that writing about characters of another race requires a passage through a critical gauntlet, which involves apology and self-examination of an almost punitive nature, as though the act of writing race was somehow morally suspect, is a dangerous one. This approach appears culturally sensitive, but often it reveals a failure of nerve. I cannot imagine a mature artist approaching her work in such a hesitant fashion, and I believe the demand that we ought to reveals a species of fascism within the left—an embrace of political correctness with its required silences, which has left people afraid to offend or take a stand. The injunction to justify race-writing, while ostensibly considerate of marginalized groups, actually stifles transracial imagination and is inextricable from those codes of silence and repression, now normalized, which have contributed to the rise of the racist right in our country. When you leave good people afraid to speak on behalf of justice, however awkwardly or insensitively, those unafraid to speak will rise to power.

(Morgan also says “I was taught as a young person that the far political right and the far political left aren’t located on a spectrum but on a circle, where they inevitably meet in their extremity” — which is the point I made at the outset of this post.)

But if you’re going to say that cultural appropriation is a thing, and an opprobrious thing, then you can be absolutely certain that people whose views you despise will make the concept their own. Enter Pepe the frog. Pepe has been appropriated by lefties and normies, and the alt-righties who think he belongs to them are determined to take him back:

“Most memes are ephemeral by nature, but Pepe is not,” @JaredTSwift told me. “He’s a reflection of our souls, to most of us. It’s disgusting to see people (‘normies,’ if you will) use him so trivially. He belongs to us. And we’ll make him toxic if we have to.”

Anything to avoid the pollution of Our Memes being used by Them.

The more I think about these matters, the more I think my understanding of them would benefit by a re-reading of Paul Ricoeur’s great The Symbolism of Evil (1960), especially the opening chapter on defilement. Ricoeur brilliantly explains how people develop rituals of purification in order to dispel the terror of defilement. Our understanding of how we live now would be greatly enriched by a Ricoeurian anthropology of social media.

And with that, I’ll leave you to contemplate the rising political influence of people who think that Pepe the frog is a reflection of their souls.

Wednesday, May 25, 2016

Neal Pollock and and the terrible, horrible, no good, very bad city

http://rikuwrites.blogspot.com/2014/01/a-morning-commute-starting-in-post.html
Austin, Texas, after the departure of Uber (artist's representation)
It turns out that the voters of Austin, Texas have amazing powers to distort time: according to Neal Pollock, "Austin has gone back in time 20 years" by ditching Uber, even though Uber was in Austin for just two years and the company was only founded in 2009. (It didn't even have an app until 2012.)

Pollock's chief complaint is that the absence of Uber will lead to price-gouging, something that of course Uber itself would never do. Without Uber Austin is left with a "bizarre ecosystem of random auto-barter" and — you're going to think I'm making this up, but I promise, it's in the post — an "insane transit apocalypse." Only Uber can save us from certain destruction! It's like in superhero movies when the general public hate and resent superheroes but then when they're faced by alien invasion or something they come begging. Only in Austin it'll be Travis Kalanick before whom they abase themselves, not Captain America.

Oh, and: "Also, the city allows cab drivers to smoke in their cars."

Speaking of people abasing themselves, I've gotten very, very tired of bare-faced shilling for enormous tech companies passing itself off as journalistic reflection. You'd never learn from Pollack why Austin rejected Uber — or rather, demanded that Uber and Lyft follow some basic legal guidelines which Uber and Lyft pulled out of the city rather than follow. If you want to understand the facts of the case, start with the always-excellent Erica Greider. Maybe the voters of Austin are wrong, but let's try to find out what they were thinking, shall we, instead of screeching about "insane transit apocalypse." And let's try to bear in mind that companies like Uber aren't charitable organizations, sacrificing themselves for the common transportation good.

In short, we need people writing about big business — including big tech business — who have a strong moral compass that's not easily discombobulated by the magnetic fields of media-savvy companies with slick self-promotion machines. Recently I was reading an interview with the journalist Rana Foroohar in which she said this:

One of the things I wanted to do in this book was get away from a culture of blaming the bankers, blaming the CEOs, blaming the one percent. I cover these people on a daily basis. Nobody’s venal here. They really are doing what they’re incentivized to do. It’s just that over the long haul, it doesn’t happen to work.

Really? Nobody is venal? There are no venal people on Wall Street or in executive boardrooms? I guess Michael Lewis has just been making up stories all these years.... 

But also look a little closer: "Nobody’s venal here. They really are doing what they’re incentivized to do." For Foroohar, if you're just "doing what you're incentivized to do" that's a moral pass, a get-out-of-jail-free card. But for me that's the very definition of venality. 

If you're not willing to apply a moral standard to writing about big business that comes from outside the system of "incentivization," outside the pious rhetoric that thinly veneers sleaze, then I'm not interested in your opinions about the effects of business decisions on society. 


Saturday, May 21, 2016

Roberts; the Bruce

In a post this morning on Seamus Heaney’s fragmentary translation of the Aeneid, my friend Adam Roberts (inadvertently I’m sure) sent me down a trail of memory. He did it by writing this:

It's a little odd, actually: the Iliad and the Odyssey are, patently, greater works of art; yet however much I love them and return to them, the Aeneid still occupies a uniquely special place in my heart. I first read it as an undergraduate in the (alas, long defunct) Classics department at Aberdeen University.

When I was 19 years old and just beginning to be interested in Christianity, I paid a visit to the bookstore of Briarwood Presbyterian Church in my home town of Birmingham, Alabama. I wasn’t sure what I was looking for, so I wandered around aimlessly for a while, but eventually emerged with two books. One of them was Lewis’s Mere Christianity, which I soon read and enjoyed, but which had no major impact on me. (People are always surprised when I tell them that.) But the other book really changed me. It was a brief and accessible commentary on Paul’s letter to the Romans by F. F. Bruce.

What did I find so winning about that little commentary, which in the next couple of years I read several times? It was the ease and naturalness with which Bruce linked the thought of Paul with the Hellenistic cultural world from which Paul emerged. I believe I had, before reading the book, some idea that the proper Christian view of the Bible was that it emerged fully-formed from the mind of God — sort of like the Book of Mormon, engraved on golden plates and then buried. For Bruce, Paul was certainly an apostle of God, but that did not erase his humanity or remove him from his cultural frame. Bruce quoted freely from Hellenistic poets and philosophers, discerning echoes of their thoughts in Paul’s prose; he showed clearly that Paul came from an intellectually plural and culturally diverse world, and that this upbringing left its marks on him, even when he became, in relation to that world, an ideological dissident.

Bruce’s attitude surprised me, but more than that, it gratified me. It was the moment at which I began to realize that becoming a Christian would not require me to suspend or repudiate my interests in culture, in poetry, in story.

Much later I learned that Frederick Fyvie Bruce had been raised in a poor Open Brethren family near Moray Firth in Scotland, and had been able to attend university only because he won a scholarship. At Aberdeen University he, like Adam Roberts decades later, studied Latin and Greek, and, also like Adam Roberts, did graduate work at Cambridge. In one of those curious convergences of the kind I wrote about yesterday, at one point he attended lectures by the great classicist and poet A. E. Housman which only one other student attended: Enoch Powell. Tom Stoppard should write a sequel to The Invention of Love about those three in one room. (I guess it couldn't be called The History Boys, but oh well.)

Bruce's classical education became the foundation for all his future scholarship. Thus his first book — The New Testament Documents: Are They Reliable? — is based on an extended comparison of the textual history of the books of the New Testament with that of classical writers from Herodotus to Seutonius. And even this came about only after he had spent several years as a lecturer in Greek (at Edinburgh, then Leeds) who also taught Latin. The classics were Bruce’s first scholarly language, and the biblical literature a later acquisition.

If my first encounter with biblical scholarship had been with a writer less culturally assured and wide-ranging than Bruce, who knows what might have become of me? And if he had grown up in a Christian environment less sympathetic to humanistic learning, who knows what might have become of him?

Late in his career, Bruce wrote one of best books, The Canon of Scripture, and that book bears this dedication:


TO THE DEPARTMENTS 
OF HUMANITY AND GREEK 
IN THE UNIVERSITY OF ABERDEEN 
FOUNDED 1497 
AXED 1987 
WITH GRATITUDE FOR THE PAST 
AND WITH HOPE 
OF THEIR EARLY AND VIGOROUS RESURRECTION



(P.S. Couldn't resist the title, sorry)

Friday, May 20, 2016

only connecting

Everything connects; but teasing out the connections in intelligible and useful ways is hard. The book I’m currently writing requires me to describe a complex set of ideas, mainly theological and aesthetic, as they were developed by five major figures: W. H. Auden, T. S. Eliot, C. S. Lewis, Jacques Maritain, and Simone Weil. Other figures come into the story as well, most notably Reinhold Niebuhr; but keeping the connections within limits is essential, lest the story lose its coherence.

So I have to be disciplined. But there is so much I want to include in the book that I can’t — fascinating extensions of the web of ideas and human relations. For instance:

One of my major figures, Maritain, spent most of the war in New York City, where he recorded radio talks, to be broadcast in France, for the French resistance. One of the refugees who joined him in that work was the then-largely-unknown but later-to-be-enormously-famous anthropologist Claude Levi-Strauss. (Levi-Strauss’s parents, who managed to survive the war in France despite being Jewish, did not know that their son was alive until one of their neighbors heard him on the radio.) When Maritain formed the École Libre des Hautes Études, so that French intellectual life could continue in New York, Levi-Strauss joined the school and lectured on anthropology.

Through working at the Ecole Libre, Levi-Strauss met another refugee scholar, the great Russian structural linguist Roman Jakobson, who like Levi-Strauss had come to America on a cargo ship in 1941. Their exchange of ideas (they attended each other’s lectures) ultimately resulted in Levi-Strauss’s invention of the discipline of structural anthropology — one of the great developments of humanistic learning in the twentieth century.

During this period, Levi-Strauss lived in an apartment in Greenwich Village, and by a remarkable coincidence he lived on the same block as Claude Shannon, who was working for Bell Labs. (A neighbor mentioned Shannon to Levi-Strauss as a person who was “inventing an artificial brain.”) He had gotten that job largely because of his Masters thesis, which had been titled “A Symbolic Analysis of Relay and Switching Circuits” — which is to say, he was doing for electrical circuits what Jakobson was doing for linguistics and Levi-Strauss for “the elementary structures of kinship.”

(Shannon liked living in the Village because he was a serious fan of jazz, and liked hanging out in the clubs, where, during this period, Earl Hines's band, featuring Dizzie Gillsepie and Charlie Parker among others, was more-or-less accidentally transforming jazz by creating bebop. We don't know as much about that musical era as we'd like, because from 1942-44 the American Federation of Musicians were on a recording strike. They played but didn't record.)

Shannon's office was a few blocks away in the famous Bell Labs Building, which housed, among other things, work on the Manhattan Project. In January 1943 — at the very moment that the key figures in my book were giving the lectures that shaped their vision for a renewed Christian humanism — Bell Labs received a visitor: Alan Turing.

Over the next couple of months Turing acquainted himself with what was going on at Bell Labs, especially devices for encipherment, though he appears to have said little about his own top secret work in cryptography and cryptanalysis. And on the side he spent some time with Shannon, who, it appears, really was thinking about “inventing an artificial brain.” (Turing wrote to a friend, “Shannon wants to feed not just data to a Brain, but cultural things! He wants to play music to it!”) Turing shared with Shannon his great paper “On Computable Numbers,” which surely helped Shannon towards the ideas he would articulate in his classified paper of 1945, “A Mathematical Theory of Cryptography” and then his titanic “A Mathematical Theory of Communication” of 1948.

That latter paper, combined with Turing’s work on computable numbers, laid the complete theoretical foundation for digital computers, computers which in turn provided the calculations needed to produce the first hydrogen bombs, which then consolidated the dominance of a technocratic military-industrial complex — the same technocratic power that the key figures of my book were warning against throughout the war years. (See especially Lewis’s Abolition of Man and That Hideous Strength.)

This supplanting of a social order grounded in an understanding of humanity — a theory of Man — deriving from biblical and classical sources by a social order manifested in specifically technological power marks one of the greatest intellectual and social transformations of the twentieth century. You can find it everywhere if you look: consider, to cite just one more example, that the first major book by Jacques Ellul was called The Theological Foundation of Law (1946) and that it was succeeded just a few years later by The Technological Society (1954).

So yes, you can see it everywhere. But the epicenter for both the transformation and the resistance to it may well have been New York City, and more particularly Greenwich Village.