Text Patterns - by Alan Jacobs

Tuesday, May 19, 2015

Pynchon and the "Californian Ideology"

In a recent post I wrote,

The hidden relations between these two worlds — Sixties counterculture and today’s Silicon Valley business world — is, I believe, one of the major themes of Thomas Pynchon’s fiction and the chief theme of his late diptych, Inherent Vice and Bleeding Edge. If you want to understand the moral world we’re living in, you could do a lot worse than to read and reflect on those two novels.

Then yesterday I read this great post by Audrey Watters on what she calls the “Silicon Valley narrative” — a phrase she’s becoming ambivalent about, and wonders whether it might profitably be replaced by “Californian ideology.” That phrase, it turns out, comes from a 1995 essay by Richard Barbrook and Andy Cameron. I knew about this essay, have known about it for years, but had completely forgotten about it until reminded by Watters. Here’s the meat of the introduction:

At the end of the twentieth century, the long predicted convergence of the media, computing and telecommunications into hypermedia is finally happening. Once again, capitalism’s relentless drive to diversify and intensify the creative powers of human labour is on the verge of qualitatively transforming the way in which we work, play and live together. By integrating different technologies around common protocols, something is being created which is more than the sum of its parts. When the ability to produce and receive unlimited amounts of information in any form is combined with the reach of the global telephone networks, existing forms of work and leisure can be fundamentally transformed. New industries will be born and current stock market favourites will swept away. At such moments of profound social change, anyone who can offer a simple explanation of what is happening will be listened to with great interest. At this crucial juncture, a loose alliance of writers, hackers, capitalists and artists from the West Coast of the USA have succeeded in defining a heterogeneous orthodoxy for the coming information age: the Californian Ideology.

This new faith has emerged from a bizarre fusion of the cultural bohemianism of San Francisco with the hi-tech industries of Silicon Valley. Promoted in magazines, books, TV programmes, websites, newsgroups and Net conferences, the Californian Ideology promiscuously combines the free-wheeling spirit of the hippies and the entrepreneurial zeal of the yuppies. This amalgamation of opposites has been achieved through a profound faith in the emancipatory potential of the new information technologies. In the digital utopia, everybody will be both hip and rich. Not surprisingly, this optimistic vision of the future has been enthusiastically embraced by computer nerds, slacker students, innovative capitalists, social activists, trendy academics, futurist bureaucrats and opportunistic politicians across the USA. As usual, Europeans have not been slow in copying the latest fad from America. While a recent EU Commission report recommends following the Californian free market model for building the information superhighway, cutting-edge artists and academics eagerly imitate the post human philosophers of the West Coast’s Extropian cult. With no obvious rivals, the triumph of the Californian Ideology appears to be complete.

Putting this together with Watters’s post and with my essay on the late Pynchon… wow, does all this give me ideas. Perhaps Pynchon is the premier interpreter of the Californian ideology — especially when you take into account some of his earlier books as well, especially Vineland — someone who understands both its immense appeal and its difficulty in promoting genuine human flourishing. Much to think about and, I hope, to report on here, later.

Monday, May 18, 2015

ideas and their consequences

I want to spend some time here expanding on a point I made in my previous post, because I think it’s relevant to many, many disputes about historical causation. In that post I argued that people don't get an impulse to alter their/our biological conformation by reading Richard Rorty or Judith Butler or any other theorists within the general orbit of the humanities, according to a model of Theory prominent among literary scholars and in Continental philosophy and in some interpretations of ancient Greek theoria. Rather, technological capability is its own ideology with its own momentum, and people who practice that ideology may sometimes be inclined to use Theory to provide ex post facto justifications for what they would have done even if Theory didn’t exist at all.

I think there is a great tendency among academics to think that cutting-edge theoretical reflection is ... well, is cutting some edges somewhere. But it seems to me that Theory is typically a belated thing. I’ve argued before that some of the greatest achievements of 20th-century literary criticism are in fact rather late entries in the Modernist movement: “We academics, who love to think of ourselves as being on the cutting-edge of thought, are typically running about half-a-century behind the novelists and poets.” And we run even further behind the scientists and technologists, who alter our material world in ways that generate the Lebenswelt within which humanistic Theory arises.

This failure of understanding — this systematic undervaluing of the materiality of culture and overvaluing of what thinkers do in their studies — is what produces vast cathedrals of error like what I have called the neo-Thomist interpretation of history. When Brad Gregory and Thomas Pfau, following Etienne Gilson and Jacques Maritain and Richard Weaver, argue that most of the modern world (especially the parts they don't like) emerges from disputes among a tiny handful of philosophers and theologians in the University of Paris in the fifteenth century, they are making an argument that ought to be self-evidently absurd. W. H. Auden used to say that the social and political history of Europe would be exactly the same if Dante, Shakespeare, and Mozart had never lived, and that seems to me not only to be true in those particular cases but also as providing a general rule for evaluating the influence of writers, artists, and philosophers. I see absolutely no reason to think that the so-called nominalists — actually a varied crew — had any impact whatsoever on the culture that emerged after their deaths. When you ask proponents of this model of history to explain how the causal chain works, how we got from a set of arcane, recondite philosophical and theological disputes to the political and economic restructuring of Western society, it’s impossible to get an answer. They seem to think that nominalism works like an airborne virus, gradually and invisibly but fatally infecting a populace.

It seems to me that Martin Luther’s ability to get a local printer to make an edition of Paul’s letter to the Romans stripped of commentary and set in wide margins for student annotation was infinitely more important for the rise of modernity than anything that William of Ockham and Duns Scotus ever wrote. If nominalist philosophy has played any role in this history at all — and I doubt even that — it has been to provide (see above) ex post facto justification for behavior generated not by philosophical change but by technological developments and economic practices.

Whenever I say this kind of thing people reply But ideas have consequences! And indeed they do. But not all ideas are equally consequential; nor do all ideas have the same kinds of consequences. Dante and Shakespeare and Mozart and Ockham and Scotus have indeed made a difference; but not the difference that those who advocate the neo-Thomist interpretation of history think they made. Moreover, and still more important, scientific ideas are ideas too; as are technological ideas; as are economic ideas. (It’s for good reason that Robert Heilbroner called his famous history of the great economists The Worldly Philosophers.)

If I’m right about all this — and here, as in the posts of mine I’ve linked to here, I have only been able to sketch out ideas that need much fuller development and much better support — then those of us who are seriously seeking alternatives to the typical modes of living in late modernity need a much, much better philosophy and theology of technology. Which is sort of why this blog exists ... but at some point, in relation to all the vital topics I’ve been exploring here, I’m going to have to go big or go home.

prosthetics, child-rearing, and social construction

There’s much to think and talk about in this report by Rose Eveleth on prosthetics, which makes me think about all the cool work my friend Sara Hendren is doing. But I’m going to set most of that fascinating material aside for now, and zero in on one small passage from Eveleth’s article:

More and more amputees, engineers, and prospective cyborgs are rejecting the idea that the “average” human body is a necessary blueprint for their devices. “We have this strong picture of us as human beings with two legs, two hands, and one head in the middle,” says Stefan Greiner, the founder of Cyborgs eV, a Berlin-based group of body hackers. “But there’s actually no reason that the human body has to look like as it has looked like for thousands of years.”

Well, that depends on what you mean by “reason,” I think. We should probably keep in mind that having “two legs, two hands [or arms], and one head in the middle” is not something unique to human beings, nor something that has been around for merely “thousands” of years. Bilateral symmetry — indeed, morphological symmetry in all its forms — is something pretty widely distributed throughout the evolutionary record. And there are very good adaptive “reasons” for that.

I’m not saying anything here about whether people should or should not pursue prosthetic reconstructions of their bodies. That’s not my subject. I just want to note the implication of Greiner’s statement — an implication that, if spelled out as a proposition, he might reject, but is there to be inferred: that bilateral symmetry in human bodies is a kind of cultural choice, something that we happen to have been doing “for thousands of years,” rather than something deeply ingrained in a vast evolutionary record.

You see a similar but more explicit logic in the way the philosopher Adam Swift talks about child-rearing practices: “It’s true that in the societies in which we live, biological origins do tend to form an important part of people’s identities, but that is largely a social and cultural construction. So you could imagine societies in which the parent-child relationship could go really well even without there being this biological link.” A person could say that the phenomenon of offspring being raised by their parents “is largely a social and cultural construction” only if he is grossly, astonishingly ignorant of biology — or, more likely, has somehow managed to forget everything he knows about biology because he has grown accustomed to thinking in the language of an exceptionally simplistic and naïve form of social constructionism.

N.B.: I am not arguing for or against changing child-rearing practices. I am exploring how and why people simply forget that human beings are animals, are biological organisms on a planet with a multitude of other biological organisms with which they share many structural and behavioral features because they also share a long common history. (I might also say that they share a creaturely status by virtue of a common Maker, but that’s not a necessary hypothesis at the moment.) In my judgment, such forgetting does not happen because people have been steeped in social constructionist arguments; those are, rather, just tools ready to hand. There is a deeper and more powerful and (I think) more pernicious ideology at work, which has two components.

Component one: that we are living in a administrative regime built on technocratic rationality whose Prime Directive is, unlike the one in the Star Trek universe, one of empowerment rather than restraint. I call it the Oppenheimer Principle, because when the physicist Robert Oppenheimer was having his security clearance re-examined during the McCarthy era, he commented, in response to a question about his motives, “When you see something that is technically sweet, you go ahead and do it and argue about what to do about it only after you've had your technical success. That is the way it was with the atomic bomb.” Social constructionism does not generate this Prime Directive, but it can occasionally be used — in, as I have said, a naïve and simplistic form — to provide ex post facto justifications of following that principle. We change bodies and restructure child-rearing practices not because all such phenomena are socially constructed but because we can — because it’s “technically sweet.”

My use of the word “we” in that last sentence leads to component two of the ideology under scrutiny here: Those who look forward to a future of increasing technological manipulation of human beings, and of other biological organisms, always imagine themselves as the Controllers, not the controlled; they always identify with the position of power. And so they forget evolutionary history, they forget biology, they forget the disasters that can come from following the Oppenheimer Principle — they forget everything that might serve to remind them of constraints on the power they have ... or fondly imagine they have.

Saturday, May 16, 2015

Station Eleven and global cooling



I recently read Emily St. John Mandel’s Station Eleven, which didn’t quite overwhelm me the way it has overwhelmed many others — though I liked it. It’s good, but it could have been great. The post-apocalyptic world is beautifully and convincingly rendered: I kept thinking, Yes: this is indeed what we would value, should all be lost. But the force of the book is compromised, I think, by its chief structural conceit, which is that all the major characters in the novel’s present tense of civilizational ruin are linked in some way to an actor named Arthur Leander who died just before the Georgia Flu wiped out 99.9% of the human race. This conceit leads Mandel to flash back repeatedly to our own world and moment, and every time that happened I thought Dammit. I just didn’t care about Arthur Leander; I didn't want to read fairly conventional realistic-novel stuff. I wanted to rush through all that to get back to the future Mandel imagines so powerfully.

All that said, I have one small thought, totally irrelevant to my feelings about the book as a whole, that keeps returning to my mind. In one of the book’s first scenes, a troupe of musicians and actors (the Traveling Symphony) is walking along an old road somewhere in Michigan, and it’s very very hot, over a hundred degrees. This is twenty years after civilization died, which makes me wonder: Would the world by then be any cooler? If all of our culture’s heat sources ceased functioning today — no more air conditioners emitting hot air, no more internal combustion engines, no more factories blowing out smoke — how long would it take before there was a measurable cooling of the world’s climate?

Monday, May 11, 2015

rewiring the reading organ

Here's Gary Shteyngart on Saul Bellow:

The first time I tackled Ravelstein, back in 2000, this American mind was as open to long-form fiction as any other and I wolfed the novel down in one Saturday between helpings of oxygen and water and little else. Today I find that Bellow’s comment, ‘It is never an easy task to take the mental measure of your readers,’ is more apt than ever. As I try to read the first pages of Ravelstein, my iPhone pings and squawks with increasing distress. The delicate intellectual thread gets lost. Macaulay. Ping! Antony and Cleopatra. Zing! Keynes. Marimba! And I’m on just pages 5 and 6 of the novel. How is a contemporary person supposed to read 201 pages? It requires nothing less than performing brain surgery on oneself. Rewiring the organ so that the neurons revisit the haunts they once knew, hanging out with Macaulay and Keynes, much as they did in 2000, before encounters with both were reduced to brief digital run-ins on some highbrow content-provider’s blog, back when knowledge was actually something to be enjoyed instead of simply being ingested in small career-sustaining bursts.

Shteyngart is sort of channeling Nick Carr here. Several years ago Carr wrote

Over the past few years I’ve had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn’t going—so far as I can tell—but it’s changing. I’m not thinking the way I used to think. I can feel it most strongly when I’m reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I’m always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.

Of course, some people have always been this way. In my book on The Pleasures of Reading in an Age of Distraction, I claim that John Self, the protagonist of Martin Amis’s early novel Money, is our patron saint. Self tries reading Animal Farm in order to please a woman who bought it for him: 

Reading takes a long time, though, don’t you find? It takes such a long time to get from, say, page twenty-one to page thirty. I mean, first you’ve got page twenty-three, then page twenty-five, then page twenty-seven, then page twenty-nine, not to mention the even numbers. Then page thirty. Then you’ve got page thirty-one and page thirty-three — there’s no end to it. Luckily Animal Farm isn’t that long a novel. But novels . . . they’re all long, aren’t they. I mean, they’re all so long. After a while I thought of ringing down and having Felix bring me up some beers. I resisted the temptation, but that took a long time too. Then I rang down and had Felix bring me up some beers. I went on reading.

Nothing against the Shteyngart piece, but it’s not really telling is anything new. People keep reminding themselves that this is The Way We Live Now but they just keep on living that way. Eventually they’ll either live some other way or start telling different stories, I guess. 

Wednesday, May 6, 2015

near the end of my (Apple) rope

I bought my first Apple product — the original Macintosh — almost exactly thirty years ago. I have never been as frustrated with Apple products as I am now. Not even close.

A great many of these issues involve communications among machines: on the Mac, Yosemite brought a host of wifi problems; on iOS, Bluetooth has been borked for millions of people since iOS 8 was introduced, especially if you have an iPhone 6 or 6 Plus. Apple wants us to replace iPhoto with its new Photos app, but I can’t get Photos to sync all the pictures that iPhoto handles … well, fairly well, anyway — which is all just part of the larger story, which is that iCloud is a complete disaster.

Marco Arment:

Your computer can’t see my computer on the network or vice versa? The only solution that works is to reboot everything, just like using Windows fifteen years ago. Before Yosemite, I never had these issues on Macs.

Yosemite is now 6 months old, these bugs still aren’t fixed, and it feels like they probably won’t be fixed anytime soon. Yosemite is probably in minimal-maintenance mode as primary resources have likely moved on to headlining features for 10.11. This is what’s so frustrating about today’s Apple: if a bug persists past the early beta stages of its introduction, it rarely ever gets fixed. They’re too busy working on the new to fix the old.

But of course the new will have bugs too. So the bugs keep piling up — not to mention the missing features: e.g., margins in TextEdit documents can’t be changed without some sketchy hacks, items in Reminders can be ordered by date only manually, Safari lacks favicons to help you distinguish tabs by sight, and so on. Then there are inexplicably frustrating software design decisions, like regular backwards-incompatible changes in file formats for some apps (Pages, Keynote). Moreover, interoperability between OS X and iOS seems to be getting worse with time, not better, despite features like Handoff which are supposed to address that very problem but may not work.

And the arrival of the Apple Watch is surely going to exacerbate all these problems, not only with Apple software but with third-party software that companies are trying to make work on three platforms now: OS X, iOS, and the special version of iOS that Watch runs. Mo’ devices, mo’ platforms, mo’ problems.

In light of all this, here’s my current plan:

First, I will stop even trying to get either Bluetooth or iCloud to work. Pretend they don’t exist, because effectively they don’t. Assume that the only backups I have are to an external hard drive. If I want to play music in my car, I’ll either listen to the radio or burn CDs like I did back in the day. (Remember when that was the coolest thing?) Well, if burning CDs still works in iTunes….

Second, move whenever possible to non-Apple software, especially, though not only, when Apple's stuff relies in some way on iCloud. (Hardest thing to replace, for a guy who doesn’t want to use Google or Microsoft products either: Keynote. Keynote is a great, if somewhat bloated app, but its ever-changing file format makes it a long-term loser.)

Third, if things aren’t any better by 2017 — when my iPhone 6 Plus will be old enough to trade in — switch to Linux, for my phone as well as my computer, assuming that the Ubuntu Phone is available in this country by then. Yes, Linux has plenty of problems; but I won’t be paying a premium price for a system that promises to “just work” but just doesn’t work. If I’m going to have to be in permanent fiddling/hacking mode, let me do it from within an operating system meant for fiddling and hacking.

Tuesday, May 5, 2015

a few words on Age of Ultron

A few random thoughts about Avengers: Age of Ultron:

  • It’s fun.
  • It needed two fewer massive battle set-pieces.
  • James Spader’s Ultron voice is wonderfully creepy and sleazy. (By the way, don't we live in the Golden Age of voice acting? I think Pixar is largely responsible for this.)
  • Joss Whedon knows that his job as director is primarily to give us those massive battle set-pieces, and he does that, but I have a feeling that his heart really isn’t in it — in part because, as writer, he knows that those simply ruin narrative coherence. So he always has strategies for threading the story together.
  • One way he does this is through creating themes that the characters respond to in their varying ways. Perhaps the biggest such theme in this film is: marriage and children. It’s really a wonderful stroke on Whedon’s part to create a (surprisingly and to me gratifyingly long) breathing-space in the movie set in Clint Barton’s ramshackle house in the country, with his wife and children. That sets all the major characters — except Thor, who, you know, is Thor — thinking about what value they place on such a life. It’s because of this theme that Hawkeye — the one Avenger who has no superpowers, genetic modifications, or mind-and-body-altering training — becomes possibly the most important single character in this movie. (I just wish Jeremy Renner were a better actor, because I don't think he quite brings it off.)
  • The other way Whedon builds continuity is through geeky jokes that recur throughout the movie. There are, as always with Whedon, several such here — one that starts when Captain America tells Tony Stark to watch his language, another based on characters trading the line “What, you didn't see that coming?” — but the best one is about Mjölnir, Thor’s hammer. At one point Whedon actually turns the superheroes themselves into fanboys speculating about just how the unliftability of Mjölnir works: “So if you put it in an elevator,” says Cap, only to have Tony cut in: “Elevator’s not worthy.” I just love this stuff, which nobody does better than Whedon.

Anyway, as I say, it’s good wholesome overstuffed bloated fun. Thumbs up.

Monday, May 4, 2015

notification

Matt Gemmell:

The problem with notifications is that they occupy the junction of several unhealthy human characteristics: social pressure of timely response, a need for diversion, and our constant thirst for novelty. Mobile devices exacerbate that issue by letting us succumb to all of those at any moment. That’s not a good thing. I’m constantly horrified that much of Microsoft’s advertising seems to presuppose that working twenty-four hours per day is mankind’s long-sought nirvana.

With the Watch, we’ll be waiting for a long time.

For one thing, notifications are mostly read-only. Most iPhone apps don’t have corresponding Watch apps yet, so you’re simply seeing a notification without the means to respond. Even those notifications that can be handled on the device are inherently constrained by the available screen space, and input methods. For example, responses to messages are limited to assorted emoticons, dictated text, or an audio clip.

The Watch’s size, and the need to raise your wrist, discourages prolonged reading, which automatically makes you filter what you deal with. On the iPhone, or any of its ancestors further up the three, the default mode of response is now. On the Watch, it’s later.

Okay.... but then, why not just wait until “later” to check your iPhone? Why not just keep the iPhone in the other room, or in your pocket with the notifications turned off? (The latter is what I do: the only notifications I get on my phone are for calls and texts from my loved ones.)

insders and outsiders

One of the stories often told by fans of the Inklings — C. S. Lewis and J. R. R. Tolkien and their friends — is that their great success is a kind of “revenge of the outsiders” story: writers whose ideas were rejected by the cultural elite end in triumph. The story’s origins lie with the Inklings themselves: so they conceived themselves, as a ragged group of oddballs tending the flame of old tales and old ways while the cultural elite went its corrupt modernist way. Lewis returns to this theme often in his letters.

But were Lewis and Tolkien really outside the mainstream? Consider:

  • Each of them was a fellow of an ancient and prestigious college in one of England’s two elite universities
  • They were the two leading authors of the English curriculum at that university (a curriculum that lasted longer than they did)
  • Each of them published books for that university’s prestigious press
  • One of them (Tolkien) shared a publisher with Bertrand Russell
  • One of them (Lewis) gave immensely popular radio talks for the BBC

Even Owen Barfield, in some ways the most culturally marginal of the major Inklings, early in his career wrote articles for the New Statesman and had a book (Poetic Diction) published by Faber. (After that he was largely self-exiled from the mainstream by his commitment to Anthroposophy.)

To be sure, there were important ways that both Lewis and Tolkien were, in the eyes of some, not quite the right thing at Oxford: neither of them attended an elite public school; Lewis was Irish; Tolkien was Catholic; each of them stood for ideas about literature that were palpably old-fashioned; and Lewis was (in addition to being generally assertive, sometimes to the point of bullying) vocal about being a Christian in ways that struck many of his colleagues as being ill-bred at best. But considering such impediments to insider status, they did amazingly well at finding their way into the midst of things, and they did so before either of them had written anything for which they’re now famous.

Saturday, May 2, 2015

Paul Goodman and Humane Technology

This is a kind of thematic follow-up to my previous post.

A few weeks ago Nick Carr posted a quotation from this 1969 article by Paul Goodman: “Can Technology Be Humane?” I had never heard of it, but it’s quite fascinating. Here’s an interesting excerpt:

For three hundred years, science and scientific technology had an unblemished and justified reputation as a wonderful adventure, pouring out practical benefits, and liberating the spirit from the errors of superstition and traditional faith. During this century they have finally been the only generally credited system of explanation and problem-solving. Yet in our generation they have come to seem to many, and to very many of the best of the young, as essentially inhuman, abstract, regimenting, hand-in-glove with Power, and even diabolical. Young people say that science is anti-life, it is a Calvinist obsession, it has been a weapon of white Europe to subjugate colored races, and manifestly—in view of recent scientific technology—people who think that way become insane. With science, the other professions are discredited; and the academic “disciplines” are discredited.

The immediate reasons for this shattering reversal of values are fairly obvious. Hitler’s ovens and his other experiments in eugenics, the first atom bombs and their frenzied subsequent developments, the deterioration of the physical environment and the destruction of the biosphere, the catastrophes impending over the cities because of technological failures and psychological stress, the prospect of a brainwashed and drugged 1984. Innovations yield diminishing returns in enhancing life. And instead of rejoicing, there is now widespread conviction that beautiful advances in genetics, surgery, computers, rocketry, or atomic energy will surely only increase human woe.

Goodman’s proposal for remedying this new mistrust and hatred of technology begins thus: “Whether or not it draws on new scientific research, technology is a branch of moral philosophy, not of science,” and requires the virtue of prudence. Since “in spite of the fantasies of hippies, we are certainly going to continue to live in a technological world,” this redefinition of technology — or recollection of it to its proper place — is a social necessity. Goodman spells out some details:

  • “Prudence is foresight, caution, utility. Thus it is up to the technologists, not to regulatory agencies of the government, to provide for safety and to think about remote effects.”
  • “The recent history of technology has consisted largely of a desperate effort to remedy situations caused by previous over-application of technology.”
  • “Currently, perhaps the chief moral criterion of a philosophic technology is modesty, having a sense of the whole and not obtruding more than a particular function warrants.”
  • “Since we are technologically overcommitted, a good general maxim in advanced countries at present is to innovate in order to simplify the technical system, but otherwise to innovate as sparingly as possible.”
  • “A complicated system works most efficiently if its parts readjust themselves decentrally, with a minimum of central intervention or control, except in case of breakdown.”
  • “But with organisms too, this has long been the bias of psychosomatic medicine, the Wisdom of the Body, as Cannon called it. To cite a classical experiment of Ralph Hefferline of Columbia: a subject is wired to suffer an annoying regular buzz, which can be delayed and finally eliminated if he makes a precise but unlikely gesture, say by twisting his ankle in a certain way; then it is found that he adjusts quicker if he is not told the method and it is left to his spontaneous twitching than if he is told and tries deliberately to help himself. He adjusts better without conscious control, his own or the experimenter’s.”
  • “My bias is also pluralistic. Instead of the few national goals of a few decision-makers, I propose that there are many goods of many activities of life, and many professions and other interest groups each with its own criteria and goals that must be taken into account. A society that distributes power widely is superficially conflictful but fundamentally stable.”
  • “The interlocking of technologies and all other institutions makes it almost impossible to reform policy in any part; yet this very interlocking that renders people powerless, including the decision-makers, creates a remarkable resonance and chain-reaction if any determined group, or even determined individual, exerts force. In the face of overwhelmingly collective operations like the space exploration, the average man must feel that local or grassroots efforts are worthless, there is no science but Big Science, and no administration but the State. And yet there is a powerful surge of localism, populism, and community action, as if people were determined to be free even if it makes no sense. A mighty empire is stood off by a band of peasants, and neither can win — this is even more remarkable than if David beats Goliath; it means that neither principle is historically adequate. In my opinion, these dilemmas and impasses show that we are on the eve of a transformation of conscience.”

If only that last sentence had come true. I hope to reflect further on this article in later posts.

American TechGnosis

Erik Davis has written a new afterword to his 1989 book TechGnosis, and it’s very much worth a read. It’s a reminder of that wing of contemporary tech culture that grows quite directly out of Sixties counterculture, with Steward Brand’s Whole Earth Catalog as one of the chief midwives of the transition.

I think TechGnosis continues to speak despite its sometime anachronism because it taps the enigmatic currents of fantasy, hope, and fear that continue to charge our tools, and that speak even more deeply to the profound and peculiar ways those tools shape us in return. These mythic currents are as real as desire, as real as dream; nor do they simply dissipate when we recognize their sway. Nonetheless, technoscience continues to propagate the Enlightenment myth of a rational and calculated life without myths, and to promote values like efficiency, productivity, entrepreneurial self-interest, and the absolute adherence to reductionist explanations for all phenomena. All these day-lit values undergird the global secularism that forms the unspoken framework for public and professional discourse, for the “worldview” of our faltering West. At the same time, however, media and technology unleash a phantasmagoric nightscape of identity crises, alternate realities, memetic infection, dread, lust, and the specter of invisible (if not diabolical) agents of surveillance and control. That these two worlds of day and night are actually one matrix remains our central mystery: a rational world of paradoxically deep weirdness where, as in some dying earth genre scenario, technology and mystery lie side-by-side, not so much as explanations of the world but as experiences of the world.

The hidden relations between these two worlds — Sixties counterculture and today’s Silicon Valley business world — is, I believe, one of the major themes of Thomas Pynchon’s fiction and the chief theme of his late diptych, Inherent Vice and Bleeding Edge. If you want to understand the moral world we’re living in, you could do a lot worse than to read and reflect on those two novels.

I recently read George Marden’s brief but deeply insightful Twilight of the American Enlightenment, and the most fascinating element of that book is the way Marsden traces the lines of thought and influence that start with America’s great victory in World War II, lead to a sense of spiritual crisis in the 1950s — Is America morally worthy of its leading place in the world? And have we achieved it at the cost of creating a lonely crowd made up of organization men? —, and go on from there by a kind of inevitable logic to the social and sexual revolutions of the 1960s.

Those developments were made inevitable, it seems to me, by a single conflict in the American mind. Marsden:

At all these levels of mainstream American life, from the highest intellectual forums to the most practical everyday advice columns, two ... authorities were almost universally celebrated: the authority of the scientific method and the authority of the autonomous individual. If you were in a public setting in the 1950s, two of the things that you might say on which you would likely get the widest possible assent were, one, that one ought to be scientific, and two, that one ought to be true to oneself.

Not much has changed — except that today’s leading technology companies claim to have united these two authorities. They give us, they say, the very science that we need in order to be true to ourselves. Erik Davis’s TechGnosis is one way of believing in that promise, but by no means the only way.

Friday, May 1, 2015

choose your own (reading) adventure

About e-reading: a kind of Standard Model has emerged among book-lovers. For example:

One of the most imperishable notions ever set down about a personal library can be found inside Sven Birkerts’s essay “Notes from a Confession.” Birkerts speaks of “that kind of reading which is just looking at books,” of the “expectant tranquility” of sitting before his library: “Just to see my books, to note their presence, their proximity to other books, fills me with a sense of futurity.” Expectant tranquility and sense of futurity — those are what the noncollector and what the downloader of e-books does not experience, because only an enveloping presence permits them.

I’m sorry but your Nook has no presence.

That’s William Giraldi, who, despite what he says, is definitely not sorry. And here’s Dustin Illingworth:

As an unabashed sensualist, the most obvious deficiency of the digital book, to me, is the scarcity of its satisfactions: its lack of spine and alarming weightlessness, its abstract and odorless pages, the tactile sterility of the entire enterprise. It seems to me that a book’s physicality is part and parcel of its ability to convey an intimate and lasting experience. Books are meant to be handled and smelled, fingers run along worn cloth, words underlined in good black ink, dog-eared corners folded and refolded. Indeed, the materiality of books — pages, fonts, marginalia, previous owners, stains — channel, for me, a kind of literary magic, an aura of lived memory that the eBook cannot aspire to. The drops of blood in my copy of Dune (nosebleed, age 14), the wilted spots in Jude the Obscure, the profound and funny notes in Confederacy of Dunces written by a mystery reader I’ll never meet — this is where the physical book and the vitality of the reader come together, thickening with every encounter. Yes, the ideas within books, their collections of consciousness, are the important things; however, a physical book makes the conveyance itself an essential part of the endless enrichment: a monument to our relationship with the living, growing text.

I don't disagree with any of this ... well, actually, I ... Okay, let me put it this way: You have two choices.

One: You have as many beautiful books as you want. You get the worn cloth, the underlined words, the dog-eared corners — if you so desire. You can have them pristine, if that’s your preference. Even the coffee stains and, um, drops of blood that connect the materiality of your existence with the materiality of the book’s existence. All this can be yours. But you have no control over what books you get: titles, authors, text — all random. Maybe you get Tolstoy, maybe you get Danielle Steele, maybe you get Dr. Oz.

Two: You can only read on e-readers, with all the features (changeable font size, backups of annotations, etc.) such devices typically have. No codexes for you. But you get to choose the books you read.

Which way do you go?

Thursday, April 30, 2015

on Charlie Hebdo and courage



I don't know the source of the above image — it came from this tweet — but if it's accurate it makes nonsense of the claim, made by writers protesting PEN's award to Charlie Hebdo, that by giving the award "PEN is not simply conveying support for freedom of expression, but also valorizing selectively offensive material: material that intensifies the anti-Islamic, anti-Maghreb, anti-Arab sentiments already prevalent in the Western world" (emphasis mine).

If you read the rest of the letter, you'll see that the authors' chief complaint about Charlie Hebdo is in fact that the magazine is not "selectively offensive": it should have, in the view of these authors, have refrained from subjecting Muslims to the same satirical scrutiny it subjects everyone else to: "To the section of the French population that is already marginalized, embattled, and victimized, a population that is shaped by the legacy of France’s various colonial enterprises, and that contains a large percentage of devout Muslims, Charlie Hebdo’s cartoons of the Prophet must be seen as being intended to cause further humiliation and suffering."

There's a lot to question here. One might ask why France's Jews are, as far as these authors are concerned, fair game. One might ask, when the authors insist that "The inequities between the person holding the pen and the subject fixed on paper by that pen cannot, and must not, be ignored," that logic might also apply to the inequities between the person holding the pen and the person holding a gun and pointing it at him.

But let's set all that aside. Let's grant, per argumentum, the authors' claim that Charlie Hebdo does wrong by subjecting Muslims to the same critique — though, it appears from the chart above, less frequently — that it subjects others too. Let's even grant that Muslims in France should, uniquely, never be the objects of satire. Does it follow that Charlie Hebdo should not receive this award?

I don't think it does follow. Let's remember what the award is: the PEN/Toni and James C. Goodale Freedom of Expression Courage Award. And I don't see how you could say that the people of Charlie Hebdo are anything but courageous — exceptionally courageous. They were brave before so many of them were murdered, and those who remain and keep working are braver still.

Yes, but — someone will say — they are brave in a wicked cause! Members of the Ku Klux Klan can be brave too! This argument might have more force if Charlie Hebdo typically singled out Muslims for attack, but as we have seen, it hasn't. In the eyes of the authors who are protesting the award, the crime of Charlie Hebdo is that a magazine whose sole raison d'être is savage mockery did not choose to exempt one group and one group only from that mockery. That strikes me as a wholly inadequate reason for refusing to honor exceptional courage — especially since courage is the one virtue that simply must be exercised if freedom of expression is to survive. And freedom of expression is what PEN is all about.

Wednesday, April 29, 2015

Warren Ellis on Facebook

I've written before, in these digital pages, about Facebook, for which I have ... I started to say "an irrational hatred," but I think it's actually a very rational hatred. The great Warren Ellis helps us understand why Facebook is so eminently and rationally hateable.

Tuesday, April 28, 2015

the story of everything

A does something to B. (Maybe A shoots B; maybe A refuses to bake a cake for B. It doesn’t signify.)

Social media pick up the story of what A did to B.

Some members of Group Y are outraged at what A did to B, and demand swift retribution.

Some other members of Group Y to are also outraged by what A did to B, but hesitate to commit themselves to a call for swift retribution. They think that perhaps they don’t know all the facts; or they wonder whether the retribution called for might be too extreme. But they stay silent because they don’t want to be called out by, or exiled from, their group.

Some members of Group Z are outraged by the calls for swift retribution. Either because they have a predisposition in favor of A or have bean agitated by the rhetoric of the more vocal members of Group Y, they come to A’s defense, and suggest that what A did isn’t that bad after all.

Some other members of Group Z are also outraged, or at least disturbed, by those calls for swift retribution, but aren’t sure that they can come to A’s defense. They think that perhaps they don’t know all the facts; or they wonder whether the justifications of A are really warranted. But they stay silent because they don’t want to be called out by, or exiled from, their group.

The vocal members of Group Y are even more outraged by the attitudes of the vocal members of Group Z than they were by the original actions of A. They double down on their condemnations of A, and demand that the members of Group Z receive their own retribution for justifying the unjustifiable, defending the indefensible.

The vocal members of Group Z ascend to the condition of righteous wrath. Those who had originally said that A’s actions were wrong but not that wrong now say that A’s actions were completely justified, and in fact could have been much more extreme and still justified. They denounce Group Y’s calls for retribution as “McCarthyite” “witch hunts.”

The vocal members of groups Y and Z make no distinction between the aggressively vocal members of the other group and the silent ones. Any attempts to suggest that the members of the other group are not monolithic and unanimous is met with a sneering hashtag: #notallmembers.

Some of those in each group who had remained silent because of uncertainty or an instinctive desire for moderation realize that they’re being targeted just as aggressively as the most extreme members of their group. They begin to suspect that those extremists were right all along about the other group. In their shock at being so condemned, they tend to forget that there are people in the other group who are feeling just as they feel, for for the same reasons. Their silence has led the rest of the world to think that they don’t exist, and that their entire group of fairly characterized by the behavior of its most extreme members. And gradually that assumption, initially false, comes to be true.

Sunday, April 26, 2015

withdrawals and commitments

I posted this on my personal blog, but on reflection I’m thinking that it has a place here, especially with our recent thoughts about attention.

My buddy Rod Dreher writes,

What I call the Benedict Option is this: a limited, strategic withdrawal of Christians from the mainstream of American popular culture, for the sake of shoring up our understanding of what the church is, and what we must do to be the church. We must do this because the strongly anti-Christian nature of contemporary popular culture occludes the meaning of the Gospel, and hides from us the kinds of habits and practices we need to engage in to be truly faithful to what we have been given.

David French responds,

I must admit, my first response to the notion of “strategic withdrawal” is less intellectual and more visceral. Retreat? I recall John Paul Jones’s words, “I have not yet begun to fight,” or, more succinctly, General Anthony McAuliffe’s legendary response to German surrender demands at Bastogne: “Nuts!”  

In reality, Christian conservatives have barely begun to fight. Christians, following the examples of the Apostles, should never retreat from the public square. They must leave only when quite literally forced out, after expending every legal bullet, availing themselves of every right of protest, and after exhausting themselves in civil disobedience. Have cultural conservatives spent half the energy on defense that the Left has spent on the attack?

It strikes me that French is responding to something Rod didn't say: Rod writes of “the strategic withdrawal of Christians from mainstream of American popular culture,” and French replies that Christians “should never retreat from the public square” — but “popular culture” and “public square” are by no means the same thing.

In most of the rest of his response French emphasizes strictly political issues, for instance, current debates over the extent of free speech. But Rod doesn't say anything about withdrawing from electoral politics — he doesn't say anything about politics at all, except insofar as building and strengthening the ekklesia is political (which it is — see below).

It’s not likely that French and I could ever come to much agreement about the core issues here, since he so readily conflates Christianity and conservatism. (“The surprising box office of God’s Not Dead, the overwhelming success of American Sniper, celebrating the life of a Christian warrior” — I ... I ... — “and the consistent ratings for Bible-themed television demonstrate that there remains a large-scale appetite for works of art that advance, whether by intention or by effect, a substantially more conservative point of view.”) But his response to Rod has the effect of forcing some important questions on those of us who think that the current social and political climate calls for new strategies: What exactly do we mean by “withdraw,” and how far do we withdraw? What specifically do we withdraw from? What are the political implications of cultural withdrawal?

Rod, in the post I quoted at the outset, does a fantastic job of laying out very briefly and concisely the work that needs to be done to strengthen local religious communities. But time, energy, attention, and money are all plagued by scarcity, which is why some kind of “withdrawal” is unavoidable — if I’m going to put more money into my church, that means less money available elsewhere. And if I’m going to devote more attention to active love of God and active love of my neighbor, from what should I withdraw my attention?

All of this is going to remain excessively vague and abstract until we can see specific instances of such withdrawal. But I suspect that different groups of Christians will have widely varying ideas of what needs to be withdrawn from: cable TV, New York Times subscriptions, Hollywood movies, monetary contributions to either of the major political parties, public schools, etc.

So I wonder if a better way to think about the Benedict Option is not as a strategic withdrawal from anything in particular but a strategic attentiveness to the institutions and forms of life within which Christians can flourish. In other words, Rod’s post is the right starting place, and the language of “withdrawal” something of a distraction from what that post is all about.

My own inclination — but then I have been a teacher for thirtysomething years — is to think that our primary focus should be on the two chief modes of Bildung: paideia and catechesis. And I do not mean for either of these modes to be confined to the formation of children.

If we ask ourselves what genuine Christian Bildung is, and what is required to achieve it in our time, then we will be directed to the construction and conservation of institutions and practices that are necessary for that great task. And then the necessary withdrawals — which may indeed vary from person to person, vocation to vocation, community to community — will take care of themselves.

Thursday, April 23, 2015

things still in the saddle, still riding mankind

A few months ago Sandra Tsing Loh published an essay about the plusses and minuses of living alone — well, that’s what the essay is ostensibly about. But note the following words and phrases that I have extracted from the essay:

  • Basil-cucumber martinis
  • floaty Indian shirts
  • sundresses
  • sandals
  • Uber
  • Robyn’s cottage
  • fresh flowers, art, and pillows
  • separate studio
  • Airbnb
  • natural-wood built-ins
  • frosted-glass cabinets
  • pockets and shelves and drawers that glide
  • model wooden ships
  • guitars
  • amps
  • old Guitar Player magazines
  • Rubbermaid bins full of power cords
  • books, newspapers, and magazines
  • Sundance or IFC Channel
  • rare archival videos
  • ten-hour Ken Burns documentaries
  • medicinal marijuana
  • Sons of Anarchy
  • garage
  • sculpting studio with a kiln
  • dusty boxes of bowling shoes
  • Cassette tapes
  • Wine corks
  • Trader Joe’s single-serve Indian meals
  • microwaveable burritos
  • Kettle Chips
  • veggie bruschetta
  • cocktail
  • his personal Cessna
  • his $425 studio with a hot plate and bathroom down the hall
  • my elegant if somewhat spare (with perfect color accents) bedroom
  • my bed (in some floaty off-white or eggshell-hued peignoir)
  • a cup of perfect coffee (prepared for brewing the night before)
  • a sprawling loft in Chelsea
  • Bullshot, a Bloody Mary that substitutes those noisome 7 grams of carbs in tomato juice with zero-carb beef bouillon
  • second bowl of cereal in the day or peanut butter or yogurt
  • his 250-square-foot converted garage
  • an espresso
  • books, DVDs, appliances
  • large monitors (TV and computer, flickering)
  • an unmade futon topped with wrinkled laundry and a sleeping bag
  • a narrow landing strip of kitchen
  • a “tabletop convection oven” (big enough to bake pizza but not chicken)
  • a ten-gallon water heater
  • Syndrums
  • a poinsettia plant at Christmas and a lily at Easter
  • the bed (with the expensive mattress and hypoallergenic pillows we finally got right)
  • a quietly humming Roomba
  • rubbery vegetables
  • our computers, televisions, and even my personal girlfriend Pandora (Joni Mitchell radio! Celtic Christmas radio!)
  • Netflix
  • my tiny Apple remote (the one little bigger than a stick of chewing gum)
  • quinoa and kale
  • juice fasts

The whole piece seems to be driven by an apparently unconscious but pathologically compulsive inventorying of commodities. It turns out that the real question of the essay is not “Does Living Alone Drive You Mad?” but rather “Which Purchasable Goods Provide Adequate Substitutes for Human Beings?”



P.S. My title comes from Emerson

Wednesday, April 22, 2015

imagining Thomas More (or not)

The good folks over at First Things are unhappy with the treatment of Thomas More in Wolf Hall, the TV series based on Hilary Mantel’s novel of the same title: here is George Weigel’s response, and here is Mark Movsesian’s. I offered some thoughts on related issues when I reviewed Mantel’s novel five years ago for Books and Culture:

Much of the material culture of the past can be known. When Cromwell describes for the women of his household the clothing of Anne Boleyn — the fabric of her gown, the cut of her headdress — we believe that indeed it was so. If Mantel did not get these details right, she could have and should have. But people's inner lives are always constructed in our imaginations, and this is true whether they are our contemporaries or figures from the distant past. The story of the courtier who finds Cromwell weeping, and to whom Cromwell expresses his fear that he will fall with Wolsey, was not invented by Mantel: it's part of the historical record. Mantel's contribution is the notion that Cromwell lied about his tears and was really thinking of his beloved dead. And this could have been the case; we cannot know. But that's not because Cromwell lived half a millennium ago. When Lord Chancellor More adds to the charges against Wolsey one that Cromwell knows to have been fabricated, Cromwell tries to imagine what went through More's mind when he made that claim — but he cannot do it; More lies always beyond the reach of his imagination, even though the two men are in frequent contact.

Such deep meditations on interior lives are, we have often been told, the fruit of the Reformation. It was Luther and his heirs who taught us to look within and see the baseness there, to be clear-eyed and unwavering in discerning our sin nature, so that we can turn to God and plead only his mercy: "We do earnestly repent, and be heartily sorry for these our misdoings; the remembrance of them is grievous unto us; the burthen of them is intolerable. Have mercy upon us, have mercy upon us, most merciful Father; for thy Son our Lord Jesus Christ's sake forgive us all that is past" — so says the General Confession written by Cromwell's contemporary Thomas Cranmer. And was not Cromwell the effectual architect of the English Reformation, the man whose policies made the emergence of the Church of England possible?

The architect, yes; but — again, this is the conventional narrative — not out of conviction, rather out of mere obedience to King Henry's wish to be freed from the authority of Rome. But, again, who really knows what Cromwell's thoughts on such matters were? The Cromwell conjured by Mantel is deeply drawn to Tyndale's Bible and Tyndale's Lutheran theology — on the deaths of his wife and daughters, he reaches there for comfort rather than to the Catholic piety of his wife, to which he also publicly assents — but "to be drawn to" is not "to be committed to." Cromwell's religious convictions are elusive to us, but Mantel would have us see that they were elusive even to himself. (The same can be said for many of us.) What this Cromwell clearly does believe is that More's theological and ecclesiastical certainties, and the fierce campaign against heresy that they engendered, are bad policy and immoral besides. He — he who is kind even to dogs and cats — flinches at More's cruelties, and sympathizes with the Protestants simply because they are hunted down and persecuted. When he rises to be Henry's chief minister, he becomes a remorseless enemy of the Church's power not because he hates the Church but because he sees how thoroughly power corrupts, and wants to limit it wherever he can.

Again, in all these ways Mantel's Cromwell is a characteristically late-modern Western man who happens to be living at the beginnings of modernity. By envisioning him so, Mantel has rendered much simpler the task of making the historical novel into a psychological novel. Could she have told the story of More, or for that matter Tyndale, in this manner? I think not. Author and protagonist merge nicely at this point: the True Believer remains inaccessible to them both.

What George Weigel, in the first FT piece I linked to above, calls “upmarket anti-Catholicism” is, in my view, simply a failure of historical imagination. Hilary Mantel could only present an admirable Thomas Cromwell by assuming, or pretending, that he’s a lot like people in her social circle: tolerant, skeptical, indulgently affectionate towards children, fond of animals, shy of violence — a typical 21st-century educated Londoner who was inexplicably born half a millennium too early. Having created Cromwell in her own image, Mantel then makes him the proxy for her own inability to make sense of someone like Thomas More.

It doesn't have to be this way. I seriously doubt that Peter Ackroyd’s beliefs are any closer to Thomas More’s than Hilary Mantel’s are, but that didn't stop him from pursuing a deep and sensitive understanding of the man in his brilliant biography. Mantel simply shirked the hard labor of trying to understand people from the distant past, and because her readers, by and large, and the people who made Wolf Hall into a television series, aren't interested in that labor either, we get the cardboard caricature of More that Weigel and Movsesian rightly protest.

Tuesday, April 21, 2015

Indexed

This post on paper clips — and other “everyday things”: Henry Petroski really should have been mentioned in the post — should be taken as yet another reminder of some important truths:

  • Non-electronic technologies are still technologies;
  • Technologies that have been developed (in some cases perfected) over decades are even centuries are often extremely well-optimized for the work they’re put to;
  • To slightly adapt Friedrich Kittler, “New technologies do not make old technologies obsolete; they assign them other places in the system.”

I’m thinking about these matters a lot because not long ago I made a significant change in my research methods for my book in progress. This is the largest and most complex project I’ve even endeavored, and has, as Tolkien said about The Lord of the Rings, “grown in the telling”; keeping all the citations, quotes, information, and ideas straight has been ... well, I started to write “extremely difficult,” but I think I need to amend that to “impossible.”

Then, after reading Hua Hsu’s wonderful review-essay, I picked up a copy of Umberto Eco’s How to Write a Thesis, and when I got to his chapter on note cards, a light went on: That’s what I need, I said to myself. Index cards. So here’s what I did:

Firs, I bought index cards in various colors. Then I assigned a color to each of the major thinkers I’m writing about in my book: W. H. Auden, T. S. Eliot, C. S. Lewis, Jacques Maritain, and Simone Weil — and reserved white cards for general notes (ideas, tasks, etc.). Every time I add a card to any of the colored stacks I number it, so I can cross-reference cards: e.g., the seventeenth blue card (Simone Weil) would be referred to elsewhere as B17. Finally, when the date of a publication or event is relevant, I write that date in the upper right corner. Every few days I read through the cards to discern correspondences, which I can then mark by cross-reference. And when I sit down at the computer I surround myself with these cards, which I can lay out in whatever pattern seems appropriate at the time, taking in the relevant content at a glance.

This is one of the best organizational decisions I have made in a long time, and I’m already thinking about ways to extent it to other kinds of tasks: class planning, for instance. If I learn anything more of interest as this project moves along, I’ll make a report here.

more on the Theses

So let’s recap. Here are my original theses for disputation. Responses:


Just a wonderful conversation — I am so grateful for the responses. The past few weeks have been exceptionally busy for me, so right now I just have time to make a few brief notes, to some of which I hope I can return later.

First, Julia Ticona is exactly right to point out that my theses presume a social location without explicitly articulating what that location is. I’ve thought about these matters before, and written relatively briefly about them: see the discussion of African Christians whose Bibles are on their phones late in this essay; and a modern Orthodox Jewish take on textual technologies here; and the idea of “open-source Judaism” here. But I haven't done nearly enough along these lines, and Ticona’s response reminds me that we are in need of a more comprehensive set of technological ethnographies.

Second, I am really intrigued by Michael Sacasas’s template for thinking about attention. I wonder if we might complicate his admirably clear formulation — hey, it’s what academics do, sue me — by considering Albert Borgmann’s threefold model of information in his great book Holding on to Reality, from the Introduction of which I’ll quote at some length here:

Information can illuminate, transform, or displace reality. When failing health or a power failure deprives you of information, the world closes in on you; it becomes dark and oppressive. Without information about reality, without reports and records, the reach of experience quickly trails off into the shadows of ignorance and forgetfulness.

In addition to the information that discloses what is distant in space and remote in time, there is information that allows us to transform reality and make it richer materially and morally. As a report is the paradigm of information about reality, so a recipe is the model of information for reality, instruction for making bread or wine or French onion soup. Similarly there are plans, scores, and constitutions, information for erecting buildings, making music, and ordering society....

This picture of a world that is perspicuous through natural information and prosperous through cultural information has never been more than a norm or a dream. It is certainly unrecognizable today when the paradigmatic carrier of information is neither a natural thing nor a cultural text, but a technological device, a stream of electrons conveying bits of information. In the succession of natural, cultural, and technological information, both of the succeeding kinds heighten the function of their predecessor and introduce a new function. Cultural information through records, reports, maps, and charts discloses reality much more widely and incisively than natural signs ever could have done. But cultural signs also and characteristically provide information for the reordering and enriching of reality. Likewise technological information lifts both the illumination and the transformation of reality to another level of lucidity and power. But it also introduces a new kind of information. To information about and for reality it adds information as reality. The paradigms of report and recipe are succeeded by the paradigm of the recording. The technological information on a compact disc is so detailed and controlled that it addresses us virtually as reality. What comes from a recording of a Bach cantata on a CD is not a report about the cantata nor a recipe-the score-for performing the cantata, it is in the common understanding music itself. Information through the power of technology steps forward as a rival of reality.

Thinking about Borgmann in relation to Sacasas, I formulate a question which I can only register right now: What if different kinds of information elicit, or demand, different forms of attention?

Finally: Most of my respondents have in some way — though it’s interesting to note the variety of ways — emphasized the need to distinguish between individual decision-making and structural analysis: between (a) whatever technologies you or I might choose to employ or not employ, when we have a choice, and (b) the massive global-capitalist late-modern forces that sustain and enforce our current technopoly. Seeing these distinctions I am reminded of a very similar conversation, that surrounding climate change.

There has been an interesting recent turn in writing about climate change. Whereas advocates for the environment once placed a great emphasis on the things that individuals and families can do — reducing one’s carbon footprint, recycling, etc. — now, it seems to me, it’s becoming more common for them to say that “being green won't solve the problem”. The problems must be addressed at a higher level — at the highest possible level. Technopoly, similarly, won't be altered by boycotting Facebook or writing more by hand or taking the occasional digital detox.

But I might be. Recycling and installing solar panels and avoiding plastic water bottles — these are actions that matter only insofar as they limit destruction to our environment; they don't do anything in particular for me, except add inconvenience. But even if sending postcards to my friends instead of tweeting to them doesn't lessen the grip of the great social-media juggernauts, it can still be a good and worthwhile thing to do. We just need to be sure we don't confuse personal culture with social critique.

Friday, April 17, 2015

Mark Greif and Mrs. Turpin

I’ve written a review of Mark Greif’s The Age of the Crisis of Man for Books and Culture, but it won’t appear for a few months. I think Greif has written a very important, deeply researched, extremely intelligent, and greatly flawed book. I want to take a few minutes here to expand on something I say in the review about its flaws but could not develop fully there.

There I write, “Greif’s belief that religion is on its way out leads him to be less than scrupulous in his research on Christian thinkers and writers, so in dealing with Christian intellectuals, he is never on firm ground — his knowledge is spotty and skimpy, and his readings of Flannery O’Connor are quite uninformed by the necessary theological context. But unlike many academics of our time, he understands that Christian writers matter to the discourse of man, and for this he deserves commendation.”

The culmination of Greif’s chapter on O’Connor is a reading of what may be her greatest story, “Revelation.” I am going to seriously spoil that story here, so if you haven’t read it, please do so before proceeding with this blog post.

Okay? All set?

The story narrates a series of revelations to one Mrs. Ruby Turpin, but here is the culminating one:

At last she lifted her head. There was only a purple streak in the sky, cutting through a field of crimson and leading, like an extension of the highway, into the descending dusk. She raised her hands from the side of the pen in a gesture hieratic and profound. A visionary light settled in her eyes. She saw the streak as a vast swinging bridge extending upward from the earth through a field of living fire. Upon it a vast horde of souls were rumbling toward heaven. There were whole companies of white-trash, clean for the first time in their lives, and bands of black niggers in white robes, and battalions of freaks and lunatics shouting and clapping and leaping like frogs. And bringing up the end of the procession was a tribe of people whom she recognized at once as those who, like herself and Claud, had always had a little of everything and the God-given wit to use it right. She leaned forward to observe them closer. They were marching behind the others with great dignity, accountable as they had always been for good order and common sense and respectable behavior. They alone were on key. Yet she could see by their shocked and altered faces that even their virtues were being burned away. She lowered her hands and gripped the rail of the hog pen, her eyes small but fixed unblinkingly on what lay ahead. In a moment the vision faded but she remained where she was, immobile.

About this passage Greif writes,

Now, one can read this as the usual O’Connor moment of grace or action of mercy. Even the just will have “their virtues ... burned away” in the last judgment. I think, rather, the change here is that there are just people, unillusioned, dignified to the end. And even up to the last, order is maintained. “[A]ccountable as they had always been for good order” is simply not ironic; where other inversions obtain (“white-trash … clean,” “black niggers in white robes”), the ordinary righteous whites are straightforward and “on key.” 

From the option to turn readers away from the worry about man, O’Connor’s last major work turns back to a vision of social order that matters more in the climax of the story than the moment in which human vanity is burned away.

There’s no gentle way to put this: Greif has misunderstood this story about as badly as it is possible to misunderstand a story. And he misunderstands it because he simply doesn’t know the biblical and theological context.

Let’s start with Greif’s belief that Mrs. Turpin and people like here are “just” — that is, righteous — people. This is to accept her at her self-valuation, and the entire point of the story is to undermine, to destroy, that self-valuation. “Revelation” is straightforwardly and openly a midrash on, nearly a retelling of, Jesus’ parable of the Pharisee and the tax-collector. Just as the Pharisee cries out, “God, I thank thee, that I am not as other men are, extortioners, unjust, adulterers, or even as this publican,” so Mrs. Turpin cries out,

“If it’s one thing I am,” Mrs. Turpin said with feeling, “it’s grateful. When I think who all I could have been besides myself and what all I got, a little of everything, and a good disposition besides, I just feel like shouting, ‘Thank you, Jesus, for making everything the way it is!’ It could have been different!” For one thing, somebody else could have got Claud. At the thought of this, she was flooded with gratitude and a terrible pang of joy ran through her. “Oh thank you, Jesus, Jesus, thank you!” she cried aloud. 

The book struck her directly over her left eye.

(In a stroke of comical over-explicitness, the book is thrown by a young woman named Mary Grace. Get it? Mary? Grace?) Like the Pharisee, Mrs. Turpin is utterly pleased with herself, satisfied in every respect, but justifies her self-satisfaction by casting it as gratitude towards God. Her constant mental theme, as she sits in the doctor’s waiting room, is her superiority to the “white-trash woman” who shares the waiting room with her. So one of the most laugh-out-loud funny but also morally incisive moments in the whole story comes when Mary Grace has been restrained and is being taken away to a hospital: “‘I thank Gawd,’ the white-trash woman said fervently, ‘I ain’t a lunatic.’”

In the sections on hope in the Summa — Flannery O’Connor’s standard nighttime reading, as Greif knows — Thomas Aquinas sees the Pharisaical attitude as an embrace of the status comprehensor, a belief that one has spiritually arrived. The proud person therefore shares with the despairing person the trait of motionlessness. ("In a moment the vision faded but she remained where she was, immobile.") The properly hopeful person, on the other hand, is the homo viator, the wayfarer, the one who is still on the road, the one who knows that she has not arrived, the one who sustains herself with the simple prayer of the tax-collector: “Lord have mercy on me a sinner.”

This is why the final vision Mrs. Turpin receives is not, as Greif declares, one of the Last Judgment but rather one of souls on pilgrimage: the pilgrimage that begins in this world and in Catholic teaching continues, for the redeemed, into Purgatory. (Mrs. Turpin can be said to receive a vision of the Last Judgment only in Kafka’s sense of the term: “It is only our conception of time that makes us call the Last Judgment by this name. It is, in fact, a kind of summary court in perpetual session.”) It is noteworthy that Greif slips up and speaks of the “moment in which human vanity is burned away,” when O’Connor says it is the virtues of Mrs. Turpin and her kind that must be burnt — or what they think of as their virtues — what they would appeal to as justifying them in the eyes of men and the eyes of God: “good order and common sense and respectable behavior.” What they must learn, and what they will learn, eventually, is that good order and common sense and respectable behavior and singing on key count for nothing in the economy of the Kingdom of Heaven — in fact, less than nothing.

Greif speaks of people like Mrs. Turpin as “unillusioned,” but this gets it backwards: they are under one of the most powerful illusions of all — that God cares about respectability and will credit the respectable with righteousness. (This is the same illusion that Kierkegaard raged against for most of his career.) Note that Mrs. Turpin is not wrong to think that she is respectable and does stand for “good order”: in that sense Greif is correct to see that the description is not ironic. Her error is to believe that to God any of that matters. It is precisely because this illusion is so pernicious that Mrs. Turpin and those like her bring up the rear of the pilgrimage — far behind the “battalions of freaks and lunatics shouting and clapping and leaping like frogs,” who understand that to sing on key in this situation is to miss the point rather spectacularly — and make it into the Kingdom by the skin of their teeth; it is precisely because this illusion is so powerful that they persist in it even as their virtues are being burned away.

You don't have to know Aquinas to understand all this; but you probably do have to know the story of the Pharisee and the tax-collector. As our cultural elites lose even the most elementary biblical literacy, this is going to happen more and more often: reading the Bible-saturated literature of the past and missing, not secondary and trivial illusions, but the entire point of stories and novels and plays and poems, and for that matter paintings and sculptures and musical compositions. The artistic past of the West will become incomprehensible, but — and this is the scary thing — no one will know that they’re misreading. Gross errors will be passed down from teacher to student, from scholar to reader, and it is difficult to imagine circumstances arising in which they can be corrected.

Wednesday, April 15, 2015

isolation and proximity

Sobering thoughts from Matthew Loftus in response to Wesley Hill's new book Spiritual Friendship:

I think a lot of this decline in human relationships can be traced to individualism and consumer culture, and I’d argue that our uncritical use of technology and social mobility make this worse by giving us more power to isolate ourselves from the unlovable. However, it’s worth noting that architecture and economics play crucial roles here as well: if we don’t design the places that we live in order to interact with one another, we’ll self-segregate until we’re just alone with our screens all the time (while driving ourselves whatever distance we can tolerate to the school, restaurant, or church of our choosing.) Thus, if we want to promote the sort of friendship that Wesley wants, we’re going to have to push back against the forces that put each of us at a comfortable distance from one another. I think that one key way to do this (amplifying the final suggestion he gives in the book) is to increase our physical proximity to each other across the board and intentionally promote the understanding that more proximity should bring with it more responsibility to those that we are close to.

Tuesday, April 14, 2015

rant of the day

Fantastic rant this morning from Maciej Ceglowski, creator of the invaluable Pinboard, about this new service:






The chief reason I keep arguing with Ned O'Gorman about whether things can want — latest installment here — is that I think the blurring of lines between the agency of animals (especially people) and the agency of made objects contributes to just this kind of thing: if we can script the Internet of Things why not script people too? Once they're scripted they want what they've been scripted to do. (Obviously O'Gorman doesn't want to see that happen any more than I do: our debate is about the tendencies of terms, not about substantive ethical and political questions.)

Saturday, April 11, 2015

Carr on Piper on Jacobs

Here’s Nick Carr commenting on the recent dialogue at the Infernal Machine between me and Andrew Piper:

It’s possible to sketch out an alternative history of the net in which thoughtful reading and commentary play a bigger role. In its original form, the blog, or web log, was more a reader’s medium than a writer’s medium. And one can, without too much work, find deeply considered comment threads spinning out from online writings. But the blog turned into a writer’s medium, and readerly comments remain the exception, as both Jacobs and Piper agree. One of the dreams for the web, expressed through a computer metaphor, was that it would be a “read-write” medium rather than a “read-only” medium. In reality, the web is more of a write-only medium, with the desire for self-expression largely subsuming the act of reading. So I’m doubtful about Jacobs’s suggestion that the potential of our new textual technologies is being frustrated by our cultural tendencies. The technologies and the culture seem of a piece. We’re not resisting the tools; we’re using them as they were designed to be used.

I’d say that depends on the tools: for instance, this semester I’m having my students write with CommentPress, which I think does a really good job of preserving a read-write environment — maybe even better, in some ways, than material text, though without the powerful force of transcription that Andrew talks about. (That may be irreplaceable — typing the words of others, while in this respect better than copying and pasting them, doesn’t have the same degree of embodiment.)

In my theses I tried to acknowledge both halves of the equation: I talked about the need to choose tools wisely (26, 35), but I also said that without the cultivation of certain key attitudes and virtues (27, 29, 33) choosing the right tools won’t do us much good (36). I don’t think Nick and I — or for that matter Andrew and I — disagree very much on all this.

Wednesday, April 8, 2015

what buttons want

Ned O’Gorman, in his response to my 79 theses, writes:

Of course technologies want. The button wants to be pushed; the trigger wants to be pulled; the text wants to be read — each of these want as much as I want to go to bed, get a drink, or get up out of my chair and walk around, though they may want in a different way than I want. To reserve “wanting” for will-bearing creatures is to commit oneself to the philosophical voluntarianism that undergirds technological instrumentalism.

We’re in interesting and difficult territory here, because what O’Gorman thinks obviously true I think obviously false. In fact, it seems impossible to me that O’Gorman even believes what he writes here.

Take for instance the case of the button that “wants to be pushed.” Clearly O’Gorman does not believe that the button sits there anxiously as a finger hovers over it thinking o please push me please please please. Clearly he knows that the button is merely a piece of plastic that when depressed activates an electrical current that passes through wires on its way to detonating a weapon. Clearly he knows that an identical button — buttons are, after all, to adopt a phrase from the poet Les Murray, the kind of thing that comes in kinds — might be used to start a toy car. So what can he mean when he says that the button “wants”?

I an open to correction, but I think he must mean something like this: “That button is designed in such a way — via its physical conformation and its emplacement in contexts of use — that it seems to be asking or demanding to be used in a very specific way.” If that’s what he means, then I fully agree. But to call that “wanting” does gross violence to the term, and obscures the fact that other human beings designed and built that button and placed it in that particular context. It is the desires, the wants, of those “will-bearing” human beings, that have made the button so eminently pushable.

(I will probably want to say something later about the peculiar ontological status of books and texts, but for now just this: even if I were to say that texts don’t want I wouldn't thereby be “divesting” them of “meaningfulness,” as O’Gorman claims. That’s a colossal non sequitur.)

I believe I understand why O’Gorman wants to make this argument: the phrases “philosophical voluntarism” and “technological instrumentalism” are the key ones. I assume that by invoking these phrases O’Gorman means to reject the idea that human beings stand in a position of absolute freedom, simply choosing whatever “instruments” seem useful to them for their given project. He wants to avoid the disasters we land ourselves in when we say that Facebook, or the internal combustion engine, or the personal computer, or nuclear power, is “just a tool” and that “what matters is how you use it.” And O’Gorman is right to want to critique this position as both naïve and destructive.

But he is wrong if he thinks that this position is entailed in any way by my theses; and even more wrong to think that this position can be effectively combatted by saying that technologies “want.” Once you start to think of technologies as having desires of their own you are well on the way to the Borg Complex: we all instinctively understand that it is precisely because tools don’t want anything that they cannot be reasoned with or argued with. And we can become easily intimidated by the sheer scale of technological production in our era. Eventually we can end up talking even about what algorithms do as though algorithms aren’t written by humans.

I trust O’Gorman would agree with me that neither pure voluntarism nor purely deterministic defeatism are adequate responses to the challenges posed by our current technocratic regime — or the opportunities offered by human creativity, the creativity that makes technology intrinsic to human personhood. It seems that he thinks the dangers of voluntarism are so great that they must be contested by attributing what can only be a purely fictional agency to tools, whereas I believe that the conceptual confusion this creates leads to a loss of a necessary focus on human responsibility.



cross-posted at The Infernal Machine