Friday, September 18, 2015
So when Fred Appel at Princeton University Press, who commissioned and edited my biography of the Book of Common Prayer, asked if I might be interested in turning an expanded version of those theses into a short book, I jumped at the chance. And I'm working on that now.
But many of the ideas that I might normally be developing on this blog need to go into that book — which will probably mean a period of silence here, until the book is completed. (I keep reading things and thinking Hey, I want to write a post about that wait a minute I can't write a post about that.)
After I have completed Short But As Yet Unnamed Book Of Theses On Technology, I hope to return to my much bigger book on Christian intellectuals and World War II — which has been on hiatus because I ran into some intractable organizational problems. But having taken a few months away from the project, I can already begin to see how I might resume and reconstruct it.
In the longer term, I hope to return to this space to develop my thoughts on the technological history of modernity — and perhaps turn those into a book as well.
Those are the plans, anyway. I may post here from time to time, but not very often until the Theses are done. Please wish me well!
One vision focuses on how college can be useful — to its graduates, to employers and to a globally competitive America. When presidential candidates talk about making college more affordable, they often mention those benefits, and they measure them largely in dollars and cents. How is it helping postgraduate earnings, or increasing G.D.P.? As college grows more expensive, plenty of people want to know whether they’re getting a good return on their investment. They believe in Utility U.
Another vision of college centers on what John Stuart Mill called “experiments in living,” aimed at getting students ready for life as free men and women. (This was not an entirely new thought: the “liberal” in “liberal education” comes from the Latin liberalis, which means “befitting a free person.”) Here, college is about building your soul as much as your skills. Students want to think critically about the values that guide them, and they will inevitably want to test out their ideas and ideals in the campus community. (Though more and more students are taking degrees online, most undergraduates will be on campus a lot of the time.) College, in this view, is where you hone the tools for the foundational American project, the pursuit of happiness. Welcome to Utopia U.
Together, these visions — Utility and Utopia — explain a great deal about modern colleges and universities. But taken singly, they lead to very different metrics for success.
Appiah walks through this tired old dichotomy only in order to say: Why not both?
(To be clear: like Appiah, I am only addressing the American context. Things can be different elsewhere, as, for example, in Japan, where a government minister has just asked all public universities to eliminate programs in the social sciences and humanities, including law and economics, and to focus instead on “more practical vocational education that better anticipates the needs of society.”)
A good general rule: when someone constructs an argument of this type — Between A and B there seems to be a great gulf fixed, but I have used my unique powers of insight to discern that this is a false dichotomy and we need not choose! — it is unlikely that they have described A fairly or described B fairly or described the conflict between then accurately.
So let's try to parse this out with a little more care:
- Some colleges (mainly the for-profit ones) promise nothing but utility.
- Some colleges (say, St. John's in Annapolis and Santa Fe) promise nothing but what Appiah calls Utopia, that is, an environment for pursuing essential and eternal questions.
- Most colleges follow the example of Peter Quill, Star-Lord, and promise a bit of both.
- Most students want, or at least claim to want, a bit of both. A few are driven primarily by intellectual curiosity, but they'd love to believe that a course of study organized around such explorations can also lead to a decent job after graduation; a great many more are primarily concerned to secure good job opportunities, but also want to confront interesting ideas and beautiful works of art. (Many of my best students in the Honors College at Baylor are pre-med, but love taking literature and philosophy courses for just this reason.)
Given this general state of affairs, with its range of sometimes-complementary and sometimes-conflicting forces at work, Appiah's framing is simplistic — and also serves as a way to avoid the really key question for the coming years: Who will pay, and what will they pay for?
Tuesday, September 15, 2015
One of Sara’s models is the artist Claire Pentecost, who sees herself as a public amateur:
One of the things I’m attached to is learning. And one of the models I’ve developed theoretically is that of the artist as the public amateur. Not the public intellectual, which is usually a position of mastery and critique, but the public amateur, a position of inquiry and experimentation. The amateur is the learner who is motivated by love or by personal attachment, and in this case, who consents to learn in public so that the very conditions of knowledge production can be interrogated. The public amateur takes the initiative to question something in the province of a discipline in which she is not conventionally qualified, acquires knowledge through unofficial means, and assumes the authority to offer interpretations of that knowledge, especially in regard to decisions that affect our lives.
Public amateurs can have exceptional social value, not least because they dare to question experts who want to remain unquestioned simply by virtue of accredited expertise; public amateurs don't take “Trust me, I know what I’m doing” as an adequate self-justification. But perhaps the greatest contribution public amateurs make to society arises from their insistence — it’s a kind of compulsion for them — on putting together ideas and experiences that the atomizing, specializing forces of our culture try to keep in neatly demarcated compartments. This is how an artist and art historian ends up teaching at an engineering school.
There are two traits that, if you wish to be a public amateur, you simply cannot afford to possess. You can’t insist on having a plan and sticking with it, and you can’t be afraid of making mistakes. If you’re the sort of person whose ducks must always be in a neat, clean row, the life of the public amateur is not for you. But as the personal story Sara tells near the end of her talk indicates, sometimes life has a way of scrambling all your ducks. When that happens, you can rage vainly against it; or you can do what Sara did.
Monday, September 14, 2015
After writing today’s post I couldn’t shake the notion that all this conversation about simplifying and rationalizing language reminded me of something, and then it hit me: Gulliver’s visit to the grand academy of Lagado.
A number of the academicians Gulliver meets there are deeply concerned with the irrationality of language, and pursue schemes to adjust it so that it fits their understanding of what science requires. One scholar has built a frame (pictured above) comprised of a series of turnable blocks. He makes some of his students turn the handles and other students to write down the sentences produced (when sentences are produced, that is).
But more interesting in light of what Mark Zuckerberg wants are those who attempt to deal with what, in Swift’s time, was called the res et verba controversy. (You can read about it in Hans Aarsleff’s 1982 book From Locke to Saussure: Essays on the Study of Language and Intellectual History.) The controversy concerned the question of whether language could be rationalized in such a way that there is a direct one-to-one match between things (res) and words (verba). This problem some of the academicians of Lagado determined to solve — along with certain other problems, especially including death — in a very practical way:
The other project was, a scheme for entirely abolishing all words whatsoever; and this was urged as a great advantage in point of health, as well as brevity. For it is plain, that every word we speak is, in some degree, a diminution of our lunge by corrosion, and, consequently, contributes to the shortening of our lives. An expedient was therefore offered, “that since words are only names for things, it would be more convenient for all men to carry about them such things as were necessary to express a particular business they are to discourse on.” And this invention would certainly have taken place, to the great ease as well as health of the subject, if the women, in conjunction with the vulgar and illiterate, had not threatened to raise a rebellion unless they might be allowed the liberty to speak with their tongues, after the manner of their forefathers; such constant irreconcilable enemies to science are the common people. However, many of the most learned and wise adhere to the new scheme of expressing themselves by things; which has only this inconvenience attending it, that if a man’s business be very great, and of various kinds, he must be obliged, in proportion, to carry a greater bundle of things upon his back, unless he can afford one or two strong servants to attend him. I have often beheld two of those sages almost sinking under the weight of their packs, like pedlars among us, who, when they met in the street, would lay down their loads, open their sacks, and hold conversation for an hour together; then put up their implements, help each other to resume their burdens, and take their leave.
But for short conversations, a man may carry implements in his pockets, and under his arms, enough to supply him; and in his house, he cannot be at a loss. Therefore the room where company meet who practise this art, is full of all things, ready at hand, requisite to furnish matter for this kind of artificial converse.
Rationalizing language and extending human life expectancy at the same time! Mark Zuckerberg and Ray Kurzweil, meet your great forbears!
If language is bound up in living, if it is an expression of both sense and sensibility, then computers, being non-living, having no sensibility, will have a very difficult time mastering “natural-language processing” beyond a certain rudimentary level. The best solution, if you have a need to get computers to “understand” human communication, may to be avoid the problem altogether. Instead of figuring out how to get computers to understand natural language, you get people to speak artificial language, the language of computers. A good way to start is to encourage people to express themselves not through messy assemblages of fuzzily defined words but through neat, formal symbols — emoticons or emoji, for instance. When we speak with emoji, we’re speaking a language that machines can understand.
People like Mark Zuckerberg have always been uncomfortable with natural language. Now, they can do something about it.
I think we should be very concerned about this move by Facebook. In these contexts, I often think of a shrewd and troubling comment by Jaron Lanier: “The Turing test cuts both ways. You can't tell if a machine has gotten smarter or if you've just lowered your own standards of intelligence to such a degree that the machine seems smart. If you can have a conversation with a simulated person presented by an AI program, can you tell how far you've let your sense of personhood degrade in order to make the illusion work for you?” In this sense, the degradation of personhood is one of Facebook's explicit goals, and Facebook will increasingly require its users to cooperate in lowering their standards of intelligence and personhood.
Friday, September 11, 2015
First, read this post by Jonathan Haidt excerpting and summarizing this article on the culture of campus microaggressions. A key passage:
Campbell and Manning describe how this culture of dignity is now giving way to a new culture of victimhood in which people are encouraged to respond to even the slightest unintentional offense, as in an honor culture. But they must not obtain redress on their own; they must appeal for help to powerful others or administrative bodies, to whom they must make the case that they have been victimized. It is the very presence of such administrative bodies, within a culture that is highly egalitarian and diverse (i.e., many college campuses) that gives rise to intense efforts to identify oneself as a fragile and aggrieved victim. This is why we have seen the recent explosion of concerns about microaggressions, combined with demands for trigger warnings and safe spaces, that Greg Lukianoff and I wrote about in The Coddling of the American Mind.
Now, take a look at this post by Conor Friedersdorf illustrating how this kind of thing works in practice. Note especially the account of an Oberlin student accused of microaggression and the way the conflict escalates.
And finally, to give you the proper socio-political context for all this, please read Freddie deBoer’s outstanding essay in the New York Times Magazine. Here’s an absolutely vital passage:
Current conditions result in neither the muscular and effective student activism favored by the defenders of current campus politics nor the emboldened, challenging professors that critics prefer. Instead, both sides seem to be gradually marginalized in favor of the growing managerial class that dominates so many campuses. Yes, students get to dictate increasingly elaborate and punitive speech codes that some of them prefer. But what could be more corporate or bureaucratic than the increasingly tight control on language and culture in the workplace? Those efforts both divert attention from the material politics that the administration often strenuously opposes (like divestment campaigns) and contribute to a deepening cultural disrespect for student activism. Professors, meanwhile, cling for dear life, trying merely to preserve whatever tenure track they can, prevented by academic culture, a lack of coordination and interdepartmental resentments from rallying together as labor activists. That the contemporary campus quiets the voices of both students and teachers — the two indispensable actors in the educational exchange — speaks to the funhouse-mirror quality of today’s academy.
I wish that committed student activists would recognize that the administrators who run their universities, no matter how convenient a recipient of their appeals, are not their friends. I want these bright, passionate students to remember that the best legacy of student activism lies in shaking up administrators, not in making appeals to them. At its worst, this tendency results in something like collusion between activists and administrators.
This is brilliantly incisive stuff by Freddie, and anyone who cares about the state of American higher education needs to reflect on it. When students demand the intervention of administrative authority to solve every little conflict, they end up simply reinforcing a power structure in which students and faculty alike are stripped of moral agency, in which all of us in the university — including the administrators themselves, since they’re typically reading responses from an instruction manual prepared in close consultation with university lawyers — are instruments in the hands of a self-perpetuating bureaucratic regime. Few social structures could be more alien to the character of true education.
Friedersdorf’s post encourages us to consider whether these habits of mind are characteristic of society as a whole. That seems indubitable to me. When people in the workplace routinely make complaints to HR officers instead of dealing directly with their colleagues, or calling the police when they see kids out on their own rather than talking to the parents, they’re employing the same strategy of enlisting Authority to fight their battles for them — and thereby consolidating the power of those who are currently in charge. Not exactly a strategy for changing the world. Nor for creating a minimally responsible citizenry.
In a fascinating article called “The Japanese Preschool’s Pedagogy of Peripheral Participation,”, Akiko Hayashi and Joseph Tobin describe a twofold strategy commonly deployed in Japan to deal with preschoolers’ conflicts: machi no hoiku and mimamoru. The former means “caring by waiting”; the second means “standing guard.” When children come into conflict, the teacher makes sure the students know that she is present, that she is watching — she may even add, kamisama datte miterun, daiyo (the gods too are watching) — but she does not intervene unless absolutely necessary. Even if the children start to fight she may not intervene; that will depend on whether a child is genuinely attempting to hurt another or the two are halfheartedly “play-fighting.”
The idea is to give children every possible opportunity to resolve their own conflicts — even past the point at which it might, to an American observer, seem that a conflict is irresolvable. This requires patient waiting; and of course one can wait too long — just as one can intervene too quickly. The mimamoru strategy is meant to reassure children that their authorities will not allow anything really bad to happen to them, though perhaps some unpleasant moments may arise. But those unpleasant moments must be tolerated, else how will the children learn to respond constructively and effectively to conflict — conflict which is, after all, inevitable in any social environment? And if children don't begin to learn such responses in preschool when will they learn it? Imagine if at university, or even in the workplace, they had developed no such abilities and were constantly dependent on authorities to ease every instance of social friction. What a mess that would be.
UPDATE: Please see Josh's comment below.
Wednesday, September 9, 2015
While much of the post is supposed to describe what Lee learned about academe in his time as a professor, his thoughts on the State of Higher Education are disjointed and incoherent. One example will suffice: he goes in one sentence from counseling liberal-arts students to skip college in in favor of watching relevant YouTube videos to declaring that “Online education isn't the solution,” and doesn't even notice the disconnect. But his prose becomes vibrant when he’s describing ... well, himself.
It turns out that Oliver Lee was absolutely fantastic as an academic. He “launched several digital humanities initiatives”; he speaks of his “effusive student evaluations”; he “coached [his] university's legal debate team to a national championship bid” — though I was sort of sad to see the vague word “bid” tacked on to the end there, since by this point I was expecting nothing less than, you know, an actual championship. But the self-celebration goes on for quite a while.
But in every Eden there’s a serpent, or several. Lee became the object of “sniping” by his colleagues; he was beset by “politics”; one of his projects was “derided as bewildering and gimmicky.” Even his students let him down: immediately after telling us what an excellent lecturer he was — “By professor standards, which admittedly aren't that high, I could rock the mic” (that apparently humble caveat isn’t really a caveat at all, since the only context of this whole essay is academia, and it just gives him a way to demean professors) — he describes how a friend visiting his class was distracted by a student watching Breaking Bad on a laptop. It seems pretty clear that if someone hadn’t told him, Lee never would have guessed that some students failed to notice his mic-rocking abilities.
By the time I got to the end of Lee’s personal narrative I had developed a very strong suspicion that what he may really have been saying was “You can’t fire me, I quit.”
But in any case, I’m reminded by this feature of the quitpiece genre: it is almost always immensely self-congratulatory. People will describe in detail their levels of commitment and energy and the superb work they elicited from their students, and will imply or say explicitly that they were targeted by colleagues or department chairs precisely because they did their work so well. If they acknowledge that they were criticized, such criticisms are invariably dismissed as motivated either by jealousy or by fantastically misplaced priorities.
Within the moral economy of the quitpiece genre — which is not, I suspect, reliably indicative of why most people who leave academia do so — to walk away from an academic job is to turn your back on an institution that doesn't deserve you, isn’t good or pure or rightly-ordered enough for you. I’m longing for a quitpiece that says “I left my job as a professor because I didn't like it and wasn’t very good at it.”
Tuesday, September 8, 2015
(a) They cost the publishers money to produce, and in many cases there is simply no possibility of a larger market, so those publishers need to recoup costs;
(b) Once those books are published they are (barring cultural catastrophe) permanently available to professors, students, and other interested parties who need them, so the publishers do provide something of a service for the scholarly community.
Of course, some academic books are certainly overpriced; but without learning more than I currently know about editing, printing, and distribution costs, I’m not sure which ones fit that bill. Anonymous Academic’s certainty that such publishers are “greedy” seems misplaced to me; I’m pretty sure that if I were driven by greed academic publishing is not be the business I’d go into....
Anyway — speaking purely autobiographically here — I have always wanted to reach the largest audience possible, consistent with being a responsible scholar. And that’s why I think it’s largely been a good thing that a number of the stronger academic publishers — including three that I have written or are writing for, the university presses of Oxford, Princeton, and Harvard — have in recent years become hybrids: still committed to the scholarly standards embodied by peer review, but also eager to find the largest possible audience for their books. They can even produce the occasional bestseller.
Not every scholarly press can do this, and publishers aren't infallible in deciding which books can be effectively promoted and how to promote them. But the picture of greedy, price-gouging academic presses painted in that post is not, I think, a very accurate one.
Monday, September 7, 2015
Once you are well inside the world of Elena Ferrante’s just-completed quartet — what English-language reviewers are calling the Neapolitan novels but what is really a single long novel published in four volumes — you are not likely to escape. The books are utterly compelling and the world they create as real as real can be. Somewhere Iris Murdoch writes of the kind of story you read to which you simply say, “It is so.” Ferrante has written that kind of story. I had been telling myself for some time that I simply no longer have the tolerance for contemporary realistic fiction. Then I started this story and thought: Oh. I just haven’t come across anything this masterful in a while.
But one thing I find curious: the universal description of these books as being centrally the story of a friendship. I think they are much better described as the story of an overwhelmingly intense, identity-forming, lifelong hatred. Much has been made of the ambiguity of the adjective in the first installment's title: L'Amica Genial, The Genius Friend, or, in Ann Goldstein's English translation, My Brilliant Friend — it is what each girl thinks of the other. But not enough attention, I think, has been brought to the deeply misleading, or at best ambivalent, character of the noun Amica.
There is no doubt that Elena, the narrator, is fascinated by, obsessed with, in need of, Lila; and Lila is probably just as obsessed by Elena — though Lila's mind remains to some extent obscure to us, in part because Elena tells this story and no one can ever enter fully into the mind of another, and in part because Elena, I think, does not want us to have full access to Lila's inner world and resists entering that world herself. When Elena gets access to documents that reveal much of Lila's thinking, she describes their contents rather sketchily, and then destroys them, unable to remain any longer in their presence.
Elena's destruction of Lila's documents — though surely foreseen by Lila — is just one example of what may be the novel's chief recurring theme: that neither woman ever misses an opportunity to harm the other, to hurt as badly as she can possibly hurt without ending their relationship forever. (To act nastily enough to cause a separation of several years, that each of them will do.) Even when they help one another, such assistance serves to acquire leverage that is later used for cruelty.
Each woman is to the other what the Ring is to Gollum: “He hated it and loved it, as he hated and loved himself. He could not get rid of it. He had no will left in the matter.” So Elena: “I loved Lila. I wanted her to last. But I wanted it to be I who made her last.” This kind of relationship cannot be described simply, but I don't think there’s any meaningful sense in which it can be called a friendship.
The novel can and should be read as, among other things, an appropriately scalding, scarifying indictment of a society that made it impossible for Lenù and Lila to be genuine friends. They were made to be friends, I would say, deeply complementary personalities, helps and correctives for one another. But the world they are brought up in — with its harsh, rigid codes of masculinity and femininity untempered and uncorrected by a Christian message (despite the presence and apparent authority of the Church), with its relentlessly soul-grinding social and economic injustices that generate either defeatism or wild grasping attempts to escape — deforms their connection almost from the start, perverts it, twists it.
On the first disastrous day of Lila’s disastrous marriage, her brother says to her new husband, “She was born twisted and I’m sorry for you.” The new husband replies, “Twisted things get straightened out.” But the overwhelming message of the novels is that they don't. This is not a story of a friendship. It is the tragedy of what should have been a friendship but never was.
Saturday, September 5, 2015
This studied avoidance of the past, of the world — of anything that isn't immediate and local — is bad for the future of fiction and bad for the American mind more generally. The default assumption that our writers can be valid only when they're working in the idioms of their peers is something close to a death sentence for artistic creativity. Looking at reading lists like this, I can't help thinking that they play a significant role in maintaining the dreary sameness that is so characteristic of the fiction and poetry that come out of contemporary MFA programs.
Friday, September 4, 2015
You’re going to be exposed to stuff you don’t like at college. We will try to give you a heads up about the stuff that might upset you, but what is considered potentially offensive is an inherently political, value-laden question, and we aren’t always going to agree with your prior beliefs about that question. We cannot guarantee that everything you might be offended by will come with a warning, and we are under no obligation to attempt to provide one. We will try to work with you with compassion and respect, but ultimately it’s your responsibility to deal with the curriculum that we impose, and not our responsibility to make sure that it doesn’t bother you. If you can’t handle that, you don’t belong in college.
That’s very well said, and I agree with pretty much every word — but I think that a great many students in almost all of our universities will dissent from its premises. They may not be able to articulate their dissent clearly; they may not even consciously formulate it; but I think a dissent is implicit in much of what I read about the various trigger-warnings controversies.
It might go something like this:
You speak of “the curriculum [you] impose,” but I deny that you have the right to impose anything. I am passing through this place, headed for the next stage of my life — possibly graduate education in some form or another, more probably a job — and I am paying you to prepare me for that next stage. In short, we have a business contract in which I am your client, and it is your job to serve what I perceive my needs to be, not what you may happen to think they are. It’s not as though we’re living in that long-ago age when universities were considered repositories of timeless wisdom and professors custodians of that wisdom. You faculty are employees of an ideological state apparatus in a neoliberal regime that constitutes itself by a series of implied or explicit contracts in which goods are exchanged for fees. Please stop acting like this is the University of Paris in the age of Aquinas and we’re all seeking transcendent wisdom. I control my own values and am not even interested in yours, much less willing to be subservient to them. So do the job I am paying you to do and shut up about all that other crap.
Wednesday, September 2, 2015
Or maybe that’s not quite right. My tastes are perhaps more limited, even fixed, than they once were — I am more likely to say, of a book or a movie or a record that people are praising, “Maybe it’s as good as they say, but I pass” — and, moreover, to feel comfortable with that decision and untempted to revisit it. But I’m not inclined to think that my tastes have become increasingly precise, ever more sophisticated; rather, I’m simply aware of the passage of time, the shrinking of the years in front of me, and am less prone to devotes lots of time and energy to things that (experience teaches me) I am not likely to find rewarding.
Might I miss some cool stuff? Indeed. And not just “might” but “will.” But here’s the thing about being fifty-six: I know I’ve missed lots of cool stuff. And I’m still here, and not obviously worse off for it. That makes it easier to go with my gut — to grab what looks good and to ignore what doesn’t — but not because I’m smarter than I used to be or more discerning. It’s just a matter of reckoning with the brevity of life. All in all, I’d rather read Jane Austen again.
That said, the next book on my list is the first book of Elena Ferrante’s Neapolitan tetralogy. So I’m not only re-reading the faves.
Commentary on technologies of reading, writing, research, and, generally, knowledge. As these technologies change and develop, what do we lose, what do we gain, what is (fundamentally or trivially) altered? And, not least, what's fun?
Alan Jacobs is Distinguished Professor of the Humanities in the Honors Program of Baylor University and the author, most recently, of The “Book of Common Prayer”: A Biography and The Pleasures of Reading in an Age of Distraction. His homepage is here.
Sites of Interest
- December (24)
- November (20)
- October (16)
- August (6)
- July (13)
- June (18)
- May (16)
- April (2)
- October (12)
- September (20)
- August (22)
- July (17)
- June (5)
- May (14)
- April (12)
- March (15)
- February (10)
- January (15)
- August (9)
- July (8)
- June (14)
- May (28)
- April (13)
- March (24)
- February (16)
- January (23)
- December (28)
- November (19)
- October (21)
- September (25)
- August (20)
- July (33)
- June (54)
- May (44)
- April (19)
- March (24)
- February (19)
- January (25)
- December (33)
- November (33)
- October (39)
- September (27)
- August (32)
- July (36)
- June (26)
- May (25)
- April (32)
- March (34)
- February (2)
- January (31)