And the crowd is going crazy!

Early Thursday morning, I wrote that I was going to see a friend defend her dissertation. And did she ever. She just did a brilliant job, she made everyone in the room smarter. All five committee members said that the work was exciting, thorough, showed intellectual maturity. They were offering advice about making the dissertation into a book, always a good sign. The nearly two hours of conversation were a celebration of the very best things about academic life, the perfect example of why people become fascinated with small details of life that illuminate larger regions.

And then the ref blew the call.

This dissertation, this marvel of scholarship, was not passed, and the PhD was not conferred. The work was “accepted with revisions.” In my experience, that usually means that the work is mostly sound but there’s some big methodological or theoretical hole that needs to be patched before the ship sails free. It’s an indicator of a couple of months of remaining effort, which the committee believes you to be capable of but is concerned enough that they’re reserving judgment.

But in this case… typos. Inconsistent capitalization. Bring us back a clean copy in three weeks.

It’s like calling back a towering home run because one cleat of one of the batter’s shoes touched the chalk of the batter’s box. It’s like closing down a run of a Broadway show because one actor’s suit jacket had two buttons instead of the appropriate three. Inconsistency in trivia is part of every fast-moving document. We all know it, and we all prepare to do copy editing before we go to final publication. But the editors who review for acquisition discuss the work on its merits, knowing that the tuning will occur.

So what a decent human would have done in this case is to congratulate their student, shake her hand and welcome her to the doctoral community… and then privately, off-stage, say “Here’s a marked-up copy. You’re going to want to do a little proofreading before you send this in to the graduate school for the archive.”

But no. This guy, who’d already taken three or four opportunities to demean the assembled graduate students who’d come in support of their colleague, decided that inconsistent capitalization was sufficient to withhold his blessing. He had one last chance to be a decent person to those in his care, and he fucked it up.

If this were a stadium, there would be beer cups flying onto the field. The commentators would be showing replay from six different camera angles, all clearly demonstrating the umpire’s error, the success of the play. And the raucous cry would emerge from fifty thousand voices:




In the greater scheme of life, a blown call on the field or in the seminar room doesn’t really matter. My friend will have her PhD in three weeks, and the work will still be really smart. I’ll all be behind us. But this is the kind of petty exercise of power and status that academia is so deservedly mocked for. And it makes me grateful all over again to stand outside it.

(By the way… if you’re reading this and you’re wondering if it might be about you, if you’re the scholar who didn’t generously support the students whose careers literally depend on your judgment… then yes, it’s you. Even if you weren’t THIS guy in THIS room, take a moment to ask what you do, every day, to ease the lives of those who work tirelessly and intelligently to enter your community. Sometimes I think the opposite of “tyrant” is just “grown-up.” Be a grown-up, and don’t make things worse for the people around you.)


It’s been a long week, between answering tons of e-mail and doing road review duty all day Monday after the storm. But today, a treat. I get to drive to see a good friend defend her dissertation today, the culmination of years of rigorous, smart work.

One of my stories begins with a dissertation defense. This is what it feels like from inside.

Every test had been not merely met but exceeded. Which brought Kurt to this day in April, as he prepared to defend his dissertation, the moment in which he would walk into the arena defended only by the shield of his scholarship, to do battle with the aging lions of his discipline. Six years of work, courses high-passed and exams high-praised, publications and prizes, the final two years in the field and in the archives, all leading to his moment in the well of the lecture hall, alone in the face of three committee members and two outside readers who would, within this afternoon, decide his worth. There would be plenty of others in attendance: fellow doctoral students, stray faculty with nothing better to do, friends from other departments, maybe three dozen or more whose presence would make the room friendlier, make it seem more like an everyday classroom lecture. But only those five in the front row would have the power of jurors, to determine his verdict.

He had paid an individual visit to each of those five offices in the past two days, on the surface a courtesy call to thank them for their guidance to his research and writing, but really to sniff out hints of their pending judgment, to build a forecast of this afternoon’s weather. The climate models looked promising, all smiles and compliments. His girlfriend Megan had reminded him that defenses didn’t get scheduled unless the dissertation chair was satisfied with the work—and Jane Clendenon wouldn’t tolerate being embarrassed by her advisee’s performance. Jane’s permission to move forward had been the real hurdle, passed back in February; her words of reassurance yesterday served as her blessing and confirmation.

He’d run through his PowerPoint deck four times that morning, delivering his research talk to his empty office until it was burned into mind, not merely its content but its cadences, its natural points to pause, the places where an elegant turn of phrase could hang in the air for a moment’s appreciation. Like many introverts, Kurt had learned to perform, had learned how to command a room in ways that forestalled more unplanned interactions. His course lectures had a theatrical sensibility, an appreciation for what a friend had called “the rhetorical circumstance” of a lone performer standing before dozens or hundreds of people. His students would occasionally bring visiting friends and family to see the show, to see the heights to which a University of Michigan classroom could rise.

Now there was nothing left to prepare. He checked his lecturer’s toolkit—remote control, laser pointer, water bottle, pen and legal pad—one last time. Walked from his office to the bathroom, where he peed a tiny volume, the third time in an hour, just to have something to do. Then he washed his hands, combed his hair in the mirror, adjusted his tie unnecessarily, collected his kit, and walked out onto the stage.

I’ll look forward to bringing you all the good news tomorrow.

Stop On By!

Dude! What the hell are these things doing on my chair?

The book has been released to the wild, but the official launch will be next Saturday, April 20, at Northshire Books in Manchester VT. If you’re in the region, stop by, hear a fun book talk, and have a great conversation with lots of interesting friends. It’s going to be a terrific time, and I’d look forward to seeing you there.

Spotted in the Wild

My good friend Patty McWilliams called me this afternoon with news that the first shipment of the new book had arrived at her shop, Hermit Hill Books of Poultney VT. Some time thereafter, she sent me the first glowing review:

Mitzi the shop cat looks adoringly at the new arrivals

Support your independent booksellers. Hermit Hill Books, 802-287-5757.

Terminal Emotions

We all have those pieces of writing that completely unlocked a new terrain for us. One of mine was Jean-Paul Sartre’s The Emotions: Outline of a Theory, from 1939. To unfairly collapse it all into a single line (and perhaps also unfairly to do it from the English translation rather than the French original), Sartre proposes that emotions are predictive, that they reflect our judgments of what we foresee. So anger, for instance, would be knowing what you want but identifying a person or circumstance that blocks you from getting it. Grief would be knowing that you’re fated to live without that person or place or pet in your life. Curiosity is separated from confusion only by whether you imagine that you will or will not be able to figure something out.

I’ve used this formulation to make sense of a lot of things over the twenty or so years since I first read it; it’s been a productive way of understanding the world. But I’m increasingly frustrated (itself an emotional descriptor of not knowing how to productively move forward) by a body of emotions that seem to me to not be future-referent at all, that seem inert, terminal. Emotions like outraged, or offended, seem to not offer a forward path at all. We drop one of those, and we’re just done.

Another core text for me is Z. D. Gurevitch’s “The Dialogic Connection and the Ethics of Dialogue,” an article published in The British Journal of Sociology in 1990. His notion of the ethical circumstance of dialogue includes three responsibilities: to speak, to listen, and to respond. That sounds trivial, but it isn’t, really. Dialogue is broken if someone refuses to speak. It’s broken if someone speaks and the other isn’t able to listen. And it’s broken if both people speak and listen, but there’s no real response, no sense in which thinking or behavior is affected by what’s been said and heard. And that, to me, is where being outraged or offended are themselves stances that violate the ethics of dialogue. They aren’t aimed at change or growth (being not future-oriented, they couldn’t be); they offer no possibility, only closure. They’re expressions of hopelessness, in a way, the belief that there is no meaningful future of rapprochement.

Decades ago, when I was in catechism class, I remember learning that “it is a sin to offend, but it is also a sin to take offense.” The notion was that if God could offer grace to all of us fallen, it was our duty to extend that grace to those we encountered. I’ve long since left that faith, but have increased my conviction in that attitude. Grace may be the opposite of outrage: the expression of constant hope for the powers of dialogic healing. Grace has a future. Outrage has no need for one.

The Right Conversation at the Right Time

Illustration by Cathryn Virginia, for The New Yorker

Sometimes you swim upstream for ages. Nothing feels like it has a landing point, nothing feels like it has traction or makes progress. And other times, you catch a current, ride a tailwind, your own work amplified by the pace of events and interest.

The new book is falling into that second category, which suggests to me that the broader conversations about adjunct labor in higher ed are primed to spring forth. After two weeks of churn about the excerpt in the Chronicle, there’s an extended review of the book on today’s electronic home page of the New Yorker! Hua Hsu (himself a faculty member at Vassar) has done a marvelous job comparing my work with that of John Sexton, retired president of NYU, to show that what you see depends on where you stand when you look. It’s a terrific piece of writing, which helps bring this sequestered academic conversation into broader communities.

Dogends and Cold Leftovers

Nora and I were talking this morning, and she said, “I wonder if all of these women-in-STEM and minorities-in-STEM initiatives are just a sign that STEM is finished…”

One of the arguments I make in the new book, drawing from research in economic sociology, is that once a career path is fully opened to women, it’s suddenly not compensated as well, not as prestigious, more highly regulated than it had been when it was mostly guys that did it.

For instance, Hadas Mandel’s research describes what she calls the “up the down staircase” phenomenon of “declining discrimination against women as individual workers, and rising discrimination against occupations after the entry of women.”[i] Or Josipa Roska’s research that shows college grads entering male-dominated fields starting out at salaries far greater than those of college grads entering female-dominated fields.[ii] Or Anne Lincoln’s research in the “feminization” of veterinary practice, and the ways in which male students begin to avoid academic disciplines that become the site of increasing women’s participation.[iii] Or Levanon, England and Allison’s research showing more evidentiary support for devaluation of “feminized professions” than for exclusion of women from “masculine professions.”[iv]

The Bureau of Labor Statistics says that it’s deeply uncertain whether the nation is facing a “STEM Crisis.” Certainly there isn’t one in academic science; the BLS study shows that the average doctoral faculty member in STEM produces about four new PhDs over the course of her or his career, far above the required replacement rate for academic jobs. In non-academic fields, employers in the study showed little difficulty in finding bachelor’s-degree holders for entry STEM work, but a shortfall in students with advanced degrees. But those industry demands are fluid, changing with the price of oil and Federal infrastructure and defense policy, fluctuating far more rapidly than a long educational process can predict.

(This is true, of course, for all career types. Students are asked to guess, at age 18, both what they want to do and what careers will be in demand, five or six years into the future. If they could accurately do that as a group, they’d be way, way ahead of almost every Wall Street investment house or venture capital firm, which are wrong most of the time.)

Anyway, when college was only for the sons of the well-to-do, there wasn’t much oversight, and wasn’t much handwringing about choosing the right major; now that women and students of color are there in high proportion, it’s become more vocational and subject to far more oversight. When college faculty were mostly men, there were plenty of tenured faculty jobs; now that women have succeeded enormously in faculty work, the job availability has collapsed and contingency has boomed (and the majority of adjunct teachers are women). When college was a mostly male endeavor, state governments funded more than half of the costs of public universities; now that college enrollment is nearing 60% women, state funding has collapsed almost everywhere.

So here’s an economic rule of thumb: once something that had been restricted to elite participants is now made available to a broad community, the original participants have mostly sucked the nutrients out of it, leaving a depleted landscape. Nora’s comment over breakfast makes me think that STEM is no different—we’re going out of our way to invite women and minorities in, just at the moment when the possibilities are drying up and we merely need cheap workers.

[i] Mandel, Hadas. “Up the Down Staircase: Women’s Upward Mobility and the Wage Penalty for Occupational Feminization, 1970-2007.” Social Forces 91, no. 4 (June 2013): 1183-1207. See also the Crates and Ribbons blog post, “Patriarchy’s Magic Trick: How Anything Perceived As Women’s Work Immediately Sheds Its Value.”

[ii] Roska, Josipa. “Double disadvantage or blessing in disguise? Understanding the relationship between college major and employment sector.” Sociology of Education 78, no. 3 (July 2005): 207-232.

[iii] Lincoln, Anne E. “The shifting supply of men and women to occupations: Feminization in veterinary education.” Social Forces 88, no. 5 (July 2010): 1969-1998. 

[iv] Levanon, Asaf, Paula England, and Paul Allison. “Occupational feminization and pay: Assessing causal dynamics using 1950-2000 U.S. Census data.” Social Forces 88, no. 2 (December 2009): 865-891.

On the Fading Tail of the Technological Wave

Douglas Adams once wrote (in paraphrase) that any technology around when you were a kid has seemed to have always been there; technology developed when you were between 15 and 35 is new and amazing and you can probably make a career using it; and technology developed when you were 35 and over is against the natural order of things. And an awful lot of taken-for-granted technology has been developed since I passed that threshold 25 years ago. Cell phones were barely around, but smart phones are new and evil. Web 1.0, where specific people worked hard to get their voices out on the internet, was the norm, and Web 2.0 that lets any anonymous idiot spread conspiracy theories on comment boards has ruined public discourse. And so on. You get the picture.

I say this today because I’ve been trying to set up a second website through my WordPress account, and the online instructions are full of simple things like “using your WordPress aggregate management software, connect through your FTP client and modify your wp-config.php file…”


And it made me think that the ability to write software code in the 21st century is like the ability to read in the 15th… most people didn’t have it, and those that did became powerful. The rest of us remained serfs.

I also got invited today to participate in a podcast, which is kind of like radio so I approve of it. I’ll be happy to do it. But we’re going to connect via a software package called Zoom. <sigh> I’m increasingly resistant to new apps, not because the apps themselves are hard to learn or hard to use, but because it’s another account with another username and another password. Passwords used to be for getting into the boys’ treehouse; now I have to have a password to shop, or to bank, or to write, or to check my mail, or to talk with someone through my laptop microphone. And the software have dumb names. Snapchat, Wufoo, Zoom… what are we, six years old? Get off my lawn.

I’m writing all this (as well as all of my books and all of my client data analysis, everything I do professionally) on a 2012 MacBook Pro with OSX10.10.5 Yosemite; I’ve gotten past all the predatory cat OS versions, but I’m still working my way through California landscapes. I do all my work with five-year old versions of Microsoft Office products. There’s absolutely nothing I need to do that can’t be done on this machine. I don’t download movies or play video games, so I don’t need lots of processor speed or active memory; and I’ve still got 85% of my meager 500GB hard drive open. I back up all seven years of my data files every two weeks onto a flash drive, which takes about ten minutes. (Clouds are for shade, not data storage.)

I admire all of this technological change, though I’m really not interested in using much of it. But I recognize that the landscape is changing all around me, and every day brings another instance of the same question—do I participate, or do I ignore it and go on doing what I already know how to do?

Nora is writing a history of a 19th century family that saw their economic order disrupted over and over again, not because they did anything wrong, but because other political and technological changes from hundreds or thousands of miles away made their work obsolete. Spinning mills made Rebecca Morison’s home spinning less cost effective, and rendered Samuel Morison’s knowledge of building spinning wheels useless. Mulberry trees were planted in New England to grow silkworms in the 1830s… by the 1850s, French silk had flooded the US market. Northerners lost access to cotton cloth during the Civil War, so home spinning of wool and flax fiber became momentarily useful again, lost again permanently by the 1880s.

I grew up in a city that the first half of the 20th century built to strength, the second half tore to ruins. My hometown of 20,000 people lost 12,000 jobs in fifteen years, a generation of men and women who never learned the password to the future. Industry left them behind, as surely as Wufoo and ApplePay have ditched me.

The prophets of a brave new world
Captains of industry
Have visions grand and great designs
But none have room for me

They see a world where everyone
Is rich and smart and young
But if I live to see such things
Too late for me they come

from Todd Rundgren, “Honest Work”

It’s not just the tools that change, but the kinds of work that the tools enable. I blog instead of tweeting and instagramming, because I value a certain kind of writing, but I know that tweets and flash fiction are meaningful work. I love 19th century buildings because you can see the labor in every brick course and every carved cherub; 21st century buildings leave me cold at least in part because the labor is hidden away under the sleek, swooping skin. I like cash, because I like to count things, and because I like privacy; I don’t buy things online much because every time you press “Pay,” it weighs the same, and every trivial purchase is memorialized forever, likely to show up as a junk email in two, weeks offering some add-on purchase or a coupon luring me to a return visit.

We all find certain things to be native, other things that never fit. Our lives are all in negotiation, weighing then against now against next.

On Being Purposefully Undeclared

Yesterday, I had a few thoughts about the importance of being an “undeclared major” for the first year or more of college, thoughts prompted by ideas from Matt Reed on Inside Higher Ed. Today, I’ve got some thoughts about what students should be doing instead, prompted by ideas from a different Inside Higher Ed commentator, John Warner. John’s a strong writer, but more importantly, a generous and thoughtful critic of the whole higher ed enterprise, in which he’s been an adjunct faculty member for a long time. He exemplifies the idea of student-centered teaching.

Anyway, a couple of months ago, he asked the question: what if every class was a version of ‘music appreciation?’ That is, what if our fundamental aim was to help students understand a new way of thinking? To help them realize that they themselves can start to think that way? And then to help them imagine which ways of thinking are most exciting, most fundamental to their emerging sense of self?

John writes movingly of his high school music appreciation class, which he took for its easy grade, but which led him to understand why music mattered, led him to be able to clearly hear (and later, to perform, though that wasn’t the course’s goal) things that otherwise would have been muted. He then talks about how he uses that in his creative writing classes: “Most of the students in that kind of class already love stories and reading, so to set aside some time for me (or one of them) to read a story out loud in class and spend a few moments afterwards marveling about how it worked its spell on us is as natural as anything.” He proposes a similar course in humor appreciation:

I could share lots of things which would engender laughter, after which we could ask why exactly we were laughing, the same way my music appreciation teacher could ask why we were bopping along to the music at our desks. Students could also bring humor to me, expanding the palate of what we discussed beyond my own preoccupations. Later, they would attempt to write their own humorous pieces, bringing their understanding from observation, to self-generated theory, to execution.

What we typically do with the “core curriculum” is (as one student put it) give everyone the opportunity to do a number of things they’ll never, ever be forced to do again. They’ll have to learn how to calculate a derivative, and memorize the 206 bones of the body, remembering for a moment which was was the radius and which was the ulna. They’ll take Intro to Psych so that they can have a new flash-card vocabulary to forget—behaviorism, cognition, Skinner and Piaget, somatic and autonomic. They’ll take US history so that they can chronologically order the world up through about twenty years before they were born.

Let’s imagine another way. Let’s imagine a year of evangelism, a year in which students are introduced in thoughtful ways to a series of obsessive adults. They would get to see in detail what historical thinking looks like, what mathematical thinking looks like, what musical thinking or storytellers’ thinking or scientists’ thinking looks like, and why those differences matter. They’d have carefully structured opportunities to practice those modes of thinking themselves, to discover which ones fit natively… which ones are excitingly new… which ones remain alien even with their best efforts. At the end of that year, they’d know themselves, which is the “core” of any informed choice of major.

I taught first year writing at Duke for four years, and all the teachers got to design our own way into the work of teaching academic writing. The very best version of it that I ever concocted was about stages of life, looking at the ways that our culture keeps inventing stages of life that hadn’t previously existed, and what those inventions say about us. We talked a lot about how ill-defined adulthood was, the phase that one might think would stand at the center of all. And for the last four weeks, I asked them to imagine something about their own pending adulthoods that they were looking forward to, or afraid of, and to discover what the research literature had to say about it.

Oh. My. God.

They learned to write an academic paper in a social science mode, for sure. But they learned so, so much more. They learned about their own struggles with body image, with ethnic identity, with money and class, with family heritage and the ways that legacy can be both gift and burden. They wrote anywhere between well and brilliantly, but far more importantly, they learned more about what they wanted from the world, and developed some paths to achieve those things.

So it’s not enough to say that the first year is undeclared. It’s not even accurate, if we do it right. The first year is an opportunity to make a declaration: a declaration of self, a declaration of purpose, a declaration of the community with whom we aspire to belong.