You don’t belong here.

I got an email from a friend last night. I will identify them as “they” and not locate them anywhere except within the lower 48 United States, but I will say that they are an adjunct faculty member at a couple of colleges. That’ll narrow it down to maybe a million people, so I think we’re safe.

They teach at one college that has returned to in-person instruction.

That college has started a vaccine administration system on its campus.

Adjuncts are not offered vaccination.

60% of their teaching force are adjuncts.

I mean, just from a public health standpoint, this is brick stupid. Why would you want a substantial part of your herd to not have been immunized, when you’re in a close-contact community?

And here’s the deal. I don’t think this is mean-spirited. I don’t think anybody’s standing at the gate, gleefully cackling “No Pfizer for YOU, dearie!!!” I think this is just the kind of thing that happens when a class of people have become invisible. Have become non-people.

This is the kind of thing that happens when we crow about having “a sense of community,” and then forget to actually think about who’s part of our community.

Lots of folks get upset at the idea of structural racism or structural sexism. They prefer to see a world of individual competitors who succeed or fail on the fair playing field of life. They insist that, because THEY THEMSELVES aren’t biased, that biases play and have played no role in their lives. They insist that, because THEY THEMSELVES aren’t biased, that bias plays no role in the lives of others

But my friend the adjunct has no way of competing for this vaccine on this campus, even as they have given years of service to this body of students. There is nothing that they could do, or could have ever done, to make them eligible for a benefit that others receive freely. There is absolutely nothing that is “their fault” about this decision made outside their control and without their consideration.

If a real estate agent shows people of color houses in only one part of town (knowing that they won’t “fit in” elsewhere), then that family will buy a house that won’t appreciate as rapidly, in a neighborhood that won’t be served by great schools, and won’t leave as much wealth for the kids to inherit. This used to be part of Federal housing and lending policy—now it’s cultural, a sense of who belongs and who doesn’t, cultural norms that add up over millions and millions of iterations into something stable and stubborn and enduring. Did those kids choose to be born six steps behind on the wealth stair? Of course not, just as George W. (“Daddy got me into Yale”) Bush or Mitt (“Daddy bought me a house in Boston while I was doing my MBA”) Romney didn’t choose to be born at the very pinnacle of that staircase. But it’s intellectually dishonest to not acknowledge those starting points, and the larger cumulative weight of history that they represent.

I was raised in a working class family, came from no economic privilege. But I can name you half a dozen times that I did some dumb thing as a teenager or young adult that, because I was white, got me a stern talking-to by a cop. If I’d been Black or Mexican American, each one of those would have been far more likely to have gotten me arrested. Or left me dead. It was bad enough being a long-haired hippie with a backpack in Amarillo, Texas in 1979… if I’d been a person of color, it would have been over.

We’re good at seeing individual cases and really, really bad at seeing patterns. And we’re absolutely terrible at seeing patterns we don’t want to see. So: was Eric Garner breaking the law by selling individual cigarettes on the sidewalk for a dollar? Sure, absolutely he was. Is my white neighbor breaking the law by having fifty leaking, dead cars scattered across his property, leaching motor oil and fuel and antifreeze and lead from the batteries into the soil above the river? Absolutely he is. So which one’s dead?

See, that’s what we mean by structural. It’s just another way of saying “patterns.” Patterns that maybe we should look at more closely.

When I accepted the challenge to write The Adjunct Underclass, I was clear from the start with my editor that I didn’t want to write another “combat narrative” of evil administrators and beleaguered teachers. I believed then, and believe even more strongly now, that what we’re seeing is an ecological collapse in which the species of college teachers is dying off. A combination of demographics and state funding and co-curricular services and educational technology and transfer credits and the broad cultural abandonment of workers have all contributed their tiny component to a structural discrimination in which someone who’s dedicated years of service to their students isn’t deemed a “real person” for purposes of public health.

But we see simple cause-and-effect more easily than we see systems. We see individual cases more easily than we see patterns. And we see, and accept, what is more easily than we imagine what might be instead.

When Is It Done?

The unseen labor of beauty

I subscribe to a daily message from The Creative Independent, a Kickstarter offshoot that conducts interviews with well known, lesser known, and unknown artists about their origins, their processes, their ways of working through frustrations or transitions. Today’s was an interview with the musician Sarah Beth Tomberlin. And, as often happens, there was one line that stopped me.

A song is finished when you connect to it and you don’t feel like you’re lying. 

That feels right. It feels like what I see when I look at the work of my friends who are woodworkers and paper artists. You can look at their work from every direction and see no loose ends, no decisions that could have been made but weren’t. Nothing “good enough,” just good.

When we make things, there are several levels of done-ness at which we could stop.

  • There’s “I don’t really care about this,” which might prevent us from starting it at all.
  • There’s “I can’t get this to work,” which may be momentary and may be terminal. The novelist Lee Martin says that his father’s gruff, farm advice was “Can’t never got nothin’ done.”
  • There’s “good, now it’s running,” which might be enough for a couple of days while we think about next steps or wait for the new parts to arrive.
  • There’s “that’s pretty nice… but I hope they don’t turn it over…”. It’s that moment where it’s really come together except for that one wonky thing that just never got resolved.
  • And then there’s “finished.” When you connect to it, and there’s no places where you feel like you’re lying.

The premise behind The Creative Independent, behind all those writers’ talks and MasterClass sessions, is that getting to “finished” is just really, really hard. Not just technically hard, but emotionally hard, because we live most of the time in some lower level of done-ness, straining to bring the beast along in parts toward “finished,” and then knowing that the completed parts don’t yet add up to an equally-completed whole. So we take encouragement and inspiration from the occasional glimpses we get of finished work, and we want to see how. That’s why we ask writers whether they drink coffee or have sworn off cigarettes, whether they write in the morning or at night, whether they think we should enroll in an MFA program or just sit down and go. We’re hoping to borrow some strategies for finding “finished,” not about the work but about our relationship to the work. About believing that “finished” is an attainable state.

I love finding things that are “finished,” and then learning how they got to be that way. A couple of weeks ago, I spent an hour and a half watching the musician Jacob Collier demonstrate the ways in which his arrangement of the song “Moon River” came about. He walked us through it, simultaneously sounding out chords on his desktop keyboard and showing us how he arranged the 4,719 different vocal tracks in his Mac Logic software. (That image at the top of today’s post is a screenshot of Collier’s Logic display of a different song; each of those little scraps of color is him singing a vocal part of seconds apiece, the scraps then assembled into an auditory mosaic.)

Yesterday, I spent some time reading about the custom 1959 Cadillac that won last year’s Ridler Award for automotive creativity. There is absolutely nothing about this car that wasn’t re-invented, re-shaped, taken four paces past reasonable… to “finished.” Two years of work and two million dollars invested, to win a ten-thousand-dollar prize. “Finished” is often an unreasonable aspiration, which is why we can recognize it when we see it.

We also need to know—our own unique, personal assessment—when “finished” is unnecessary. We only have so many hours, and I’d prefer to get a few things “finished,” so I leave lots of others “good enough.” My wood stacking technique is meager, as is my snow shoveling. But the walkway is safe, if you’re paying attention… and the wood is under cover and drying, as long as I don’t bump into that one faulty tower at the northwest corner and bring it down. That work does what’s needed, and leaves me time to aspire toward “finished” in some other area.

These blog posts aren’t “finished.” I give them an hour or two, helping me think through what’s on my mind, and (I hope) offering some encouragement or a few interesting minutes to others. But my real writing… that’s the unreasonable labor of making sure that no matter which way it’s turned, it remains integrated and legible and beautiful. The blog, in a way, is the gym where I get stronger and learn new ideas before I bring them to the performance floor.

The status of “finished” is also a unique, personal assessment. It has no bearing on whether the work speaks to others. It is, as they say, necessary without being sufficient. It is a baseline threshold for releasing the work into the world as a creative person, but what others do with it is beyond our influence. I have no place in my heart for opera or classical European ballet, even as I can recognize that it’s fully finished, elegant, thoughtful work. I have a neighbor who’s a professional, academic, renowned, award-winning poet (it’s quite a little town we’ve got here), whose work doesn’t speak to me at all. Nora doesn’t like Jacob Collier’s music, though she can recognize its level of craft. The state motto of Vermont should be “Huh… I don’t know that I’d’a done it that way…” As Martha Graham once said, “What other people think about you is absolutely none of your business.”

All we can do is to find some areas of our lives that deserve unreasonable labor, and then to dedicate ourselves to taking some piece of that work to “finished.” We all owe ourselves that much. It is our very best self, made material.

Hostility Lit

Well, that’s ONE model, I guess…

I’m going to paraphrase here, just so you know.

My writers’ group was meeting on Sunday, discussing a truly wonderful story from one of our members. She’d started with this story fifteen years before, had built it into a novel that became something of a shambling beast, and wanted to go back to the story as a stand-alone and rediscover what had drawn her to write about this specific moment.

The story still carried some remnants of its novelization, like scraps of plaster stuck to the back of a painting’s frame when it’s taken from a long mounting on the wall. Most specifically, it ended with a narrator from some long distant future, enclosing this perfectly rendered instant within a more inert historical frame. We thought about how, if that retrospective conclusion were removed, the story might otherwise end; thought about different modes of denouement that would land the characters into a modified world.

And one of our group said that he wasn’t sure that a denouement was needed at all. “A novel has to make friends with you,” he said, more or less, “but a short story just has to run up, slap you in the face, and run away.”

Ummm… okay… and we want that why, exactly?

I mean, think about that metaphor. In what other mode would we want meaningless, random aggression that we’re left to figure out on our own? Isn’t that a definition of terrorism? Domestic violence? Do we want our stories to give us micro-dosed PTSD?

Larry David, the lead writer of Seinfeld, said famously that his two rules for the show were “no hugging, no learning.” And that’s just a sad, disappointing recipe for a life.

Years ago, the Chronicle of Higher Education ran a piece about young scholars’ transition to faculty life, an editorial essay called “That Guy.” The premise was that we’ve all run into jerks in our professional lives—the dissertation adviser who never returns papers, the self-important professor who reads the same lectures off the same crumbling, handwritten notes for decades, the senior scholar now allergic to any of the new thinking of his field—and that those modes of jerkishness can act as positive motivation for our own career. We can take, as part of our developmental task, to “not be that guy.”

I feel the same every time I run into another instance of our modern fetish for hostility lit. It simply convinces me, once again, that I don’t need to do my work that way. I can hold that up as an opposed magnetic force that pushes me toward my own aspirations.

Let me go back to a couple of things that I noted in my comments about the passing of Barry Lopez a couple of months ago.

  • From the Inuktitut language, the word for “storyteller” is isumatuq, which means “the person who creates the atmosphere in which the wisdom reveals itself.”
  • Storytellers are pattern makers. If our patterns are beautiful and full of grace, they will have the power to bring a person for whom the world has become chaotic and disorganized up from their knees and back to life.
  • If stories come to you, care for them. And learn to give them away where they are needed. Sometimes a person needs a story more than food to stay alive

Back when I was doing academic research, I devised a framework for myself that I called “narrative research,” which was the idea that everything we learned about some person or culture or community could be seen as an expression of an important collective story, and that my job was to understand and to tell that story. Part of this came from my particular field of interest, which was ethnographic work among teenagers. I came to mistrust the “hard science” of developmental psychology, with its inevitable sequential Piagettian stages from sensorimotor through pre-operational to concrete operational to formal operational cognition. I was more drawn to Piaget’s contemporary (and competitor) Lev Vygotsky, who framed youth as an apprenticeship in the ways of life that adults wanted to teach, learning the stories that mattered.

Contemporary adolescence is best understood, for me, as a time of lost story. The story of childhood has been removed, but the story of adulthood is yet withheld. Teenagers don’t have a legitimate cultural story in our structures; they’re no longer A, but not yet B.

Life is filled with those moments of narrative gap, which we often call “crises.” We move from college to career. From free single to young parent. From a house with kids to a house without kids. From fertile to menopausal. From married to divorced. From married to widowed. From employed to retired. From mostly well to mostly infirm. In every case, there will be some liminal period in which we’re no longer A, but haven’t yet figured out how to be B.

Every single one of my stories, now that I look back at them, is a story of someone attempting to build a new B now that A is no longer available to them. I write passages from one nation to another, stories of exile and new home. It doesn’t matter whether the story is 1,500 words or 95,000 words; what matters is that someone in uncertainty, in a “life of quiet desperation,” is helped to find a new community and build a new self. What matters is that they can help a reader for whom the world has become chaotic and disorganized to rise from their knees and back to life.

Sometimes a person needs a story more than food to stay alive.

The Evasion of Death

Art is one attempt to evade death: make a monument to what is true and beautiful, and hope it will outlast you. Children are another attempt, perhaps more sensible than art. And then there’s sex (not unrelated to children), which offers its participants the ability to forget about death and, in moments of self-erasure, accept it, too. But orgasm lasts a few seconds, and children a few decades. People are still reading poems from thousands of years ago.

Much like Achilles, gaming out whether he wants a short life of glory or a long life of anonymity, someone preoccupied with their artistic reputation might well ask: glory now or later or perhaps never? Would you like to be much discussed but little read, or read and treasured by a few? Of course, none of these tradeoffs guarantee an outcome; most of us do not have a goddess mother to insure our fate. You might get everything—fame now, fame later—and you might get nothing at all. Writers know that talent is real, success is arbitrary, and that the relationship these two facts bear to each other is a mystery.

B. D. McClay, “‘Divorcing’ Is Literature That Looks Beyond Life,” The New Yorker, Feb 10 2021

I started this morning by reading that last line—Writers know that talent is real, success is arbitrary, and that the relationship these two facts bear to each other is a mystery—as the pull quote from the morning’s feed from The New Yorker. I liked it quite a lot, read it to Nora, and then back to read it in the context of McClay’s essay on the one novel from the brief life of Susan Taubes.

I think that this pair of paragraphs above is a fuller expression of the fundamental idea of the essay. People think about creativity as a way to achieve immortality: that we make something that will live beyond us. But I think that’s the partial case of what McClay expresses more fully: that art is an attempt to evade death.

Immortality is one relatively unimaginative way of thinking about that, which is why an afterlife is so easy for people to grasp. Nora and I were at a funeral service a few years ago for a local young man who’d died of a drug overdose. I was downstairs helping to organize the post-service reception, but Nora was upstairs at the funeral proper. And she talked later about how much the idea of the afterlife would serve as a relief for most people: that you have a second chance, identifiably as yourself and surrounded by your family, but with all of your flaws removed. You will be the ideal self you always knew you could be if you weren’t surrounded by all of this earthly burden and temptation.

(Hell is unimaginative, too, whether it’s the lake of fire or the Sisyphean task that can never be fulfilled. You’re still identifiably the same person, but now placed into a perfectly refined soup of suffering.)

Science has given us contemporary and agnostic versions of immortality: the brain in the jar, the cryogenically frozen and restored self brought back once the re-animation technologies are complete. Or “the singularity,” in which we become machine enough to self-repair, to rewrite DNA or to have blood-cleansing nanobots, and our machines themselves are intelligent enough to be treated as a new culture with its own goals and norms. Again, not especially creative—we’re still recognizably us, just better. (And this has its own image of hell, too: when the machines take over and enslave us.)

A somewhat more subtle version of immortality is genetic lineage—that our children and their children and their children after that will live on, somehow representing us to history. Monarchies are based on this, the idea that the royal family endures beyond any of its individual members. It’s not as direct as the afterlife, because we as identifiable selves don’t get to experience it; it’s just that we assume that our successors will give us some credit for our kids’ traits and successes.

A more complicated version of immortality is reincarnation: the idea that, after death, our souls are redeployed into the world in a new bodily form, each time in order to experience a new learning of the world and of the self. The notion of karma isn’t so much about “what goes around comes around” as it is about the things we do in this life determining the lessons we need to learn in the next. We are absolutely NOT reincarnated as the same self the next time around; we might change genders, ethnicities, caste or class, even species. The only continuity is “the essence” or soul, not the fuller sense of our identity. This is more difficult for Westerners to accept, given our drive to individuality; the foundation of western metaphysics is that we’re special little snowflakes, each of us unique and irreplaceable. And, ideally, perpetual.

Let’s go back to the larger idea of evading death, though, the general case for which immortality is one unique instance. Another instance would be the creation of a thing or a system that goes beyond us. Henry Gantt has been dead for 101 years, but every architecture and engineering firm in the world has used a Gantt chart for project management. We don’t know much about Henry, most of us, but some scrap of his thinking is with us on a regular basis. So to with the Mona Lisa or the Maple Leaf Rag, the Pythagorean theorem or PopTarts. The details of their inventors’ lives are mostly unknown to each of us, a sentence or two we might remember about them at most, but the inventions continue to offer applicability or bring delight well after the passing of the individuals behind them.

Cemeteries, archives, libraries, halls of fame… all of those are efforts to attach some ongoing recognition to the lives of those gone by. The drive to write The Great American Novel is in part an attempt to have this afterlife, to become one of the marble busts in the pantheon. We’re gone, but our thing… identifiably ours, with our name attached… remains.

But let’s go beyond that, to accept what McClay calls “moments of self-erasure.” That’s a remarkable form of evading death: to simply disappear, to have our selves not matter. The philosophers have talked about this in terms of “transcendence,” of being beyond earthly concerns. Victor Frankl wrote that “The essentially self-transcendent quality of human existence renders man a being reaching out beyond himself.” Empathy is one form of this, that we can subsume our interests by trying to more fully understand the interests of another. Psychologists, following on the work of Mihaly Csikszentmihalyi, have used the term “flow” for peak experiences in which our sense of time and self are lost to full immersion in some external force. Mob behavior is another form, in which we temporarily surrender our will to the collective hive mind. And the libertines have taken orgasm to be their foundational model of “self-erasure,” drawing on the French term le petit mort or the small death. In all cases, the mode is the same: we have utterly and entirely lost concern for ourselves and our enduring existence. The fact of our unique selves, at that small moment, is irrelevant.

And that is the mode of evading death that I find most compelling about writing. When I’m in… when I’m really dug in, listening to and watching characters, trying to see clearly… I’m just not present at all. I have no experience of my self as self, no hunger or tiredness, no aspiration, no goal, no readers, no pride or shame. I’m gone. The story has replaced me.

Reading—good reading—does the same thing. Listening to music does the same thing. We leave one world, and enter another.

I have no control over, and no particular interest in, whether my work is read after my demise. That material immortality is of no concern to me. The paradox of creativity is that the fullest life is indistinguishable from non-life. We’re just gone.


One of those funny coincidences today. (ha.)

I’m at work on a handbook of academic assessment, and spending some of the day reading principles of effective feedback. The late educational assessment leader Grant Wiggins (“professional educational troublemaker of long standing”) claims that effective feedback has a few common characteristics:

  • It refers to the goal that someone’s trying to achieve—it helps me do what I want
  • It’s tangible and transparent—I can see and understand it immediately
  • It’s actionable—I can do something with it
  • It’s user-friendly—I can understand its terms and principles
  • It’s timely—it comes when I need to try again
  • It’s ongoing—I get feedback toward progress, not just my current condition
  • It’s consistent—the language and principles I’m aimed at remain steady

This is all good stuff, exactly what any of us would hope to provide as teachers or as informal coaches.

And as I’m reading, this other turd drops into my in-box:

Thank you for querying <agency> about your book project. We have evaluated your query and regrettably, your project is not a right fit for our agency.

We must be highly selective about the new projects we pursue. Thank you again for thinking of us. Please be well.


There are so many times our lives when the feedback is nothing but a binary: yes you did or no you didn’t. It isn’t feedback, really, in any meaningful way at all; it’s just a denied hall pass. It meets none of Wiggins’ seven principles. It is inert.

I actually had this following passage in the “book project” that was denied, about a different version of the information-free rejection:

Just then, Gwen’s phone pinged. She glanced at it, and laughed. “Well, I’m not going to MIT, anyway,” she said. “Look at this message line.” She handed me the phone with the e-mail notification from the MIT Office of Admissions: “Decision on your application.”

“I don’t even have to open that message to know I didn’t get in,” she said. “The ones who got in probably got a message that says something like ‘Welcome to MIT!’ This just has ‘sorry, too bad’ all over it.” She took the phone back, opened the message. “Yep. Sorry, too bad.” She stuck the phone back in her pocket. 

These are the messages we receive from those who have loads of slush to clear from the doorstep before their real work can get underway. And they already have plenty of real work. The rest of us are just a nuisance to be dispelled as efficiently as can be done.

And I’ll give this agency some credit: they sent a form letter. More than half, over the course of the years, have not. Those supplications have merely vanished. As Pepé Le Pew would say, Le Pouf.

(And can I be petty and quibble with their language? Sure I can… it’s not a book project. It’s a book. It’s not made out of Elmer’s glue and popsicle sticks. It’s not pine cones spray-painted gold and hot-glued onto a plywood ring to make a wreath. Come on… if you’re supposed to be some professional-user-of-words-kind-of-person, think about what your words imply. Try harder.)

I know they pretend kindness, but I think that the agentry industry ought to adopt a standardized form for rejections. I’ll even give them the draft:

Dear author:


Why not? (choose one)

  • Inept writing in sample
  • Inept writing/formatting in query
  • Interesting, but it’s not a topic/genre I can sell among my editor contacts
  • Interesting, but nobody knows who you are

See how easy? And now I can do something with the feedback! I can work on my craft; I can rebuild and proofread my query; I can do some more market research and find a more closely allied agent; or I can hang onto it and try to build the platform some more. If I get twenty of these rejections, I can do some data analysis, find patterns in the frequencies.

Otherwise, as my working-class relatives in Michigan were fond of saying, it’s just a turd in the punchbowl.


Gift or Burden

Here’s my book… lemme know what you think… no, no rush…

Let’s recap the past two days. On Tuesday, I wrote about how impossible it is to know how good we are at any specific thing, and how elite performance, by definition, is unavailable to almost all of us. And then on Wednesday, I wrote about how difficult it is to either offer or receive feedback on the quality of our work.

So where does that leave us? (Aside from it being Thursday.)

Thing one. These little essays are written for my own website, but they’re automatically linked to also become posts on my LinkedIn account. And LinkedIn, because it’s built for busy professionals who can’t waste a minute (except on LinkedIn), conveniently labels all of my posts by estimated reading time. “6 min read,” yesterday’s was deemed. We know enough to provide the executive summary for our giant analytical documents, because the executives have to get their talking points about our crisis and move on to the next.

Using the algorithm that seems to drive the LinkedIn estimator, we’re looking at reading 200 words per minute. That’s pretty fast, business reading rather than the immersive, engaging reading that might slow you a little. (As Sven Birkets once wrote, Everything here ultimately originates in the private self—that of the dreamy fellow with an open book in his lap. Dreamy fellows aren’t moving at 200wpm.) So let’s estimate that my fiction might be read at a more leisurely 150wpm. That means my last novel would ask 600 minutes of your attention—ten hours.

That’s a big ask.

Go to a great museum, spend the day. If they open at ten and close at six, that’s eight hours. Six if you have lunch and spend time in the gift shop. And over those six hours, you’ll see hundreds and hundreds of pieces, and you’ll average way less than a minute apiece with them. You’ll review the wall of paintings, choose the two or three that slow you down, spend five minutes with each of those, and then walk into the next gallery.

Here’s a test. Get a kitchen timer and set it for five minutes. And look at one painting on your computer monitor for the full five minutes. It’s a LONG time, isn’t it?

And I’ve asked you for ten hours.

It’s just an unreasonable request, in our era of three-minute YouTube videos that now seem like an eternity in the face of ten-second TikToks. Who’s got time? I think this is behind our fetish for flash fiction: character-conflict-explosion-resolution-NEXT! Give me 250 words and be done with it. We’re amending our unreasonable art form to meet the demands of our tweeted, soundbite age.

This weekend, I’ll be writing in preparation for a ten-minute play event put on by our local theater company. Can you do Death of a Salesman in ten minutes? Hamilton? The Ferryman? No. You put two or three people into one scene and that’s all you get. And that limits the kinds of themes you can write about. The producer of the Netflix adaptation of The Queen’s Gambit said that he was interested in writing a seven-episode miniseries because if it had been a two-hour movie, the only question in people’s minds would have been “does she beat the Russian guy?” It would have been a sports movie, Rocky with less muscles. But in six hours, they had time to explore her obsessions, and the obsessive communities who take her in.

In ten minutes, you’d have five material exchanges in the middle of one chess match, and a little dialogue over the top of it.

Thing two. When Nora and I make dinner for friends, we usually spend most of the afternoon prepping, for an evening that lasts three or four hours. The ratio of creation to enjoyment is about one to one, and Nora and I get to participate directly in that enjoyment. It’s a rapid, balanced payback.

Books aren’t like that. That book that I’m asking a reader to spend ten hours with cost me thousands. And the response from that reader back to me, if there is one, will be consumed in seconds. Thousands of hours out, seconds back.

We absolutely have to write for ourselves, if it’s going to be worth our time doing at all. The payback ratio makes absolutely no sense otherwise. The work itself has to be gratifying; that’s the payback that matters, and that balances itself out. The thousands of hours of creation must be each be its own reward.

So let’s think about today’s vocabulary word: asymptote. An asymptote is a value that a mathematical function endlessly approaches but never quite reaches. Visually, it looks like this:

Image result for asymptote

The curve never reaches the asymptote of zero, but gets endlessly and infinitely closer to it.

So how close can we get to zero feedback before the work no longer makes emotional sense? Or conversely, if we endlessly approach zero feedback, why not just accept zero, write the thing and put it away to start another without ever giving it away at all? If we write for ourselves, let’s just leave it there and call it good.

Thing three. Of course, we can’t do that. Each entry in this blog is an engaging writing task, and teaches me a little more about what and how I think. Even these little essays could be written and filed away in a folder on my hard drive. But I don’t do that. I format them, post them to my website, and cross-post them to LinkedIn. I must believe that the work has some external benefit. And if faith is belief in the absence of evidence, then my writing is as close to an act of faith as I’ve come in decades.

Unsolicited Advice

Don’t DO that!

One of the maxims for good living that was drilled into me in my childhood was the Midwestern guide to community cohesion: Unsolicited advice is never welcome.

We pretend kindness, and deliver cruelty. We just know that if only our friend did this one thing, their life would be so much better. But no. If you can’t say something nice…

One of the hard parts of this directive, though, is defining “unsolicited.” And we all define that very differently. I gave Nora a handbag once, one that she’d admired at a show. She was delighted with it (as was I), and she showed it to a close friend, gushing about how much she loved it. “I’d like it a lot better without the embroidery,” that friend said. Under what condition did that friend feel warranted to offer that critique in that circumstance?

“Don’t you want me to be honest?” we’ll hear people say? And if the answer is that you’re honestly going to be mean spirited about something, then no. No I don’t.

Yelp, for instance, is nothing more than unsolicited advice. Sure, I mean, Yelp solicits it, but that’s not the same as the hapless restaurant or nail salon or community college that bears the brunt of it. (I mean, the definition of the word yelp is “a short, sharp cry, especially one of pain or alarm.” The very name of the app evokes a beaten dog.) The outcome of Web 2.0 is nothing but unsolicited advice, which sometimes can be unbelievably abusive. (At least with Web 3.0, there won’t be any people: it’ll just be data points from your fridge and your thermostat and your car, all condensing into an info fog that we won’t need to actively engage at all. Our machines will talk about us behind our backs.)

When you get a gift, you know enough to not say, “Golly, this isn’t made very well at all.” When someone makes you dinner, you don’t say, “I don’t know what you did to this salmon, but you should never do it again.” When someone shows you their grandchildren’s photos, you don’t say, “What’s wrong with that one?” You may well say those things later, to your spouse, but even then… why? To what end?

Sometimes, though, we actively do seek out critique. And even that’s a hard task.

I have a friend in my writing group who recently finished his creative writing MFA. He talks about a common and unfortunate habit in his program that they called “fan fictioning:” rather than address what the writer is trying to do with their story, the reviewer makes it into their own. “Maybe Dan should just be living with his father and not the whole family, that would show more about how isolated he is. And maybe instead of bowling, Dan’s dad could be interested in, like, modern dance, so that we’d really see how much of an outlier his obsessions make him.” And so on. It has become the critic’s story, no longer the writer’s, and the critique is no longer helpful.

All of our words for criticism (including “criticism”) are dangerous. In architecture school, we had “critiques” or just “crits,” which everybody knew were going to be aggressive and hostile. in writing programs, we have “workshop,” which just invites every participant to grab their tools and fuck around with your story while you sit helplessly with your microphone muted. We place our work before a “jury” or a “panel,” which isn’t going to go well; the best we can hope for is to be acquitted or paroled. Our work might be “reviewed blind,” which sometimes feels more apt than might have been intended. Sometimes we’re allowed to “defend” our work, which means it’s come under attack.

No matter what form criticism or advice takes, the fundamental fact is that we’ve engaged in an action that we find meaningful… we’ve done it to the best of our ability… and now we put the results on the table for the judgment of others. How can that be anything but emotionally fraught? How can we want anything other than someone to say “well done”?

The time for advice is during the making, not after. When you’re laying out a piece of work, and say “I’m not sure what I should do here…” When you’ve made a cocktail and it’s pretty good but not quite, and you say, “what do you think this needs?”

I taught an informal fiction course last summer, and I gave feedback on student work every week. You just have to. If you let people go the whole eight weeks, finish a story and send it to you, then you’re left with only two possibilities: “Good job!” or “ehh…” But if the critique comes steadily during the building, then the resulting project is going to be as good as that person can make it during that time, and you can talk honestly and happily about the ways that it’s grown. If we just get the finished thing dumped on the desk, we’re faced with something more like a binary yes/no judgment, a much higher-stakes response.

I’ve written quite a few novels—nine and a half, to be exact—but have shown most of them to relatively few people. There really is something fragile about the early stages of a book, when even we don’t know yet what it’ll be; the imposition of a second sensibility could enlarge it or derail it, and we know it isn’t properly made yet, so we keep it under wraps. And that just makes the unveiling even more fraught. I’ve spent thousands and thousands of hours to make this story exactly what I want it to be. And it probably won’t be exactly what you want it to be. So where does that leave us?

For my most recent novel, I had thirty copies printed out in a pleasing design; printing has gotten cheap enough, and page layout software good enough, that I could have them done for five or six bucks apiece, inexpensive enough to give to friends as something nicer than a Word file. I’ve given about twenty of them away, about three months ago. And I’ve heard back from four readers.

But really, what do I want to hear? I want to hear that it was astonishing, that they hadn’t read another novel that good in years. I mean, let’s be honest, we don’t do this work unless we care enormously about it. And we don’t give it to others unless we want them to care about it, too. We intend it as a gift, sort of… but it’s also a burden, because we’re soliciting their good review.

The notoriously conservative football coach (and enormously creative, hostile critic) Woody Hayes once said, “There’s only three things that can happen when you throw the ball, and two of ’em are bad.” Well, there’s only three things that can happen when you give somebody your book, too, and two of ’em are bad. They can love it; that’s good. They can ignore it in the swarm and swirl of life; that’s bad. Or they can tell you they didn’t like it; that’s bad, too.

More tomorrow.


How many standard deviations from the median am I?
(from the course on quantitative analysis)

I think about this a lot.

Thirty-five or so years ago, I was reading one of the annual editions of the Bill James Baseball Abstracts. Bill James was one of the first (and best) statisticians working in baseball. For instance, he analyzed the ways that the specifics of ballparks worked for and against the pitchers on their teams—for example, the Oakland Coliseum had a huge foul territory that was still in play, not in the seats. He calculated the number of free outs a pitcher would get over a set number number of innings from popups that infielders could catch that would have been in the stands elsewhere. Things like that. Just a really smart, obsessive, geeky guy. And a funny and opinionated writer, too.

Anyway, I remember one excerpt of an essay—I’ll butcher it in paraphrase—in which he talked about guys in the stands at ballgames watching a shortstop misplay a ball, and saying “Man, I coulda had that.” James, in response, wrote this wonderfully scathing little piece about how nobody in the stands at a baseball game has any real way of knowing just how good those guys are on the field. And he used statistical analysis to explain it.

There were 26 teams in the major leagues in the 1980s, and each team had a season-long roster of 25 players. That’s a total of 650 major league baseball players, out of (at that time) 80 million adult American men, plus another 50 million or so throughout Latin America. So 650 players out of a hundred thirty million in the eligible pool means that those guys—even a season-long benchwarmer with the woeful Pittsburgh Pirates (who won 57 games and lost 104)—were among the 0.0005% best baseball players in the eligible community. That is to say, there’s one major leaguer among every two hundred thousand of us. You can be awfully, awfully good… and yet not nearly good enough. One in a million is nearly literal.

The shape of a normal population distribution takes the form of a Gaussian curve with the median value at the center and symmetrical fall-off to both sides. One’s position within that distribution can be described by Z-score, which is simply the number of standard deviations we might be below or above the norm. Major league baseball players represent those men who are at Z>5, five standard deviations above the norm, a community of stunning outliers.

Baseball is maybe more objective than writing, but still, as writers, we’re working to enter a community of stunning outliers. It’s estimated that about 50,000 novels are commercially published in the US each year. Literary agent Miriam Altschuler claims that 70% of those sell 2,000 copies or fewer, which means you’ll have never heard of them (and she’ll be broke if she tries to represent them). So all of us writers are applying to enter a tiny community, most of whose seats are already held by tenure. You’re not going to displace Margaret Atwood or Stephen King on any publisher’s roster. (I hope that’s not news to you…)

When we’re sitting in the stands, reading some shabby novel, we say things like, “I could do that.” But really… could we? How would we know? Who would tell us our Z-score?

Let’s think of it as a series of filters.

  • We start with the 200 million American adults, and knock it down to the ten percent who read the most fiction. That’s twenty million.
  • Now let’s take the ten percent of that group who imagine that we also could write professional-quality fiction. That’s two million.
  • Now let’s take the ten percent of THAT group who actually have the time and the commitment to produce a full-length manuscript. That’s two hundred thousand.
  • Ten percent of that is 20,000, which is way more than the number of available publishable slots after you subtract all the Atwoods and Kings and Grishams and Oateses. Probably another ten percent reduction to two thousand is closer to it for all of us wannabe “debut authors.”

The Z-score for a new novelist is almost as substantial as for a major league baseball player. And yet, we write.

More tomorrow.