In an Anti-Human World, Can a Pandemic Get Us to Reflect on Our Humanness?

American culture expects humans to operate like machines, not like humans, and for the most part we comply.

Photo Source: Canva

There’s a hydraulic model of San Francisco Bay in Sausalito. It’s the size of two football fields, housed inside a building where the U.S. Army Corps of Engineers used to simulate conditions in the bay to test the effects of building dams, oil spills, storms and shoreline activity. Inside their small theater at the model’s entrance, I watched a video about the Bay’s history. We’re in the land of the Gold Rush, in the mid-1880s. Old photos showed miners wielding high-pressure hoses, blasting the tops off mountains, the rock and dirt washing into rivers that fed the bay. So much, it nearly filled in the bay. To only 18 feet deep. This bay isn’t a pond. It’s big!

Suddenly, 150 years of American progress snapped into sharp relief. “We’re like locusts!” I thought, with a shiver. In the lifespan of four generations, we’ve built intercontinental train tracks, interstate highways and airports, strung electric and telephone wires across the continent, run cables under oceans, dammed rivers and diverted water to build mega-cities like Los Angeles.

Any non-human animal or plant species we saw ripping through the environment at this speed, we’d exterminate.

But we’ve not put any brakes on this behavior. Why not? Because it looks like progress, like a good thing. In many ways, it is. It’s certainly made contemporary life in developed countries convenient and comfortable. I text and talk from my cellphone, take and send pictures, any time to anyone anywhere in the world. Planes and ships can carry me almost anywhere on earth, and rockets can rocket me to the moon. I’m not hot or cold unless I choose to be. And if I get sick, chances are good that medicine or surgery can cure me. It’s extended our life expectancy by decades. But we’re so immersed in this view of progress that we don’t recognize it has put us in chains psychologically and emotionally.

It’s locked us into a machine mentality and culture that’s anti-human.

Many years ago, in a workshop for evaluators, someone who was working for an international development agency talked about the difficulty they were having introducing dairy farming to island people. The organization gave them cows that they were supposed to feed and milk. Instead, they killed and ate the cattle or let them run loose. Eventually the development workers asked the people why they weren’t accepting the progress being offered them. The answer: they enjoyed their fishing and foraging life and community and leisure the rest of the time. They didn’t want to be driven and locked into a routine, required cycle of milking and feeding and maintaining another species and becoming dependent on that artificial system.

In our highly developed, routinized society, we hardly even notice what we’ve given up, the trade-offs we’ve made

it’s been so long ago that we made them and we spend so much of our lives maintaining them. We’re bound by timelines and timeframes and milestones and outcomes. We expect to see an end product, something that’s new, no matter how slightly, no matter if needed or not. We compete with each other; it’s baked into how we live. Who owns what, where they live, and what they drive, and who they associate with, and what kind of work they do, and where they are in the hierarchy.

I’ve worked in higher education for decades, where the academy is supposed to be a collegial community of scholars. It’s not. A professor once told me, the fights are so bitter in the academy because there’s so little to win. Even with less resources to fight over than in large companies, the competitive ethos turns into a battle for status, prestige, and control over minute academic and administrative decisions. In the last couple of decades, as large amounts of money have flowed into STEM disciplines, the divisiveness between disciplines and between faculty and administration and governing boards has deepened with collegiality and community as casualties.

This competitive push for progress is a system of control and domination, of others and of the natural world.

To keep it intact and keep everyone marching along that line, culturally we devalue any human needs and traits that don’t fit this worldview. If the islanders don’t want to milk cows, they’re lazy or backward, we think. If we don’t want to rush from one meeting to the next, from one project to another, we’re labeled undependable or undisciplined. If we’re not energized by the frenzy and franticness, we must be an isolate or an introvert—said derisively. And we may well be those things since more reflective people are the ones who notice the lack of attunement with this cultural way of behaving and experience its abrasiveness.

By denying those parts of our human nature that don’t fit, we’re not only destroying ourselves, we’re destroying other species, and our natural environment, our home—all that’s essential—that nurtures us and needs nurturing.

But nurturing itself is devalued.

It’s as if we shouldn’t even need nurturing, at least not once we pass our early teenage years. Some of the crustiest among us are likely those who need nourishing the most, but aren’t receiving it. They may be the most sensitive among us, but covering it with rancor as a way to protect themselves in a world made hard and harsh by this demanding, controlling worldview in which we live.

A highly trained artist said to me, while we were having lunch together, that she paints what’s inside her and often it comes quickly and easily. But that feels wrong somehow, like it’s not valuable if it comes too easy. This is another way the rapaciousness our Western culture cultivates bleeds over from the physical to the psychological and emotional. We can come to suspect our own talents, undermine our own contributions to the world. Unless it’s a struggle, unless it takes a lot of time and a lot of effort, perhaps even suffering, how can it be worthwhile? And why should we be paid for it?

Culturally, we seem to think that about the arts and human services professions. What do they produce that’s valuable to society and deserving of remuneration? They are their own rewards—in the case of the arts, self-indulgence and emotional or intellectual expression. In the case of human services, a sense of helping and doing good for others. If people want to follow their aesthetic or compassionate nature, American society punishes most of them financially.

But beauty, and joy, and compassion and empathy are all deep human needs and qualities that make us kinder, gentler individuals and make a society and a culture softer, richer, deeper, and more humane.

As a society, we cannot see and value those qualities and needs when rapaciousness runs amok. The positive parts of our culture of progress, the parts that have made us more physically comfortable, are like a curtain, blinding us to its backside. Or if we do see that side, we’ve been willing to accept the trade-offs. But how do we know when we’ve traded too much?

© Christina Leimer 2020  Contact for reprinting permission.

Women’s Rules

Women’s Rules

A female friend told me, with astonished curiosity: “I’ve never known a woman who’s so unaffected by the rules for women. You don’t even seem to know they exist.”

“What?! There are rules?” I grinned and rolled my eyes, waiving away the observation. But, it did make me wonder. Two decades later, after bumping head-long into a few of those rules, I think she was right. Somehow, I never got the rule book. Or I just never read it.

Growing up on a farm, the oldest child in the family, I was driving a tractor when I was so young I could barely see over the steering wheel. Chopping wood, feeding chickens, pushing the lawnmower, hauling hay—girls did any chores they were physically capable of doing. It was an all hands on deck environment. Lots of work to be done; few people to do it. When I wasn’t in school or working, I played in the woods and fields, climbed trees, hunted for arrowheads, and fished and swam in the pond, by myself or with neighbor kids, until I got hungry and went home. In my first 10 years, agency and independence were solidly engrained.

My favorite reading, in elementary school, was biographies. Mom would take me to the public library and I would bring home a stack of books, lay in the floor and soak in the lives of people doing extraordinary things, some overcoming extreme obstacles. All kinds of people. Colors. Genders. Disabilities. The message I got was, people do incredible things. Then, as I entered my second decade, along came the Mary Tyler Moore Show, Maude, and Helen Reddy singing “I Am Woman.”

These shows were groundbreaking for featuring independent women and the song became the anthem of the 1960s and 70s feminist movement. But I didn’t know that then. Without historical or political context, the women sitcom characters and music reinforced the conclusion I’d reached from the biographies. Women, like other people, can do professional work, live alone and say what they think.

In this context, I took my few childhood gender-biased battles as isolated individuals’ problems, not global expectations. In the fourth grade, there were enough boys in my class to field a softball team but most of them were not athletic. I was the only athletic girl. So I wanted to play on the boys’ softball team, making the argument that it was the only way our school could possibly win. When my female teacher refused to let me, I took the field anyway on game day.

She chased me inside the school building, hitting me in the back with her high-heeled shoe. Then, when the St. Louis Cardinals made the World Series, she brought a TV to school and let all the boys out of class to watch the games. Many of the boys didn’t care about sports. I did, but she wouldn’t let me watch. So I folded my arms at my desk and refused to do any school work. I didn’t know she was enforcing society’s gender rules. I just thought she was an ignorant, mean person who liked the boys better than me.

As an adult, moving to other parts of the U.S. and entering the professional world, I began to experience some jarring. As often happens when there’s something I need to understand, the women’s rules question from years earlier re-appeared. But where do I find these rules, I wondered. The answer: look to the broader culture. That’s where you’ll find the rule book. So tuning my radar to magazines, clothing, politics and my own experiences in different settings, I found some of the rules and realized some were finding me. Here’s what I learned, and how I measure up.

Based on women’s magazines, I’m supposed to be interested in:

✔  Health                           Not
✔ Beauty                           Not
✔ Relationships               Only the one I’m in.
✔ Food                             Love to eat. Not read about it.
✔ Fashion                        Some
✔ Celebrity                       Not

Of all the things there are to learn and think about, why would these topics pervade women’s reading material? I typically see these magazines in beauty shops and grocery store checkout lines. My mother owned a beauty shop, so these magazines must have been around our house when I was growing up. Yet I was so unconcerned with these parts of life that it took a graduate course in gender stratification to realize these topics reflect women’s traditional domain—home and family. And celebrity and gossip reflect women’s interest in people’s personal lives. Still? It’s 2016. I am an alien.

Based on clothing for adult females, I’m supposed to be:
Size  4-16

or maybe up to size 18 depending on the store or brand, or down to size 12W. Does the W mean woman, or wide? Based on an internet search where others asked this question—we’re not sure. Some say woman. But that makes no sense, since men’s and women’s clothes typically aren’t on the same rack or even in the same section of the store. Anyway, all sizes above this range are considered “plus.” I’m a plus woman. The plus size clothing is usually separate from the regular sizes, often on another floor. In one store, it was in the basement, next to the carpets and furniture. Can’t ignore the symbolism of that placement.

I’m inclined to take this “plus” moniker as meaning I’m a super woman or more than a woman. But that’s not the feel, when the salesperson silently points me to the basement. Why? The reality is there’s tremendous variation in women’s bodies. So why be accepting of such a limited range? Clothes sizing and labeling too is all over the map. Some adult females’ sizes are referred to as Junior or Misses. Does that make these females not yet women? Perhaps many of us are not regular women.

Based on women’s voices, I’m supposed to:
Speak softly, obliquely, in a high pitch.

But I’m loud, direct and tenor, or to use the gendered categories of classical music, contralto. Wikipedia says a contralto female voice is rare, maybe one percent of the female population. It’s in about the same range as the male countertenor. So, the reality is, some women’s and men’s voices overlap; they’re indistinguishable. Why then, must we assume people’s gender based on their voice? On the phone, people who don’t know me call me sir. I’ve been hung up on, told to go into a bank to get my business taken care of, and transferred to the fraud unit. Often, even after I’ve corrected the person’s mistaken gender attribution, if the conversation continues long enough, they revert to calling me sir. What’s up with that? Why does anyone even need to say yes sir, no sir, or thank you sir? Why isn’t yes, no, and thank you sufficient? Why do we need to genderize such speech?

Direct. If I was running for political office and asked about my hair, makeup, clothing, or spouse’s role, I’d respond Bernie Sanders’ style: “Do you have a serious question to ask?” Having received feedback, a.k.a. negative reinforcement, I’ve tried to be less direct, but then I find it annoys some men, makes me lose track of the point, allows others to miss the point, and makes my stomach turn at my own rambling and muddledness.

Loud. I enjoy wit and humor and laugh full-bodied when something strikes me as funny. When it’s just amusing, I only smile. When I’m impassioned, when I’m talking about something I strongly care about, my voice and bearing rise. Combine that with my plus size and contralto, and it equals, for some people—intimidating. Throttle back, was the message from a male supervisor. I’m to notice the space I take up, and take up less of it. “You could do something about your size,” was the advice of a petite female supervisor, conveyed with an undertone of, if you were only willing to. Hhmm, then I guess I should get bigger.

At work, I’m supposed to:
Bring the snacks, take the notes, and fetch anyone who hasn’t shown up on time at the conference room. I don’t. I’m supposed to allow men to restate my ideas as their own and not call them on it. Sometimes, I do, just to get along.

But snacks? We’re all capable of going shopping. No one has to make food from scratch anymore, so why would this task still fall to women? Taking notes? Often, I would prefer to take notes, because I like them short and concise, and it gives some power when you craft the record of the meeting—but that’s not the way it’s perceived. If men take notes, they get kudos for their egalitarianism; women get perceptually demoted. Fetching someone who’s late? Start without them, and don’t recap. Next time, they’ll arrive before the starting gun. It’s risky, as a woman, to set such boundaries and hold to professional expectations, especially when the colleagues who are late are men. Women are supposed to coddle. If we don’t, you know what we’re called.

Now that I’ve read the women’s rule book, it’s clear how misaligned my interests, thinking and behavior are with the expectations for American women. But why, I wondered, was there such a contradiction between my upbringing and societal expectations. At least a partial answer came, serendipitously, when my mother bought a new car. She called me, excited about her red Camaro. I returned the excitement, went to see it and went for a ride. Then she repeated her story the following week, and the next, until finally I said, with a bit more exasperation than was tolerable on the other end of the phone, “I know, mom, you told me! I saw it.”

She exploded. “You don’t understand!”

“Understand what?”

“This is the first car I’ve bought on MY own!! In MY own name!! With MY own money!!”

I was shocked. My strong-willed, business-owning mother, who was then nearing 50 years old, had always driven a car (my family had at least two) and earned money. I had no idea this all existed only because my dad allowed it. That the U.S. financial systems at the time would only lend money to men.

From my child’s vantage point, my mom was strong and independent, just like my dad. Neither of them ever told me I couldn’t do things because I was a girl. My dad taught me to drive the tractor, and before I started to school, I was his constant companion. My mom always told my brother, sister and me, “Be what you are, no matter what people think.”

This incident was such a revelation I started peeling back the surface. When I did, I saw a different world, one with a set of perspectives and experiences I perhaps should have known by the time I was an adult, but didn’t. Neither of my parents was politically active. I never heard them use the word “feminist,” yet that revolution was going on all around. While even today women’s parity is still far short of the goal, the impact of the 1960s and 70s women’s movement was so great that by the time I was living on my own, I could take for granted credit cards and car loans. I just got them when I needed them and could afford it. The fact that, had I been born 15 years earlier, my father, brother or a husband would have had to sign for my financial needs, makes me shudder.

Throughout my life, I’ve not identified much as female. I never doubted I was a girl. My body has the requisite feminine parts. But I thought of myself as human, or androgynous, and reveled in all the variety and contrasts that entails. Even now, if I pictured my version of my identity as a four-slice pie, woman might be one slice, maybe a slice and a half. Yet, regardless of my way of thinking and feeling about my identity, the world identifies me as a woman, and therefore, subject to women’s rules.

It’s this forced identity that chafes me. Women’s interests, women’s clothes, women’s voices and work expectations, what I see when I look at what our culture shows us about women is so narrow. Even when your exterior is granite, over time, with frequency, the feedback works like acid rain, either we chip away parts of ourselves or others do it for us. Some of us elbow the box, squirming and trying for more room, finding our niche within these broad social constraints. But what do we, and the world, lose in the process?

What we see and experience is how we learn. It gives us our framework for interpreting meaning, making sense of the world and our lives. In the world of my dreams, we nourish the diversity that’s in each of us. The complexity that is human. We give our daughters, and our sons, more ways of being in the world, more opportunities to express their innate talent, vision and ideas. And we celebrate and display that richer picture. If that happens, there will be different lessons. Instead of women’s rules and men’s rules, perhaps they, and we, will learn the rules for being human.

The Human Cost of Taking Care

The Human Cost of Taking Care

Every time there is a mass shooting, like the one at the social services center in San Bernadino, California, politicians claim more mental health services are needed. But the community mental health system has been deteriorating for years, despite increasing need, and Congress is aware of it. For several years, I worked in community mental health, until the pressure of its collapse drove me out. This is what it’s like and why we must fix it.

Every morning there’s about a half hour of quiet before the vans start arriving, dropping off residents of group homes and halfway houses for their appointment at the community mental health clinic on the edge of downtown of a major U.S. city. Since this was their only form of transportation, patients spent most of their day at the clinic waiting for the late afternoon vans to return to pick them up. Others came by city bus, some by private car. Most came alone. Some came with a friend, family member or social worker. They were all people who had nowhere else to turn for help.

 

They waited in one large room for hours to see us for 30 minutes. A few waited uncomfortably among people they never expected to encounter and in a setting they never knew existed before they lost their job or insurance.  With caseloads of 100 or more, we saw each person once every three months, unless there was a crisis. If there was, we got them in to see the doctor. But with such limited time and treatment options, there was no guarantee it would help.

Scenarios like these are common in community mental health.

Scenario 1: John’s mother called; she was scared of him. He broke a window and threw a knife at her. He thought airplanes were shooting bombs out of the air-conditioning vents in their house. He was off his medication. It was a crisis, so we scheduled an emergency appointment. His brother brought him to the clinic by car. John ran into the bathroom, ran out and smeared feces on the hall wall. He cursed and flailed.

People waiting in the lobby were watching, wary. We knew we had to physically restrain him, and get the doc to give him an injection that would calm him. We cleared the lobby, moving everyone outside, until we could isolate him. No hospital bed would be available for at least a week. So John would be returning home.

Scenario 2: A colleague was doing paperwork with a patient in my office, multitasking to keep up the pace, when the woman started crying. She put the papers aside to talk to her, to look at her, to listen. But that took extra time. The doctor, who had just finished with his last patient and was expecting another, came looking for his next patient. He opened the door to the clinician’s office. The patient immediately pulled herself together and they went back to the paperwork, wrapping up quickly to make up for lost time in the schedule. They never got back to why she was crying.

Scenario 3:  Some clinicians brought packages of crackers to give away because people sat in the waiting room so long, they got hungry and irritable. A colleague gave crackers to an elderly woman who wouldn’t eat them until she was assured it was not charity, but something we provided because people have to wait so long.

Scenario 4: Though caseworkers were never trained to do group therapy, we had to run groups because more people could be seen in less time and it cost less for unlicensed professionals to do the work. Everyone who was waiting for a particular doctor comprised the group. In one group of two people, neither would talk. One of them walked out. Immediately, the other convulsed in tears, clasped her arms tightly around her chest and rocked toward me, telling me, in choked bits and pieces that her son had been murdered on the street in front of her, while she begged for his life.

Scenario 5: I checked my mail cubby, usually filled with routine confirmations and notices from the central office. An appointment reminder sent to a young man who hadn’t shown up was returned to me marked DECEASED. He had a tendency toward reading dark books and at our last meeting shared with me his angst about the meaning of life. He committed suicide, and we didn’t know it.

After a decade of working in systems that allow so little caregiving, I could no longer do it. I stopped liking my patients and dreaded having to see them, so I knew I had to go. Nursing homes, hospitals, mental health clinics, schools, these were the places to which I was naturally drawn. It was work I loved and found worthwhile, a way to contribute to the greater good that gave me a sense of purpose. But looking in the mirror, I saw the beginnings of an apathetic, hostile social service worker. The kind of colleague I occasionally encountered, and disdained. I was earning very little money, there were so few places to turn to for the services my clients needed and none for my own well-being. The situation created quandaries that tied me in knots.

So, for my financial and mental health, I quit the job, enrolled in graduate school and learned to work with computers and statistics. In the two plus decades since, this career route has been financially rewarding, intellectually challenging, and respected. It distanced me from the social systems that ground down my compassion, gave me a reprieve from responsibility for helping people with their problems and allowed me time to heal from what I eventually realized was burnout.

Now in my fifties, I’m still scorched, but the personal and social costs of such destructive systems and the needs of so many Americans are too great for me to keep my distance forever. The social systems that overwhelmed me have become ever more necessary and even less nourishing, to both caretakers and people needing care. I was able to leave that system. My patients couldn’t.

The nation’s community mental health system has been floundering for years. In January 2013, the U.S. Health and Human Services (HHS) Substance Abuse and Mental Health Services Administration gave Congress a formal report on the status of the mental health and substance abuse workforce at the request of the Senate Appropriations subcommittee on Labor, Health, and Human Services, Education and Related Agencies. The report references previous reports in 2006 and 2007, which led to some changes in technology, training, resources, staff recruiting and integrating primary and behavioral healthcare.

Report from U.S. Department of Health and Human Services on the status of the mental health workforceHowever, their report says, these long-term concerns about staff turnover, training needs, compensation, and shortages of qualified professionals persist and even greater demands are being placed on the system as increasing numbers of Americans need these services.

After a decade of wars, many returning veterans need mental health and substance abuse treatment. The Affordable Care Act is giving more people healthcare coverage and requiring that mental health issues be covered comparable to physical health, which will bring more people into the mental health system. So will screenings for mental illness and substance abuse by primary care physicians as this practice becomes more common. In addition, to alleviate mass incarceration, many States are implementing re-entry programs to reduce the prison population, many of whom need mental health or substance abuse treatment.

In the U.S., if we need physical, mental, or assisted living care, we expect that care should be available and provided with some degree of compassion. But equally widespread is the fear that it won’t be.

With the persistent and severe problems that plague most of our human services systems, there is good reason to worry that these systems will not adequately help us when we need them.

By and large people working in these systems are humane individuals, with a desire to nourish and a willingness to confront and try to help alleviate suffering. Many believe that people who enter a human service profession do so for intrinsic rewards, the emotional satisfaction of helping a child learn, contributing to the next generation, healing wounds—physical and emotional, or comforting the dying. While the work does offer such compensation at times, more often it’s grueling and demanding. People are scared, in pain, and they lash out. Their families hover and complain, uncertain about what to do and getting little attention, and sometimes their loved ones die. Because caretakers have opened themselves to the suffering, they too must deal with their own human responses to such events. But frequently, there isn’t time. They must move on to the next crisis or person who needs them. And often, there is no way to process the experience with others, so the tension and emotional pain accumulates.

Supervision and debriefing are what should have happened when I was trying to help the mother whose son was murdered. Putting people in positions to do work for which they have not been trained, that involves human lives, is irresponsible. Without expertise in trauma care, and in my shock, all I knew to do was listen and try to comfort, holding her hand and hugging her as she sobbed. She deserved better. I should have been able to call in a licensed therapist and I should have been able to debrief myself with a similarly trained supervisor. But I didn’t even know that, until long after I’d left human services work.

Sometimes though, even when trained professionals are involved, caseloads, schedules and other organizational practices preclude them from being able to use their skills to help people and take care of themselves. A colleague told me, after having to restrain a client who was physically aggressive, the supervisor immediately called in all staff involved to debrief the crisis. But the caretakers were anxious and had difficulty participating because other patients were still waiting for their appointments and paperwork had to be completed. The supervisor’s instinct was on target, but the system’s practices did not allow it. Caseloads were too high, and schedules too tight. For these reasons too, clinicians’ one-to-one time with psychiatrists are often taken up with routine administrative work rather than training and discussing cases.

In these systems, you have to work to keep your humanity.

Policies, practice, pressures and culture conspire to take it away. According to the HHS report, anywhere from 21%-67% of mental health workers experience burnout. These are extremely high rates; damaging to both caretakers and those who need care. When such a large portion of the workforce is affected, strong institutional factors are at work, producing unrelenting stress.

Caregivers suffering from burnout experience emotional exhaustion, depersonalization and reduced personal accomplishment. They may feel like a failure and doubt their competency or compassion, become pessimistic and feel trapped. They may develop a sense of detachment, isolate themselves or take their frustrations out on others. Absences from work increase, and they can become more susceptible to illness, changes in appetite and sleep, and drug or alcohol use as a coping mechanism. Even though the percentage of mental health workers who experience burnout is already high, that percentage is expected to rise as systems become increasingly stressed by reductions in budgets and increased costs that are addressed by requiring staff to produce more billable hours in an already jam-packed work day.

Patients and the system suffer when experienced professionals leave the field.

Staff turnover rates are in this range as well, from 15-42%. Leaving for a better opportunity is one reason why; burnout is another. Patients and the system suffer when experienced professionals leave the field. According to the HHS report to Congress, a majority of primary care physicians attempting to find mental health services for their patients were unable to do so due in part to a lack of mental health professionals. The regions of the nation where shortages exist cover 91 million people. Providing adequate care would require 1,846 psychiatrists and 5,931 other practitioners.

In addition, the HHS projected that 12,624 child and adolescent psychologists will be needed by 2020 but only 8,312 are expected. Employee turnover disrupts continuity of care, makes it difficult to assure consistent use of evidence-based practice, severs patient-caregiver relationships if they have been formed, and creates patient dissatisfaction.

Community mental health is seriously underfunded, but the cost to our humanity is far greater than money.Inadequate financial compensation contributes to both turnover and burnout because it puts employees’ own security and health at risk. I know clinicians who cannot afford to replace their 10-year-old car, who put off trips to the dentist or doctor because of what it might cost, and worry about whether they will be able to support themselves in retirement. Some even qualify for food stamps. For example, on average, an experienced licensed social worker earns less money than a fast food restaurant manager. Direct care workers in residential treatment centers may earn minimum wage or barely above. And behavioral care professionals earn substantially less than their counterparts in physical healthcare, a phenomenon the HHS report attributes to our society’s social stigma surrounding mental illness and substance abuse that bleeds onto those who work in the field.

Market-based approaches to everything have been the reigning paradigm for the last three decades. But

if the market rewards what’s needed and is in short supply, why do we pay our caretakers so little and take so little care of them?

We cannot simultaneously destroy people’s ability to care for themselves and expect that they can care for us. In addition to needing organizational policies and practices that promote staff wellness and attract sufficient qualified professionals, we need to realistically understand and face the policies and practices in our society that create so much need for mental health and substance abuse treatment. The cost of taking care includes not only money. It requires a humanity and sense of responsibility for the common good, an ethos of looking out for and supporting one another.

In 1977, U.S. Vice President Hubert Humphrey said “the moral test of government is how that government treats those who are in the dawn of life, the children; those who are in the twilight of life, the elderly; those who are in the shadows of life; the sick, the needy and the handicapped.” In today’s climate, so steeped in public-private partnerships, the responsibility not only reflects on government, it reflects on business and us all.

Sources:

Hyde, P.S. Report to congress on the nation’s substance abuse and mental health workforce issues. U.S. Department of Health and Human Services Substance Abuse and Mental Health Services Administration, January 24, 2013.

Morse, G., Salyers, M.P., Rollins, A.L., Monroe-DeVita, M. and Pfahler, C. Burnout in mental health services: A review of the problem and its remediation. Adm Policy Ment Health, September 2012, Vol. 39, No. 5.

Thomas, M., Kohli, V. and Choi, J.  Correlates of job burnout among human services workers: implications for workforce retention. Journal of Sociology & Social Welfare, December 2014, Vol. XLI, No. 4.

Restoring the Balance After Violence

Restoring the Balance After Violence

This article is drawn from my research on memorials we create in response to violence and sudden death. I wrote it in 1999, but never published it. I initially posted it online after the Sept. 11, 2001 terrorist attacks in the U.S. that shook the world, in the hope that it would be of some help. But clearly, violence is still a major problem in the U.S. Mass shootings at schools, theaters, churches, concerts; no place is off limits. And inner city violence and police killings of African Americans continue unabated.

These spontaneous memorials, or shrines, spring up out of pain, fear and grief. They’re an outcry. Yet, as a society, we’ve not listened. Now that massive protests are demanding justice and removal of weapons of war from our streets, I’m once again making this research available, in the hope that it will contribute somehow to quelling the violence and healing wounds.

Like all first time visitors to the Oklahoma City bombing site, I didn’t know what to expect. More than three years after the terrorist blast, I stood on the corner of 5th and Robinson looking down the sloping street to the spot where the bomb-loaded rental truck exploded half a block away. Evidence of disaster surrounded me. Boarded up windows, vacant lots, twisted metal and collapsing roofs. Scrawled in black spray paint on the side of the gutted Journal-Record Building was the message “Team 5, 4-19-95, We search for truth, We seek justice. The courts require it. The victims cry for it. And God demands it!” On the next corner, at 5th and Harvey, is a statue of Jesus weeping, his back turned toward the violence.

This section of 5th Street, now closed to vehicle traffic, is what people come to see. The fence protecting the remains of the Alfred P. Murrah Federal Building has become a shrine, a spontaneous memorial, a place of pilgrimage for people from every state, from many nations, and for families of the victims.

Some people bring mementos to leave on the fence. Others, not expecting to be moved by the scene, are touched by it but are unprepared to contribute. Still, they want to give something of themselves, so they leave whatever is available. Pens, caps, neckties, belts, business cards, sunglasses, jewelry, scarfs, and hair barrettes. These spontaneous gifts are threaded into the wire mesh filled with keychains, license plates, shoulder patches, stuffed animals, baby shoes, pacifiers, bells, candles, rocks, dream catchers, medicine bags, crosses, angels, religious pamphlets, ribbons, dolls, toy trucks, beads, flags, pinwheels, coffee mugs, T-shirts, banners, flowers, poems and notes. Lots of poems and notes.

Cards, wreaths, and photos mark birthdays and anniversaries that passed without the honoree. One poem and photo told the story of Kathy Silovsky, known as “Scout.” She survived the blast when her coworkers did not, then killed herself three years later. The hard hat left by a nurse EMT questions, “‘Why couldn’t I do more. I’m so sorry.” Another poem addresses the curious, ending with the wish that the writer too were one of the curious rather than one who lost loved ones in the bombing.

 

I first saw the site during the summer heat wave of 1998. Whether I was there at 9 a.m., 1 p.m., or 7 p.m., I was never alone. The sidewalks were always filled with curious, solemn, and tearful faces. Child faces, elderly faces, middle aged and young adult faces. White, and brown, and black faces. Male and female faces. They stood in the smothering heat, wiping perspiration, reading the messages and looking at the mementos and photos. They whispered. They gestured. Seldom were voices raised. When a woman shouting to her mother suddenly shattered the silence, it ripped through my body, making me cringe and sealing the reality that on April 19, 1995 at 9:02 a.m., 168 people died where I was standing.

The OKC fence is America’s most elaborate and dramatic spontaneous memorial. And no wonder. Besides killing 168 of us, this mass violence violated many of our most treasured cultural values and expectations. Timothy McVeigh’s and Terry Nichols’ terrorist attack aimed a blow at the heart of our nation, at our democratic process of political and social change. Their attack occurred at a time and place where our culture tells us we will be safe. Their bomb did not explode at some international event in our nation’s capitol where risk is anticipated and security heightened. It happened on a normal Wednesday morning in a small Midwestern city where we should be able to go to work, file for social security, leave our children at a day care center, and not worry about whether we will all return home for dinner in the evening. The victims were people we can identify with in some way. They were employees, parents or children, ordinary people like most of us, people that are not supposed to be at risk for murder. And the perpetrators were not outsiders, not a foreign enemy or Middle Eastern terrorists, as we surmised. They were our own.

In contemporary American culture, death is just not supposed to happen in public, nor while we’re taking care of routine, daily tasks. People who are minding their own business, who are not elderly and not leading a risky lifestyle are not expected to die. And we are supposed to be able to tell the good guys from the bad guys. There’s supposed to be some way that we can identify people who might hurt us and people who won’t, but the killers don’t usually look like monsters. They look like, and often are, our own children, our own parents, our own spouse.

These cultural violations produce uncertainty about the continuance of our society and the extent to which we possess even the most fundamental shared values. They can also produce personal insecurity. When we feel that we are unsafe in our homes and schools, on our jobs, during daylight hours, while we’re engaging in ordinary daily tasks, the questioning and the feelings of helplessness, vulnerability, and grief can be too wrenching for words. We need to act, to do something to overcome the helplessness and counteract the message violence sends, the message that individual lives are not valuable.

Spontaneous memorials give people a way to respond immediately, urgently, to this need. Unlike funerals, which few of us would ever consider attending without having known the deceased or a family member, anyone can participate in these memorial rites. No one is automatically included or excluded. Individuals create a role for themselves and define themselves as mourners by contributing to the shrine in some way. The mementos they decide to leave, if any, and the number of times they visit is self-determined.

At the Oklahoma City bombing site fence, some of the families of the victims leave their mementos right alongside those left by people who never personally knew any of the victims. But this is often NOT the case. In the last decade, we’ve seen these shrines as part of the televised news coverage of school shootings, the BATF-Branch Davidians shootout in Texas, the Polly Klaas abduction and murder from her home in California, and the intentional drowning of the young Smith boys by their mother in South Carolina. Wherever there is high profile violence, usually murder, there are spontaneous memorials. But these memorials also spring up in places the national press never sees. On sidewalks, at storefronts, in parks and parking lots all across the nation. Wherever someone was murdered and a local community felt the loss, there are spontaneous memorials.

***

 

When Americans are murdered, their death takes on a cultural significance that is otherwise absent and the family, to some degree, loses the ability to deal with their loved one’s death in private. Spontaneous memorials, though they may be intended to show support for the family, primarily address the social threats of violence. These memorials can be very painful reminders for the victim’s family. When people began leaving flowers and mementos in the California woods where 12-year-old Polly Klaas’ body was found, her father originally felt it was gruesome, certainly not something to partake in or from which to draw comfort. When private mourning is the way to grieve a loss, and the individual is an ordinary person not generally known to the public, then the public’s sudden interest raises suspicions about motivations.

Spontaneous memorials can draw people whose interests are only tangentially related to the victim’s death or the social issues that death represents. These memorials can attract people who are trying to resolve grief over their own personal losses. Their residual grief makes it easy for them to identify with the losses of others and, since grief is only resolved by repeated opportunities to experience it and thereby diminish its power, those for whom adequate opportunities have been unavailable find an opportunity in spontaneous memorials to foster their own healing.

Spontaneous memorials, so open, unregulated and new, do not have defined norms for expression or behavior nor do we all share the same interpretation of what these shrines represent. Some feelings, like anger and the desire for revenge, are a usual part of grieving but are not ordinarily expressed in traditional funeral rites or memorials. But anger is frequently and openly expressed during social and political protests. At spontaneous memorials it is common to see notes and symbols that express a range of emotions, including anger. People are angry at the police, the killers, the parents of young killers, and laws and social policies that they feel were inadequate safeguards that contributed to the death. With very high profile violence like the OKC bombing, these memorials even attract a few people who consider the forum appropriate for expressing political or religious views, even when they’re unrelated to the tragedy.

***

When violence strikes in the suburbs, in small towns, and in rural America it grabs the media spotlight. It’s rare to hear about inner city violence in the national news. Not because there is none, but because it’s old hat. Mainstream America has come to expect that inner city Americans may die violently. That’s life in the concrete jungle with its gangs and drugs and weapons and its schools that don’t educate. But the effect of violence on inner city Americans is equally devastating and the ritual response just as compelling. Even more so perhaps because of the need to counter America’s neglect and vilification.

In some of our largest cities, inner city memorial walls commemorate the dead, most of whom died too young. Painted on sides of buildings and playground walls near where the victim died, lived or regularly met with friends, these bright, vivid, pulsating murals exude a vitality that shouts the intent to survive. They include images of favorite candies, clothes, or sports, hypodermic needles, crucifixes, handguns, roses, cartoon characters, and portraits and names of people who were intentionally gunned down, those who were in the wrong place at the wrong time, those who overdosed, and those who may have been drug dealers, robbers, or killers themselves.

Survivors sometimes write their own names on the walls to show that they visited. In these memorials, the boundaries between the living and the dead, and between perpetrators and victims are blurred, all caught up in the culture of poverty and violence that is so pervasive that even children approach artists about painting memorials for them.

Memorial walls are usually commissioned by family or friends of the deceased but sometimes the neighborhood takes up a collection to memorialize one of their own. Some artists have created memorials to commemorate their family members and remind young people about the dangers of their world. In one, a hip hop dressed grim reaper crouches forward in his unlaced tennis shoes and throws the dice. The inscription warns of the invisible killer, AIDS. In another, below a poem about the danger of crack, a serpent rises from the water and tempts a naked young woman.

The boundaries between the living and the dead, and between perpetrators and victims are blurred, all caught up in the culture of poverty and violence.

These memorials are intended to be permanent and accepted by the community, so artists try to get permission from property owners or the city before they paint. Still, inner city memorial walls don’t last forever. Sometimes the walls are painted over by new owners of the buildings, by city cleanup crews, or by police when the memorials charge police brutality, or they’re destroyed when buildings are torn down. Weather fades and chips the paint. And rival gangs occasionally deface the walls, trying to obliterate even the memory of the deceased. But in general the walls are places protected by the community. They are places where block parties honor and celebrate the deceased’s life, where mourners go to leave birthday cards, light candles or say prayers. They are places for organizations to discuss alternatives to violence, for police to monitor gang activity, and for personal pilgrimages that renew hope.

***

American society is so large, complex and multifaceted and so successful at minimizing our risk of death that most deaths can be adequately handled privately, among family and friends, because they do not affect a larger community or possess broad social ramifications. But for some Americans, a funeral and a marker in a cemetery are not enough to mark a violent death. Even though the victims are private individuals, not public figures, the nature of the death, the feelings it evokes and the meanings it carries, requires that the death be recognized and commemorated in public space, usually at the site of death.

In the U.S., we’re supposed to get over death quickly. But our ability to do that is complicated when death is unexpected, unjust, and public. If a convenience store clerk is murdered, there’s a flurry of official activity and passersby look on. But after the police tape is gone, and the streets are cleaned, there’s no sign of tragedy, no reminders that human life was lost or community damage done. In a matter of hours, the scene can suggest that life there is normal, routine.

But life is not routine for people who knew the victim or those who share a strong sense of vulnerability or outrage surrounding the death. When they see the site, they know that something important happened there that can’t just be swept away, ignored, or treated as if it never existed. For them the site is no longer ordinary. The spilling of human blood, the disruption of community bonds, and the fragility of human life lingers at that spot and it must be acknowledged and soothed before the ground can again be used for ordinary purposes. Returning the ground too quickly to everyday, secular uses, can feel like a sacrilege.

Most of us believe that in death everyone deserves respect, as does death itself. Violence has violated this basic element of the human spirit and that wound must be healed. Returning the site to ordinary use without first marking it, noting its significance, discounts the wound and shows that individuals are insignificant and death is routine. Marking the site lends gravity and weight to the loss and gives a nod to our respect for death, rather than allowing the person and the tragedy to disappear like smoke.

For people who feel the personal and social threats of violence but are not a part of the immediate community, like people who do not live in or around OKC, visiting the site can establish a tangible connection to the tragedy that gives concreteness and realness to an emotional connection. Some people may be sufficiently saddened or outraged after hearing about the death and feel the need to do something about it but have nowhere to go, or nothing to do, yet the feelings remain. In addition, others around them may not share or understand their intense concern.

Going to the site provides a physical anchor for their emotion that validates their feelings while simultaneously allowing them to take some action, even if the action is only to add their voice to opposing an injustice. For others, going to the site is like looking for metaphorical clues, like trying to get inside the incomprehensible, trying to grasp its reality and understand. Still others believe that the soul leaves the body the moment it dies, and since the death happened suddenly, there was no time for last rites, so paying homage at the place of death is crucial to fulfilling religious and social obligations.

These memorials are often fiercely protected by the community. When I visited a spontaneous memorial in Houston, I noticed a neighbor looking out of the window. One man walking along the sidewalk began to slowly stroll and another driving by pulled into the parking lot and sat for a few minutes until I guess he decided I was OK. In OKC, the Memorial Foundation archivist had to be given an official badge because she was repeatedly questioned by people who saw her remove objects from the fence. On the rare occasions when someone defaces an inner city memorial wall, artists quickly restore it, determined not to allow the deceased’s memory to be erased.

These memorials in public space can also cause controversy when they’re placed on disputed property or when they impinge on others’ grief. If some place mementoes on a street corner or county easement, but neighbors are upset because the shrine is in daily view from their windows, how do the differing needs get resolved? What if the shrine, visible from home, is an unwanted reminder of one’s own child’s death? What is the right thing for all involved to do?

For some people, a memorial at the site of a loved one’s death can be unbearable. They feel that by associating the site with their deceased loved one, the accident or murder will dominate their memories, and they don’t want that, don’t want the trauma and tragedy to overshadow their living memories of the person they love. But they may not always feel that way. Grieving is a complex emotional, physical, intellectual, spiritual and social process that everyone does at their own pace and in their own way. The coping strategies we use, the things we see and do to restore hope, balance, and engagement in life are not the same the day after the death as they are in six months or five years. Nor will the activities or objects that help one person heal necessarily help another person in the same way.

Trying to create a memorial that will help the community as a whole to heal but that also respects the fact that individuals grieve differently and at different rates is a delicate act. In OKC, groundbreaking for a permanent memorial to replace the fence is about to begin but some of the victims’ family members don’t want to see the fence go. For more than three years it has been the site where they have mourned their loved ones and received support from people around the world.

Once an object is imbued with sacred or deep human significance, it cannot simply be discarded without people feeling betrayed.

On a now vacant lot across 5th Street from the fence is the Survivor Tree. It lost all of its leaves in the explosion but bloomed the following year. Survivors of the blast gravitate to the Survivor Tree while families of the deceased prefer the fence. On the night Timothy McVeigh was convicted, families and survivors came together in a memorial service at the Survivor Tree and, for awhile, it seemed that the division was bridged. But families soon returned to the fence. So OKC Memorial Foundation officials are considering whether to alter the permanent memorial’s design to include some portion of the fence. (Update Note: Part of the fence is included in the formal memorial. A ceremony was held to transfer the fence from its original location.)

Once an object is imbued with sacred or deep human significance, it cannot simply be discarded without people feeling betrayed. When inner city memorial walls are completed, usually there’s a block party with food, music, and dancing. This celebration honors the deceased, brings people together to affirm life, makes the wall part of the community and invests it with personal and collective meaning. This ritual can be compared to constructing a building to be used as a church. Once the building is consecrated, it takes on extraordinary significance. It is no longer just any building; it is a special, sacred building that deserves reverence and respect. If for some reason the building will no longer be used as a church, it must be de-consecrated. Spontaneous and roadside memorials and inner city memorial walls possess all of the shades of meaning and emotion that their contributors have placed on them. Dismantling them or disposing of the objects may require a final ceremonial gesture that validates their destruction.

For most of us, these memorials are not neutral objects in our landscape. They affect us to one degree or another. They stimulate reverence, disgust, sorrow, regret, sympathy, anxiety, curiosity, empathy, and fear. Numerous enough to notice, they’re still infrequent enough to startle. That momentary twinge they cause, that brief recognition of death in our daily life, intrudes on our ability to keep death at bay, to keep it private and controlled. The world refuses to stop for the death of a taxi driver, or a teacher, or a student, or for any of us. But an inner city memorial wall or a sidewalk covered with flowers, flags, candles and notes forces us to notice, makes us pause, if only for an instant.

 

For more information, see our groundbreaking article
Spontaneous Memorialization: Violent Death and Emerging Mourning Ritual in Omega: Journal of Death & Dying, October 1997.

Out of Class

Out of Class

I work at a community college, but rarely interact with students. I count them, analyze their enrollment patterns, their graduation rates and grade point averages. I keep track of the types of degrees awarded, the demographic composition of the student body, and the classes students complete so administrators can make decisions about running the college.

What I do is called institutional research. So when I answered my phone one day, the student voice surprised me. Her words sent me back decades.

“I need financial aid to go to college but my mom won’t sign the papers,” she said. “I’m 18, what can I do?”

I know that story. I never expected to hear it from anyone else though. I thought I was the only person in the U.S. whose parents refused to help their children go to college. With all the politicking and pushing today for everyone to go to college, and the cultural belief in family support and pride in each other’s success, it’s easy to feel like an alien when you have to go it alone.

It must be your own fault, too. Presidents Clinton to Bush to Obama, through one plan or another, have intended to make college accessible to every kid who graduates from high school. So hey, all you got to do is make it through 12 years. Then, the world is your oyster as you head off to college and take your rightful place in making each generation more successful than the last. In recent years, we’ve recognized that foster kids might need extra help, but otherwise, of course we’ve all got parents who push us, prod us, and support us on to that great equalizer, college.

Every scholarly study on the benefits of higher education relates college degrees to earning power. Without a college degree, you make very little money, say the studies. Without a college degree, you can’t hope to reap the fruits of America’s orchard. Go to college. Get a degree. Maybe even two. Without it, you’re zero. You’ll amount to nothing in America. You’ll be broke and miserable and wind up on the street sleeping under cardboard. Begging for scraps from the educated. Who, by the way, will step over you on their way to a seaside restaurant. It’s your own damn fault, they’ll say. Should have gone to college.

And whether your culture values higher education or not, mainstream culture does, and that’s what counts.

For me though, money wasn’t the motivator. Every study of employee motivation tells us that, yes, money is appreciated, we can always use more of it, but money alone won’t make us happy with our job, won’t make us want to improve the widget-making process or streamline the paperwork. The paycheck boost looks good, for a couple of months, but then we start wondering how long it’ll take to get the next one.

Money isn’t enough of a motivator to get through 12 years of working toward a four-year degree. That’s how long it took me, working a full-time job and supporting myself, to earn a Bachelor’s degree. College for me was about learning, about understanding the world around me, and before me. It was about how the world might be different in the future than it is today. It was about how I might change the world. Before I started the first grade, I was a reader. Every week, my mom gave me carte blanche at our small town’s public library. Into the big kid’s section I’d go. No baby books for me. And when, in the third grade, I found the adult section, the librarian had a heart to heart with my mother. My mother won. So did I.

My parents were blue collar small business people in the rural Mid-West. College wasn’t necessary to make a good living then. You just needed to work hard.

To my dad, that meant physical labor. Thinking was NOT work. Writing, drawing, playing music, that was NOT work. If you didn’t sweat. If your muscles didn’t bulge and pop and occasionally get bruised, then you weren’t working. You were lazy, loafing, living off the labor of others.

Even so, as long as my mom was around, I assumed I’d go to college. My teachers encouraged it. I should be a teacher, or a nurse, they said. Ugh, I thought. A scientist maybe. An artist maybe. Or an explorer. I had a couple of distant cousins, my parents’ age, who had college degrees, so college wasn’t completely unheard of. Even though these cousins lived in big cities, far away from us, and we never saw them. One of them was crazy, my dad said. College made him that way. The other one, who’d grown up in the city, had no common sense. He thought he could live on the land. Grow his own food. Build his own house powered by the sun. Provide for himself. College made him stupid, my dad said.

When my mom left us, my parents divorced, and it was the end of my college dreams. My dad had the money. He had custody of my brother, sister and me. And he hated college. Besides making people crazy and stupid, it turned them into smart alecky, know-it-alls. At 13, I was that already.

The next several years were a rough ride for all of us. Living on my own at 17 and working the graveyard shift in a nursing home, I finished high school.

I was a writer. Interested in journalism, my work was the backbone of our high school newspaper and literary magazine, and the University of Missouri’s School of Journalism was considering offering me a scholarship, as were local businesses and organizations. “Take my name off the scholarship list,” I told our high school counselor. I couldn’t see it working. Four years was eternity. And the idea of moving into a tiny dorm room with 18-year-old women who were just out of high school and living away from home for the first time, whose concerns would be different than mine, seemed unbearably isolating.

Before the summer was over, I realized my mistake. I couldn’t live without going to college. But how?

Off I went to the financial aid office at the local university asking the same question the young woman asked me on the phone. The financial aid officers needed my dad’s income information and signature. I knew he wouldn’t give it, but I asked. He didn’t. There was little the university could do. The minuscule scholarships still available weren’t even enough to pay for books. I had to work for another year then file an income tax return to prove financial independence, so I could get loans.

By that time, my old Chevy died, so I had to borrow a few hundred dollars to buy another used car. At minimum wage, paychecks didn’t always cover the rent, and I fell behind. I was living on 19 cent boxes of macaroni and cheese from the thrift store. Debts were accumulating from everywhere. It was gonna be a long haul.

*****

I mentioned the student’s phone call to our Dean of Student Development Services. “I hear that a lot from our students,” she said. “Some parents think their job’s done when their kid turns 18. They’re on their own. They don’t seem to care if their child does better than they did.”

Some parents don’t want their children to do better. That’s a fact that our cultural myths and rhetoric about family hides.

A professor, whose son entered his father’s field and was beginning to win awards and garner attention, told me that he was proud but jealous. He wanted his son to do well, but not surpass his own achievements. In the case of a working class family, sending a child to college may, as it did for my dad, counter their own values. A college education is not considered “better.” For other working class parents, there’s the fear of losing their children if they send them to college. What will they learn? Who will they be when they finish their degree? Will they still be part of the family? Will they want to come home? Working class culture, is not educated professional culture.

*****

For working class people, colleges are remote, even when they live nearby. So the first semester I went to enroll at the university, my younger sister went with me. She didn’t know any more about what we were doing than I did, but she was at least a familiar face, in an otherwise foreign environment. Besides, if she decided to go to college in a few years, she’d have a bit of experience to make it less daunting. We met my advisor in a big hall where other students were doing the same thing at tables scattered around the room. My sister and I sat down across the card table from her but she told my sister to go sit in a row of chairs by the wall until we were finished. I wonder if she would have shooed away a parent who was accompanying their child.

The advisor then tried to convince me to register for basic English, Math and Science courses. I had other plans. I knew I wasn’t going to be able to get through college in four years, so I wanted to take courses that interested me and that might help me earn more than minimum wage before I finished college. She didn’t get it. There were specified courses for freshmen, she explained. There were course sequences. There were prerequisites. I insisted. She insisted. After multiple volleys, I ended up taking an English course and a History course, to satisfy the university, and Journalism courses, to satisfy me.

I don’t know how much this experience affected my sister’s view of higher education. We never commented on it, working class people don’t mention humiliations. She never finished high school. Neither did my brother, whose artistic talents shine even without nurturing. He uses them in building. Construction or carpentry, now that’s something a working class man does. He doesn’t draw or paint on a canvas. Reading and writing books, that’s not what a working class woman does. She gets married, has babies, works on the farm, in a factory or as a waitress, hairdresser or clerk in a retail store or fast food restaurant. She doesn’t travel around the world, go to museums and film festivals, or live independently in the city.

*****

I didn’t notice the social class differences much while working on my undergraduate degree. I hit the campus for class, then left for my full-time job, so I didn’t know other students or professors. Whatever happened on campus, short of its closing down, was irrelevant to me.

In graduate school, however, three years after I finished my bachelor’s degree, social class distinctions were obvious, even pointed at times.

Attending full-time on an assistantship, I spent every day with my peers and professors. In this program, there was a not-so-subtle distinction between students who were doctoral material and those who intended to end their education with a Masters. As a student deemed doctoral worthy, I was expected to attend professional conferences and present my work. The problem was, I was actually trying to live off of my assistantship, with no supplemental money flowing in from home. Pointing out my lack of funds made some of my friends and professors uncomfortable. How could I be there, looking like them, doing credible scholarly work, yet not have the same financial resources and family support. They weren’t privileged, after all. Just middle class. Must be that I was scared, or resisting. I just didn’t want to do the things that would bolster my career. When I declined acceptance into a Ph.D. program, one professor told me, dismissively, that I could do it if it was important enough to me. I just wasn’t willing to sacrifice.

*****

That summer, I went to my dad’s house for dinner. It was the first time we’d seen each other in 20 years. His stepson, who I remember at about age 9 chopping up the plants in my room, and generally being annoying, immediately told a story about spinning his truck’s wheels to fling gravel onto the fancy car of some “city slickers” who were driving on a country road.

What do you say to each other after 20 years’ and two college degrees worth of separation?

Relying on the presence of food and neighbors, my dad was clearly hoping for a smooth visit. The questions were superficial, intended to be non-confrontational, but even these bland queries couldn’t avoid the class differences. “What kind of car do you drive,” he asked. When I told him a Honda, his wife laughed and hit me on the shoulder. “So aren’t you gonna talk about people who drive those foreign cars, the way you usually do?” “Nah,” my dad said, trying to get along, “that’s the kind of car professors drive.”

None of my first cousins or aunts and uncles went to college, and sometimes, in each other’s company, it’s as if they don’t know who they’re talking to. I’m a stranger that they’re supposed to know, but aren’t quite sure they do. Sometimes I’m not sure myself, about how much I’ve really changed. I’m not a professor. I told my dad before, in explaining my job, that I don’t teach. But educated people are foreign, unknown entities. There are few books in my dad’s house, and that’s what educated people do, don’t they? Write books.

Their world feels foreign to me too, but in some ways it always did. Which was part of why I knew I had to go to college. Sometimes it’s difficult to separate the class distinctions from the jagged edges my family’s divorce and disruption caused. My brother and I talk, tentatively, cautiously. But eventually there’s a place where our souls connect, a place where we know and remember each other. It’s a place where life is what it is, regardless of class and distance and pain. I tell him about the trip I’m taking to Europe in the fall. He tells me, standing on the front porch of his cabin he built himself from scrap wood, that he’ll never have the money to travel, but he’d like to. I’ll never have the money to buy a house, I say. I got started too late. We’d both like to do it all. But we have to choose.

That year, finally, I paid off my student loans. But another young woman was just starting the trek.