pandemics and ethnic violence

We begin with two questions:

  1. Under what conditions will a population persecute an ethnic minority during and after a pandemic?
  2. Who is most likely to instigate the violence?

Their answers are important. In case you haven’t heard, a deadly coronavirus has surfaced in Wuhan, China, and has now spread around the world. Some hold Asians responsible for the pandemic and act on their prejudice. A 16-year-old Asian-American boy in the San Fernando Valley was assaulted and sent to the emergency room after being accused of having the virus. In Texas, an assailant stabbed an Asian family in line at a Sam’s Club. Asians in communities like Washington D.C. and Rockville, Maryland are purchasing firearms en masse in an attempt to protect themselves. If general answers to our questions exist they may suggest ways to relieve ethnic tensions and prevent additional violence.

We turn to medieval Europe for guidance. It experienced a horrifying pandemic followed by decades of severe antisemitism. Our best bet to understand these questions is to examine the Black Plague and the subsequent treatment of medieval Jews.

The Black Plague

Without exaggeration, the Black Plague (also called the Black Death) was the worst pandemic in human history. From 1347, when it arrived in Italy, to 1351, between 40 and 60 percent of the entire European population perished. Aggregate figures obscure the losses of individual towns. Eighty percent of residents died in some locales, effectively wiping cities off the map (Jedwab et al., 2018). France was hit so hard that it took approximately 150 years for its population to reach pre-plague levels. Medieval historians attempted to capture the devastation of the plague by describing ships sailing aimlessly on the sea, their crews dead, and towns so empty that “there was not a dog left pissing on the wall” (Lerner, 1981).

Screen Shot 2020-04-03 at 12.58.58 PM.png
Source: (Jedwab et al., 2018)

The statistics are scary, but the plague was also visually terrifying. After catching the disease from an infected flea an individual develops black buboes (lumps) in the groin and neck areas, vomits blood, and dies within a week. In severe cases, plague-causing bacteria circulate through the bloodstream and cause necrosis in the hands and feet, leading them to turn black and die while still attached to the body. As soon as these symptoms develop, victims usually pass within 24 hours. Seeing most of your acquaintances develop these symptoms and die was arguably more psychologically damaging than actually contracting the disease yourself.

The scientific response, if it can be called one, to the pandemic was weak. Medieval medicine was still in the throes of miasma theory, which held that disease was spread by “bad air” that emanated from decomposing matter. Still, miasma was deemed an insufficient explanation for a calamity of this size. The most educated in medieval Europe saw the plague not as a natural phenomenon beyond the grasp of modern medicine, but as a cosmic indicator of God’s wrath. Chroniclers claim the plague originated from sources as diverse as “floods of snakes and toads, snows that melted mountains, black smoke, venomous fumes, deafening thunder, lightning bolts, hailstones, and eight-legged worms that killed with their stench” all ostensibly sent from above to punish man for his sins (Cohn, 2002). A common addendum was that these were all caused in one way or another by Jews, but we’ll get to that later.

Some explanations, if squinted at, bear a passing resemblance to the secular science of today. They invoked specific constellations and the alignment of planets as the instigators of the plague, drawing out an effect from a non-divine cause. How exactly distant celestial objects caused a pandemic is unclear from their accounts, though.

In general, medieval explanations of the plague reek of desperate mysticism. They had quite literally no idea of where the disease came from, how it worked, or how to protect themselves from it.

Medieval Jews

Jewish communities were widespread in Europe by the time the plague began. In the eighth century, many Jews had become merchants and migrated from what we today call the Middle East to Muslim Spain and southern Europe. Over the next two centuries, they spread northward, eventually reaching France and southern Germany before populating the rest of the region (Botticini & Eckstein, 2003). Jews were overwhelmingly located in large urban centers and specialized in skilled professions like finance and medicine. By the 12th century, scholars estimate that as many as 95% of Jews in western Europe had left farming and taken up distinctly urban occupations (Botticini & Eckstein, 2004).

Screen Shot 2020-04-03 at 7.21.36 PM.png
Map indicating percent of neighboring towns with resident Jews by 1347 in a limited dataset. In other words, it does not account for every town with a Jewish population in Europe but should give you some idea of settlement patterns. The fact England is devoid of Jews is not a limitation of the data; they were expelled from the region in 1290. Map source: (Jedwab et al., 2018)

Unfortunately, the medieval period offered Jews little respite from persecution. Antisemitism was constant and institutionalized, as Christianity was the official religion of most states. Restrictions were placed on the ability of Jews to proselytize, worship, and marry. For example, in 1215, the Catholic Church declared that Jews should be differentiated from the rest of society via their dress to prevent Christians and Jews from accidentally having “sinful” intercourse. These rules eventually begot requirements that Jews wear pointed hats and distinctive badges, foreshadowing the infamous yellow patches of the Holocaust.

Medieval antisemitism was violent as well as bureaucratic. Pope Urban II fueled Christian hysteria by announcing the First Crusade in 1096, and shortly after, bands of soldiers passing through Germany on their way to wage holy war in Jerusalem killed hundreds of Jews in what became known as the Rhineland Massacres. These attacks were so brutal that there are accounts of Jewish parents hearing of a massacre in a neighboring town and killing their own children, and then themselves, rather than face the crusaders. Explanations for these massacres vary. Some scholars claim they were fueled by the desire to seize the provisions of relatively wealthy Jews in preparation for travel to the Middle East. Others attribute them to misplaced religious aggression intended for Muslims but received by Jews due to their proximity and status as “non-believers.” While there are earlier recorded instances of antisemitism, the pogroms of the First Crusade are believed to represent the first major instance of religious violence against the Jews.

Strangely, the Medievals seem to vacillate between ethnic loathing and appreciation for Jewish economic contributions. The Catholic Church forbade Christians from charging interest on loans in 1311, allowing Jews to dominate the profession. As a result, they were often the only source of credit in a town and hence were vital to nobility and the elite. This, coupled with Jews’ predilection to take up skilled trades, gave leaders a real economic incentive to encourage Jewish settlement. Rulers occasionally offered Jews “promises of security and economic opportunity” to settle in their region (Jedwab et al., 2018).

The Black Plague and Medieval Jews

As mentioned, the plague was a rapid, virulent disease with no secular explanation, and the Jews were a minority group—the only non-Christians in town—with a history of persecution. Naturally, they were blamed. Rumors circulated that Jews manufactured poison from frogs, lizards, and spiders, and dumped it in Christian wells to cause the plague. These speculations gained traction when tortured Jews “confessed” to the alleged crimes.

The results were gruesome. Adult Jews were often burned alive in the city square or in their synagogues, and their children forcibly baptized. In some towns, Jews were told they were merely going to be expelled but were then led into a wooden building that was promptly set ablaze. I will spare additional details, but these events spawned a nauseating genre of illustrations if one is curious. As the plague progressed, more than 150 towns recorded pogroms or expulsions in the five-year period between 1347 and 1351. These events, more so than the Rhineland massacres, shaped the distribution of Jewish settlement in Europe for centuries afterward.

Screen Shot 2020-03-31 at 8.16.43 PM
Black Death Jewish pogroms on a map of the Weimar Republic (early 20th century Germany). “No data” means there are no records of Jewish settlement in the area at the time of the Black Death (Voigtländer & Voth, 2012).

If we can bring ourselves to envision these pogroms, we imagine a mob of commoners whipped up into a spontaneous frenzy. Perhaps they have pitchforks. Maybe they carry clubs. Given what we know about the economic role of medieval Jews, you might impute a financial motive upon the villagers. It’s possible commoners owed some debt to the Jewish moneylenders that would be cleared if the latter died or left town. If asked what income quintile a mob member falls into, you might think that’s a strange question, and then respond the lowest. Their pitchforks indicate some type of subsistence farming, and only the poorest and least enlightened would be vulnerable to the crowd mentality that characterizes such heinous acts, you would think.

If so, it might be surprising to hear the Black Death pogroms were instigated and performed by the elite of medieval society. (Cohn, 2007) writes that “few, if any, [medieval historians] pointed to peasants, artisans, or even the faceless mob as perpetrators of the violence against the Jews in 1348 to 1351.” Bishops, dukes, and wealthy denizens were the first to spread the well-poisoning rumors and were the ones to legally condone the violence. Before even a single persecution in his nation took place, Emperor Charles IV of Bohemia had already arranged for the disposal of Jewish property and granted legal immunity to knights and patricians to facilitate the massacres. When some cities expressed skepticism at the well-poisoning allegations, aristocrats and noblemen, rather than the “rabble,” gathered at town halls to convince their governments to burn the Jews. Plague antisemitism, by most accounts, was a high-class affair.

Mayors and princes recognized the contagious nature of violence. If elites persecuted the Jews, they thought, the masses might join in and the situation could spiral out of control. As a result, the wealthy actively tried to exclude the poor from antisemitic activities. Prior to a pogrom, the wealthy would circulate rumors of well-poisoning. Those of means would then capture several Jews, torture them into “confessing,” and then alert the town government. Its notaries would record the accusations, and the matter would be presented before a court. After a (certain) guilty verdict, patrician leaders would gather the Jews and burn them in the town square or synagogue. Each step of the process was self-contained within the medieval gentry, providing no opportunity for commoners to amplify the violence beyond what was necessary. Mass persecutions often take the form of an entire society turning against a group, but the medieval elites sought to insulate a substantial amount of their population from the pogroms. Ironically, they feared religious violence left unchecked [1].

Persecutions were widespread, but not universal. (Voigtländer & Voth, 2012) say only 70 percent of towns with a Jewish population in plague-era Germany either expelled or killed their Jews. To be sure, 70 percent is a substantial figure, but the fact it is not 100 percent demonstrates that there were conditions under which ethnic violence would not ensue. What were these conditions?

In pursuing this question, (Jedwab et al., 2018) observed something strange. As plague mortality increased in some towns, the probability Jews would be persecuted actually decreased. A town where only 15 percent of inhabitants died is somehow more antisemitic than one where 40 percent did. How odd. Common sense tells us that the more severe an unexplained disaster, the stronger the incentive to resolve ambiguity and blame an outgroup. Why reserve judgment when things are the worst?

It turns out economic incentives were stronger than the desire to scapegoat. Jedwab only observed the inverse relationship between mortality and probability of persecution in towns where Jews provided moneylending services. The Jews’ unique economic role granted them a “protective effect,” changing the decision calculus of would-be persecutors. It’s true there’s still an incentive to persecute Jews since debts would be cleared if the moneylenders died, but this is a short-term gain with long-term consequences. If all Jews in a town are eliminated then future access to financial services is gone. Everyone in town would be a Christain, and thus forbidden from extending credit. As mortality increases, Jews qua moneylenders became increasingly valuable, since killing or expelling them would exacerbate the economic crisis that accompanies losing a significant fraction of your population. As a result, they were spared [2].

This is consistent with the picture we developed earlier of the upper classes undertaking plague pogroms. Often, only the wealthy utilized Jewish financial services, thus only they were sensitive to the financial implications of killing the sole bankers in town (Cohn 2007). If commoners were the major perpetrators of Black Death massacres, Jedwab and colleagues would probably not encounter evidence of a protective effect tied to Jewish occupations. Indeed, they looked for, and could not find, the protective effect in towns where Jews were doctors, artisans, or merchants. It only appeared when they provided financial services.

Persecution frequency also fell dramatically after the initial wave of the plague. The disease made semi-frequent appearances in the decades and centuries after 1347, but not one of them sparked as much violence as the first bout. A potential explanation is that many Jews had already been killed or expelled from their towns, leaving nobody to persecute. Plague severity was lower in later recurrences, so there might have been less of an incentive to persecute a minority for a mild natural disaster as opposed to a major one.

Screen Shot 2020-04-07 at 4.05.38 PM.png
Jewish persecutions by date. Source: (Jedwab et al., 2018)

I’ll describe a somewhat rosier phenomenon that could have contributed to this decline in persecutions. Remember that the medieval intellectual elite was clueless when it came to the causal mechanisms of the plague. Among other theories, they believed it spread via bad air, worm effluvium, frogs and toads that fell from the sky, the wrath of God, or Jewish machinations. Because Jews were the only salient member of this list present in most towns, they had borne the brunt of popular frustration and become a scapegoat [3].

Yet, science progressed slowly. By 1389, roughly 40 years after Europe’s first encounter with the plague, doctors noticed that mortality had fallen after each successive recurrence of the disease. Instead of attributing this to fewer toads in the atmosphere or less worm stench, they settled on the efficacy of human interventions. Institutions had learned how to respond —the quarantine was invented in this period— and medicine had progressed (Cohn 2007). Medievals had increasingly effective strategies for suppressing the plague and none of them involved Jews. Blaming them for subsequent  outbreaks would be downright irrational as you would be diverting time and resources away from interventions that were known to work.

I want to be clear that this did not end antisemitism in Europe. Jews for centuries were —and still are— killed over all types of non-plague related allegations like host desecration, blood libel, and killing Jesus. Yet, they enjoyed a period of relative calm after the first wave of the Black Death, in part, I believe, because their persecutors had begun to understand the actual causes of the disease.

Generalizations

1. Under what conditions will a population persecute a minority?

Persecutions are more likely when members of the minority in question don’t occupy an essential economic niche. Jews as moneylenders provided a vital service to members of the medieval elite, so the prospect of killing or expelling their only source of credit may have made them think twice about doing so.

2. Who is most likely to instigate the violence?

Wealthy, high-status members of medieval society instigated and undertook the Black Plague pogroms. They were responsible for spreading well-poisoning rumors, extracting “confessions,” processing and ruling on accusations, and attacking Jewish townsfolk. Some medieval elites even conspired to insulate the poor from this process for fear of the violence escalating beyond control.


How applicable are these conclusions? Can they tell us anything specific about ethnic violence and COVID-19?

Probably not. For starters, the world today looks nothing like it did during the 14th century. The Medievals may have discovered how to attenuate the effects of the plague, but it remained more or less a mystery for centuries afterward. We didn’t get germ theory until the 19th century, and it wasn’t until 1898 that Paul-Louis Simond discovered the plague was transmitted to humans through fleas.

Perhaps as a result of scientific progress, we’re also much less religious, or nature of our religiosity has changed. Very few believe hurricanes are sent by God to punish sinners, and we don’t supply theological explanations of why the ground trembles in an earthquake. We have scientific accounts of why nature misbehaves. As a result, we’re skeptical of (but not immune to) claims that a minority ethnic group is the source of all our problems. In short, we have the Enlightenment between us and the Medievals.

Cosmopolitanism is also on our side. Jews were often the only minority in a town and were indistinguishable without special markers like hats and badges. To this day, parts of Europe are still pretty ethnically homogeneous, but every continent has hubs of multiculturalism. 46% percent of people in Toronto were born outside Canada. 37 percent of Londoners hail from outside the UK. Roughly 10 percent of French residents are immigrants. All this mixing has increased our tolerance dramatically relative to historical levels.

Perhaps most importantly, COVID-19 is not even close to the plague in terms of severity. The medical, economic, and social ramifications of this pandemic are dire, but we are not facing local death rates of 80 percent. We do not expect 40 to 60 percent of our total population to die. COVID-19 is a challenge that is taxing our greatest medical minds, but we have a growing understanding of how it functions and how to treat it. It’s definitely worse than the flu, but it’s no Black Plague.

An investigation into the Black Plague and medieval Jews can provide historical perspective, but its results are not easily generalizable to the current situation. The best we can say is that when things are bad and people are ignorant of the causes, they will blame an outgroup they do not rely on economically. The cynical among us perhaps could have intuited this. Thankfully, things aren’t as bad now as they were in 1347, and we are collectively much less ignorant than our ancestors. We’ve made progress, but intolerance remains a stubborn enemy. What Asians already have, and will, endure as a result of this pandemic supports this.

 

Acknowledgments

Huge thanks to Michael Ioffe and Jamie Bikales for reading drafts.

 

Footnotes

[1] I draw heavily on (Cohn, 2007) in the preceding two paragraphs. It’s definitely true some pogroms were instigated and undertaken by the poor while the elites sought to protect the Jews. For instance, Pope Clement VI issued an (unheeded) papal bull absolving Jews of blame. Yet, Cohn has convinced me (an amateur) that these cases constitute a minority.

Still, the class demographics of medieval pogroms are a matter of scholarly debate. (Gottfried, 1985) describes members of Spanish royalty unsuccessfully attempting to protect their Jewish denizens. However, he does not specify whether the antisemitism was primarily instigated by the masses or regional leaders. (Jedwab et al., 2018) mentions “citizens and peasants” storming a local Jewish quarter, but whether local antisemitism was spurred by the gentry or not is also unclear. (Haverkamp, 2002) supposedly also argues for the commoner-hypothesis, but the article is written in German and thus utterly inaccessible to me.

[2] A simple alternate explanation for the inverse relation between mortality and probability of persecution is that there are fewer people left to facilitate a pogrom or expulsion at higher levels of mortality. Jedwab and colleagues aren’t convinced. They note that the plague remained in a town for an average of 5 months, and people weren’t dying simultaneously. It’s entirely possible that even at high mortality a town can muster enough people to organize a pogrom. Also, many of the persecutions were preventative. Some towns burned their Jewish population before the plague had even reached them in an attempt to avert catastrophe.

[3] God was also “present” —in the form of churches and a general religious atmosphere— in medieval Europe, so he was another popular figure to attribute the plague to. However, you can’t really blame God in a moral sense for anything he does, so adherents to this view blamed themselves. So-called “flagellants” traveled from town to town lashing themselves in an attempt to appease God’s wrath and earn salvation. This was strange and radical even by medieval standards. Pope Clement thought the movement heretical enough to ban it in 1349 (Kieckhefer 1974).

 

 

What I cited

(These are all the scholarly sources. My guideline is that if I downloaded something as a pdf and consulted it, it’ll be here. Otherwise, it’s linked in the text of the post).

Botticini, M., & Eckstein, Z. (2003, January). From Farmers to Merchants: A Human Capital Interpretation of Jewish Economic History.

Botticini, M., & Eckstein, Z. (2004, July). Jewish Occupational Selection : Education, Restrictions, or Minorities? IZA Discussion Papers, No. 1224.

Cohn, S. (2002). The Black Death: End of a Paradigm. American Historical Review .

Cohn, S. (2007). The Black Death and the Burning of Jews. Past and Present(196).

Gottfried, R. S. (1985). The Black Death: Natural and Human Disaster in Medieval Europe. Free Press.

Haverkamp, A. (2002). Geschichte der Juden im Mittelalter von der Nordsee bis zu den Su¨dalpen. Kommentiertes Kartenwerk. Hannover, Hahn.

Jedwab, R., Johnson, N. D., & Koyama, M. (2018, April). Negative Shocks and Mass Persecutions: Evidence from the Black Death. SSRN.

Kieckhefer, R. (1974). Radical tendencies in the flagellant movement of the mid-fourteenth century. Journal of Medieval and Renaissance Studies, 4(2).

Lerner, R. E. (1981, June). The Black Death and Western European Eschatological Mentalities. The American Historical Review, 533-552.

Voigtländer, N., & Voth, H.-J. (2012). PERSECUTION PERPETUATED: THE MEDIEVAL ORIGINS OF ANTI-SEMITIC VIOLENCE IN NAZI GERMANY*. The Quarterly Journal of Economics, 1339-1392.

Did household appliances actually save time? Does this matter?

I used to have this idea that the introduction of electric appliances circa 1960 dramatically reduced the number of hours people, especially housewives, spent on domestic tasks. The thought is pretty intuitive. It’s quicker to wash clothes with a washing machine than with a tub and washboard. Dishwashers are more efficient than washing dishes by hand. In the span of a couple of years, I thought, American housewives had much more free time. This idea fits nicely with the fact women’s workforce participation rate increased in the 1960s and the decade marked the beginning of second-wave feminism. Just perhaps, a decrease in the domestic load allowed women time to reflect on unjust norms and mobilize against them.

Unfortunately, the data do not support me. Household appliances are not the fiery instruments of social change I imagined them to be. There is no doubt they altered American domestic life, but they did not reduce the aggregate time spent on household duties.

Several time-use studies from 1925 to 1965 corroborate this.

Screen Shot 2019-12-29 at 8.04.51 PM.png
Source: (Ramey 2009)

As we can see, total home production, or the amount of time housewives spent on domestic tasks, remained approximately constant from the 20s to the 60s. This is surprising considering home appliances diffused rapidly during this period. In 1925, fewer than 20% of American households had a washing machine. By 1950, more than 75% of them did (Bowden and Offer 1994).

However, the allocation of time had changed between the decades studied. Hours spent in food preparation and care of clothing decreased and were shifted towards general managerial tasks (purchasing, management, travel, and other). We can imagine appliances and relying more on store-ready food contributing to decreases in food-prep time for the 1960s housewife. We can also imagine frequent trips to the grocer increasing the time spent on travel, for example.

Even the presence of basic utilities didn’t seem to decrease time spent on domestic duties. Time-use studies comparing black and white families in the 1920s reach surprising results. Black housewives at the time, most without the luxury of a kitchen sink or running water, spent just as many hours on housework as their white counterparts. Both parties averaged around 53 hours a week. As it turns out, time spent on household production is not correlated with income (which is —most likely— correlated with the amount of technology in the home). How could this be? (Ramey 2009) explains:

[Since] lower income families lived in smaller living quarters, there was less home production to be done. Reid argued that apartment dwellers spent significantly less time in home production than families in their own houses; there was less space to
clean and no houses to paint or repair. Second, there is a good deal of
qualitative evidence that lower-income families produced less home
production output. A home economist noted during that time “if one is
poor it follows as a matter of course that one is dirty.” Having clean
clothes, clean dishes, a clean house, and well-cared-for children was
just another luxury the poor could not afford.

There are additional historical factors that explain why household production did not decrease with the introduction of appliances. The first views appliances as a substitute for another labor-saving device whose availability was dwindling: servants. In the early 20th century, it was common for middle-class housewives to hire help. Servants and maids assisted with cooking, cleaning, and caring for children, among other things. Foreign-born residents were usually employed in this capacity, but when immigration restrictions were imposed, domestic labor became scarce. The decline in servants and maids coincided with the rise of electrification and home appliances, allowing a single woman to accomplish what used to take a staff of two or three servants to do (Ramey 2009). Appliances, in this sense, compensated for the loss of a maid. Perhaps it used to take an hour for you and two servants to do the laundry. With a laundry machine, it’s now possible to do the laundry by yourself in an hour. Time spent in home production remains the same, but there are now fewer “inputs” to the process.

The second explanation appeals to changing standards. While appliances were being introduced to American families, new ideas about sanitation and nutrition were also spreading. Housewives learned they could positively influence the health and well-being of their family through their housework, so they opted to do more of it (Mokyr 2000). Even though they could have cleaned the house or cooked dinner much quicker than before, housewives decided to keep cleaning, or cook a more demanding meal, rather than take the time saved by appliances as leisure.

Do not fear. Hours spent in home production by women did begin to decrease near the end of the 1960s. Yet, this was not due to advances in labor-saving technology.

 

Screen Shot 2019-12-30 at 5.00.26 PM.png
Source (Ramey 2009)

Men began to contribute more. As a result, the average woman devoted only 30 hours per week hours towards household production in 2005, compared to the ~50 hours she may have expended in 1900. However, total hours spent in home production per week, the sum of male and female hours, has not shifted much over the century. The average person today is likely devoting as much time towards domestic tasks as their ancestors did over 100 years ago.


I think there are potential implications here for thinking about the future. Many of us imagine that advances in technology will increase productivity dramatically, and as a result, we will be able to enjoy much more leisure. If we can accomplish in 20 hours what it used to take 40 hours to do, why work the additional 20 hours?

Yet, history suggests that advancements in domestic technology do not necessarily save us time. Their benefits roll over into things like increasing living standards before we see additional leisure. Standards for health and cleanliness may increase steadily as our capacity for nutritious meals and clean homes increases as well. Perhaps something vaguely similar to the immigrant/servant situation could happen. Innovations in household technology might decrease time spent in home production, but maybe our jobs become incredibly demanding, eating any time saved by being able to cook or clean quicker. In both scenarios, leisure loses.

I don’t think this is disheartening news. Recall the black and white families studied in the 1920s. Even though housewives of both races spent approximately the same amount of time on household production, the white families were much better off. They enjoyed a higher standard of living due to basic utilities like running water and probably also other bits of technology present in their homes. Even though inputs, in terms of hours expended, are similar, the difference in outputs is astounding.

As a result, a more realistic version of the future might still have ~50 hours of home production for each household, but living standards that are much, much higher. Our health will be fantastic, we will conform to the highest standards of sanitation and hygiene, and we will unequivocally be better off, leisure be damned. We imagine an ideal future as a place with infinite leisure, but a society in which our standard of living is ten times as high with us putting in the same amount effort is still pretty damn good.

 

Works Cited:

Bowden, S., & Offer, A. (1994, November). Household Appliances and the Use of Time: The United States and Britain Since the 1920s. The Economic History Review, 725-748.

Mokyr, J. (2000). Why “More Work for Mother?” Knowledge and Household Behavior, 1870-1945. The Journal of Economic History.

Ramey, V. (2009, March). Time Spent in Home Production in the Twentieth-Century United States: New Estimates from Old Data. The Journal of Economic History, 1-47.

 

Is becoming a lawyer a good investment? (part 3 and conclusions)

In part 1, we described the hypothetical students used in this investigation and calculated the opportunity cost of going to law school for each of them. In part 2, we factored in tuition and university fees and arrived the amount of income a new JD would need to earn to do “better” than their non-lawyer doppelgänger. Part 3 is where we draw conclusions. In order to do so, we’re going to revisit the concept of hurdle compensation that we developed in the last post.


Step 5: hurdle compensation, attribution, and expected salary

As mentioned above, a student’s hurdle compensation is the annual salary they would need to exceed in order to do “better” than they would have without a JD. While this figure tells us something about the costs of actually being a lawyer and how much one should earn to be better off financially in light of these costs, we can interpret it in another manner. Whatever one of our students earns in excess of their hurdle compensation is income directly attributable to their law degree.

Think about how the hurdle compensation was calculated. We took the expected fourth year non-JD income of one of the students and added overtime pay. This is pay they could have earned at their hypothetical non-lawyer jobs, meaning anything in excess is a result of their law degree. The cost of living premium is still included even though it does not represent income a student could have earned in an alternate life. It accounts for the fact some of their salaries as lawyers will be eaten up by higher than average housing.

Now, we examine each students’ expected post-JD income and see how it compares to their hurdle compensation.

Fortunately, we have good data on salaries for recent law graduates. For more than a decade, the National Association for Law Placement (NALP) has been surveying newly employed lawyers on their earnings and publishing the results. Schlunk used their 2008 report to do his analysis and I will use the 2018 data for mine.

 

Screen Shot 2019-08-16 at 10.32.09 PM

 

As we can see, the distribution of lawyer salaries is heavily concentrated in two areas. The NALP asserts 49.6% of salaries reported fall in the $45,000-$75,000 range, while just over 20% are greater than $180,000. As you can guess, the salaries in the high six figures are those given to new biglaw associates. The NALP also notes low salaries are under-reported, meaning the percentage of jobs paying in the 45-75k range is probably higher.

Based off of this graph, we will assume the average non-biglaw job pays $60,000 and the average biglaw one $185,000. Using the percentage chance we stipulated in part 1 of each student landing a biglaw job, we can calculate their expected first year salary and subtract their hurdle compensation to see how much income is attributable to their JD.

 

Screen Shot 2019-08-21 at 8.15.59 AM
Table 8

 

Thankfully, the results are positive. The expected income of each student exceeds their hurdle compensation, meaning on average their degree is conferring some additional financial gain. However, we must note averages can be deceiving. Solid Performer’s expected income is ~$129,000, but this is nowhere near what he will earn in the average biglaw or non-biglaw job. If he is unfortunate enough to find employment in the 60-75k range (as nearly half of all new lawyers do) his annual earnings will not exceed his hurdle compensation of ~$99,000. His law degree may have made him eligible for high paying biglaw jobs, but the alternatives are dismal given the costs. In the event he doesn’t score a biglaw job (a 45% chance), he will be worse off than if he never went to law school in the first place.

Regardless, we are going to treat the expected first year law income as the actual earnings of our students, and their incremental income as the actual income of theirs attributable to a law degree even if in specific cases the numbers may either be much higher or much lower.

Step 6: discount rates

As of now, we know what a law degree will net you your first year of employment, but what about all subsequent years until retirement? Even if we did know that, how can we value future income in today’s terms to compare it to the costs we’ve incurred in the present?

The first question we’ll tackle with an assumption. In his calculations, Schlunk assumes a 3.5% yearly growth in salary over a 35 year law career to account for increases in productivity. I see no reason to disagree, so I’ll do the same. This ignores the possibility of our students making partner or coming across fat bonuses or raises, though.

Techniques exist to answer the second question. Valuing future income isn’t as straightforward as summing it all and saying this is how much it’s worth (unless you’ve made some uncommon assumptions). Future earnings are discounted and expressed in present dollar terms in order to reflect the opportunity cost of not having the money or account for some risk to actually receiving it.

The latter reason is most relevant to our discussion. The incremental income earned attributable to the law degree is small, relative to the entire salary, and volatile. Schlunk says law students often do not appreciate the career instability of many attorneys. Not everyone gets raises and sometimes law firms “blow up.” Even if you’re an in-house lawyer, you’re as subject to corporate downsizing as any other employee.

Additionally, our students aren’t content with merely breaking even on this investment. They want as close to a guarantee as possible they will not only recoup their initial costs but make positive returns. We can think about this as them wanting the investment to pay off over a very short time horizon, which is the equivalent of applying a high discount rate to future income. Money closer to the present is much, much more valuable to them. The more they make now, the less doubt there is that law school was a bust.

These two factors make it reasonable to apply relatively high discount rates to the future income attributable to the degree. To do our calculations, we will treat each student’s incremental income as an annuity dispersed once a year that grows at a rate of 3.5%. Below are the results for different discount rates.

Screen Shot 2019-08-29 at 4.10.21 PM
Table 9

Step 7: conclusions and limitations

For context, I’ll reproduce the cost of attendance for each student.

Screen Shot 2019-07-14 at 9.01.15 PM
Table 10

For all of the given discount rates, law school is not an attractive investment. Suppose Solid Performer personally discounts future income at 17%. In other words, $100 today is worth just as much as $117 a year from now for him. If he were to attend law school under our assumptions, he would be paying $335,579 (total cost) for something that is in fact only worth $219,629 to him. Likewise, If Hot Prospect is under the impression that 12% is the correct rate (which is still significantly below Schlunk’s recommendations), she would be paying $422,140 for something that is worth $409,471. Each student would only break even if they convinced themselves something a little under 12% was the appropriate discount rate.

Does this mean law school is a poor investment in all circumstances? Definitely not. One thing our analysis neglects is the impact of financial aid and scholarships on total cost. Many law schools offer considerable assistance to students, with more than half of their enrollment receiving some type of merit or need-based funding. Clearly, the conversation surrounding law school changes significantly when you’re on a full ride versus paying entirely out of pocket.

It’s also worth repeating how averages can be deceiving. Our expected yearly income figure used to calculate the present value of a law degree captures only the average outcome of each student. We’ve assumed Solid Performer either earns $60,000 or $180,000 out of law school, but used something in between these two numbers to arrive at the salary we claim he would have made after graduation. If Solid Performer gets the biglaw job and earns the $180,000, his degree is certainly a much better financial investment, but that is not reflected in our outcomes.

The next set of limitations has to do with your author. I am an undergraduate with some time on his hands, not Herwig Schlunk, a law professor with more degrees than everyone in my family combined. The assumptions I make in this series of posts without Schlunk’s guidance are at best educated guesses and at worst ignorant stipulations. What’s more, even though I am trying my best to follow Schlunk’s analysis closely, his paper is nearly a decade old and I do not have the expertise to know to what extent his methods and assumptions are still accurate. This is an amateur attempt at something only a professional can treat with the necessary discernment.

Lastly, I expect some to take issue with the scope of this inquiry. I’ve tried to shed light on the narrow question of whether or not law school is a good financial investment, but there are clearly other reasons to be a lawyer. As mentioned in the intro to part 1, one can be motivated to practice law by the pursuit of justice or their personal vocation. I will not speculate on the ultimate reason a reader may want to pursue a JD, but our analysis suggests it should not be to make a sound investment.


 

If you got this far, thanks for reading.

I heavily recommend reading Schlunk’s original paper. While my goal was to reproduce his method with current numbers, it is almost certain I erred in some respect or missed an insightful point hidden in the original. Do give it a look for a more comprehensive understanding of how he went about answering the question.

Also, here is the Excel sheet I used to calculate everything. You can plug in different annual incomes or discount rates to get a feeling for how the results might have turned out otherwise with different assumptions.

If you have questions/concerns/feedback regarding this series of posts, feel free to leave a comment or send me an email.

riley[at]rileywilson[dot]net

Is becoming a lawyer a good investment? (part 2)

In the last post, we described the hypothetical students we are using for this investigation and considered and the type of income they would be missing if they chose to go to law school. Now, we’ll add tuition to the total cost and see how much each would need to earn as lawyers to “break even,” so to speak.


Step 3: tuition and totals

Tuition varies greatly between institutions. In the most egregious of circumstances, you can expect to pay up to $77,000 a year at fancy private law school. Yet, public law schools exist that only charge around $22,000.

From my amateur research, tuition seems to increase with the prestige of the institution. Accordingly, Also Ran, Solid Performer, and Hot Prospect will probably pay different amounts. To account for the variation, I lifted tuition information from three law schools whose standards correspond to the stated undergraduate performance of our hypothetical students. For Also Ran, a school nearly outside the top 50 in the rankings. For Solid Performer, a school hovering around the 20 mark. Hot Prospect gets the big name T14 law school with the big name prices.

I also added a couple thousand dollars extra to yearly expenses to account for books and university fees.

Screen Shot 2019-07-14 at 8.00.09 PM
Table 4

These numbers are eye-popping. For context, Let’s assume Solid Performer takes out loans to cover the entirety of her expenses. If she secures a 6.08% interest rate (the federal graduate fixed rate) and makes monthly payments of around $1,500, it would still take her 15 years to pay them off.

Adding these totals to the opportunity cost figures is truly frightening.

Screen Shot 2019-07-14 at 9.01.15 PM
Table 5

We’ll soften the blow a bit by taking into account summer employment.  Many law students are able to land lucrative summer positions that can lower the net cost of law school. In 2009, Schlunk estimated that Also Ran, Solid Performer, and Hot Prospect can each earn approximately $5,000, $7,500, and $10,000 respectively during a summer.

Rather than plug these figures into an inflation calculator, I decided to look for current data. Ziprecruiter has this helpful table on their website.

Screen Shot 2019-07-19 at 9.10.20 PM

I think it’s safe to assume Also Ran, Solid Performer, and Hot Prospect have summer earning potentials around the 25th, 50th, and 75th percentiles, respectively. Subtracting two summers of compensation yields the following total cost figures.

Screen Shot 2019-07-19 at 9.34.11 PM
Table 6

Roughly, this is how much it costs to go to law school. However, and I hadn’t actually thought about this, there are also significant costs to being a lawyer. We examine those next.

Step 4: hurdle compensation.

If you become a lawyer, it would be nice to earn more than what you would have without a JD. Financially, it would be a disaster if you invested $300,000 and three years of your life to end up with the same earning power as a similar individual who did not go to law school.

As a result, we will attempt to quantify the amount you would need to earn in order to be doing better than your non-lawyer doppelgänger. This is different than just looking at the fourth year hypothetical yearly wages calculated in table 2 of part 1. As mentioned above, actually being a lawyer entails sacrifices that ideally you would be compensated for. Imagine you make $5,000 more annually than your doppelgänger. However, you also have to live in a more expensive city than they do and pay $5,000 more in housing. Effectively, you make as much as the doppelgänger when the additional costs are considered. If you want to “actually” make $5,000 more than your twin in this scenario, you would have to be compensated for the $5,000 you spent in excess of what you would have. Thus, you would have to make $10,000 more than them.

We’ll call this figure we arrive at after adding the additional costs associated with being a lawyer “hurdle compensation.” In our toy example above, the hurdle compensation would have been whatever the non-lawyer doppelgänger had made plus the expenses tied to being a lawyer.

We will consider the cost of housing and overtime pay in calculating hurdle compensation for our hypothetical students.

Lawyers, especially high earning ones, are concentrated in a few American cities. New York, San Francisco, Los Angeles, and Washington D.C. all have ample big-shot lawyer populations and sky-high costs of living. As a result, if one of our students gets a biglaw job, they will most likely have to move to an expensive city.

Lawyers are also notoriously overworked. It’s not uncommon to only bill (charge clients for) 40 hours a week but actually work 60 or 70 hours at large law firms. Our budding lawyers should be compensated for this overtime if they so happen to get a biglaw job.

As with state taxes, I’m not even going to try and figure out an exact cost of living premium owing to the variation across cities. Schlunk assumes an additional 10% of total income is an accurate premium, so I’m going to go with that.

Getting overtime premiums involves less hand-waving and more calculation. We’ll assume the average worker puts in 2,000 hours a year (40-hour weeks), and any hours over this are overtime. We’ll also stipulate that any lawyer has to put in 200 more hours a year than average. Thus, a regular lawyer works 2,200 hours a year.

Now, it’s not unreasonable to assume that while an average lawyer is expected to work 2,200 hours, biglaw lawyers work 2,400 (46-hour weeks).

The going overtime rate is 1.5x your average hourly rate. However, each additional overtime hour is more valuable as it is taken from an ever-decreasing pool of your free time. Thus, Schlunk, and I, assume the first 200 overtime hours should be valued at 1.5x one’s average hourly rate, but the next 200 should be 2x.

Based on our assumptions about the probability of each student getting a biglaw job, Also Ran will need the cost of living premium and 400 hours of overtime pay 20% of the time, Solid Performer 55% of the time, and Hot Prospect 90% of the time.

Here are the totals:

Screen Shot 2019-08-11 at 10.09.24 PM
Table 7

As we can see, our three students must make more than approximately $64,000, $99,000, and $135,000 respectively in order to “do better” than their non-lawyer counterparts.

However, these figures do not tell the entire story. In the next part, we’ll consider discount rates and conclude whether the JD was a good investment.

Is becoming a lawyer a good investment? (part 1)

Like many in unmarketable majors, I’ve briefly toyed with the idea of becoming a lawyer. Not necessarily because I have a deep interest in the law, but because going to law school is the ultimate vindication for your humanities degree.

My personal interest passed quickly, but curiosity lingered about the profession. Why do people become lawyers? Is it really as miserable as I’ve heard? Is it even a sound financial decision?

Thankfully, someone has already answered the last question. I recently came across a paper from law professor Herwig Schlunk on whether going to law school is a good investment (spoiler alert — it isn’t).

The paper is a little outdated, being written in 2009, but I wanted to try and reproduce his analysis as best I can with current figures to see if the conclusions change. This post is the first in a series where I do just that. I’ll be following his steps as closely as possible, but will not be comparing my end results to his.

Note that Schlunk, and this series of posts, is attempting to answer the question of “should I go to law school?” from a purely financial perspective. Money aside, people might attend because they feel they could be good lawyers, or they want to contribute to a more just society. Non-monetary reasons are certainly valid, even applauded, but the costs of any type of graduate school should be considered before a prospective student writes the check or takes out the loan. Schlunk acknowledges that becoming a lawyer confers numerous benefits beyond increased earning power, but as he puts it, “you can’t eat prestige.”


Step 1: the students

Because the answer to any major life decision is highly particularized, it’s foolish to perform one set of calculations trying to settle the matter and claim the results apply to everyone. In an attempt to be less foolish, Schlunk stipulates three hypothetical undergraduates with different backgrounds and considers their situations in parallel. (Note: I’m borrowing Schlunk’s names for the students out of convenience).

Let’s meet them.

Also Ran is an undergraduate at a middle-of-the-pack university. He achieves above average grades in a relatively nonmarketable major and could have earned $47,000 in a non-legal job after graduation. He, by Schlunk’s account, “claws his way” into a second/third tier law school and has about a 20% chance of getting a lucrative “biglaw” job after graduation.

Solid performer went to a better college and made good grades in a more marketable major (think economics vs English). He makes his way into a mid tier law school but could have earned $66,000 had he chose to jump into the workforce. As a JD, he has a 55% chance of getting a cushy biglaw job.

Last, we have Hot Prospect. She is the most conventionally successful of the bunch, having a stellar academic record in a marketable major (CS/math/engineering) at an elite undergraduate institution. She gets into another elite university for law school but could have made $83,000 in her first year out of undergrad. However, biglaw jobs aren’t a certainty for anyone, including her. She has a 90% chance of getting one after law school.

Step 2: opportunity cost

If someone decides to go to law school, or pursue any other type of post-graduate education, they are missing out on potential wages. Of course, not all of what you make goes straight into your pocket. In the table below, I subtract various taxes from the salaries each student could have made in their first year of employment. The FICA tax rate I used was 6.2%, and all three students happened to fall in the same Federal income tax bracket given their first year salaries.

State taxes are trickier as schemes vary wildly across the nation. States like Minnesota and California have a multi-tiered system with differing marginal tax rates. Others, like Massachussets and Utah, have a single rate for all income. To further complicate things, Texas, Nevada, Washington, and Florida have no state income tax at all.

To simplify my calculations, I’ll make a move similar to what Schlunk did and assume a flat 4.5% state tax rate. This might significantly over or understate the amount of taxes you would pay in most states, but it’s an accurate figure if you’re living in Illionois, for example, with a 4.95% flat rate.

Screen Shot 2019-07-11 at 7.03.48 PM
Table 1

Based on the assumptions above, this is the opportuntity cost for each student of going to law school for a single year. Note that it typically takes three years of schooling to get the degree. I could multiply each figure by three, but that wouldn’t account for raises in pay commensurate with increases in productivity. In his calculations, Schlunk accounts for this by bumping pay 3.5% a year but acknowledges this figure might be too low. I assumed a 4% growth in yearly wages and ran the numbers again.

Screen Shot 2019-07-13 at 4.15.19 PM
Table 2

The hypothetical fourth year is included to illustrate what kind of earnings a potential JD could expect her fourth year post-undergrad had she joined the workforce instead of going to law school. Later, we will compare it to what she would likely earn as a freshly-minted lawyer.

Now, the (rough) financial opportunity cost can be obtained by summing the first, second, and third year after tax incomes.

Screen Shot 2019-07-13 at 4.21.28 PM
Table 3

Already, law school isn’t looking good. The financial benefits must be large in order to justify passing on $110,000-$185,000. Unfortunately, this is only the opportunity cost. In the next part we will consider the greatest explicit expense to getting a JD.