1) Really appreciated this take on Ukraine from Fareed Zakaria:
Last week, I outlined Russia’s interests and strengths in this crisis. It is vital to understand its weaknesses as well. “When Putin took Crimea in 2014, he lost Ukraine,” as Owen Matthews writes in a thought-provoking essay. After it declared independence in 1991, Ukraine was divided between an unabashedly pro-Russia segment of its population and a more nationalistic one. But by annexing Crimea and plunging eastern Ukraine into open conflict, Matthews writes, Putin has energized Ukrainian nationalism and fed a growing anti-Russia sentiment. And the math does not help. Putin took millions of pro-Russia Ukrainians in Crimea and Donbas out of the country’s political calculus. (Those in Donbas don’t vote in Ukrainian elections because the area is too unstable.) As a result, a Ukrainian politician estimated to me that the pro-Russia seats in Ukraine’s parliament have shrunk from a plurality to barely 15 percent of the total.
In retrospect, if Putin’s aim were to keep Ukraine unstable and weak, it would have made far more sense to leave those parts of Ukraine within the country, supporting the pro-Russia forces and politicians in various ways so that they could act as a fifth column within the country, always urging Kyiv to forge closer ties with Moscow. Instead, Ukraine is now composed mainly of a population that is proudly nationalist and that has become much more anti-Russia.
2) Here’s a great idea: “Buying masks should be as easy as buying sunscreen”
Imagine trying to buy sunscreen without knowing how much the product will protect you from the sun. Thankfully, your decision is made easy. All you need to look for is one number on the package — the sun protection factor, or SPF.
Buying masks to protect you from covid-19 can — and should — be just as easy. That’s why we propose a government-backed program that would evaluate and label masks for consumers like the Food and Drug Administration does for sunscreens.
Not all masks are equal, and with so many options available online and in stores it’s easy for consumers to be confused about which to select. The Centers for Disease Control and Prevention recently attempted to provide clarity, recommending high-quality face coverings, such as N95s or KN95s, to help fight the very transmissible omicron coronavirus variant…
These challenges in identifying high-quality protective face coverings underscore the urgent need for a federal consumer mask certification and labeling program, as we have previously written. The science is clear: The wearing of masks is essential for stopping the spread of the virus that causes covid-19. Any mask is better than no mask, but such a federal certification and labeling program would allow Americans to make informed choices about their mask’s protection level with ease and confidence.
Our proposed program would create benchmarks for each mask’s level of particle filtration, leakage amounts, comfort, fit and breathability, among other criteria. Establishing these standards would provide consumers with an easy-to-use rating system for comparing masks and the ability to choose products based on an evaluation of their performance.
Manufacturers would need to clearly display these rankings on their masks and respirators. Additionally, product packaging should include guidance about how to properly wear the mask, whether it can be reused and how to clean it. Without such a system in place, the public will remain unaware and possibly unprotected. The mask they have purchased might provide 95 percent protection against virus particles, or only 10 percent.
The U.S. government should also prioritize an initiative to design next-generation N95 masks. They should be comfortable to wear, have more than a 95 percent particle filtration rate, low leakage levels and made of advanced, lightweight materials. These masks must be designed in different sizes and styles for women, men and children.
3) Not great, “Bird Flu Is Back in the US. No One Knows What Comes Next: The fast-moving pathogen, which has already invaded Europe, was found in East Coast ducks. The last outbreak that tore through the US killed 50 million birds.”
Three birds out of the millions that American hunters shoot each year might seem like nothing—but the findings have sent a ripple of disquiet through the community of scientists who monitor animal diseases. In 2015, that same strain of flu landed in the Midwest’s turkey industry and caused the largest animal-disease outbreak ever seen in the US, killing or causing the destruction of more than 50 million birds and costing the US economy more than $3 billion. Human-health experts are uneasy as well. Since 2003, that flu has sickened at least 863 people across the world and killed more than half of them. Other avian flu strains have made hundreds more people ill. Before Covid arrived, avian flu was considered the disease most likely to cause a transnational outbreak.
It is far too soon to say whether the arrival of this virus in the US is a blip, an imminent danger to agriculture, or a zoonotic pathogen probing for a path to attack humanity. But it is a reminder that Covid is not the only disease with pandemic potential, and of how easy it is to lose focus when it comes to other possible threats. The possibility of a human- or animal-origin strain of flu swamping the world once seemed so imminent that back in 2005 the White House wrote a national strategy for it. But researchers say the surveillance schemes that would pick up its movement have since been allowed to drift.
4) I really love Chloé Valdary’s approach to anti-racism training.
She has done away with unconscious bias training, segregating co-workers by race, and placing blame on abstract “systems.” Instead, she promotes stoicism and a self-love that leads to community love.
“Enchantment . . . is a state of being where you’re in a healthy relationship with yourself, which allows you to have a healthy relationship with others,” Valdary told me. “If we want to teach people how to love, we have to ask what people are already in love with. That’s why I use pop-culture references to reinforce my teachings.”
Her trainings promote three core principles: Treat people like human beings instead of political abstractions; criticize to uplift and empower rather than tear down or destroy; and, root everything you do in love and compassion — harking back to the Christian principles of Martin Luther King, Jr.
5) Loved this from deBoer, “Human Capital is Real, and Some People Are Smarter Than Other People until we acknowledge that, there can be no coherent discussion of education”
When I set out to write my book, I knew the idea of intrinsic or inherent academic talent, an innate predisposition to succeed or fail, would be controversial, and was prepared for that controversy. The repeated reassurances that the book rejected race science, which annoyed some readers so deeply, were in part an attempt to ward off deliberate misunderstandings of what I was saying. (That is, that individual talents can vary thanks to genetics without that implying that group differences are genetic.) What I was consistently surprised by, though, was the number of people who responded to my book by insisting that there is no such thing as a summative difference in intelligence or academic ability – that is, that not only are there no inherent predispositions towards being good or bad at school, no one even becomes better or worse, no one is smarter than another. There are no measurable differences in what we know or can do intellectually. Or, in some tellings, no one knows what smart is, it’s some sort of ineffable quality we can’t pin down, or the very idea of “smart” is a racist Western imperialist hegemonic heteronormative con.
I find this all unhelpful. Narrow down as specifically as you can and no one can persist in denying that there are differences in summative ability. Can anyone really claim that I can do calculus as well as a math professor who teaches it? Because I can’t do calculus at all! Of course people have different things they know she understand and can do intellectually. I’m not naturally talented at math. I don’t like it but it’s true. And easily quantifiable. If there were no such distinctions school would not exist.
But the will to obscure this fact is strong. In many fields, the academics at the top are busily abstracting and mystifying success, the better to insist that no one is bad at what you study. (My old field, writing studies, is filled with academics who believe there is no such thing as being better or worse at writing, which makes you wonder why anyone is paying their salaries.) Every day academics declare that grade are a capitalist plot, tests evil, and the very idea of assessment offensive. But there really are things that you can know and not know in life, and some of them, such as reading, are really important. And in fact we are very good indeed at creating instruments that measure whether you can read or write or do algebra. It’s just that their results are socially inconvenient.
If the concern is saying that there are attributes and abilities in life that matter that are not academic or connected to intelligence, and that they should be taken seriously and rewarded, the news is good, as this is perhaps the core argument of my book. If the concern is saying that being smart is an unhealthy obsession in our society and too essential to having material security, the news is good, as that is perhaps the other way to state the core argument of my book. But I don’t understand why we would pretend that academic or intellectual ability doesn’t exist, and act as though that attitude is a prerequisite to be a progressive person who desires equality of rights, dignity, and human value. As I never get tired of pointing out, traditional left thinkers like Marx never pretended that all of us are equal in our abilities. (“From each according to his abilities” implies the opposite!) What the left pushes for is equality of human value, including across – perhaps especially across – differences in talent. Equal value, equal dignity, and equal right to demand the minimum conditions needed for human flourishing.
We can lawyer about the concept of intrinsic ability as much as we want. (For the record, acknowledging that genes and environment both play important roles in education, and that there are complex interactions between them, does not imply that outcomes are therefore mutable.) We live in a world where some people can do things, intellectually, that are monetarily rewarded and socially valuable, and some people can’t. Our attempts to spread these abilities universally have been an abject failure. Because each of us has a nature, and while we’re all good at something, we’re not all good at the same things, and capitalism most certainly does not reward all gifts equally, and so much the worse for us. (Indeed, this is the very reason redistribution is necessary.) Yes, intelligence is multivariate and complex and exists in many dimensions. But so is love, and no one pretends that love therefore does not exist. We are already asking the impossible of our education system, expecting it to reward excellence and create equality at the same time. Let’s not burden it even further by pretending we don’t know some people are better and some at worse at school.
6a) The discourse around Maus is just ridiculous. Is it super-dumb to remove an outstanding book from the curriculum because it has a naked humanoid mouse and uses some bad language?! Yes. But that’s not book banning! But, liberals can be just as tribal and credulous so my social media feed was filled with “book banning” posts and Nazi comparisons. Sorry, but dumb does not equal fascism. Perhaps even worse is to see this making its way into journalism. Here’s the first two sentences from an AP story with my emphases:
A Tennessee school district has voted to ban a Pulitzer Prize-winning graphic novel about the Holocaust due to “inappropriate language” and an illustration of a nude woman, according to minutes from a board meeting.
The McMinn County School Board decided Jan. 10 to remove “Maus” from its curriculum, news outlets reported.
6b) Drum on Maus:
Our story so far: East Bumfuck County¹ in Tennessee—about 20 miles away from the site of the Scopes monkey trial—has banned Maus, a Pulitzer-winning graphic novel about the Holocaust. Outrage is universal.
But rural, conservative school districts have been doing this kind of stuff forever. They don’t like sexual themes. They don’t like nudity. They don’t like swearing. Maus is an adult novel that features all of these things. But there are lots of other novels and nonfiction books about the Holocaust. If Maus is too raw for them, how about recommending something else instead of getting dragged into the usual pointless culture war squabble. Wouldn’t that be a better use of time?
What the hell?! In what actual use of the English language is “remove from curriculum” the same as “ban.” Words have meaning, damnit! They are not just there to make liberals feel good about backwards conservatives.
7) Loved this point from Brian Beutler:
As Republicans escalate their efforts to discourage vaccine uptake, keep in mind: GOP officials turned against vaccines only after Trump lost; Murdoch media in countries with conservative governments are pro-vaccine; Rupert Murdoch himself took a convoy to become one of the world’s first vaccinated people, and Fox has an internal vaccination requirement the talent doesn’t talk about. In other words: It couldn’t be more clear that the purpose is to harm U.S. society (by spreading disease and killing their own supporters) to make the country ungovernable while a Democrat is president.
8) I missed this before, but it’s really pretty stunning and eye-opening, “The Atlantic Slave Trade in Two Minutes: 315 years. 20,528 voyages. Millions of lives.”
Usually, when we say “American slavery” or the “American slave trade,” we mean the American colonies or, later, the United States. But as we discussed in Episode 2 of Slate’s History of American Slavery Academy, relative to the entire slave trade, North America was a bit player. From the trade’s beginning in the 16th century to its conclusion in the 19th, slave merchants brought the vast majority of enslaved Africans to two places: the Caribbean and Brazil. Of the more than 10 million enslaved Africans to eventually reach the Western Hemisphere, just 388,747—less than 4 percent of the total—came to North America. This was dwarfed by the 1.3 million brought to Spanish Central America, the 4 million brought to British, French, Dutch, and Danish holdings in the Caribbean, and the 4.8 million brought to Brazil.
Wow! I need to know more about this history of Brazil.
9) This Scott Alexander post was terrific, but also hard to get just the right excerpt. Do yourself a favor and read the whole thing. Seriously.
There were some historians who praised the Marx article and said nice things about it. But they were all explicitly socialist historians, and they were all studying time periods other than the one containing Lincoln and Marx. So this probably doesn’t completely discredit all expertise. Meanwhile, actual statisticians and election security experts said pretty clearly they thought the election was fair, even when this was in their domain of expertise.
Finally, the Marx thing was intended as a cutesy human interest story (albeit one with an obvious political motive) and everybody knows cutesy human interest stories are always false.
All of this is a lot more complicated than “of course you can trust the news” or “how dare you entertain deranged conspiracy theories!” There are lots of cases where you can’t trust the news! It sucks! It’s completely understandable that large swathes of people can’t differentiate the many many cases where the news lies to them from the other set of cases where the news is not, at this moment, actively lying. But that differentiation is possible, most people learn how to do it, and it’s the main way we know anything at all.
10) Maybe meat-eating was not central to human evolution:
For decades, paleontologists have theorized that the evolution of humanlike features and meat eating are strongly connected.
“The explanation has been that meat-eating allowed this: We got a lot more nutrition, and these concentrated sources facilitated these changes,” Pobiner says. Large brains are phenomenal energy hogs—even at rest, a human brain consumes about 20 percent of the body’s energy. But a switch to a diet full of calorie-rich meat meant an excess of energy that could be directed to supporting larger, more complex brains. And if prehumans hunted their food, that would explain a shift toward longer limbs that were more efficient for stalking prey over great distances. Meat made us human, the conventional wisdom said. And Pobiner agreed.
But in April 2020, Pobiner got a call that made her rethink that hypothesis. The call was from Andrew Barr, a paleontologist at George Washington University in Washington, DC, who wasn’t totally convinced about the link between Homo erectus and meat-eating. He wanted to use the fossil record to check whether there really was evidence that human ancestors were eating more meat around the time Homo erectus evolved, or whether it simply appeared that way because we hadn’t been looking hard enough. Pobiner thought this sounded like an intriguing project: “I love the idea of questioning conventional wisdom, even if it’s conventional wisdom that I buy into.”
The researchers were unable to travel to Kenya for fieldwork because of the pandemic, so instead they analyzed data from nine major research areas in eastern Africa that cover millions of years of human evolution. They used different metrics to assess how well-researched each time period was, and how many bones with butchery marks were found in each site. In a new paper in the journal Proceedings of the National Academy of Sciences (PNAS), Barr and Pobiner now argue that the link between meat-eating and human evolution might be less certain than previously thought. The apparent increase in butchered bones after the appearance of Homo erectus, they conclude, is actually a sampling bias. More paleontologists went looking for bones at dig sites from this era—and as a result, they found more of them.
This doesn’t rule out a link between meat-eating and evolutionary change, but it does suggest that the story might be a little more complicated. “If we want to say how common a behavior was, then we need some way to control for the fact that at some points in time and at some places we’ve looked harder for that behavior than we have at other points,” says Barr. Because sites with well-preserved animal bones are relatively rare, paleontologists often sample them over and over again. But Barr and Pobiner’s study found that other sites that date from between 1.9 and 2.6 million years ago—the era during which Homo Erectus evolved—have been relatively under-studied. “We are drawn to places that preserve fossils because they’re the raw material of our science. So we keep going back to these same places,” Barr says.
For Barr, the new study’s results point to a gap in the paleontological record that needs to be filled in. It might be that other factors were responsible for the evolution of humanlike traits, or it might be that there was a big increase in meat-eating in an earlier period that we just haven’t been able to see yet. “At some point there is no evidence for butchery, and at some point there’s a lot of evidence. And something had to happen in between,” says Jessica Thompson, an anthropologist at Yale University.
11) This is from 2010 (Yglesias just linked to it), but lots of cool charts, “Verbal vs. mathematical aptitude in academics”
Some observations:
– Social work people have more EQ than IQ (this is not a major achievement because of the scale obviously).
– Accountants never made it into the “blue bird” reading group.
– Philosophers are the smartest humanists, physicists the smartest scientists, economists the smartest social scientists.
12) I think Michele Goldberg is right about this, “Let Kids Take Their Masks Off After the Omicron Surge”
Elissa Perkins, the director of infectious disease management in the emergency department of the Boston Medical Center, told me she spent most of 2020 “imploring everybody I could in every forum that I could to mask.” In the beginning, she said, this was to flatten the curve, and later to protect the vulnerable. But masking, she said, “was intended to be a short-term intervention,” and she believes we haven’t talked enough about the drawbacks of mandating it for kids long-term.
“If we accept that we don’t want masks to be required in our schools forever, we have to decide when is the right time to remove them,” she said. “And that’s a conversation that we’re not really having.”
At least, people in deep blue areas weren’t having it until recently. But as the Omicron wave begins to ebb, that conversation — sometimes tentatively and sometimes acrimoniously — has begun. This week Perkins co-wrote a Washington Post essay calling for schools to make masking optional. The Atlantic published an article titled, “The Case Against Masks at School.”
“Coming off the Omicron surge, I think there’s going to be a tipping point with more and more people questioning does this need to continue in schools,” said Erin Bromage, an associate professor of biology at the University of Massachusetts, Dartmouth. Bromage worked with the governor of Rhode Island to reopen schools there, and later helped schools in southern Massachusetts reopen. He believes in the importance of Covid mitigations, but his views on school masking have evolved in recent months. There comes a point, he said, “at which the reduction in risk that comes from the mask is balanced or begins to be outweighed by the detrimental side of things that come with masking.”
The debate about masks in schools can quickly turn vicious because it pits legitimate interests against one another. Many people who are immunocompromised, or live with those who are, understandably fear that getting rid of mandates will make them more vulnerable. But keeping kids in masks month after month also inflicts harm, even if it’s not always easy to measure.
“I think it would be naïve to not acknowledge that there are downsides of masks,” said Perkins. “I know some of that data is harder to come by because those outcomes are not as discrete as Covid or not-Covid. But from speaking with pediatricians, from speaking with learning specialists, and also from speaking with parents of younger children especially, there are significant issues related to language acquisition, pronunciation, things like that. And there are very clear social and emotional side effects in the older kids.”
That’s why I believe that mandatory school masking should end when coronavirus rates return to pre-Omicron levels. I fully accept that, in future surges, masks might have to go back on, but that’s all the more reason to get them off as soon as possible, to give students some reprieve.
13) As my twitter followers know, I’ve joined the Wordle crowd. I’m generally not a word game person at all. But I love the simplicity and the minimal time commitment and the fact that I’m usually pretty decent despite generally being bad at word games. This was interesting, “Twitter boots a bot that revealed Wordle’s upcoming words to the game’s players”
14) I just ordered a squeeze nasal irrigator, rather than a neti pot, but next time I’m congested, saline irrigation here I come:
To the uninitiated, the neti pot may seem like yet another wellness trend. After all, the teapot-like vessel was popularized in the United States by the celebrity surgeon Dr. Mehmet Oz, who called it a “nose bidet” on “The Oprah Winfrey Show” and has been criticized for promoting unproven supplements and health products.
Rinsing warm saltwater through your nose — in one nostril and out the other — as an antidote for a variety of woes like sinus inflammation, congestion and allergies may seem strange and possibly scary; especially if you’ve heard about its links to rare but deadly brain-eating amoeba infections.
But according to ear, nose and throat doctors, nasal rinsing, which traces back thousands of years to the Ayurvedic medical traditions of India, is an unusual example of a practice that is at once ancient, trendy and evidence-based. And, it’s safe and inexpensive to boot.
It has a “very, very high level of evidence, randomized controlled trial evidence, that shows that it does work and it does help,” said Dr. Zara Patel, an associate professor of otolaryngology at the Stanford University School of Medicine. Here’s what we know.
15) Among the other impacts of having a young child… less voter turnout:
Despite evidence that infants affect families’ economic and social behaviors, little is known about how young children influence their parents’ political engagement. I show that U.S. women with an infant during an election year are 3.5 percentage points less likely to vote than women without children; men with an infant are 2.2 percentage points less likely to vote. Suggesting that this effect may be causal, I find no significant decreases in turnout the year before parents have an infant. Using a triple-difference approach, I then show that universal vote-by-mail systems mitigate the negative association between infants and mothers’ turnout.
16) I honestly had no idea 50-50 shared custody was not already the legal default in custody cases.
17) I have no idea why the New Yorker would be running this extensive appreciation of Led Zeppelin right now, but I loved it. I also did not realize I’m not supposed to love both Zeppelin and the Who? But, I emphatically do. I found this part about Zeppelin’s musical borrowings, or plargiarism, depending on your perspective, quite interesting:
Nowadays, skeptics are likely to judge Page’s project of “narrowing the distance between genres” as entitled cultural appropriation, or even plagiarism. Extending its traditional hostility, Rolling Stone has accused the band of having a “catalog full of blatant musical swipes.” Words like “plunder” and “stolen” are thrown about online. Spitz prefers the gentler phrase “suspiciously close.” Through the years, the band has been sued or petitioned by Willie Dixon (“Whole Lotta Love” took words from Dixon’s “You Need Love”), Howlin’ Wolf (“The Lemon Song” borrowed its opening riff and some lyrics from his “Killing Floor”), Anne Bredon (who wrote the original song that Joan Baez, and then Led Zeppelin, made famous as “Babe I’m Gonna Leave You”), and the band Spirit, whose “Taurus” contains a passage that indeed sounds “suspiciously close” to the opening chords of “Stairway to Heaven” (though Spirit lost a lawsuit it brought in 2016).
Page has certainly been parsimonious with credit-sharing, and, in at least one case, shabbily slow to do the right thing—he should have credited the American performer Jake Holmes, who created the musical basis for “Dazed and Confused,” on “Led Zeppelin I.” (Holmes sued and won a settlement in 2011.) But the blues evolved as an ecosystem of borrowing and recycling. The musical form cleaves to the twelve-bar template of I-IV-I-V-IV-I. Musically, you need some or all of this chord progression to cook up anything that feels bluesy, as a roux demands flour and fat, or a whodunnit a murder; originality in this regard would be something of a category error. In the Delta-blues or country-blues tradition that flourished before the Second World War, words tended to drift Homerically free of their makers. Performers might write a couple of their own verses and then finish with lines of a borrowed formula—so-called floating verses, or, the scholar Elijah Wald writes, “rhymed couplets that could be inserted more or less at random.” In fact, the postwar Chicago blues musicians who excited a generation of English performers—Willie Dixon, Muddy Waters, Howlin’ Wolf—were themselves nostalgically repurposing, partly for a white crossover market, the Delta sound of lost prewar giants like Robert Johnson, who died in 1938. As early as 1949, the music industry cannily decided to baptize this modernized, electrified blues sound as “rhythm and blues.” In this sense, you could say that English players like Clapton and Page were double nostalgics, copiers of copiers.
Robert Plant’s tendency to lift words and formulas from old songs should be seen in this light. Plagiarism is private subterfuge made haplessly public. But to take Willie Dixon’s “You’ve got yearnin’ and I got burnin’ ” and put the words into “Whole Lotta Love” as “You need cooling / Baby, I’m not fooling”; to reverse the opening lines of Moby Grape’s 1968 song “Never,” from “Working from eleven / To seven every night / Ought to make life a drag,” and put them into “Since I’ve Been Loving You” as “Workin’ from seven to eleven every night / Really makes life a drag”; to punctuate “The Lemon Song,” which is obviously indebted to Howlin’ Wolf’s “Killing Floor,” with the repeated allusion “down on this killing floor,” while guilelessly referring to Roosevelt Sykes’s “She Squeezed My Lemon” (1937)—to make these moves, in a musical community that was utterly familiar with all the source material, testifies not to the anxiety of plagiarism but to the relaxedness of homage.
Plagiarists do what they do out of weakness, because they need stolen assistance. Does that sound like Led Zeppelin? The genius of “Whole Lotta Love” lies in its opening five-note riff, which has no obvious musical connection to Dixon’s song. “The Lemon Song” makes of “Killing Floor” something entirely new. “Since I’ve Been Loving You” is a better and richer song than Moby Grape’s “Never.” “When the Levee Breaks” is astonishingly different from Memphis Minnie’s. (It isn’t a blues song, for starters.) And, yes, “Stairway to Heaven” has more spirit, along with a few other dynamics, than Spirit’s “Taurus.” Besides, Led Zeppelin did credit many of its sources. The first album names Willie Dixon as the composer of “You Shook Me” and “I Can’t Quit You Baby.” Generally, on the matter of homage and appropriation, I agree with Jean-Michel Guesdon and Philippe Margotin, who, in “Led Zeppelin: All the Songs,” call the band’s version of the latter song “one of the most beautiful and moving tributes ever paid by a British group to its African American elders.”
18) Radiolab just did a podcast on this which led me to, “Why 536 was ‘the worst year to be alive'”
Ask medieval historian Michael McCormick what year was the worst to be alive, and he’s got an answer: “536.” Not 1349, when the Black Death wiped out half of Europe. Not 1918, when the flu killed 50 million to 100 million people, mostly young adults. But 536. In Europe, “It was the beginning of one of the worst periods to be alive, if not the worst year,” says McCormick, a historian and archaeologist who chairs the Harvard University Initiative for the Science of the Human Past.
A mysterious fog plunged Europe, the Middle East, and parts of Asia into darkness, day and night—for 18 months. “For the sun gave forth its light without brightness, like the moon, during the whole year,” wrote Byzantine historian Procopius. Temperatures in the summer of 536 fell 1.5°C to 2.5°C, initiating the coldest decade in the past 2300 years. Snow fell that summer in China; crops failed; people starved. The Irish chronicles record “a failure of bread from the years 536–539.” Then, in 541, bubonic plague struck the Roman port of Pelusium, in Egypt. What came to be called the Plague of Justinian spread rapidly, wiping out one-third to one-half of the population of the eastern Roman Empire and hastening its collapse, McCormick says.
Historians have long known that the middle of the sixth century was a dark hour in what used to be called the Dark Ages, but the source of the mysterious clouds has long been a puzzle. Now, an ultraprecise analysis of ice from a Swiss glacier by a team led by McCormick and glaciologist Paul Mayewski at the Climate Change Institute of The University of Maine (UM) in Orono has fingered a culprit. At a workshop at Harvard this week, the team reported that a cataclysmic volcanic eruption in Iceland spewed ash across the Northern Hemisphere early in 536. Two other massive eruptions followed, in 540 and 547. The repeated blows, followed by plague, plunged Europe into economic stagnation that lasted until 640, when another signal in the ice—a spike in airborne lead—marks a resurgence of silver mining, as the team reports in Antiquity this week.
Like this:
Like Loading...
Recent Comments