As a general rule, I care much more about the consequences of extinction than the causes. Even though I work on past landscapes, my mind is firmly rooted in the present, and I strive for my work to be relevant to. We know what’s causing extinctions today, for the most part. What comes next — the longterm implications of extinction — are an unknown, and something I think the paleorecord can really speak to.
Despite that, my work does touch on the causes of ice age extinctions, and I often find myself drawn into the long-standing debates on whether climate change or humans* killed off the mammoths, giant ground sloths, and other mega-beasts. It’s a topic of great interest to the general public, and one that has generated staunch advocates on either side (there are relatively few people arguing for some synergistic middle ground).
Me? I tend to fall in the pro-human category, for several reasons. The megafaunal extinctions are characterized by a few global patterns: 1) they were time-transgressive, happening first in Eurasia and Australia, then North America, South America, and finally on islands. 2) They were taxonomically restricted, only hitting land mammals for the most part (few birds, reptiles, etc.). 3) They were size selective, weeding out about half the animals larger than an adult German Shepherd here in North America, but leaving the smaller animals relatively untouched. Plus, we’ve had a couple dozen or more ice ages in the last 2.5 million years. What’s different about this one? People.
Lots of folks disagree, which has led to some heated debates. For a long time, it was thought that humans weren’t in North America until about 12,000 years ago, and thus they didn’t overlap long enough with megafauna to have made an impact — especially with small populations (we’ve since pushed the arrival of humans to the Americas to at least 15,000 years ago). It’s understandably difficult to imagine small bands of humans wiping out large populations of megaherbivores and megacarnivores in a relatively short time (though see this classic Alroy paper).
Perhaps the biggest challenge, however, is the lack of unambiguous kill sites, which is a line of evidence that featured prominently in one of the most intense back-and-forths in the Pleistocene extinction literature: Grayson and Meltzer vs. Fiedel and Haynes. The scarcity of kill sites is, in my opinion, the single best argument against a human cause in the climate-only argument. It’s something of a puzzle, because we know that the first humans in the Americas made specialized stone tools to hunt large beasts, and that they utilized megafaunal resources. So why don’t we have more than a handful of mammoth, mastodon, horse, and camel remains with distinct signs of butchering?
Some of this could be an artifact of resistance in the archaeological community to a growing body of data, similar to the long-standing resistance to accepting that humans were here before 12,900 years ago (a period know as the Clovis culture for the distinctive stone tools found from that time). I haven’t really seen widespread evidence for this, though. And, if anything, I’m glad that fossil remains are held to really high standards– I wouldn’t want to start calling any bone with scratches or grooves a “kill site” any more than I’d want strong evidence tossed out because of overly restrictive definitions. Folks who are critical of the overkill model often cite the lack of stone tools associated with fossil bones. In other words, it’s not just that we find few bones with cut marks, it’s that we also find few stone tools alongside them.
Because these arguments come up frequently, I’ve spent a lot of time pondering, and I have some ideas. This is where I love the power of blogs — I get to speculate without having to worry about the fact that I’m not an archaeologist (though I do read the literature). I also have zero data, partly because I’m not an archaeologist and partly because I’m trying to explain the absence of evidence, which is tough. With all that in mind — that I’m a paleoecologist with no data– here’s what I think:
It actually makes sense to me that there are few kill sites, both in terms of modified bones and sites bones and tools together, for two reasons: taphonomy** and basic human efficiency:
Let’s talk about taphonomy first, because it’s huge: In order for an animal to become part of the fossil record, it’s got to be buried fairly quickly after death, and ideally in a place without a lot of oxygen (to inhibit decomposition) or acidic conditions (which will dissolve the bones). If you’re hunting an animal for food (criterion one for a kill site: kill animal) and then butchering it (criterion two: leave marks on bones), you’re likely in the open. The bones you don’t take with you and use as tools, or process for marrow, are going to be scavenged by coyotes and other wild animals. Those first humans in the Americas may or may not have had domesticated wolves (there’s evidence of dogs in Europe going back to 30,000 years ago, and east Asia going back 16,000 years ago), which would have had an impact on carcasses, too.
Plus, bones on the surface don’t last very long. They get eaten, dragged off, or eventually decompose or dissolve if they’re not buried. It’s remarkable that things end up as fossils at all, when you think about it. Here in Maine, the soils are way too acidic to preserve bones long, so the only fossil bones we get are in shell middens on the coast, in offshore ocean mud, or in lakes or sandy riverbanks. If a hunter shot a moose and wasn’t able to track it down, in other words, it probably wouldn’t become a fossil. Maine’s acidic, peaty soils would have made quick work of the remains. That’s not true everywhere, obviously, but there’s a reason that a large number of ice age fossils are associated with stream beds, caves, lake sediment, marshes, or other environments where they’d be buried quickly or preserved well. Some, like Dan Fisher, have even argued that mammoth carcasses that have been found in lakes were basically left behind by people who had sunken them deliberately to store for later, but never came back.
As for human efficiency, this includes both using as much of an animal as possible (which we see in many hunter-gatherer cultures) as well as the fact that tools themselves are valuable items, and not easily replaced. When I build a shed in my backyard, I don’t leave my hammer on the ground when the shed is finished. Similarly, I find it hard to imagine an ice age hunter-gatherer spending a lot of time making a stone tool only to abandon it, intact, with a butchered carcass — I can see it happening in a really rare circumstance, like losing a dispute with a saber-tooth over a kill (and even then, I’d probably try to come back and get my stuff, if I survived).
Stone tools make up a huge amount of the archaeological record from the late Pleistocene, because stone preserves well. We don’t have clothing from animal skins, or ropes, straps, baskets, or other cultural artifacts made from soft tissues– but that doesn’t mean we don’t think humans never made those items (for example, beads, needles, fish hooks, and hide scrapers all point to a regular engagement with soft materials). We may have limited evidence of direct interactions with megafauna here in North America, but that could just as easily be because people were resourceful, intelligent, capable hunter-gatherers — smart, efficient people who knew what they were doing when they went on a hunt. In that light, the absence of unambiguous kill sites would be the expectation, rather than an anomaly, and the fact that any exist at all is remarkable. It just makes the overkill hypothesis harder to prove with archaeological data alone, which is a nice argument for interdisciplinary studies that include paleoecology, too!
*We’re not going to talk about comets or hyper-diseases or other fringe theories.
**The study of the process of fossilization