Why are the elderly invisible in archaeological contexts?

Comparison of adult male, child and adult female skeletons, shows why young age easier to estimate than old due to growth. Via Medical Ephemra

Comparison of adult male, child and adult female skeletons, shows why young age easier to estimate than old due to growth. Via Medical Ephemra

For the past two months, I have been busy preparing my dissertation data for analysis. This means that I am taking the paper versions of my data from books, monographs, archaeological field notes, laboratory documents, and excavation reports, and creating a digital database of them that will allow me to conduct spatial and statistical analysis. After weeks of inputting data, I’ve finally finished with turning all five cemeteries into digital versions.

One of the things that I noticed while I was coding the data, was that there were no specific ages for individuals who were 45 years or more. This doesn’t mean that there weren’t people older than 45, but rather it means that we can’t age individuals older than 45 well. So instead of stating an age range like 50-55 years, we simply code the individual as 45 or more years old. This renders the elderly invisible in a sense- we don’t have a full sense of how old individuals are, how bodies change when the individuals are later in age, and how they may be buried with different objects at different later ages.

A new study Cave and Oxenham (2014), argues that the actual age to which people lived in the past may be much higher than previously thought. They argue that the reason people assume that individuals in the past died younger is due to a number of reasons. 1) Average age of death in past communities is often skewed younger due to high childhood mortality reducing the average. 2) Skeletal indicators for old age are less likely to survive in skeletal tissue due to being in areas which are more likely to degrade over time. 3) Age-at-death estimations aren’t as reliable for adults. When we determine age for sub-adults, we can look at development of teeth and bones. However, in adults, they are already developed so we need to look at degradation and wear on bones which can be more highly variable. 4) Finally, for older adults, we often don’t identify an age category- rather we simply use the lowest possible age and code the individual as 50 years or more. The goal of Cave and Oxenham (2014) is to present methods that allow for the elderly to be categorized not in a catchall 50+ years, but to sort them into decades (ex. 50s, 60s, 70s, 80s) to better understand how the elderly in the past are treated in life and death.

The sample under examination is from the site of Worthy Park, an Anglo-Saxon cemetery dating from middle of the fifth century AD to the middle of the seventh century AD. This site is located near Winchester, Hampshire in the UK. They examined dental wear of 48 individuals from the cemetery. While there are many methods for determining age-at-death, Cave and Oxenham (2014) focus on occlusal tooth wear for age estimation as tooth wear is significantly correlated with age. The first and second molars are examined because they are the most reliable for estimating age from wear. Once wear has been estimated for each individual, individuals are seriated from least to most amount of wear. In addition to this, they compare the sample against a known age-at-death population to get a better understanding of dental wear and age.

Chart for estimating age from dental wear, via Principles of Forensic Identification (originally from  Lovejoy 1985)

Chart for estimating age from dental wear, via Principles of Forensic Identification (originally from Lovejoy 1985)

Based on the new estimation for age-at-death based on dental wear, Cave and Oxenham (2014) are able to create a more realistic mortality profile of the population. They compare their age estimations to the original estimations of the cemetery which didn’t age any individuals in categories over 50 years (listed as 50+ years). They determined that there were 5 individuals who were 45-54, 7 who were 55-64, 8 who were 65-74, and 7 who were 75-85 years old. Based on this, they conclude that using categories like “50+ years” hides variation in the older adult population and aids in the misconception that historic populations didn’t live long. They argue that we need to stop using catchall age categories and start representing the range of age-at-death more appropriately.

Of course, they do not that there are some issues with this type of age estimation and that it isn’t 100% accurate. Different people will wear their teeth in different ways due to social status and access to different types of food. However, in general, this provides an important method for better understanding the older adult population. By better ageing older individuals, we can create more appropriate interpretations of how individuals in the past were treated based on their age. There is a major difference between a 50 year old and an 80 year old that cannot be ignored simply because our methods aren’t as reliable and it is easier to code as 50+. If you get the chance, I would highly suggest reading this paper for more details on their method!

Works Cited

ResearchBlogging.orgC. Cave, & M. Oxenham (2014). Identification of the Archaeological ‘Invisible Elderly’: An Approach Illustrated with an Anglo-Saxon Example International Journal of Osteoarchaeology, Early View

8 responses to “Why are the elderly invisible in archaeological contexts?

  1. I’m aware of the sites you are discussing as they actually came up in some of previous Early Anglo-Saxon England lectures. I agree that social status plays a huge part and this will twist results. However, what I find more interesting about your findings is the lack of a tapering off as the ages increase. If we look at demographic scales, as age increases the numbers decrease but your findings appear to indicate that the upper 3 decades are very similar. Keep up the good work.

  2. I want to note (in a ethnographical, post-processual reflexion) that our comprehension of “old age” its probably not the same as those of these people. I mean, a 13 yrs old child is a child to our understanding, but to many social groups they are adults. Probably there was a different reaction to adults and old age funerals which that could be detectable in the material culture and I’d love to see these informations crossed with accurate dental wear charts. I think that is way to catch these oldsters in their full presence at the times that they were alive.

    Also: I favourited this blog!

    • Very true- we often refer to this as life course analysis- understanding what different biological age groups mean culturally. In the Anglo-Saxon world, ‘adulthood’ was reached sometime between 13-18 years, and often after 30 the individual was considered an older adult. However, there are probably different reactions to a 35 year old “old adult” and an 80 year old “old adult” that would be interesting to explore in the archaeological record.

  3. Reading medieval texts, I’m often struck by how little their perception of youth/old age differs from ours! The biggest difference is probably that 60-ish felt like serious old age, but that was true until not long ago. (Improved dentistry probably is a big part of that.) Expectations for what you were supposed to do at certain ages varied, of course. E.g., it’s possible that in some medieval contexts (by no means all) a girl’s family would worry if she wasn’t married by her early twenties; but I’m reliably told that this was still so in Georgia and Tennessee in the 1970s. Peasant children started their working lives much sooner than we now deem normal (again– when did that stop in the US?) But that doesn’t mean they weren’t considered very young. the Bible famously said that life normally lasts 70 years, 80 if you’re lucky. Walter Map, in the late 12th century, reflecting on what would be a good cut-off point for a history of present times, decides on 100 years, because that’s the outer limit of “living memory”: there are centenarians, he says, and people who have heard first- hand accounts of 100 years ago from parents and grandparents. My impression is that people expected to live to old age (maybe not quite like now, but what was considered a normal lifespan even a generation or two ago in the US). They more or less felt young, middle-aged, and old at around the same times we do. The big difference is that the chances of not making it to middle or old age ( illness, childbirth, accidents, food insecurity) were much greater than they are now. But that doesn’t mean you perceived the life cycle as all that different; only that you knew and expected life to be fragile.

  4. Up to 85 y.o at 700 A.D!? I see from looking at the Cave & Oxenham paper that they say it’s very easy to date relatively within the sample, but how is the sample dated in absolute terms?
    Is it possible that in 700 A.D. people were eating more abrasive foods which wore their teeth more quickly so that a 50 y.o might resemble a 70 – 80 y.o. from more recent times?
    – James.

  5. Of course, they do not that there are some issues with this type of age estimation and that it isn’t 100% accurate.

    Should be: Of course, they do NOTE

    Very interesting

  6. Pingback: Review of Mortuary Archaeology and Bioarchaeology from 2014 | Bones Don't Lie·

Leave a comment