I’m taking a break from the ongoing Earth-regency Alternate History series this week (mostly because research has been taking more time than I’ve had available. Instead, the following is based on an email exchange between one of my players and myself, raising some serious questions about the population dynamics of longer-lived species and aging in RPGs…
One of my players asked me today about how to determine the age of his new character, an elf who has entered the game in question in an age category of “Venerable”. But the game in question – I won’t name the rules system – has no rules for character aging, and doesn’t even nominate standard lifespans for different races. He proposed, “would it be appropriate to use the 3.5 tables? If so, then my elf would be at least 350 years of age (more probably 450+) with a maximum age of *rolls 4 percentile dice* 606 years, according to 3.5 PHB ageing for elves.”
This was the first time in several years that I’d looked at the assumptions that underlie “standard aging” tables, and I’ve learned quite a lot since the last time. As a result, my thought process led me down some interesting paths, paths which showed how significant a “mere” +50%-or-so lifespan was – never mind the 4-500% suggested by the 3.5 PHB.
Demographics are not a flat line
My first problem is now, and always has been, with the notion of a flat percentage being used to determine where in a race’s lifespan a particular character’s age falls. This makes it just as probable that a character will have a high age as it is that they will have a low age – and it doesn’t take much examination of demographics to realise that the real world simply doesn’t work that way.
Demographics are not a dumbbell curve
The next most-common approach that I’ve seen is the rolling of multiple dice to determine age. This makes a character’s age more likely to be at or around the mathematical mean, offset by any adjustment made to ensure a minimum age that’s suitable for adventuring. This makes character ages too old, on average, and – once again – looks nothing like a real demographic curve.
Either of these approaches can yield what seem to be reasonable character ages in the case of individuals; it is only when you start looking at larger populations that the answers stop making sense. The population aging approach you choose brings with it implications for knowledge of the past, acquisition of skills, birth and death rates, relative population levels, and the resulting social mechanisms.
Knowledge of the past
If your character is 500 years old, you should expect them to have a fair idea of what was going on 400 years ago, and about events between then and now. This is a cross for the GM to bear that he really doesn’t need; it would, in general, be better to have events of more than a generation ago being lost in the mists of time and the pages of history. Why? Because then the GM can bring out historical events as he needs them for maximum story gain, rather than having to prepare the history in advance.
It doesn’t matter so much in Fantasy novels, where the author can introduce an Elven character only when it suits the plotline; an Elven PC will be pestering the GM for detailed histories every time the past becomes relevant to a plotline. It adds to the Prep Burdon of the GM, sometimes massively, and can totally erase a lot of the atmosphere and mystery of the past.
Acquisition Of Skills
As soon as you have a race living four or five times as long as humans, the GM has to start fudging questions concerning the acquisition of skills – or they will end up with Ubermensch who don’t need the PCs. If it takes 20 years to master a craft or skill, for example, most humans will do so at around the age of 30, and – given probable lifespans – be able to master only one or perhaps two in a lifetime (50-60 years). Your typical elf, if they have 500-year-liefspans, even if youth and childhood are also increased proportionately, will (in comparison) have time to master TEN to TWENTY, even without any advantages from genetic/racial predisposition. And that also ignores any compounding effects – even though, in reality, studying one subject often makes it easier to learn a related subject. That doesn’t matter so much to humans, where there’s only time for the mastery of two (perhaps 3 or 4 in exceptional cases) skills – but when you start talking about 10-20 skills, this effect goes from negligible to seriously important.
To combat it, and prevent elves from coming to dominate society, you have to start making assumptions about how easily long-lived races learn new things, about how ambitious and motivated they are, and generally adding in whole reams of additional racial profile – much of which doesn’t marry up with other source material like official adventure modules.
Heck, consider the number of diplomatic and trade contacts an even-moderately accomplished Elven trader could amass in hundreds of years, the number of secrets and confidences that one could accumulate!
Four hundred years ago, it was 1612 – how much has occurred since then? How many mysteries have arisen because every eyewitness died out before their stories could be documented?
Birth and Death Rates & Relative Population Levels
This is something that I alluded to not long ago in Sugar, Spice, and a touch of Rhubarb: That’s What Little Names Are Made Of, where I was discussing the effects of birth and death rates on population levels and how to stop long-lived races from overwhelming other races from sheer population level, and the implications for character names.
In a nutshell, the more long-lived the race, the lower the population level needs to be simply to maintain population parity with a human society. I’ll return to this subject as the discussion proceeds.
The Human Analogue In A Fantasy Campaign
Consider humans – get their aging right and then it should be possible to simply scale the answers to get elves or any other long-lived race.
Historically, in the historical timeframe on which D&D is based, 40% of children born die before reaching double-digits in age. 30% of those who get to age 10 will be dead before they reach age 20. 50% of those who get to age 20 will be dead before reaching age 30, and 70% of those who get to 30 will be dead before 40. Of those who reach 40, 80% won’t get to fifty, and of those who get to 50, 90% won’t make 60. Of those who make 60, 95% won’t get to age 70. Thereafter it’s 96% dead before 80, 97% dead before 85, 98% dead before 90, and 99% dead every 2 years thereafter – 92, 94, 96, 98, 100, 102, 104, and so on. In theory, if you make your aging save, you can keep going – the record is believed to be about 116 years, though there is a substantial error rate. There are unsubstantiated claims of a South American tribesman reaching 150 years of age, for example.
Now, factor in the availability of healing magic, and the fact that most of those who die in the 0-20 age bracket die of disease, while most of those who die in the 20-40 age bracket do so in military campaigns of one sort or another.
Then factor in the increased danger of accidental death because there are dangerous monsters and magic and what-have-you around.
You can assume that these two factors cancel each other out, implying that the younger the age, the more likely you are to encounter one of these additional dangers – and that appears to make sense. You increase the rate of accidental death and reduce the rate of death from wounds and disease – but that’s just an assumption that could very well go either way. Make this assumption, though, for the sake of argument, and let’s look at the results:
The Population Breakdown
With our base assumptions and something vaguely approaching a historical foundation in place, we can generate a demographic breakdown:
- 40% die before age 10 (4 in 10). 60% reach 10 years old (6 in 10).
- 30% of this 60% die before 20 = 3/10 of 6 in ten = 18 in 100. The other 70% survive = 7/10 of 6 in ten = 42 in 100.
- 50% of the 42 in 100 die before 30, = 21 in 100. The same amount survive.
- 70% of the surviving 21 in 100 will die before 40 = 147 in 1000. 30% survive = 3/10 x 21/100 = 63 in 1000.
- 80% of the surviving 63 in 1000 will die before 50, so 20% will survive = 1/5 x 63/1000 = 63/5000.
- 90% of the 63 in 5000 die before 60, so 10% will survive = 63/50,000.
- 95% of the 63 in 50,000 die before 70, so 5% will survive = 63/1e6.
- 96% of the 63/1e6 die before age 80, so 4% will survive = 126/5e7.
- 97% of the 126/50 million die before age 85, so 3% survive = 378/5e9.
- 98% of the 378 in 5000 million die before age 90, so 2% survive = 378/25e10.
- 99% of the 378 in 25 thousand million die before age 92, so 1% survive = 378/25e12.
…and so on.
Application to a typical population
Now multiply those by a population base – let’s say, 100,000 people.
- 40,000 will be <10, 60,000 will be 10+.
- The 60,000 are made up of 18,000 aged 10-19 and 42,000 aged 20+.
- The 42,000 are made up of 21,000 aged 20-29 and 21,000 aged 30+.
- The 21,000 are made up of 14,700 aged 30-39 and 6,300 aged 40+.
- The 6,300 are made up of 5,040 aged 40-49 and 1,260 aged 50+.
- The 1,260 are made up of 1,134 aged 50-59 and 126 aged 60+.
- The 126 are made up of 119.7 aged 60-69 and 6.3 aged 70+. That doesn’t make a lot of sense, so round the numbers to 120 and 6 for practical usage.
- The 6.3 people are made up of 6.048 people aged 70-79 and <1 person older than 80 – though we are now well within the 0.3 in 100,000 rounding error. So leave it be at 6 people aged 70+.
The result is a population curve which is noticeably bunched up into the lower end of the scale, rather different to the bell curve or completely flat line that either of the generation methods we have calculated.
The Next Step Not Taken
In my youth, I would have gone on to plot these results on a graph, and then perform a mathematical analysis to derive a complex equation describing the exact percentage of the population for any given age (to fill in the missing points on the graph), then converted the results into a table for generating a randomly rolled age.
Of course, if we simply assume a flat distribution of possible results across the sub-range of ages specified, we can get a simpler answer far more quickly – a d1000 for the age band, and then a d10 for range within that age band. But for the purposes of this article, even that is going further than we have to.
Elves with a 60% longer lifespan
To be honest, with all the social impacts of being long-lived, I can’t really see elves having more than a +60% lifespan over humans without the difficulties becoming insuperable. Doesn’t sound like a lot, does it? But let’s apply it and see what effects it would actually have on the demographic.
Because the dangers faced by the young would be the same for both humans and for elves, I’m not going to apply the full factor to the young. Instead, I’m going to go: Times 1, times 1.2, times 1.4, and times 1.6 thereafter.
- 10+ stays 10+.
- The ten-year gap between 10+ and 20+ becomes a 12-year gap to 22+.
- The ten-year gap between 20+ and 30+ becomes a 14-year gap – but it now starts at 22+ and runs to 36+.
- The ten-year gap between 30+ and 40+ becomes a 16-year gap, but it now starts at 36+ and runs to 52+.
- All the subsequent age brackets are also 16 years of length.
That gives a population breakdown of:
- 40,000 will be under 10, 60,000 will be 10+.
- The 60,000 are made up of 18,000 aged 10-21 and 42,000 aged 22+.
- The 42,000 are made up of 21,000 aged 23-35 and 21,000 aged 36+.
- The 21,000 are made up of 14,700 aged 36-51 and 6,300 aged 52+.
- The 6,300 are made up of 5,040 aged 52-67 and 1,260 aged 68+.
- The 1,260 are made up of 1,134 aged 68-83 and 126 aged 84+.
- The 126 are made up of 120 aged 84-99 and 6 aged 100+.
That’s what a 60% increase in the lifespan looks like. For any given calendar age, you get more elves alive of that age than you do humans. In the bracket containing 75 years of age, for example, you have 6 humans in every hundred thousand and 1260 elves.
To reduce the population levels of both to match – 6 in both – you find that elvish communities are one 210th the size of comparable human communities – so a city of 20,000 people would be the same as a ‘city’ of 95 elves. And a town of 2000 humans would be the equivalent of a group of 9-10 elves.
Actually, that’s not quite correct. In both cases, we’re aiming for an age range – to get an absolutely correct comparison, we should divide that age range up. So the 6 humans are actually 6 aged 70+ (with, effectively, none older than 80, according to our earlier calculations). So that means 0.6 of them will be exactly 70 years of age.
The elvish age bracket containing age 70 applies to 1134 people out of 100,000, and runs from 68 to 83, a span of 16 years – so 1134 / 16 gives 70.875 people out of 100,000 aged exactly 70. To get that back to 0.6 people, we have to divide the elvish population by a factor of 70.875 / 0.6, or 118.125.
That means that a city of 20,000 humans is as common as a “city” of 20,000/118.125=169 elves. A town or village of 2,000 humans is as common as a “town” of about 17 elves.
These differentials would be even more extreme if the 400-500 year lifespan model were applied. You would end up with the average Elven city having like 2 people in it. And villages would contain less than 1 person.
Don’t believe me? Well, let’s have a go.
The conversion factor
So, to start with, we want to graduate from x1 to x5 smoothly. The square root of 5 is 2.236, and the square root of that is 1.5, near enough. So, let’s say the factors are:
- times 1;
- times 1.5;
- times 1.5 x 1.5 = times 2.25;
- times 2.25 x 1.5 = times 3.375;
- times 5, thereafter.
Our ten-year population intervals become:
- 10 years;
- 15 years;
- 23 years;
- 34 years;
- 50 years, thereafter.
And that gives, from a standard 100,000 breakdown:
- 40,000 will be under 10, 60,000 will be 10+.
- The 60,000 are made up of 18,000 aged 10-24 and 42,000 aged 25+.
- The 42,000 are made up of 21,000 aged 25-47 and 21,000 aged 48+.
- The 21,000 are made up of 14,700 aged 48-81 and 6,300 aged 82+.
- The 6,300 are made up of 5,040 aged 82-131 and 1,260 aged 132+.
- The 1,260 are made up of 1,134 aged 132-181 and 126 aged 182+.
- The 126 are made up of 120 aged 182-231 and 6 aged 232+.
…and so on.
With 70 years being our standard of comparison, we have 6 humans in 100,000 and 14,700 elves in roughly that time-span. Dividing the 6 humans into the 10-year span gives 0.6 people in 100,000 being exactly 70 years old, while dividing the 14,700 elves into the 34 year age span gives 432-and-a-fraction elves exactly 70 years old out of every 100,000. Reducing elvish populations so that both groups have 0.6 members in 100,000 who are aged exactly 70 years gives a ratio of 720.6.
So an elvish city of “20,000 humans” would contain about 28 elves, and a village of “2000 humans” would be the equivalent of an elvish village of… three. Most of the time. Actually, 20% of the time, it would only be two elves.
Plucking numbers out of the air for lifespan is all well and good, but if you don’t know what you’re doing, the implications can overwhelm your game setting. Or, if they are not taken into account – something few people take the time and trouble to do – they can completely demolish the plausibility of the game setting when someone else hits you between the eyes with some hard questions.
One Caveat: I don’t have any actual population demographics for the calculations shown here, especially for those specified in the section The Human Analogue In A Fantasy Campaign. These are simply numbers that seem about right from the many sources and references that I have read in the past. More accurate data would yield more accurate analysis and projections – but the results ‘feel’ right, as they stand. So you can take them with a grain of salt – but I’ll use them until something more accurate presents itself.