The Age Of An Elf: Demographics of the long-lived
I’m taking a break from the ongoing Earth-regency Alternate History series this week (mostly because research has been taking more time than I’ve had available. Instead, the following is based on an email exchange between one of my players and myself, raising some serious questions about the population dynamics of longer-lived species and aging in RPGs…
One of my players asked me today about how to determine the age of his new character, an elf who has entered the game in question in an age category of “Venerable”. But the game in question – I won’t name the rules system – has no rules for character aging, and doesn’t even nominate standard lifespans for different races. He proposed, “would it be appropriate to use the 3.5 tables? If so, then my elf would be at least 350 years of age (more probably 450+) with a maximum age of *rolls 4 percentile dice* 606 years, according to 3.5 PHB ageing for elves.”
This was the first time in several years that I’d looked at the assumptions that underlie “standard aging” tables, and I’ve learned quite a lot since the last time. As a result, my thought process led me down some interesting paths, paths which showed how significant a “mere” +50%-or-so lifespan was – never mind the 4-500% suggested by the 3.5 PHB.
Demographics are not a flat line
My first problem is now, and always has been, with the notion of a flat percentage being used to determine where in a race’s lifespan a particular character’s age falls. This makes it just as probable that a character will have a high age as it is that they will have a low age – and it doesn’t take much examination of demographics to realise that the real world simply doesn’t work that way.
Demographics are not a dumbbell curve
The next most-common approach that I’ve seen is the rolling of multiple dice to determine age. This makes a character’s age more likely to be at or around the mathematical mean, offset by any adjustment made to ensure a minimum age that’s suitable for adventuring. This makes character ages too old, on average, and – once again – looks nothing like a real demographic curve.
The problems
Either of these approaches can yield what seem to be reasonable character ages in the case of individuals; it is only when you start looking at larger populations that the answers stop making sense. The population aging approach you choose brings with it implications for knowledge of the past, acquisition of skills, birth and death rates, relative population levels, and the resulting social mechanisms.
Knowledge of the past
If your character is 500 years old, you should expect them to have a fair idea of what was going on 400 years ago, and about events between then and now. This is a cross for the GM to bear that he really doesn’t need; it would, in general, be better to have events of more than a generation ago being lost in the mists of time and the pages of history. Why? Because then the GM can bring out historical events as he needs them for maximum story gain, rather than having to prepare the history in advance.
It doesn’t matter so much in Fantasy novels, where the author can introduce an Elven character only when it suits the plotline; an Elven PC will be pestering the GM for detailed histories every time the past becomes relevant to a plotline. It adds to the Prep Burdon of the GM, sometimes massively, and can totally erase a lot of the atmosphere and mystery of the past.
Acquisition Of Skills
As soon as you have a race living four or five times as long as humans, the GM has to start fudging questions concerning the acquisition of skills – or they will end up with Ubermensch who don’t need the PCs. If it takes 20 years to master a craft or skill, for example, most humans will do so at around the age of 30, and – given probable lifespans – be able to master only one or perhaps two in a lifetime (50-60 years). Your typical elf, if they have 500-year-liefspans, even if youth and childhood are also increased proportionately, will (in comparison) have time to master TEN to TWENTY, even without any advantages from genetic/racial predisposition. And that also ignores any compounding effects – even though, in reality, studying one subject often makes it easier to learn a related subject. That doesn’t matter so much to humans, where there’s only time for the mastery of two (perhaps 3 or 4 in exceptional cases) skills – but when you start talking about 10-20 skills, this effect goes from negligible to seriously important.
To combat it, and prevent elves from coming to dominate society, you have to start making assumptions about how easily long-lived races learn new things, about how ambitious and motivated they are, and generally adding in whole reams of additional racial profile – much of which doesn’t marry up with other source material like official adventure modules.
Heck, consider the number of diplomatic and trade contacts an even-moderately accomplished Elven trader could amass in hundreds of years, the number of secrets and confidences that one could accumulate!
Four hundred years ago, it was 1612 – how much has occurred since then? How many mysteries have arisen because every eyewitness died out before their stories could be documented?
Birth and Death Rates & Relative Population Levels
This is something that I alluded to not long ago in Sugar, Spice, and a touch of Rhubarb: That’s What Little Names Are Made Of, where I was discussing the effects of birth and death rates on population levels and how to stop long-lived races from overwhelming other races from sheer population level, and the implications for character names.
In a nutshell, the more long-lived the race, the lower the population level needs to be simply to maintain population parity with a human society. I’ll return to this subject as the discussion proceeds.
The Human Analogue In A Fantasy Campaign
Consider humans – get their aging right and then it should be possible to simply scale the answers to get elves or any other long-lived race.
Historically, in the historical timeframe on which D&D is based, 40% of children born die before reaching double-digits in age. 30% of those who get to age 10 will be dead before they reach age 20. 50% of those who get to age 20 will be dead before reaching age 30, and 70% of those who get to 30 will be dead before 40. Of those who reach 40, 80% won’t get to fifty, and of those who get to 50, 90% won’t make 60. Of those who make 60, 95% won’t get to age 70. Thereafter it’s 96% dead before 80, 97% dead before 85, 98% dead before 90, and 99% dead every 2 years thereafter – 92, 94, 96, 98, 100, 102, 104, and so on. In theory, if you make your aging save, you can keep going – the record is believed to be about 116 years, though there is a substantial error rate. There are unsubstantiated claims of a South American tribesman reaching 150 years of age, for example.
Now, factor in the availability of healing magic, and the fact that most of those who die in the 0-20 age bracket die of disease, while most of those who die in the 20-40 age bracket do so in military campaigns of one sort or another.
Then factor in the increased danger of accidental death because there are dangerous monsters and magic and what-have-you around.
You can assume that these two factors cancel each other out, implying that the younger the age, the more likely you are to encounter one of these additional dangers – and that appears to make sense. You increase the rate of accidental death and reduce the rate of death from wounds and disease – but that’s just an assumption that could very well go either way. Make this assumption, though, for the sake of argument, and let’s look at the results:
The Population Breakdown
With our base assumptions and something vaguely approaching a historical foundation in place, we can generate a demographic breakdown:
- 40% die before age 10 (4 in 10). 60% reach 10 years old (6 in 10).
- 30% of this 60% die before 20 = 3/10 of 6 in ten = 18 in 100. The other 70% survive = 7/10 of 6 in ten = 42 in 100.
- 50% of the 42 in 100 die before 30, = 21 in 100. The same amount survive.
- 70% of the surviving 21 in 100 will die before 40 = 147 in 1000. 30% survive = 3/10 x 21/100 = 63 in 1000.
- 80% of the surviving 63 in 1000 will die before 50, so 20% will survive = 1/5 x 63/1000 = 63/5000.
- 90% of the 63 in 5000 die before 60, so 10% will survive = 63/50,000.
- 95% of the 63 in 50,000 die before 70, so 5% will survive = 63/1e6.
- 96% of the 63/1e6 die before age 80, so 4% will survive = 126/5e7.
- 97% of the 126/50 million die before age 85, so 3% survive = 378/5e9.
- 98% of the 378 in 5000 million die before age 90, so 2% survive = 378/25e10.
- 99% of the 378 in 25 thousand million die before age 92, so 1% survive = 378/25e12.
…and so on.
Application to a typical population
Now multiply those by a population base – let’s say, 100,000 people.
- 40,000 will be <10, 60,000 will be 10+.
- The 60,000 are made up of 18,000 aged 10-19 and 42,000 aged 20+.
- The 42,000 are made up of 21,000 aged 20-29 and 21,000 aged 30+.
- The 21,000 are made up of 14,700 aged 30-39 and 6,300 aged 40+.
- The 6,300 are made up of 5,040 aged 40-49 and 1,260 aged 50+.
- The 1,260 are made up of 1,134 aged 50-59 and 126 aged 60+.
- The 126 are made up of 119.7 aged 60-69 and 6.3 aged 70+. That doesn’t make a lot of sense, so round the numbers to 120 and 6 for practical usage.
- The 6.3 people are made up of 6.048 people aged 70-79 and <1 person older than 80 – though we are now well within the 0.3 in 100,000 rounding error. So leave it be at 6 people aged 70+.
The result is a population curve which is noticeably bunched up into the lower end of the scale, rather different to the bell curve or completely flat line that either of the generation methods we have calculated.
The Next Step Not Taken
In my youth, I would have gone on to plot these results on a graph, and then perform a mathematical analysis to derive a complex equation describing the exact percentage of the population for any given age (to fill in the missing points on the graph), then converted the results into a table for generating a randomly rolled age.
Of course, if we simply assume a flat distribution of possible results across the sub-range of ages specified, we can get a simpler answer far more quickly – a d1000 for the age band, and then a d10 for range within that age band. But for the purposes of this article, even that is going further than we have to.
Elves with a 60% longer lifespan
To be honest, with all the social impacts of being long-lived, I can’t really see elves having more than a +60% lifespan over humans without the difficulties becoming insuperable. Doesn’t sound like a lot, does it? But let’s apply it and see what effects it would actually have on the demographic.
Because the dangers faced by the young would be the same for both humans and for elves, I’m not going to apply the full factor to the young. Instead, I’m going to go: Times 1, times 1.2, times 1.4, and times 1.6 thereafter.
So:
- 10+ stays 10+.
- The ten-year gap between 10+ and 20+ becomes a 12-year gap to 22+.
- The ten-year gap between 20+ and 30+ becomes a 14-year gap – but it now starts at 22+ and runs to 36+.
- The ten-year gap between 30+ and 40+ becomes a 16-year gap, but it now starts at 36+ and runs to 52+.
- All the subsequent age brackets are also 16 years of length.
That gives a population breakdown of:
- 40,000 will be under 10, 60,000 will be 10+.
- The 60,000 are made up of 18,000 aged 10-21 and 42,000 aged 22+.
- The 42,000 are made up of 21,000 aged 23-35 and 21,000 aged 36+.
- The 21,000 are made up of 14,700 aged 36-51 and 6,300 aged 52+.
- The 6,300 are made up of 5,040 aged 52-67 and 1,260 aged 68+.
- The 1,260 are made up of 1,134 aged 68-83 and 126 aged 84+.
- The 126 are made up of 120 aged 84-99 and 6 aged 100+.
That’s what a 60% increase in the lifespan looks like. For any given calendar age, you get more elves alive of that age than you do humans. In the bracket containing 75 years of age, for example, you have 6 humans in every hundred thousand and 1260 elves.
To reduce the population levels of both to match – 6 in both – you find that elvish communities are one 210th the size of comparable human communities – so a city of 20,000 people would be the same as a ‘city’ of 95 elves. And a town of 2000 humans would be the equivalent of a group of 9-10 elves.
A correction
Actually, that’s not quite correct. In both cases, we’re aiming for an age range – to get an absolutely correct comparison, we should divide that age range up. So the 6 humans are actually 6 aged 70+ (with, effectively, none older than 80, according to our earlier calculations). So that means 0.6 of them will be exactly 70 years of age.
The elvish age bracket containing age 70 applies to 1134 people out of 100,000, and runs from 68 to 83, a span of 16 years – so 1134 / 16 gives 70.875 people out of 100,000 aged exactly 70. To get that back to 0.6 people, we have to divide the elvish population by a factor of 70.875 / 0.6, or 118.125.
That means that a city of 20,000 humans is as common as a “city” of 20,000/118.125=169 elves. A town or village of 2,000 humans is as common as a “town” of about 17 elves.
400-500 years?
These differentials would be even more extreme if the 400-500 year lifespan model were applied. You would end up with the average Elven city having like 2 people in it. And villages would contain less than 1 person.
Don’t believe me? Well, let’s have a go.
The conversion factor
So, to start with, we want to graduate from x1 to x5 smoothly. The square root of 5 is 2.236, and the square root of that is 1.5, near enough. So, let’s say the factors are:
- times 1;
- times 1.5;
- times 1.5 x 1.5 = times 2.25;
- times 2.25 x 1.5 = times 3.375;
- times 5, thereafter.
Our ten-year population intervals become:
- 10 years;
- 15 years;
- 23 years;
- 34 years;
- 50 years, thereafter.
And that gives, from a standard 100,000 breakdown:
- 40,000 will be under 10, 60,000 will be 10+.
- The 60,000 are made up of 18,000 aged 10-24 and 42,000 aged 25+.
- The 42,000 are made up of 21,000 aged 25-47 and 21,000 aged 48+.
- The 21,000 are made up of 14,700 aged 48-81 and 6,300 aged 82+.
- The 6,300 are made up of 5,040 aged 82-131 and 1,260 aged 132+.
- The 1,260 are made up of 1,134 aged 132-181 and 126 aged 182+.
- The 126 are made up of 120 aged 182-231 and 6 aged 232+.
…and so on.
With 70 years being our standard of comparison, we have 6 humans in 100,000 and 14,700 elves in roughly that time-span. Dividing the 6 humans into the 10-year span gives 0.6 people in 100,000 being exactly 70 years old, while dividing the 14,700 elves into the 34 year age span gives 432-and-a-fraction elves exactly 70 years old out of every 100,000. Reducing elvish populations so that both groups have 0.6 members in 100,000 who are aged exactly 70 years gives a ratio of 720.6.
So an elvish city of “20,000 humans” would contain about 28 elves, and a village of “2000 humans” would be the equivalent of an elvish village of… three. Most of the time. Actually, 20% of the time, it would only be two elves.
Conclusion
Plucking numbers out of the air for lifespan is all well and good, but if you don’t know what you’re doing, the implications can overwhelm your game setting. Or, if they are not taken into account – something few people take the time and trouble to do – they can completely demolish the plausibility of the game setting when someone else hits you between the eyes with some hard questions.
One Caveat: I don’t have any actual population demographics for the calculations shown here, especially for those specified in the section The Human Analogue In A Fantasy Campaign. These are simply numbers that seem about right from the many sources and references that I have read in the past. More accurate data would yield more accurate analysis and projections – but the results ‘feel’ right, as they stand. So you can take them with a grain of salt – but I’ll use them until something more accurate presents itself.
Discover more from Campaign Mastery
Subscribe to get the latest posts sent to your email.
May 1st, 2012 at 11:58 am
I am afraid I totally do not understand how you end up with elven “cities” of 169. Our world has hardly seen a population collapse with longer lifespans (quite the opposite). You still need a certain density of boots on the ground to tend fields, maintain buildings, and so on, and I do not see how the longer lifespan of elves changes that.
May 1st, 2012 at 12:53 pm
@Sean: I’m not sure I can explain it any differently or more clearly, but I’ll try.
The calculations show that in a population of 100,000, with age brackets adjusted for an elvish lifespan 60% longer than human, there will be 1134 elves aged between 68 and 83, and we have agreed that for simplicity, we will assume that those ages are equally represented within that age bracket. That’s a span of 16 years, so the number of elves aged 70 would be one sixteenth of the 1134, or 70.875 (out of 100,000).
A human population, with human aging, has only 6 people in the age bracket 70-79. Again, we assume for simplicity that these ages are evenly distributed, so the number of people aged exactly 70 are 0.6 out of 100,000.
I then calculate the ratio between these so that the relative population density of elves that will yield exactly the same number of people aged 70 years is revealed. Why 70? Because that’s the youngest age at which “venerable” can reasonably be applied to a human. So 70.875 divided by 0.6 gives 118.125.
That means that elvish communities that are as prevalent in frequency as a human community of size X, as measured by the number of 70-year-olds within it, will be slightly under one 118th the size, all else being equal.
So a human city of 20,000 would be the same thing as an elven city of 20,000 / 118.125 = 169.3 elves – IF the criterion is that both have the same average number of 70-year-olds.
So, let’s now assume that we don’t go for equity – that elves have cities of 20,000 citizens, the same as humans do. The same ratio should then apply to the number of cities they have, if the overall population of elves is to equal that of humans – so for every elvish city of 20,000 citizens, there should be 118.125 human cities of that size.
What the point of equity is, is up to the GM. It might be that 120-year-old elves are as numerous as 70-year old humans. The point of the article isn’t to say that “elvish cities will only have 169 people if they live 60% longer than humans”, it’s to show GMs how to model the breakdown from a total population of X into a relative proportion of population groups, so that some estimation of the social impact can be made.
What the numbers show, with a basis of +60% lifespan and an equality of 70-year-old citizens, is that most elves would live as pairs – and, given the preponderance of children, that must mean single-parent households. How do they prepare enough food? Do they gather in smaller numbers of larger communities than is normal for humans? Those are questions the GM can answer – but these calculations show how to alter the balance between number of communities and relative size.
I could also ask the question, “what is a city?” and get the answer, a central hub for commerce, industry, esucation, dministration, and defence, with many more citizens than the next size of community down. So if an Elven city has a permanent population of 169, that simply means that they have fewer people fulfilling those functions on behalf of the “elvish state” – implying that more of these functions are decentralized and independant of any overall authority structure. Another way of phrasing the results might be that a community of 169 elves is of equal importance to the elvish “nation” as a city of 20,000 is to a human Kingdom.
If you wanted to, you could double the community size and make them half as common – or ten timesm and one-tenth as common.
Oh, and I wasn’t saying that there would be a population collapse because of the longer lives – I was saying that if the two populations are to be roughly equivalent in power and expertise and ability to exert authority, the long-lived HAVE to be reduced in number relative to the short-lived.
GMs and game systems can blythely toss off any scale of increased lifespan on the part of a non-human race that they see fit to use – but rarely if ever do they actually consider the consequences of those numbers. If there are experts out there with three or four hundred years of experience – in trade, in politics, in whatever – I can’t see how they wouldn’t be running things. At the very least, they would be the powers behind the thrones. Who would you want commanding your armies?
Only by applying the principle of “Can’t Be Everywhere” as a limitation to the effectiveness of the long-lived (ie lower numbers) can any sort of equity in power be reached.
Mike recently posted..The Age Of An Elf: Demographics of the long-lived
May 2nd, 2012 at 10:11 am
Who would I want commanding my armies? Someone I can trust, if I am a human king. That is unlikely to be an elf.
Experience gives an edge, yes, but even experts make mistakes. Also, who is to say that elves do not spend all of their time being the best X (be it shoemaker or sage) and not branching into other areas? And ‘can’t be everywhere’ is a huge limitation when trying to maintain networks, oversee investments and so on.
Further, somethings are not that applicable to experience reducing the time/effort needed more than marginally. Like farming, you still need to plant, weed, harvest, that does not change. Same with herding, weaving, and so very many components of life at the lower tech levels.
May 2nd, 2012 at 12:04 pm
Your point about trustworthiness is a good one. I should have said, “all things being equal”, implying that both would be equally trustworthy.
The problem with elves as “the ultimate expert in [X]” is that this then forms a double standard with referance to PCs vs NPCs. So, while your point regarding experience is a valid one, it’s one that creates as many problems as it solves.
I have the most trouble with your final paragraph, since it implies that all races perform such activities in exactly the same way – plant, weed, harvest, repeat. If elves do not follow the human pattern of activity, do not farm so labor-intensively, this arguement goes out the window – which is exactly the point I was making about “implications for society” in the article. “Why does your garden not have weeds?” “I asked them not to grow there.”
Change the assumptions of society and you change the shape of that society. More importantly, if the demographics indicate that the shape of the society has to change, that simply tells you what the effect of those changed assumptions has to be.
If you have a substantial urban centre that comprises only 169 individuals, most of them children, then the society will have to change because those numbers are not enough to support a human-style settlement. Rather than intensive farming, they might rely more on gathering wild fruit and nuts for their food supply – mandating a lower population density. The questions would then become, “why don’t they adopt the more efficient human techniques” and “what are the other impacts on society and culture of such low population densities?”. The answers to the first could be anything from philosophical to greater awareness of living things, a sort of extreme vegetarianism.
Yes, there will be activities in which the time & effort required are not reduced significantly by experience. But time and effort requirements are not functions of population levels alone, because the requirements can also be proportional to population levels. In essence, fewer people equals less demand.
And if you do need more people to maintain a trade empire, you can always simply employ humans. Organizational structures do not have to maintain racial purity standards.
Lastly, “Can’t be everywhere’ is INTENDED to be huge limitation. Because the alternative is for the experience and expertise of the long-lived to overwhelm human societies. Because, in most fields, experience and expertise make huge differences to outcomes. The trustworthiness of a general makes no difference if he gets his head handed to him (along with those of his army) by a 400-year-old mercenary tactician.
Mike recently posted..The Age Of An Elf: Demographics of the long-lived
May 3rd, 2012 at 8:26 pm
A couple of things about long lived races and knowledge.
While it might be great to imply that we can store all information, keep up to date enough to still be an expert on it, that just isn’t the case.
Take for example, how much of your high school math do you remember? How much do you remember of your 10th birthday? or any birthday you had 15 or more years ago? Chances are not a lot. The same weakness applies to long lived races. Sure an elf may have been an expert medical science (or its equivalent) in the 1960’s but since then he moved onto studying philosophy. How much if we talked to that elf now about medical science would still be relavent? How much would he actually remember?
Long lived races are great, but lets face it the knowledge they have will quickly get out dated, particularly as their sense of time and urgency will be different to the ‘human’ perception. Sure Mr Elf Trader from 1602 knew a lot about the trading of tin from Britain… he may have known every town, relevant wagoneer, customs officer and tax rate etc. However his perception of time does not make it easy for him to accept that John Teamster retired after working a mere 20 years.
Age changes perception of time. It is clear even in humans, 5 minutes to a 5 year old is a lot longer than 5 minutes to a 30 year old. A year to a 19 year old doesn’t pass as quickly as it seems to for a 60 year old. For an elf (assuiming 1200 year lifespan from dnd) this would be much the same, human life would be unnervingly fast.
In terms of population rates, your number assume that % chance of accident and death remains the same over a longer life span. In reality, the death rate is likely to be the same until death from the effects of aging (weakening of heart muscles etc) become a factor. In that regard, elves will not likely suffer the weaknesses of age as early. They are probably just as likely to die of disease and accident before they reach 10 year of age (maybe more likely as they will have a child’s frailties longer). You analysis fails to take those factors into consideration.
There are other things too, such as lower birth rates generally attributed to elves. What if elven women only ovulate once per year? Chance of pregnancy drops dramatically. These means that any hazards that happen are likely to take a lot longer to recover from. The population of elven cities would still be the same as human ones, there are just likely to be far fewer cities. Again this fits with the typical fantasy meme.
There are a lot more factors to aging, gathering (and then retaining and keeping up to date) knowledge than your analysis takes account of.
I do agree that demographics and game play are important, but by the same token the standard fantasy concepts about elves and other long lived races usually get most game master out of too much of a bind. Elves generally don’t care much for human history, its too fleeting, kings change too fast and its wars insigificant unless elves were threatened. Even then a lot of the details of a war that happened 400 years ago to elves are likely to the vast bulk of elves that are still out adventuring likely to be similar to the details of the Vietnam War or Korean War. I know a bit about them from history, my parents lived through them but in reality chances of exact details or comprehensive recollection of the war is unlikely. The 1st Gulf War took place when I was a teenager, I remember watching patriot missiles fire off on CNN. I remember snippets but I couldn’t for the life of me write an account other than the basics of what happened. I would struggle with dates, events and anything but the most basic information. It was a big deal at the time, there was the possibility of WW3. Even with mass media coverage of these events (something unlikely in a world with elves) I wouldn’t know enough to make an impact on most hypothetical plots.
Sorry got sidetracked.
May 3rd, 2012 at 11:05 pm
@Chaede: Wow, there’s a lot of meat in that contribution, thanks!
To some extent there is truth in your statement about losing the ability to work skills if you don’t maintain them. Most game systems ignore this point however, and with good reason: such rules are almost always impractical in play. However, there is also merit in the counterpoint which is that knowledge doesn’t go out of date in a medieval or psuedo-medieval setting with the same frequency that it does in a modern setting. The last time I saw an estimate of it, the sum of human knowledge is doubling every 4 years or so, and it is this rate of advancement that characterised the 20th century. In the medieval period, it was much slower, and there might be only one advance in 200 years, making it much easier to refresh your recollection with a single well-chosen referance or a couple of days spent on a refresher course. Adding to that consideration is this: Just because you have forgotten how to do something that you once knew, does not mean that you are starting from scratch when the time comes to re-learn it. I don’t know of any experiments to determine how quickly such lost knowledge can be reacquired, but my own experience suggests that it is ten to one or faster relative to the original learning curve.
It is also true that memory and memory training is different between the typical modern person and those from an era when literacy was less common. Because modern man has less need to remember events, his memory is relatively feeble and undeveloped; it was not unusual for people in a medieval or psuedo-medieval setting to be able to memorise whole tracts of the bible (or other holy book) the first time they are told it. In fact, one of the greek philosophers forwarned of this effect when the written word was first invented, and we heard the same warnings when the printing press came along, and when the internet happened. You can also see the same effect with basic arithmetic – I can still do long division with pen and paper, or work out a square root or whatever, but most of my contemporaries can’t and most of the next generation – my nieces and nephews and cousins, all about your age – were never taught how in the first place, so ubiquitous are electronic calculators.
Your second point relates to Percieved time vs chronological time, and I would submit that this is one of those great philosophical debates that psychologists love to play with. All sorts of things go into into it, from excitement levels, percieved danger, levels of interest in the activities, shared vs solo experiences – I think it is probably oversimplifying to compare an infant’s perception of a span of time with that of an older adult. From what little I know on the subject, this would seem to be a case in which practical game design and adventure construction has it right: there are some occasions when each period of X duration is significant (seconds, minutes, hours, days, weeks, months, years) and other occasions when life is more unhurried and the span can flow past without a ripple, seeming far shorter in retrospect than it did while actually being experienced. The difference between a fantasy game setting, with its myriad of dangers that compell attention, actually works against your arguement here. A Bucolic, pastoral, sheltered existance permits the years to flow past without a ripple, while regular interventions of danger in different forms keeps people focussed on events, and hence on the passage of time.
Your third and fourth points concern accidental death and birth rate. Since these are exactly the sort of consequences for racial and cultural design that I was pointing to as a consequence of the combination of long-life and the need for some level of balance between the different populations, the suggestions that you make and the assumptions that you point out are certainly relevant. The question is whether or not it is better to establish a lifespan and then work backwards to determine a set of base assumptions that are compatable with that lifespan, or to start with the base assumptions and determine what the resulting lifespan and demographic are going to be. I would submit that the first approach is much easier than the second – but that blindly throwing numbers for both is unlikely to yield a satisfactory answer, and that’s what I feel most game designers do (especially D&D). Regarding accidents and accidental death, I would add that much of the reason for an increase in accidental death as we age relates to increasing infirmity and the accumulated effects of poor health & nutrition, and that these factors are somewhat compensated for by the existance of healing magics and the like. One of the first GMs I came to know when I got started in this hobby assumed that this factor made health standards, and hence aging, more akin to that of the 20th century than the medieval, just as the existance of druids would have effects on the food supply that are the equivalent of modern agricultur, and I can see his point. However, I think that his statement is too general and sweeping. In general, then, suffice it to say that people will be healthier and more robust, and hence better able to fight off disease and infection, historically the leading cause of death; and that this would promote accident and violence as leading factors in mortality, and that clerical intervention would make these more all-or-nothing. The only reasons people would die from accidental death is if a cleric could not get to them in time, or if there were so many people suffering that triage was needed – based on social importance and rank, and the character in question was one of the unlucky ones.
Your final point begins with the statement,
before summarising the points of view and resulting social perspectives made earlier. The objective is not to get the game master out of trouble, it is to provide them with the tools to improve their game so that they don’t get into trouble in the first place, and so that they can take control of their games and present a vision of the game world that is uniquely theirs. Musicians don’t aim to be “just like everybody else” when they write and compose music, they want to bring something new and interesting to each recording; athletes don’t just aim to equal their best past performance, they work on being able to better it (because all their rivals are also doing so, and if they don’t constantly improve they will be left behind). Why should GMs aspire to anything less?
May 4th, 2012 at 12:30 pm
Say you have two characters of equal level that both say they want to be the “Best archer ever”. Both focus on this and spend all of their time doing it. Shouldn’t the person who’s put a century into it be better that the guy who has put ten years into it? Reality says yes, game balance says no. This also breaks the assumption that elves and dwarves are just really slow at learning. This would put starting characters on the same level, but it would mean that they’d advance slower.
As for cultural differences, you then have the problem of the elf who was raised by humans and did things with a human’s intensity. They should logically be better than a human due to time. PC’s are often people who are different than the majority of people of their culture. So any explanation has to include the outliers as well.
My solution is to have all of the races age somewhat the same into middle age. Perhaps a difference of 25% for the long lived races. After they start taking penalties, the other races aging slows down. And I don’t let people play older characters of long-lived races.
May 4th, 2012 at 7:05 pm
@Philo Pharynx: I don’t understand how you go from “reality says yes, game balance says no” to “This also breaks the assumption that elves and dwarves are just really slow at learning”. What breaks that assumption so far as I’m concerned is the fact that PCs from long-lived races acquire skills and feats at the same rate as human characters – and there is nothing age-related about the process. You allude to this very situation with your next sentence, so I can only presume that this is what you meant by your comment.
Regarding the cultural effects of an elf raised by a human – that’s a very good point and one that (perhaps surprisingly) had not occurred to me, prior to your comment. Suddenly, there are no easy answers, except to say “it’s broken and that’s the way it is” – or to say “the book must be wrong”.
The assumptions I made in determining a human-to-elf population ratio included the one that they learned at the same rate, an assumption that was challenged by other respondants. Your comment restores the validity of the entire article – so thanks for that!
Logically, the alternative would be to impose some handicap to learning – one proportional to the extended lifespan would be ideal, but a compromise may be necessary. Why, Lo and behold, that’s exactly what AD&D used to do – impose an XP penalty to certain races. Not as great a penalty as there perhaps should have been, but at least a genuflection in the direction of verisimilitude. When those penalties were abolished (and I can understand why the change was made), it fundamentally changed the nature of the elves and other long-lived species – without actually changing their descriptions at all. That was the point at which the long-lived demographics became broken. Will this be fixed in 5th edition or will it be overlooked – again? I guess we’ll have to wait and see about that!
May 4th, 2012 at 9:57 pm
[…] The Age Of An Elf: Demographics of the long-lived […]
May 12th, 2012 at 3:25 am
I thought I would throw in a couple of relevant paragraphs from M.Y.T.H. Inc. Link by Robert Asprin that are relevant. If I had been able to locate them at the time, I would have quoted them as part of the main article, but I guess it’s better late than never.
Mike recently posted..A Rational Intuition
June 25th, 2012 at 5:25 am
[…] explores The Age of an Elf: Demographics of the long-lived, where he shows that demographics as indicated by demihuman lifespans in D&D are pretty […]
August 12th, 2012 at 6:29 am
Why do people keep making this same mistake?
That thing you did in the middle, where you made the number of 70-year-old Elves and Humans the same? That’s rubbish. You make the populations of adult Elves some fraction of the population of adult Humans to suit the setting canon, and then you set the birth rates and death rates of Elves so that works.
Yes, longer-lived species need a better annual survival rate to keep a stable population. Well spotted. Have a cookie. That thing where Elves “learn slower” is 3e D&D canon used to explain the “balanced” races, 2nd edition gave them bonus NWPs and multi-classing to express their extended learning time instead.
Also, where did you get those survival rates for medieval people? They look far too low. Life expectancy at birth for humans is historically 25-40, 30 in medieval times, and life expectancy for 20 year olds is around +45 years as most deaths occurred to infants and the elderly. Yours are ~17 years at birth and obviously just +10 years at age 20. That’s got to be wrong.
What else? Ah yes, maximum age works fairly well as, say, 65+2d20 for modern humans, and there’s not a lot wrong with defining starting ages similarly as you can fit the chance of “starting” at each age to fit any such curve. You don’t need a population table to pick an individual because they don’t need equal odds of being picked.
August 12th, 2012 at 8:42 am
@Tussuck: Errr, because it’s not a mistake?
You can’t make a comparison without a common basis. The number of elves and humans was set to the same thing to illustrate the relative consequences of increased lifespan, with the intent of determining what multiple of the value required in order to achieve a stable population this was, ie the relative population levels of humans and elves based on their lifespans.
“You make the population of adult Elves some fraction of the population of adult Humans to suit the setting canon” – so any fraction other than 100% is acceptable? “…and then you set the birth rates and death rates of elves so that works” – and how do you arrive at these numbers while still respecting the resulting average lifespan?
As you can see, I can do sarcastic as well as you can. But it’s not particularly conducive to any sort of genuine dialogue, particularly when it’s clear that you have simply skimmed the article and have not read it fully, filling in the blanks with your own presumptions. If you had done so, you have realised that the need for a greater survival rate was not an observation, it was defined as a necessary assumption. The question being asked was “how much better?”, given the other demographic elements that were provided or determined.
Also as I said in the article, the survival rates for medieval peoples were something I was uncertain of. They were a compound of recollections of history studies, many other articles and books, many documentaries, and the impression that all of these sources collectively gave me. I have since found a wikipedia page that gave far better and very different statistics. Based on the information provided there, your figures are roughly correct.
I’ve seen the “learn slower” proposal raised in discussion since the days of AD&D (1981). It’s not a 3.x invention. RPGs did exist before 2e AD&D, you know.
‘What else?’, as you put it. You can pick whatever maximum age you want and whatever starting age you want and the odds of picking the ages that match the demographics that fit your campaign canon are one in a billion or so. The whole point of the article was to try and arm GMs with the information they need to establish appropriate survival rates, associated causes of population loss, etc, to fit the campaign setting that they wanted – or to set these values as they saw fit and determine the demographic consequences for the total population in question.
Finally, having created a society that integrates with the campaign world you have created and is subject to the demographic controls that are implied as a consequence – or evaluated the demographic controls that you want and determined what pressures the resulting population would experience – you end by suggesting that the GM completely ignore all this on the part of any given individual, rather than making them fit with the boundaries of their race and native society? Why bother creating anything in the first place if your activity is not going to more closely integrate the PCs and their adventures with the world? And why bother reading, and responding to, articles that attempt to equip GMs with the tools, processes, and understanding to achieve such creation effectively?
Would I use these analyses myself? It would depend on the campaign. I would tend to pick what I wanted to be the demographic controls on the PC members of the race and then consider the consequences, rather than defining a population level and stability for the race and then determining the impact on PC representatives – unless I had something specific in mind. Would other GMs like to have the option of choosing for themselves? Almost certainly, at least some of them would. If you’re not interested, or if the article doesn’t address your needs as a GM or the expectations that you had of it for some reason, then why not state what you were looking for, and hope to inspire a sequel article that DID give you what you wanted? That would at least have been a productive line of conversation. Or is that the reason you didn’t do it?
October 31st, 2013 at 4:29 am
First off, maybe I should apologize for resurrecting an old post – it’s just a really excellent analysis that I still feel compelled to comment on.
The reason is primarily to point out that your methodology in using age 70 as the baseline for comparison may be flawed due to exceptionalism. The values you calculate for relative settlement sizes are amplified because the human value is so small. If you take the baseline at just about any other age, the values can be dramatically different.
To demonstrate, I basically just did what you said you’d have done in your youth – I charted the results, broke them down into yearly cohorts (using your numbers – so each year was an equivalent fraction of your cohorts, rounding off to avoid fractional people), and ran the same comparison for all years for which humans had data (0-75). Picking Age 70 (or anywhere 70-75) as the baseline yields a settlement factor of 71 – so a human settlement must be 71 times the size of an elven settlement. But if you pick 68 or 69, the factor is only 5.9 (12 humans vs 71 elves). Picking any year from 10-29 or 36-39 yields a factor below 1, meaning that the human settlement is actually smaller than the equivalent elven settlement. Averaging over the entire range (0-75) yields a settlement factor of 9.71 – so your average human town should be roughly 10 times the size of an elven (+60%) town – i.e. a human village of 2000 is comparable to an elven village of 200.
Running the same stats on your elven extreme example – you end up with a settlement factor of 40 – so your average human village of 2000 is comparable to an elven village of 50.
Of course, these stats ignore the elves over age 75 (the highest age cohort with any representative humans), and over-emphasize the values where there are few humans (close to a “divide by zero” error). If we instead reverse the ratios (#humans/#elves) and average over those results, the Elf(+60%) ratio is only .54 – so a human village of 2000 is comparable to an elven village of 1087, while the Elf(Extreme) ratio is .3 for an elven village of 597.
Maybe I’m biased, but these numbers feel a bit more realistic to me. They especially feel more realistic if you consider armies instead of settlements – an elven army of 600 (or 1100) “feels” reasonably comparable to a human army of 2000. But an elven army of 3 or even 17 doesn’t seem at all comparable to a human army of 2000. I have a really hard time imagining such a conflict with an elven victory.
I’m still not entirely sure how “number of citizens at a given age” is the only or even most important factor in comparing civilizations. I’m also not entirely sure that the age demographic profile of two radically different races (species?) would necessarily be the same. But I don’t feel a need to debate these assumptions.
I do see the advantage of substantial experience – the elf who spends 100 years a task is obviously going to be better than the human who spends 10. But, due to diminishing returns, the elf is not going to be 10 times better (I’d estimate he’d be lucky to be even twice as good). Toss in a modicum of skill atrophy, and these numbers seem far more accurate than the ones above.
Granted, it’s a rare system that models either diminishing returns or skill atrophy (mostly to avoid unnecessary complexity) – but if you’re willing to grant that most systems are meant to model the exceptional few rather than the populace as a whole, it seems plausible to assume that the average citizen does suffer both effects, and thus dramatically extended life expectancy is not as dramatic an advantage as presented above.
(As an aside, using the original data – the human median age is 35, the elf(+60%) median age is 60, and the elf(extreme) median age is 107 – which are probably not bad estimates for life expectancy at birth for each race. Now I need to go learn more about life expectancy so I can calculate it properly…)
October 31st, 2013 at 1:46 pm
That’s why I keep comments open even on the old posts, Brandon, so there’s no need to apologize.
Like every analysis, variations of assumption and methodology can obviously yield different outcomes, and the key is always going to be finding the assumptions and methodology that yield results that “feel right” for your world. Just look at the implications and consequences that resulted from simply changing the definition of a Hit Point (which led to the All Wounds Are Not Alike series (link is to part 1), then making sure that those assumptions are reflected in the rest of the fantasy society.
The ultimate purpose of this article isn’t to throw out some definitive numbers, but to equip GMs with the question and the tools to resolve it to their own satisfaction. Which is exactly what you’ve done. I agree with everything you’ve said in your last four or five paragraphs.
I have an advantage over a lot of other GMs in that my family has a lot of very practical people in it, who I have had the opportunity to observe “at work” over the years, and the biggest thing that experience seems to have yielded is best summed up as “economy of effort” vs outcome. Because they know what they are doing and how, they can achieve easily three times as much as even someone naturally that way inclined in the same time-frame, with lest waste of effort, less compromise of function, and less waste of materials. My father can do in a day in his workshop what would take me weeks – assuming parity of equipment. At the same time, I don’t think for a minute that he would describe himself as a professional fabricator or furniture-maker. There aren’t many opportunities these days for people to see things like that being done, it’s all industrialized. But trying to quantify the relative differences is difficult-to-impossible.
That’s the challenge with which you are confronted in simulating any semi-realistic game world – the need to be able to quantify the imponderable. That’s one reason I think every GM should occasionally hang around SCA-type events or historical reenactment societies where there’s a demonstration of craft techniques. Simplifying it to “years of experience” is as close to a generic answer as you can get, and while it might be inadequate, or only marginal, it’s a lot better than having no standard at all.
Appreciate the effort in commenting and the contribution you’ve made to the discussion :)
April 18th, 2014 at 12:32 am
[…] Given that most systems’ Skill mechanics can be described as “using skill to limit the character’s dependence on luck” – which also can be defined as “skill limiting the uncertainty of outcome” – this can become an important issue if campaigns go on for long enough. It’s a question that I’ve had to grapple with many times, and which has also cropped up in related contexts such as the impact of extremely long lives on expertise levels, discussed in one of my previous articles, The Age Of An Elf: Demographics of the long-lived. […]