I was thinking about the perception of time and how that doesn’t match up with the mechanism of time-keeping in the standard initiative systems in games.

I mean, it’s certainly possible to design additional mechanics to take these variations into account, and reinvigorate a system that has become predictable.

More interesting AND more realistic at the same time? That certainly bears further investigation!

And so, here we are…

Physical measurement of Time

Let’s try to start this discussion with a level of objectivity, by looking at how we measure time.

To start with, let’s agree that there are three scales in use, and perhaps should be more.

There’s the Planetary scale, that have always been measured by observations of objective phenomena – everything from a day up falls into this category.

There’s the human scale, which is used for everyday events, actions, and perceptions – human reaction time at it’s best is about 1/4 of a second, but if we play on the safe side, we can say that anything from 1/10th of a second up to a day falls into this category (with the longer entries a little more vague and approximate in meaning and measurement).

There’s the atomic, which can only really be handled in a lab, often with expensive apparatus, but which we have bent to our wills to more precisely measure the human scale.

I would argue that there should also be the Cosmic, used to lop off seemingly-endless repetitions of zeros in measuring events like stellar lifetimes and galactic interactions and planetary histories. But this doesn’t seem to have caught on, at least not yet. Maybe when such phenomena are reported daily in the newspapers…

It’s entirely possible that when we start grouping clusters of related events together at the Cosmic Scale that we need to break it in two or even three different scales just to keep things comprehensible in relative terms. That’s beyond the remit of this very broad examination of what is ultimately, a side-issue – at best, a foundation..

We know how each of these scales relates to each other. We can convert one into another with relative aplomb. No-one would ever do so for practical purposes, but if we had to, we could estimate the lifetime of stars in terms of frequencies of light.

Make no mistake, each of these scales exists (or doesn’t yet exist) purely as a human convenience. Within them, various measurements are employed by humans, and have been for a very long time in most cases.

    Seasons

    Let us start with the seasons. These are obvious objective phenomena that recur every year, but whose start and end points are fuzzy, and ill-defined until you start getting into astronomical observation.

    Then someone links the winter solstice with the season whose name it bears, and the summer solstice likewise, and you have ‘pinned down’ (with an artificial definition) two of the four fuzzy markers.

    The relationship these markers have with the objective reality experienced on the ground is illustrated vividly by the fact that Groundhog Day persists, year-on-year.

    Sidebar: An improvement? A perspective.

    It might be possible, and even better from a human perspective, to define the start of a season as the first time in a cycle that one or more certain objective measurements being recorded. It might be the first night where the temperature dips below a certain level, or the first time snow falls, or whatever.

    These would be functional and practical definitions that would correlate with the experiences of the growers of crops, making them useful, too.

    But they completely lack the color of an event like Groundhog day, which transforms the passage of time into something to be celebrated – and we humans love any excuse for a good party.

    Non-humans of more sober mind-set might well opt for the more practical approach described – something to keep in mind. In fact, since military success is often tied to the impact of the seasons, we could probably include those of a more martial mind-set on that list!

    Months

    Whew, I waffled on about seasons far longer than I intended! So let’s try and redress the balance a bit over the next few categories.

    Months come in two varieties – there are Lunar Months, which are tied to an objective physical phenomenon, and there are calendar months, which are a convenient abstraction, dividing the year into 12 roughly equal units and the seasons into a beginning, middle, and end.

    Sidebar notes

    It’s perhaps worth observing in passing that it is the extreme circularity of the Lunar Orbit that makes Lunar Months sensible. Were the orbit more elongated, they would not be even close to the same length in more objective counts – some might last for 60 days, and some for 15. If orbital eccentricity on that scale were the case, I’m not sure the concept would even evolve.

    Things would also grow more complicated if there were two visible moons up in the sky, because we now have three visible phenomena – the periods of each moon and the subdivisions provided by the relative frequency of their interactions.

    I employed both of these in my very first fantasy campaign, in ways that are far too complicated to go into here. Suffice it to say that I ended up with 36 “Months” and ten “seasons” in a “calendar year”.

    Days

    Another obvious objective phenomenon is the rising and setting of the sun. Never mind that it happens at a different time every day, and that the period of daylight is also variable over the course of a year – daybreak-to-dusk plus the night is a reasonable definition to work from.

    Sidebar: Starting each day at zero

    Humans (these days) use an objectively-set artificial zero, sometimes called “Midnight”, as the boundary from one day to another. This is a very modern perspective that shouldn’t necessarily apply in a fantasy culture.

    A lot of it comes from a fixed and precise notion of the length of human-scale time units, which are next on my to-examine list. Take that away and replace it with arbitrary or approximate measurements, like the length of a candle burning down, and incorporate the practicalities of rural life as the locally-dominant feature, and you can end up with a quite different answer, and one that can add functionality to fantasy campaigns:

    Each day starts at zero, and is divided into 10, 12, or 16 arbitrary units of approximately equal span until dusk. Any leftover is a god-given time to relax, or to squeeze in one extra (unscheduled) chore. But what the gods may give in the warmer months, they steal back in the winter. At night, the same-sized arbitrary unit is used to approximate divisions, but there will usually be either more of them or less of them, depending on the season.

    In the Zenith-3 campaign, on the current campaign date, at their new base of operations in Arkansas, dawn currently arrives at 6:01 AM and lasts 14 hours and 20 minutes.

    As the height of summer approaches, dawn will come earlier, and the day will last longer. These two changes are not equal, but the differences are measured in a rate of change of seconds per day. Come winter, and the day will be down to around 10 hours (I haven’t looked it up, I’m just subtracting from 24).

    So, if Dawn is zero each day, and we’re using divisions of 12, then each ‘division of labor’ is 1 hour and 21.75 minutes long – call it 80 minutes for convenience. This permits a farmer or laborer to divide his ‘day’ into functional units by which to estimate the progress of tasks and the scheduling of activity.

    The night is 24-14h21m = 9h39m in length; divide this by 80m, and you get a night of 7.2375 ‘divisions’. Call it 7 1/4 for convenience. So, for night-time tasks, 2 divisions on and 5+ off is equitable for four – so long as the ‘2 divisions’ that are short (only 1 1/4 in length) are rotated around.

    You can already see this having an impact on social and logistical patterns, on the ‘real world’ around the characters, and this is just the starting point. But it all stems from using arbitrary-but-meaningful units instead of absolute-and-measurable ones.

    Avoid using the terms ‘hours’ and ‘minutes’ – reserve those for the real-world objective measurements, or you’ll eventually get yourself in a hopeless tangle. “Divisions” works for comprehension (and comes naturally with “subdivisions” for a smaller unit), but is fairly flavorless.

    But once again, I’m getting off-track.

    Portions Of A Day

    Another fairly basic objectively-observable phenomenon is ‘noon’, when the sun is at it’s highest point in the sky. This, in fact, is where ‘midnight’ comes from, when the sun is at it’s (theoretical) ‘lowest point’.

    Humans have found it convenient to abstract the daylight spans on either side of this non-arbitrary point into “morning’ and ‘afternoon’, and then to subdivide those (‘early morning’, ‘mid-afternoon’, and so on) – generally into subdivisions of three for no good reason that I can come up with without arbitrarily defining a day as 12 hours long, in which ‘into thirds’ becomes a natural subdivision.

    In practical terms, no-one has the time to stand around watching the shadow of a stick (or equivalent), so “noon” becomes fuzzy, and the divisions equally so. I would suggest that this fuzziness is the reason these concepts can survive differing lengths of day – they are approximations of convenience.

    Sidebar: More speculations, plus Dwarves

    Again, those who want to make their cultures a little more alien, take note! Dwarves, with their underground lifestyles, might have an entirely different sense of ‘convenience’ in such matters – ‘start-shift, mid-shift, short-shift, and ‘end-shift’ (a division into four) might be more appropriate – with ‘short-shift’ called that because it’s interrupted by,. and begins with, a mid-shift meal.

    Weeks & Fortnights

    From whence does the concept a week come from? And a fortnight, whose bright idea was that?

    I have always held the (uninformed) opinion that these started as subdivisions of a ‘month’ (one quarter and one-half, respectively) and then got codified into a fixed number of days because it permitted sufficient worship on the seventh day to retain religious indoctrination without compromising the productivity of the laborer too much – a compromise between religion and secular power, in other words, and so far back in history that the origins have been lost.

    But that might be just my fanciful imagination.

    Taking the ‘fancy’ and the implied criticism of theology out of it, what I am left with is that these are arbitrary subdivisions defined in terms of shorter time periods (days) that have proven useful in defining satisfactory levels of work-‘life’ balance.

    Factor in recurring market days and the like and social patterns quickly shape themselves around these intervals. It’s debatable whether longer groupings (eight days a week, anyone?) with their more complex patterns, are too much for people to tolerate, or if this is simply a human artifice to marry these periods into some semblance of integration with the longer time-units.

    Still more speculation

    I have occasionally wondered why we humans don’t use a 360-day year, with recurring days that ‘aren’t counted’ as holidays spaced throughout in order to make up the ‘natural year’ of 365 days.

    You can even add in an extra ‘non-counted day’ on leap years, except when the year ends in ’00’ but doesn’t end in ‘000’ to make this system every bit as accurate as the one we do use.

    And the convenience! Months that are exactly 30 days long. Weeks that are either 5 or 10 days long. An exact number of weeks in every month.

    Species of a more ‘precise’ or analytic bent might employ such a system, but I think it more likely to find favor amongst species that are even less mathematically-inclined than we are. Like Halflings. Something about the notion of sweeping those days of the calendar that don’t fit to one side and using them as an excuse for a feast resonates, for some reason.

    Years, Decades, Centuries, Millennia

    Until we get astronomy locked down to a reasonably high standard, ‘years’ are semi-arbitrarily defined by the rotation of the seasons. Decades, centuries, and millennia are simple base-10 groupings of years.

    That’s an important point that anyone involved in computers in the early days should appreciate.

    “10” in binary, =2 in ‘real-world’ (base-10) numbers.
    “10” in octal (base 8) = 8 in base-10 numbers.
    “10” in hexadecimal = 16 in base-10 numbers.

    Computers ‘think’ in binary, but usually in groups that form ‘words’ of code. Octal has largely fallen out of fashion, replaced by hexadecimal codes, and these are still in use today. If you write in ‘machine language’, you are coding in hexadecimal.

    The implication is that if a species uses base some-thing-other-than-10, these arbitrary compilations of years will represent different tallies.

    Beyond years, these are simply units of convenience.

    Seconds, Minutes, & Hours

    We need to talk about the convenience of 12, and of 30, 60, and 90. I’ll try to keep it brief.

    12 is evenly divisible by 1, 2, 3, and 4. When it comes to measuring angles, though, 12 subdivisions of a circle yields units that are too large, and that quickly become inconvenient.

    Logically, then, we extend by 5 to get 30 – the first number to be evenly divisible by the first 5 digits, and we get the sixth as a bonus.

    So, why aren’t angles measured in degrees of 30ths of a circle?

    My best guess – and I don’t know for certain – is that the resulting margins of error were too large to make the subdivisions useful. 1 thirtieth of a circle is 12 traditional degrees, and that’s a big enough interval that dwelling measured to that standard would be in constant danger of collapse. “Level, plus-or-minus six degrees?” Not going to work.

    Navigation – if your course is correct to within a margin of six degrees to either side, over a distance of 100 miles, you could be as much as 10½ miles away from your intended destination – that’s MORE than 10%, and enough that you might completely miss your target.

    So someone decided to double it, to sixty and then increase that six-fold to 360° because being accurate to within 1° is a heck of a lot better in everything from carpentry to home construction to navigation.

    But the earlier unit of sixty remained for sub-subdivisions of degrees and of sub-sub-subdivisions – still known as minutes and seconds, respectively – and because these angular measurements predate accurate timekeeping, hours were subdivided the same way when clocks were invented.

    The term ‘second’ was first used in the year 1000 by a Persian scholar named al-Biruni, basing the measurement on a fraction of the time between New Moons of certain specific weeks relative to the preceding Sunday.

    That’s my theory, and I’m sticking with it until some better explanation comes along.

    Minutes and seconds are, therefore, arbitrary divisions of a basic time unit (hours) that have been chosen because they can be subdivided evenly in many convenient ways – one half, one third, one quarter, one fifth, one tenth, one sixtieth.

    Attempts to change units of angular measurement have been tried over the years – look up Gradians – and have foundered. Radians (there are pi of them in a circle) have survived because they are mathematically convenient in some contexts beyond the everyday.

    Because the mathematical utility of these sub-divisions remains true, even if they are arbitrary, I would expect most civilizations to adopt them – but the precise interval of time represented with them will vary with the definition of an ‘hour’. I have posited at least one alien civilization in which the hours are divided into 100 minutes, however – even though I don’t think it would actually ever happen in real life.

    Heartbeats

    Okay, now we’re getting into intense territory. The human heartbeat varies in beats per minute quite considerably, depending on what we’re doing and on our emotional state. It also varies massively from individual to individual as does the variability. To some extent, this is due to physical training, but that’s far from the whole story.

    It’s a documented fact that formula one drivers have a far lower resting heartbeat than most people would consider normal, and a far higher heartbeat when under stress, which they are able to sustain for longer periods than almost any other type of athlete (from 40 to 200+ bpm, for up to 2 hours). The same is also documented in other forms of motor-sports, though to a lesser extent perhaps.

    Take away the sustained nature of this pattern, and you get Test Pilots and Astronauts, who can operate at absolute peak efficiency for minutes at a time. Lower the peak from there and you get other elite sportsmen and elite combat troops, and so on.

    When our hearts are pounding, though, this remains the most important timekeeper, at least subjectively. Everything else fades into insignificance in comparison.

    And that’s where I think subjective time comes into the picture, something I’ll discuss more fully a little further down the track.

    (English?) Railroads

    I’ve been informed that there was very little precision in timekeeping until national railroads began running, especially in England. Suddenly, it because vital for all the clocks in all the railway stations to read the same thing at the same time so that arrival and departure times could be precise. Anything less would soon lead to one train colliding with another, and even sooner lead to a horde of angry customers.

    From that beginning, it spread – radio broadcasts and hours of labor and television and so on. The whole concept of being punctual was fuzzy prior to this – you arrived when you got there, and so long as you didn’t waste any time or get delayed en route, that was as punctual as it got.

    Not sure of the relevance, but I’m throwing it in here, anyway.

    Crystal Oscillations

    Precision started mechanical, but became electronic, when electrical oscillations in particular types of pure crystals became precise radio wavelengths and the corresponding frequencies, and were then adapted into clocks.

    Not that most such clocks and watches were very precise, at first. The vibrations seemed sensitive to all sorts of environmental variations that such digital clocks and watches were known to gain or lose time, all the time.

    In a good one, that might be a minute or two a month; in a more typical one, that much per week; and in a bad one it could be that much in a day. The good ones therefore needed resetting every 6 months or so, the typical ones every couple of months or less, and the bad ones, weekly.

    Precision did improve somewhat over the years, but became increasingly expensive. You can get digital watches now that are guaranteed to be accurate to within one second per century – but they cost thousands of dollars.

    Sports and sporting prowess has remained one of the major drivers of precision in a relatively everyday setting. The time was when it was sufficient to measure lap times to within a tenth of a second – and then it became necessary to do so to the nearest hundredth in order to split competitors, and then to the nearest thousandth, and now to the nearest ten-thousandth.

    You can see the same thing happening in other areas, too, like human sprint races, and swimming races. Those are eternally compromised by the need to actuate the timers with mechanical triggers, though, so there is a hard limit to the accuracy with which these things are measured, and the ‘dead heat’ still happens.

    Beats Of Atomic Light

    In physics, greater precision was needed. It came, first, in the same crystal technologies described earlier, and then in atomic clocks, and then in the counting of frequency ‘beats’ of particular wavelengths of particular atoms, under extremely controlled conditions, which is where the ultimate in precision stands now.

    The current definition of one second is 9 192 631 770 vibrations of the ‘unperturbed ground-state hyperfine transition frequency’ of the Caesium-133 atom, measured in Hertz (i.e. cycles per second). Other atomic ‘vibrations’ have been defined as secondary ways of measuring the unit of time, some of them with greater stability and hence greater practicality, but the Caesium isotope is still the standard.

    Wikipedia’s article on ‘second’ (the SI standard unit) adds,

    A set of atomic clocks around the world keep time by consensus by “voting’ on the correct time and steering the voting clocks to the consensus, which is called International Atomic Time.

    Civil Time is defined to agree with the rotation of the Earth. The international standard for timekeeping is the Coordinated Universal Time (UTC). This time scale “ticks” the same atomic seconds as TAI, but inserts or omits leap seconds as necessary to correct for variables in the rate of rotation of the Earth.

    Einstein

    It seems likely that future standards will have to specify that the measurements be taken at a specific speed of motion relative to the observer (zero within tolerances) because Einstein complicated everything with his theory of General Relativity.

    This showed that as the relative speed of motion increases, time is perceived to stretch, and that gravitational fields, by distorting the space and hence the distance through which a beam of light must pass, do likewise.

    One of the first accepted ‘proofs’ of the theory was the solution to a problem in which the orbital period of Mercury was wrong in predicting when it would become visible from around the far side of the sun. This proved both the principle of Gravitational Lensing and solved a problem that had been vexing astronomers for some time.

    But it also means that with motion, time stops being fixed and becomes flexible. At low speeds, the effect is trivial, even negligible – but even jet aircraft have been shown to exceed the minimum reasonable threshold for ‘trivial, even negligible’.

    They did this by sticking an atomic clock on just such a jet; the clock that had been perfectly synchronized with another on the ground. They then flew the jet around the world and compared the two clocks, finding that they no longer agreed.

    Precision timekeeping is needed for GPS to work, for one example, and this source of error would completely disrupt the service if it hadn’t been taken into account.

    So speed of motion creates a hard limit to the accuracy of timekeeping, and it’s not just a perception of time that it is inherently variable.

    Or is it? Before falling down that rabbit hole, lets switch to the second line of discussion – perceptions of time.

The Perception Of Time

I’ve occasionally gotten into arguments by suggesting that all time is perceived, and that we have no direct functional sense of time, and that seems like a good place to start.

We measure time using clocks and the like by observing changes over time. We have agreed that a specific such amount of mechanical change represents an hour, or a minute, or a second (to refer to the hands of a clock).

Other techniques involve electrical currents and the electrochemical reactions that produce them. But we don’t see these electrical impulses, vibrations, or the reactions that create them; we see a counter change when a threshold count is exceeded. That counter is the readout on a digital clock, it could be in the hours, minutes, or seconds position.

    Inferred time

    We can infer time based on a standard speed of movement or even one that changes consistently, such as the acceleration due to gravity over a fixed distance. but, just like the swinging of a pendulum, this relies on perception of a visual change in a system.

    We might be able to infer time by the period that it takes certain chemical or physical reactions to proceed. This may be a fairly fuzzy choice, but water clocks use this principle.

    Every external measurement of time gets perceived as such a change. That doesn’t mean that it’s not an objective standard; just that we have to perceive time indirectly by the changes that occur.

    Internal perceptions

    What about our internal perceptions of time? Well, there are inherently variable phenomena like heartbeats – and persistent stories of people who can control them sufficiently to use them as timepieces. I’ve never seen any of these claims proven empirically, though.

    Beyond that and similar biochemical reactions of which we have no direct perception (since they occur at a cellular level), we have reaction times and a subjective sense of the passage of time, sometimes pegged to theoretical circadian rhythms.

    I’m not arguing that these are inherently inaccurate or accurate to a certain standard; just that they are subjective, and rely on a perceived interval of time having passed.

    Internal Alarm Clocks

    Most of us have an internal alarm clock that wakes us up to whatever degree of reliability or unreliability. It might do so at the same time every day, or at the same condition of natural light, or at the first rooster-crow, or some combination.

    There’s a psychological element to these ‘clocks in our heads’, too – if I set my (external) alarm clock and really need or want to wake up then, I will often awaken five or ten minutes before the actual alarm sounds. If I don’t feel such a burning need, or don’t synchronize my internal chronometer to the time shown on the alarm clock by setting the alarm, it doesn’t happen.

    But there have been occasions when the power failed, killing the external clock – and the internal one still worked. Once, as a prank, the digital clock was reset while I slept – I still woke up at about the right time (ten minutes late, as I recall). And there have been a number of occasions when I have mis-set the alarm to PM instead of AM – but because I had perceived the applicable time as AM, I awoke at around the right time.

    This isn’t just a matter of going to bed at the right time, or of awaking after a consistent period of sleep; it would be a lot more predictable were that the case. So far as I am concerned, this is an objectively-real if unreliable phenomenon – one that most people share to some extent, and with differing reliability levels.

    But it’s still subjectively observed and interpreted, no matter how objectively real it may be.

    Other biological functions

    And the same is true of every other biological or biochemical or neurological or neurochemical process that I can think of. These are undeniably objectively real, but none of them are perceived directly, so they are all subject to subjective interpretation and the time intervals they ‘measure’ are subjective in length..

Two subjects

That means that there are two fascinating subjects to be analyzed in thinking about these phenomena and how to reflect them in game mechanics on behalf of the PCs and NPCs who may experience them.

There’s the phenomenon of time perception, also known as chronoperception, itself; and there’s the relationship, if any, between this and ‘objective time’, which would define things like the reliability and accuracy of the perceptions.

To me, the more I thought about it, the more inextricably-linked these became, because I couldn’t think about the perceived passage of time without referring to some external perception or objective time interval.

I could subjectively perceive that ‘morning’ has become ‘afternoon’ but those terms don’t have any meaning without the perception of external reality itself.

I could perceive that it’s been about an hour since I last checked the time – but to do so, I have to have noted the time an hour ago, and have a concept of ‘an hour’ against which my subjective interpretations are compared.

The subjective perception has no meaning without the objective reality, and so everything said on the subject relates to the relationship between the two – and that leads me back to the earlier point about our only ever perceiving time indirectly, and therefore subjectively.

So, let’s talk about different subjective interpretations of time, since that’s all we’ve got.

    Past Time

    When I think back over the years, some events seem more remote than others.

    I’ve lived in two different places totaling more than thirty years in both – but that doesn’t ‘feel’ like half my lifetime. A third, tops. That could be interpreted as my feeling 90 years old (and I do, sometimes), so let me be clear – I mean that 30+ years feels more like 20-or-so at most.

    I can still remember clearly, events from my childhood (just fewer of them) – but some events that are far more recent are also far more clouded in clarity and specificity.

    There are two primary theories of time perception that could apply, according to Wikipedia:

    The strength model of time memory. This posits a memory trace that persists over time, by which one might judge the age of a memory (and therefore how long ago the event remembered occurred) from the strength of the trace. This conflicts with the fact that memories of recent events may fade more quickly than more distant memories.

    The inference model suggests the time of an event is inferred from information about relations between the event in question and other events whose date or time is known.

    I think that both of these are probably correct to some degree, and the perception of recency lies in the first, while the ease of recall and perception of detail lies in the second. Thus, soldiers suffering from what used to called PTSD can experience flashbacks to events that seem contemporary and immediate and completely visceral (and will then act and react accordingly), while knowing and feeling that these are long-past events the rest of the time.

    It also seems likely that the frequency of recollection makes recollection easier and hence the memory, more immediate. Time spent without a traumatic past event being triggered helps encrust that memory with distance, creating greater resistance to it being triggered in the future, even by stimuli that would have immediately induced a full flashback.

    These mechanisms would also limit the impact of such traumatic re-visitations, so that a flashback might be a passing emotional flash and not a full reliving of the trauma – combat veterans from the Vietnam war have often said that a helicopter being heard or seen overhead or the snap of a twig often brings a flash of emotion deriving from their time of service. In some cases, these pass almost immediately, in others they last significantly longer and are far more intense and immediate because of it.

    Science has determined that different ranges of duration are processed by different areas of the brain; to me, this directly relates to the storage, processing, and recall of memories.

    Wikipedia (my primary reference source for this article, and not even consulted until I got to the modern definition of a second) lists a number of temporal illusions, or distortions in the perception of time.

    I’ll touch on some of these as they become relevant. So far, the major ones to be applicable appear to be

    • Time Telescoping, in which events are recalled as nearer or further back in time than they really occurred, referred to as Forward and Backward telescoping, respectively;
    • Auditory stimuli may appear to last longer than visual stimuli, which suggests differences in how the brain handles those stimuli;
    • and one that Wikipedia doesn’t mention, that different senses may cause stronger or weaker memory accesses than others. Scent is often a much stronger stimulus than sight or sound, for example, if one that has fewer significant events ‘tagged’ by it.

    But we’re not really talking about memory here, other than perhaps indirectly. So let’s move on.

    Slow Time

    “Events seemed to unfold in slow motion”. I’ve heard and read that any number of times, both from sportsmen who were in the zone, or who were about to experience a traumatic event that they could see coming, and from those experiencing violence of some kind like soldiers and police officers.

    To some extent, this is all about the brain going into hyper-drive due to adrenaline, focusing more of its resources into analyzing a situation perceived as survival-critical; it is often accompanied by a form of tunnel vision, as ‘irrelevancies’ are discarded or ignored.

    In past articles about optical illusions, I’ve talked about the Gorilla paradox, in which a brain concentrating on one task (counting the number of passes of a basketball by one team) can fail to observe a guy in a gorilla suit wandering through the field of vision, waving at the person, and leaving. Magicians use it for misdirection, getting the audience to focus so intently on one thing that they don’t notice another.

    Slow time gives the perceiver greater time to react, and to choose between different reaction options.

    Endless Time

    “A watched pot never boils” is another aphorism, and one that describes a different mental phenomenon – that, in response to boredom, a brain can either wander off (which cuts short the time perceived to pass) or can simply shut down and rest (which prolongs the perceived passage of time without any events to trigger a sense of Slow Time.

    When I’m writing – be it an article or an adventure or whatever – and the words are flowing smoothly, they just continue to stream from thoughts into words on the page. Sometimes, it can be hard to keep up, because I can’t simply let misspellings and missed punctuation go, I have to correct those I perceive immediately.

    If ever the words stop flowing, and I have to stop and think about what to say next, or if I’ve left anything out, or should this be moved there instead of appearing here. it’s easy to start writing in my head than on the page. I can rough-compose whole pages in my mind this way – and, if I’m lucky, remember them when I resume putting down actual words.

    This can encourage that smooth flow, when it occurs, but it can also mean that a snap decision turns into five minutes of reverie with nothing to show for it.

    Long Time

    When you are focused on one thing, like writing, time outside of that focus can slow or stretch. I can compose words for what seems like a few minutes, only to find that a substantial portion of an hour has passed – or I can struggle through a difficult passage for what seems to be hours, only to discover that only a few minutes have passed.

    Both of these are examples of different phenomena of Long Time – either the perceived duration of time interval lying outside the point of focus is longer than objective measurements, or the perceived duration of focus is stretched relative to objective measurements.

    Intensity of focus vs distraction tends to shift perception from one to the other. Frustration of any sort pushes perception toward stretched duration of focus. There’s also a rebound effect, as perceived time shifts first one way and then the other. Intensity of concentration is a factor in both.

    Subjective awareness of the passage of time is inherently sloppy, it seems. But that brings me to Vierordt’s Law: Shorter intervals tend to be overestimated while longer periods tend to be underestimated.

    Clearly, there is a threshold in between these two perceptions, but I would contend that there is often a threshold outside the ‘longer periods’ range at which longer periods again become overestimated.

    A really long movie, for example, can seem even longer than it was. Adding ten minutes to the running time can cause a movie to seem twenty or more minutes longer. Stimulation and boredom both play a part in this, as does exhaustion – you can’t stay at 11 for the whole movie, you need to occasionally dial things down and let people catch their breaths.

    That was one of the key principles in my article on emotional pacing in RPGs (Part 1, and Part 2).

    Heinlein

    One of the key tenets of at least one of Robert A Heinlein’s stories lay in the perception of time by characters in the story, and was summarized as “the duration of an interval is proportional to the number of learning events experienced” – more or less.

    Three of the temporal illusions referenced by Wikipedia apply to this:

    • Time intervals associated with more changes may be perceived as longer than intervals with fewer changes;
    • The perceived temporal length of a given task may shorten with greater motivation; and
    • The perceived temporal length of a given task may increase when broken up or interrupted.

    I’d actually broaden these to some extent, again by bringing the concept of focus.

    Revised Temporal Perception propositions

    A) When you are focused (higher motivation), interruptions and distractions are (1) more easily tuned out, but (2) have a disproportionate impact on the perception of duration when they are not.

    B) When you are not focused so strongly, the perceived passage of time is more strongly measured by events that could be considered distractions and interruptions than on progress in the task at hand.

    Music In The Background

    Let’s say that I have music playing in the background while I write. This not only cues me to take regular breaks – I’ll come back to that in a moment – but it helps me monitor the passage of time, except when I’m intensely focused, when I simply tune it out and don’t hear it at all.

    The more familiar the music is, the easier it is for that tuning-out to occur. What’s interesting is that the more easily I can tune out the music, the more successful I am at tuning out other forms of environmental distraction, including awareness of the passage of time.

    I mentioned those periods of protracted reverie a little while back? They don’t derail the process of writing as frequently or for as long if there’s music playing. Note, not music videos or TV shows – it’s too easy to get distracted by the visual stimuli. I need to save that for the page, where I really need it.

    Perceived Productivity

    The other thing that plays into this perception is perceived productivity.

    We often imply the length of a time interval by considering the achievements within that interval together with an impression of the ease and efficiency with which they occurred. That’s just human nature.

    I can look at a graphic representation of the length of this article and guesstimate it as being significantly longer than average, about 7000 words so far maybe, and that since it has mostly flowed freely, I’ve been writing it for about 6 hours. So, reality check: as of THIS word, it’s 6,850 words (close enough) and I started writing it (aside from some headings and subheadings and the opening paragraph) at 10:30AM this morning, 6 1/2 hours ago.

    Notice that I overestimated the work product slightly and underestimated the time interval slightly.

    There is the perception that regular short breaks waste time. Testing has shown that this is not the case unless you are operating at the highest level – when the words are flowing freely, for example. Most of the time, though, I write at about half the pace indicated by those actual measurements, and taking regular breaks increases the productivity without increasing the perception of productivity.

    And that skews those mental assessments. In fact, it can skew them dramatically. So, unless I’m in the zone, those prompts to take a short break at semi-regular intervals can more than make up for any time lost due to the distraction factor.

    The other benefit is that it prompts me to save my work regularly – something that I haven’t done since I started. So let’s take care of that, right away!

    Quick Time

    There’s a very thin line between being in slow time and being so overwhelmed by events to which you have to pay attention that you are overwhelmed.

    When that happens, the natural response, as I indicated earlier, is to develop tunnel vision. You can focus only on the enemy or task in front of you, and everything else gets shunted to one side.

    Intelligence: Is More, Better?

    Clearly, a high intelligence helps you have clarity, helps you analyze situations on the fly, and helps you develop and modify clear tactics to achieve your current objectives.

    All of those sound like good things in terms of situational awareness and are easily thought of in terms of slow time.

    But consider that high intelligence frequently means high awareness or perception (for exactly the same reasons) and that means that there are more things for you to keep track of, and more possibilities for each, and more possible responses on your part to things that they might do – so it would actually be a lot easier to suffer from a monomaniacal focus.. Arguably, high INT/PERC should help until suddenly it doesn’t – when it becomes a liability.

The Mechanics Of Temporal Mis-perception

At this point, you should have a pretty good handle on what we want to modify the game mechanics in order to simulate.

But it’s probably worth a nutshell review of Initiative systems before we go there, though.

In lots of game systems, initiative can be thought of as a numeric value that expresses who acts first. So it starts low (1st character to act) and rises to N (last character to act). These values are often determined by some sort of roll or draw, which may or may not be modified by a stat value or by some sort of character ranking like class level.

In the Hero games system, it’s a little more complicated than that, because characters get a different number of actions in a given 12-second turn, depending on their character’s speed. These are distributed unevenly across the 12-second span – everyone acts in segment 12, and any remaining actions are evenly distributed over the remaining 11 segments (each lasting one second).

One of the first changes I made to the standard Hero System was to rewrite the actions table to evenly distribute actions across all 12 segments, eliminating the “Segment-12-everyone-acts” because typical combat segments could last a couple of minutes while Segment 12 took over an hour. Even distribution eliminated that problem.

In the D&D 3.x system, initiative is a numeric value that indicates in relative terms when a character acts, counting down from the highest to the lowest. This system is so much faster than the Hero Games model that it has largely replaced the superhero subsystem in my campaigns. I’m still thinking about a “last character acts then everyone recovers” model. It’s slightly complicated by the capacity to hold actions until a trigger of some sort, but by and large it works extremely well.

Between them, these are representative of most of the initiative systems that are out there, so those are what I’ll be looking to modify.

    Injecting Some Variability

    The two types of systems can be treated as belonging to two classifications: High sooner, or Low sooner.

    To inject some variability, we simply need each character, after they act, to roll a d6 and add it to their previous initiative value to get their next round value..

    If you’re in slow time, that means that you have either rolled low (in a low-sooner model) or rolled high (in a high-sooner model). It’s as simple as that.

    Variability Modifiers

    Anything else that we want to factor in can largely be treated as a modifier to that die roll – with limits on how much it can change, so that you don’t waste a lot of time dealing with lots of modifiers.

    In low-faster systems, anything that makes your perception of time better, that aids your comprehension, subtracts 1 from your die roll. That’s anything and everything – and not one each item, it’s one for anything at all.

    In high-faster systems, you add 1 instead.

    That includes things like a simplification of the tactical situation, being a lot more capable than the enemy, outnumbering the enemy, and so on.

    More robust alternatives make it plus-or-minus 1 for each of the named factors – but you can’t get an initiative adjustment of less than 0, anything more than that simply goes to waste.

    The point here isn’t to have a big adjustment in a given combat round, but a steady accumulation of them as different advantages add up.

    The GM can also decide that a tactical situation has worsened – the enemy get reinforcements or whatever – and impose a modifier on everyone except selected characters as a one-time thing. This covers situations in which a character is flanked and has to try to focus in more than one direction, and so on. These assessments should take place, with immediate effect, after the last character acts in a round.

    Tactical Focus Vs Tactical Myopia

    Finally, we have the problem in which characters become overwhelmed, causing tunnel vision. Once this happens, the GM should impose a modifier based on the character’s intelligence score and use it to move the initiative value in the direction of ‘go slower’. If they ever reach the point of trying to take initiative points off a score of zero, these should be applied as a ‘go faster’ to everyone else – it’s all relative values.

    Once tunnel vision occurs, the character fails to be aware of anyone else doing damage to them. All attacks against the character get the usual ‘surprise’ bonus or ‘from behind’ bonus, whichever is greater.

    Trigger

    So, how does this happen? In low=slower systems, i.e. high-sooner it’s easy – any modifier that would push the character’s initiative value below zero instead puts the character into this condition.

    It’s a little bit trickier in a high-slower system; we need to establish a triggering threshold. As a general rule, 5 + low init + high init should be a reasonable threshold. If the status appears to be triggered too often, raise this by another 5.

    Exit

    Any change in condition that moves the character away from the threshold gives the character the chance to refocus – but note that the character is in a tunnel-vision state in which only the enemy right in front of them exists. In practical terms, that means that the enemy in question has to go down, or get flanked by an ally of the overloaded PC. The character can then take a round to clear their head and generate a new, unmodified, initiative score.

Finally, it would be extremely irresponsible of me not to offer up such a set of game mechanics house rules without considering the potential impacts.

Opening New Possibilities

Before I go there, though, I’d like to point out that this proposal offers more than just what you’ve seen on the face of it, a yin to the downside’s yang..

    Feats To Manipulate Initiative.

    This opens up a whole new class of combat feats – you could have feats that negate a certain category of negative modifiers, feats that let you impose a negative feat on an enemy, feats that let you add a positive modifier to an ally, feats that modify the die size, feats that force the other side to modify their dice size…

    Classes That Manipulate Time

    Similarly, you can have classes or class abilities that do some or all of these things. In fact, you could have an entire class or subclass built around the concept of combat-awareness, or of creating combat confusion in their enemies.

    Spells/Magic items That Manipulate Temporal Perception

    And, of course, there can be spells that temporarily, and magic items that permanently confer these effects on those who otherwise wouldn’t have access to them.

This analysis doesn’t quite answer all the possible questions. “Is a barbarian who is Raging more susceptible to Tunnel Vision”, for example. I like to leave some such issues open for individual GMs to resolve, because that enables them to make the concepts their own.

The Inevitable Question

Is it worth it?

That’s a difficult question to answer. Despite efforts to minimize it, there can be no doubt that these modifications will add some overheads to the system that determines who goes first – that, after all, is what it’s designed to do.

A Warning From The Past

No matter how straightforward it appears on the surface, any recurring modification to the mechanics has to be approached with an air of trepidation. If you don’t know why that should be the case, take a look at My Biggest Mistakes: The Woes Of Piety & Magic, most especially, the first of those subjects. I assure you that it remains an object lesson to this day, not only to myself, but to everyone who played in the affected campaign. Okay, so that’s just one surviving player these days, but still…

An Act Of Balance

There are a lot of benefits promised for this set of house rules – and they would all serve to bind this modification more tightly to the campaign.

My personal opinion – well, I have several of them.

  • There are very few combat systems that modify the mental state of the combatants, discounting the Sanity mechanics of Call Of Cthulhu. So this would immediately be a point of distinction for any campaign.
  • I think there’s more than enough analysis offered to show that the results would be more realistic in ways that most game systems don’t even recognize.
  • The game mechanics are designed to be as far from onerous as possible. Even so, without considering fringe benefits, it’s problematic whether or not they are worth implementing on blind faith and optimism – but they are worth trialing.
  • Those fringe benefits are huge, but come with a downside to match. If the basic modifications have passed a successful trial period, that limitation goes away and the goal-posts move.

Ultimately, these leech a little of the abstractness from an initiative system and make it a little more simulationist. That could be seen as good, or bad, depending on your perspective; you need a balance between both for a system to be practical.

If you try them and like them, you may find it necessary to further abstract some other element of the combat system to compensate.

All I can do is provide food for thought and some guidance. The rest is up to you!

Print Friendly, PDF & Email