Campaign Mastery helps tabletop RPG GMs knock their players' socks off through tips, how-to articles, and GMing tricks that build memorable campaigns from start to finish.

Who Got Poker In My RPG?


Poker in RPGAt my workplace there’s a poker group. At my previous job there was a group. Players in my Riddleport campaign play in such groups. Games are even broadcast during prime time TV. When did this game become so popular?

If you have a player in your group who is a raving poker fan, I thought I’d put together a few ideas for you today on how to include the game in some form in your campaign to give that player a thrill. It’ll be like chocolate in their peanut butter for them!

Meet The Villain

Losing at a card game can become clever foreshadowing. Have the PCs play against the villain or a notable minion.

Bonus points if they keep their identity hidden during the game!

They’ll cheat, of course. And the loss gives great foreshadowing when the players confront the NPC in an encounter later on in the campaign.

If the PCs somehow win the hand or clean out the bad guy, the villain flips the table and announces revenge. A dish best served cold, my friend.

Get Real

Try to get a realistic feel for the culture of the game. You want to do more than deal out a few hands. You want to roleplay it.

You might consider visiting a casino and observe. Take notes on the sights, sounds and smells. Look at the ceiling, the floor and in the nooks and crannies to collect cool little details you can add to the game scene.

You can also visit pokers sites online to get a feel for the lingo, style and themes to help your encounter and NPC descriptions. You can compete and win at real money poker sites if you want to feel firsthand what it’s like for all you method-actor GMs out there.

Theme It

Mashup poker with your campaign setting to make your own vision for the game a unique experience for your players.

Perhaps the dealer is a cigar chomping quasit. The chips are a special set brought back from the Goblin Lands, covered in goblin runes with accompanying teeth marks. Instead of dealing cards you cast them. Instead of ante it’s spit. You declare shield instead of call.

Even better, give all the hand combos new names. Full House becomes A Coup. A Straight is a Crossbow.

Check out this list of hands to help you figure out what you can rename thematically.

It’s Just Fun

First, a word of caution. Do not play for real money in the game. That seems obvious to you and me. But to someone with the fever, you bring out a deck of cards and say there’s a game in the tavern, and their eyes get squinty and they reach for their wallet out of habit.

No matter how much they ask, keep that aspect to the real game. Think I’m joking?

Combine a casual style RPG player with a passion for poker, and you can see where their loyalty lies. They’ll cancel out on you to play that other game. They’ll have a card suit for an ear ring. They’ll talk the language in-character.

If you run a game within your game, they’ll want to do it for real. Stick to your policy of playing for fun only.

As a compromise, to create real stakes, gamble for GM helper roles. Play a fixed number of hands. The one with the most gold pieces at the end has no duties. The one with the least has table cleanup and garbage duty. Assign other responsibilities to everybody in between – scribe, quartermaster, mapper and so on.

That’s assuming stakes are not already high enough with characters gambling their own wealth away. :)

Find Cool Cards

You can find themed cards, chips and accessories online.

Enhance play, for example, with a fantasy deck. Maybe you have enough discarded Magic cards to create a poker deck.

If you just have regular chips or can’t find cool themed ones online, buy stickers and apply. You could also paint them up to get the look you like. Actually, metallic spray paint + stickers gives you a fantasy or sci-fi set pretty fast.

Hinge A Game Outcome On It

Make the results of the game affect the game world in some way.

OSR folk love getting chess sets into encounters as puzzles and mini-games.

I remember in one campaign I used a chess board as the Game of Gods.

As events unfolded in the world, the chess board got updated. One of the players had the ability to scry this board to see developments. As they were mid-level, they also influenced the moves on the board, which made the scrying even more important. “Did we successfully block the King’s check!?”

Additionally, I turned the pieces in to NPCs. And the scryer could see some pieces cracking, vibrating, leaning and so on. In this way, I provided plot clues.

So too could you make poker have real game world impact. The cards could be NPCs. Or a mix of NPCs, locations and items. Hands dealt or played could be encounters and events.

The game players? Gods, demon lords, kings, imprisoned mega-psionicists, or unknowing rogues in a plane far away. Pick a group that would make cool epic tier NPCs for your game or future adventure opportunities.

There you have it. Six ideas for getting this real world hobby into your RPG game:

  1. Foreshadow future confrontation against the villain
  2. Observe real poker to help you roleplay it better
  3. Theme things in-game to give you better ambiance
  4. Raise the stakes with M helper duties
  5. Find great props
  6. Base the plot on the hands

Have you ever used the game in your RPG sessions? How did it go?

Comments (5)

The Imperial History Of Earth-Regency, Part 11: The Post-Modernist Dark Age – 1998-2015


This entry is part 11 of 12 in the series The Imperial History of Earth-Regency


Pieces Of Creation is an occasional recurring column at Campaign Mastery in which Mike offers game reference and other materials that he has created for his own campaigns.

All images used to illustrate this article are public-domain works hosted by Wikipedia, Wikipedia Commons, or derivations of such works, except as noted, and except for the Image of Big Ben as a 9/11 target, which is the author’s own work.

This article is a work of fiction and no endorsement of the content should be attributed to any of the individuals or institutions named, photographed, or credited.

Author’s Notes: It has been almost a decade since the first draft of this Alternate History was written in 2003. While some of the events forecast in these pages have come to pass, others have not; so this marks the point in the narrative where significant divergances occur, simply because my imagination went left while the real world went right. Very little in the overview below has been changed from the original text, though there have been a few clarifications here and there.

Nevertheless, the content of this series has been updated, as each part has been published through Campaign Mastery, to reflect the benefits of a further decade of life experience and perspective, and the next few chapters will be likewise enhanced with the introduction of more “real history” into the timeline.

This ‘era’ within the fictional history marked, and marks, the transition from based-on-real-history to completely fictional. All the 2012 updates do is smear that transition over a ten-year span within the history. But it will mean a gradual change in style, and possibly shorter subsequent chapters, due to the added research and development requirements.

When one era ends, it’s traditional for a new one to begin. But the seeds of the new era had been layed many years before the conclusion of the Communications Era, with the development and popularization of Home Computers and the Internet. The dominant theme of the new era would be the consequences of the events of the 15 years which preceded it.

Ted Nelson, one of the many fathers of the internet, in 2011. Photo by Gisle Hannemyr

The Popularization of the Internet

The internet began as ARPANET, a means for defense researchers at different universities to exchange technical information by computer linkup without the need for a face-to-face meeting. The scientists who used the systems found the technology so convenient that they began using it in their personal lives. From that point on, it became inevitable that a domestic equivalent would eventually emerge. E-mail had arrived. But it was when Australian researcher Ted Nelson first invented the hyperlink in the 1960s that the seeds of the phenomenon most people envisage when they refer to the internet truly came into existence.

Nelson’s proposals were ignored for almost 20 years, but when they finally caught on, the internet exploded. Crucial to this was the mass-popularization of Microsoft’s Windows 95 operating system, which took many different network protocols & content types – FTP for file transfers, HTTP for web pages, JAVASCRIPT and ASP for interactive scripts on web pages, and SMTP/POP3 for sending and receiving emails – into one (relatively) seamless system, with one type of communications linking directly to another through a simple point-and-click user interface. People didn’t see all these different programs as independent and separate, they saw a single entity – the internet – which had all these different things that could be done with it. This was the ultimate development of the Communications Age.

Nothing this revolutionary could come into existence without widespread social consequences. At first, this was dismissed as a possibility, even while the Dot Coms were setting stock markets on fire. Email was just like regular mail, just faster. The web was just like the magazine rack in the world’s best library – except that all the magazines were by amateurs. Web Pages were regarded as a mass-production extension of the Desktop Publishing that had predominated in the late 80s and early 90s. Even as late as 2002, 70% of internet users cited email as their primary purpose for an internet connection. But all Booms must Bust, and the Dot Com collapse which began in 1998 lasted for more than five years. Despite this implosion, there were four developments that would signpost and make possible a firestorm of change in subsequent years.

Zappos was already the largest online shoe retailer based in Henderson Nevada when it was purchased by Amazon in 2009. Their fulfillment centre - a small part of the Amazon online empire - shows the incredible scope of the Amazon retail juggernaut. This image is from 2006 by lizzielaroo.

Online Retailing

The first of the four was the development, through scripting languages – JavaScript & ASP in particular – of online purchasing systems. Pioneered by the online bookstore Amazon, this development challenged the concept of the storefront display, one of the greatest expenses of a startup business. As the Dot Com explosion proceeded, it was proclaimed as an absolute necessity for a business to have a web site, and for customers to be able to interact with the business through that website as though it was the actual store. Although it never reached the point of totally wiping out in-corpus purchasing, due largely to the rise in internet fraud and public wariness, the first part of the prediction was largely self-fulfilling prophecy; the more companies acted on that perception, the more the internet boomed, and the more necessary it came to appear to other companies that they too needed a web presence to remain competitive.

The most significant consequence was the erosion of trade controls and regulations at the direct customer level – if it was cheaper, with transport costs, to buy a book or a CD from the US, or Venezuela for that matter, you could – and hundreds of thousands did. While internet shopping never grew at the hysterical rate predicted by the “experts”, it did grow, year-by-year, and with it the economic basis of whole nations slowly changed. The internet had bypassed the local economy completely – or, more accurately, had put small manufacturers and multinationals on an even playing field, no matter where the small manufacturer was located.

This in turn inspired shady entrepreneurs, who realized that the internet was not only unregulated, it was almost impossible to regulate. Porn websites proliferated, bypassing local censorship regulations. The internet became the Great Leveler, reducing enforceable laws to the lowest common denominator. Gambling sites soon followed. Many countries made vain attempts to limit or restrict the content placed on servers within their jurisdictions in an attempt to turn back the tide; the owners of those sites simply moved the websites to servers in other countries. They did not even have to physically move to those countries, the entire transaction and infrastructure set-up taking place electronically, over the internet. Typically, it took less than 24 hours before the site was back on the web.

A version of the famous (or notorious depending on who you ask) Napster icon, from a set of themed icons by DBGthekatu.

File-sharing & The Media Industries

The Entertainment industry learned nothing from this lesson, as shown by their reaction to the third major development of the new era, the combined arrival of MP3s and file-sharing. The first such combination, Napster, was eventually shut down, because it had centralized servers that contained the index to the files available for being shared, and those centralized servers could be targeted; but even before it was defunct, new programs were available to the public free of charge that did not have this legal vulnerability.

The entertainment industry, already stung by surveys showing that the internet was beginning to eat into “traditional” recreations like television, responded in 2003 with lawsuits that were beyond any rational belief. They had already tried to gain exemption from liability for any damage they might “inadvertently” cause in invading personal computers via web-based viruses in the search for illegal files or the technology to exchange them; increasingly, their reactions were more suggestive of a state of panic than of a rational business. The irony was that they had only themselves to blame for this state of affairs; in the 1980s they had decided to eliminate relatively cheap LPs in favor of CDs, which they then priced at profit levels that were, to say the least, exorbitant. This created an unsatisfied demand which ensured that alternative (free) distribution methods would flourish. They had also neglected the MP3 phenomenon in its early stages, despite being given the opportunity to bring out a “downloadable files” option of their own. Having decided that internet distribution of movies and audio would not amount to much, because file sizes were so extraordinarily big, they were caught entirely unprepared when new file formats and improvements in internet technologies took file sizes to relatively small numbers without apparent impact on quality. It was their own unwillingness to embrace the potentials of new technology that had left them vulnerable to the problem in the first place, and their own greed that had created the demand.

The IRIA (Imperial Recording Industry Association) lawsuits of 2003 were terrorism by law; the damages sought were so high in order to ensure that the defendants would never even contemplate fighting the case in court. An out of court settlement, setting a precedent favorable to the Recording Industry, is what they were sure of achieving. It didn’t work out that way, for a very simple reason – Iraq.

An F/A-18C Hornet coming in for a landing abourd the USS Constellation (CV-64) after a mission in support of Operation Iraqi Freedom, whose objectives were to liberate the Iraqi people, eliminate Iraq's supposed weapons of mass destruction, and end the regime of President Saddam Hussein. Photo by Photographer's Mate 2nd Class Daniel J McLain.

The Gulf War II Connection

in 2003, the Empire took up where it had left off in the invasion of Iraq, as officials tired of the games Saddam Hussein had been playing with inspection teams (and became wary of increasing Mao tensions over the issue). The result of that conflict was fairly predictable, given the circumstances; but there were some unexpected spin-offs.

Notably, the Imperial Government took notice that many of the most patriotic pieces of music were still under copyright, meaning that they could be used on websites only at the owner’s risk. This was not an acceptable situation, given that troops were fighting a war that had deeply divided the community even as the combat began. That, in turn – and in conjunction with the IRIA’s threats to sue the universities where the alleged infractions had taken place – ensured that a coalition of forces began assembling against the Recording Industry. The Universities called their alumni, especially the politicians and politically connected, and before the IRIA knew where it was going, the entire political and economic infrastructure of the Empire was bankrolling a defense fund, one which had already acquired the worlds leading attorneys (all of whom had also graduated from universities!)

A Barrister in traditional wig, 2009. Photo by Southbanksteve, Flickr

The Copyright Quagmire

Court was, in reality, the last place on earth the IRIA wanted to go. Once there, only three outcomes were possible: either the jury returned a verdict for the defendant (undermining the entire legal status of copyrights that the IRIA had worked for years to create), possibly out of revulsion at the greed displayed by the damages requested; or, worse still, they could do their duty under the law and find the defendants liable, but set a trifling damages bill, destroying the value of copyright violations; or worse still, they could return a finding of malicious prosecution, handing the bill for all legal costs to the IRIA – costs that were sure to be unbelievably exorbitant given the size of the legal team lined up to challenge the case. And worst of all, the verdict would be legally binding and a permanent precedent! Any appeal would place the entire matter in the hands of the High Court – which had the authority to redefine copyright or abandon it altogether if it saw fit. There were No outcomes that benefitted the IRIA – but they had never expected their bluff to be called. Nor could they afford to back down from it – the loss of credibility would do exactly the same damage as a loss.

Screenshot of the Chatzilla IRC software in use. Image is subject to the Mozilla Public Licence.

IRC & The Law

When the case actually reached the Court of Manchester in 2005, the legal process was in its own way being radically overhauled by the increasingly omnipresent internet, by means of a fourth technological innovation – Internet Relay Chat, or IRC. This enabled multiple people to converse by typed text with one another in utter silence save that tapping of a computer keyboard. For the first time, IRC connections between laptop computers via wireless networking enabled the lead attorneys to have their entire research staff hooked into the case, locating precedents and past arguements and rulings more quickly than the clerk of the court could.

It was noteworthy that the defense team had this technology, but that the prosecution did not; a situation many considered symbolic of the entire situation. It was the Progressives vs. the Luddites all over again, and the Luddites were doomed to failure. One reporter would liken it to “The One Eyed Man in the Kingdom of the Blind”, so profound were the differences. In effect, there was a legal team of over 2000 people defending against 3 unassisted lawyers. Much to the chagrin of IRIA, the defense steamrollered them – and managed to get the whole problem kicked up to the Supreme Court on a jurisdictional issue even before a verdict had been reached. The result was a complete redefinition of copyright within the Empire, one driven by what purpose the copyrighted material was being put to – and putting an end to the power of IRIA.

Just part of the plethora of connections that made up the Internet (also known as 'The Cloud' and 'The Dreamtime') on Jan 15, 2005. Image by the Opte Project.

Mass Intercommunications & The Dreamtime

Internet Chat had other implications for society. For the first time, people living in many different countries were able to talk directly to one another – it was not uncommon for a chat room to have Europeans, Pakistanis, Arabs, Americans, Australians and South Americans all chatting at the same time – and those who participated began to forge bonds of understanding and to adopt a more cosmopolitan view. Communities of those with common interests arose, as they always do, but with the fundamental difference that Geography was irrelevant – these were neighborhood clubs whose membership happened to be scattered all over the Earth.

In the past, it had been held by some cultures that reality was more than the eye could see, that overlaying the physical world was a spiritual world containing forces, allies, and foes, a world that was invisible to all but the uninitiated, and in which things were possible that were either impossible outright, or impractical at the very least, in the physical world. Amongst the citizens of the Empire, the Australian Aborigines and Inuit Eskimos had the clearest views of the “spirit world”, preserved despite the influence of Modern “Education”. It was a world of unsuspected dangers and unimaginable possibilities. The Internet was similar, a Dreamtime of new dangers and new promise, an electronic web connecting 70% of the Empire together in ways considered science fiction a decade earlier. The correlation between the two would spark new religions and new analogies; by 2015, “the ‘net” had become known colloquially as “the Dreamtime”.

The DynaTAC 8000x, also known as "The Brick", was the world's first commercially available hand-held mobile phone. The version released in 1983 weighed 28 ounces and was 10 inches tall - not counting the antenna! Previous mobile telephones either had to be mounted in a vehicle or were contained in heavy briefcases.

Collaboration In The Dreamtime

IRC continued to evolve; with improvements in compression technologies and faster computer processing, it became possible first to chat using spoken words, and then to communicate with visuals. Shared electronic whiteboards enabled design teams to collaborate no matter where in the world they might be physically located. Surgical instruction and supervision was conducted live as operations proceeded, over the internet. It became increasingly common for people to work from home and “telecommute”, with obvious ramifications for the transport industry (Canada astonished the world by becoming the world leader in this social trend).

It was a French company, Yahoo – which, thanks to the power of the internet, many people thought American in origin – who first realized that with relay servers connected to phone lines in almost every local district of every nation of the Empire, that it was possible for a telephone call to be routed through an internet connection from another country to a local telephone. A single internet connection made all phone calls local, and almost free. They built the technology into their free “Messenger” IRC software as much as proof-of-concept as an actual marketing exercise. At first, the service was only to USK telephones (from anywhere else in the world); but one after another, more districts and countries were connected.

Telephone monopolies, accustomed to reaping the bulk of their profits from long distance communications, became increasingly unprofitable, and forced to resort to draconian business practices of the most cutthroat variety to stay ahead of the competition. It could be argued that in many ways, the various governments chose the worst time possible to privatize and deregulate the industry, as standards of service were sacrificed to the almighty bottom line of profits.

Nor was this the only challenge to confront the traditional communications monopolies; radio-based mobile telephones evolved rapidly with improvements in technology, leaping from the wildly impractical to the ubiquitous in less than a decade, and then continuing to both shrink in size and grow in capability. The majority of telephone companies had been more than willing to ignore these devices when they were bulky and unwieldy; the sudden explosion brought with it new and more modern rivals. Worse yet, in order to compete with these rivals, their entire infrastructure would need to be overhauled, even replaced, placing a substantial burdon on the operating capital of the telecommunications giants. For some, the burdon would prove too much; they would go under be broken up, their assets consumed by the johnny-come-lately telcos. Others managed, leveraging the size of their existing networks in partnership arrangements with the new services in order to raise the capital needed for infrastructure development; but these could not possibly compete with the powerhouses that the newcomers became with this added advantage. Increasingly, they would begin to fade from the services sector of the industry or face imminant collapse at every turn, becoming administrators of the communications “backbone” and the hubs to which the modern service providers purchased access. Their customer base was no longer the man in the street, but the company with which the man-in-the-street did business – a wholesaler, not a retailer.

The unsustainable progression of a classic pyramid scheme from a US Securities & Exchange Commission report on the subject.

There is No Heaven without a Hell

The new technologies brought new crimes and new threats. Identity theft was one of the most prominent. The growth of pay-by-internet made it possible for the theft of credit card details for subsequent use by the thief without ever possessing the actual credit card. From this beginning, enterprising criminals found that if they intercepted mail addressed to an individual, they could quickly acquire enough documentation to permit the opening of multiple credit cards in the victim’s name, enabling them to purchase goods and obtain cash advances as they saw fit – without the victim even knowing that a crime had been committed until a month after the fact, when the credit card bills arrived.

In 2004, a refinement was devised by Eldon Bartels; he redirected the account mail to a safe-house under his control, and used the proceeds of one “stolen” credit card to make the monthly payments on several others. The result combined the “best” aspects of Identity Theft and a Pyramid scheme. By the time the plot was accidentally discovered and Bartels arrested, he had managed to acquire an estimated 14 Billion dollars through credit card fraud. Hiring the best lawyers – he could afford them – he received a 6-month suspended sentence and retired to the Bahamas.

The 404 file not found page of Wikipedia on 29 Sept 2011. Image by Mover. This sort of page is what customers see when attempting to reach a web server that has been subject to a Denial of Service (DoS) attack.

The Denial of Service Threat

The other threats were more technological, designed to exploit people’s penchant for connecting their computer to a network of other computers. Usually in furtherance of idle mischief, these menaces – worms, trojans, exploits, viruses, and metaphasic threats – were all about gaining access to one person’s identity and passwords to permit more petty thefts. Obtaining the user code to a piece of software could save the perpetrator hundreds of pounds. A few used this as an adjunct to Identity Theft, but most were by petty criminals or vandals.

Somewhat more serious were denial-of-service attacks, where viruses with a time-based trigger were set to attempt to access a particular server at a given moment, or to flood a service with email – whatever that server was designed to do, these were intended to overwhelm it.

The first such was an accident, but several other such attacks followed, usually in support of political activism. Whenever a company with a web presence – which was virtually all of them – did something to offend someone, there was always a chance that the company would be the target of a D.O.S. attack. In many cases, these attacks had little direct impact; the fact that the website for “Joe Bloggs Tire Service” was overloaded didn’t make much difference in the scheme of things. But few companies hosted their own web servers; it was far more common for several companies to lease disk space and web access from a Hosting Service. In which case, whoever was so unhappy with Joe Bloggs had not only shut down access to Joe’s site, but had cut off everyone else who just happened to have chosen the same Hosting service. And of course, the rest of the internet was slowed, sometimes dramatically, by all the extra traffic flowing across it. Like graffiti, the expense to the world in general that was incurred in recovering from a D.O.S. attack far outweighed the damage to the actual target, and the costs of mounting such an attack. Slowly, people – and the courts – began to take the matter increasingly seriously.

The rise of Spyware drove the development of new tools to fight the problem. HijackThis is an analysis tool that targets browser-hijacking methods that was released to Open Source on February 16, 2012. Click on the thumbnail to visit the Wikipedia page describing the software and its use, which also has a link to the project website at SourceForge.

Spyware: The War For Privacy

Many of these attacks were in retaliation for attacks on the privacy of individuals, which came under ever-greater pressure in the post-internet world. The rise of new deployment strategies for software which funded development and support costs through advertising – Adware – led to the incorporation of sub-programs which explicitly tracked web surfing habits, name, address, online purchases, and so on.

Then someone hit on the idea of doing the same thing without the ads – so that the victim didn’t know their security was compromised. This type of software was named Spyware. Increasingly rapacious marketers developed new tricks – like tiny, invisible graphics called Web Bugs – which enabled them to determine which site people had come from and which site they went to when they left. The companies claimed that the information was aggregated and homogenized, statistical in nature – but as the Dot Com collapse continued, and mergers resulted, it quickly became possible for a company to have two separate databases, neither of which contained enough data on an individual to identify them, but which contained enough common details stored in each that it was possible to treat them as one large database – with the result that the company knew all about the person.

The 9/11 terrorist attack against Big Ben united most of the world in outrage. Restoration of the Imperial Icon would be complete in 2015. Click on the thumbnail for a larger image.

9/11 & The Hunt For Al-Qaida

This is not to say that more physical threats were a thing of the past. If anything, the new breed of terrorist was more threatening than their predecessors had been.

On September 11th, 2001, hijackers seized control of a number of fully laden passenger aircraft and flew them into key Imperial landmarks and structures. Big Ben was destroyed. Buckingham Palace was severely damaged (fortunately the Royal Family was not in residence at the time). Another plane crashed before it could strike the headquarters of the Imperial Military Planning & Intelligence Services Building. Imperial Citizens throughout the world were outraged and united in anger; this one act of barbarism created more solidarity on a single political issue than ever before. Even the Mao declared their support and cooperation in the hunt for those responsible.

For the first time, Mao intelligence was assisting the Empire, but that proved less beneficial than Imperial strategists had expected – largely because, through earlier alliances with the Chinese, parts of the Middle East had acquired Mao technology, and were shielded against their most effective methods. Nevertheless, the hunt for Osama Bin Laden and his Al-Qaida terrorist network proceeded with a determination and ferocity that was as much about wounded pride as it was political necessity.

Movement of the NASDAQ index from 1994 shows the inflation and collapse of the Dot Com bubble. Image made by ed g2s using publicly-available data from nasdaq.com.

The State Of The Economy

The Imperial economy represents the spending habits of billions of people, and in times of transition it is acutely vulnerable. Lasting fortunes are won and lost in such times, and the post-modernist era was no exception. Most visibly affected were the “Dot Coms” of course, many of which were financially unsound from the moment of inception. But also transformed and transfigured were a raft of other industries; Postal, Communications, Advertising, Entertainment, Transport, Education, Insurance. These in turn had knock-on effects on other industries. Restaurants that had relied on office staff found themselves unviable with so many workers not in daily attendance (due to telecommuting). Corporate offices became less necessary, but meeting rooms took their place. Even so, the average size of a corporate headquarters would shrink markedly over this period, which reduced the infrastructure costs of business, helping pave the way for greater prosperity and economic diversity in years to come.

Robotic workers photographed (in real life) at the Shanghai Science and Technology Museum by Mountain.

The Consequences of a changing Demographic

The era was always going to be revolutionary, even without the impact of the Internet. The 1950s and 60s had seen a population boom of unprecedented magnitude, with social consequences to match. With the workforce expanding faster than the economy generated jobs, it was inevitable that there would be massive unemployment – with the peak occurring in the decades 1970-1990 as hordes of better-educated workers entered the job market. This, in turn, left humane governments with little option but to adopt social policies designed to support the unemployed. Even paying a pittance – and most unemployment payments were a bare minimum sustenance level – boosted the amount of cash flowing through the economy over what had been there in previous years.

All these people needed to spend their money – generally on the necessities of life – and so supermarkets and takeaway restaurants and video rental dealerships and the like flourished. The growth in the retail sector was driven by lower-income earners, and primarily geared to satisfy their needs. This in turn generated additional employment requirements, and eventually an unstable equilibrium was reached.

As population growth had moderated after the Baby Boom, so 18 years later, workforce growth also began to moderate. Through the early 1990s, stability was at last achieved. Governments took the credit for their economic management. From about 1995 onwards, as women retired from the workforce to raise families, and the baby boomers began to drift out of the workforce through age and attrition, employment growth began to outstrip the growth in workers, and the unemployment rates slowly began to recede. Not by much – about 0.1% a month, most months – but these added up. From a high of 12% in the mid-70s, unemployment first receded to a stable 6.5% in the mid-90s. The Dot Com collapse temporarily drove it back up to almost 8%, but by 2004, it was back down to 6%, and the decline was accelerating as the first onrush of workers approached mandatory retirement and life on an old-age pension.

In some specialist, well-paid industries, or where working conditions were unusually poorly rewarded, such as Doctors and Nurses, there was already a greater demand than could be met, and had been for a decade. Governments took the credit for their economic management – again. In 2010, it hit 1%, and even the dullest of political thinkers had realized what was going on. By 2012, there were more jobs than there were people to fill them, and the only unemployment was an irreducible minimum of people in transit from one position to the next.

Of course, the aging population, in combination with the technological developments, had tremendous impact on the type of work that people were being employed to carry out. Tourism, especially on the local/national scale, aged care, and health care of all types, skyrocketed. There were some losses in construction and similar areas, as there were in clerical staff, but these were relatively stable. The biggest losses were in “blue collar” labor, especially manufacturing, where production became increasingly automated. The ideal was one worker (now a white-collar production supervisor) to one production line, but that was never quite attainable – it was simply too rigid to adapt quickly enough to changing economic circumstances.

Although obvious in hindsight, these patterns were rarely appreciated at the time, and the economic consequences came as surprises when they shouldn’t have. The explosion in service providers to the lower end of the wage spread surprised everyone in the 1970s and 80s. Fast Food chains proliferated. Where a town might have supported one supermarket, it had two or three. The manufacturing and infrastructure demands of this active population pushed industrial systems to the limits and beyond, and had much to do with the pollution problems of the era. As the baby boomers progressed into middle and upper management positions, it came as a total shock that they chose to use their greater spending power; the industries that had initially exploded began to contract, to make way for a proliferation of mid-priced restaurants and service industries, who carried out the tasks that the workers had no time or inclination to provide.

Convenience and quality of life were the objectives of the growth industries of the 1990s. In first decade of the new century, as the problem of unemployment receded, the problems of an aging population grew. Too many people were unable to provide for their own retirement, and where once it was unemployment support that was the critical social expenditure of national governments, the pension became dominant. The result was a resurgence of the industries that had dominated growth in preceding decades, but it was short-lived. Even as demand was growing, the number of available workers was shrinking; and in order to attract staff, wages and employee-related expenses were rising; and these forces demanded a contraction of the market. The result was a number of spectacular collapses and mergers, with concumbant economic chaos.

To meet the needs of business operations, two words became the touchstones of the economy in the late 2000s: Retraining and Subsidy. If a business was going to need a professional, it was better (and less expensive) to hire someone fresh out of school at low wages and pay their way through the education process – as well as giving them real world experience in the business practices of the company. Businesses resumed the roles of Patrons, just as they had during the Renaissance. In order to recoup their expenses, and hold onto their employees, businesses began moving to longer employment contracts. It was not unusual for a new employee to be hired on an eight-year contract – two years of part-time study and part-time work, three years of full-time study, and three years of full employment. As the working population declined, so did the causes of pollution – the number of cars on the road, for example. Although it would be decades before any sort of genuine ecological recovery was underway, the new century slowly shed the problems of the last like a used overcoat.

The Emperor William II in RAF service uniform prior to ascending the throne. Photograph by Robert Payne, Flickr.

The Politics of Empire

In the middle of all this, a new Imperial Monarch came to the throne. Elizabeth defied all predictions by remaining in power for 54 years, but over the first decade of the new era her health increasingly deteriorated. In 2006, she succumbed to a stroke that left her incapable of carrying out the duties of her office, and Prince William became the Emperor William II. Given her age at the time, it would be stating the obvious to point out that to most Imperial citizens, Elizabeth was the only Monarch they had ever known. The presence of a new Monarch with new ideas and a new style was going to shake up the Empire in ways they could barely imagine.

Since the events of 1993, there had been a quasi-stability about the Empire – the Empress had established in that year that she could override the Civil Service Policy in any matter that came to her attention, but her conduits of information were still controlled by the Civil Service, and it had not taken them long to realize that if they disrupted her attention with minor matters while obfuscating the important issues, and filled her days with social engagements, her powers were effectively neutralized. She had been unable to batter her way out of the cotton wool in which that they had increasingly cocooned her, and the Civil Service machine had, after a hiccough or two, rolled majestically onward, untroubled by the demonstrations of the power of the throne. They were confident that the same now well-established and polished techniques would soon have William II firmly emplaced as a figurehead as well, leaving the running of the Empire to ‘those who knew how to do it’ – they just had to “housetrain” him. In the meantime, the search for a bride – receptions, galas, state visits – would keep him happily neutralized for months to come.

View across St Salvator's Quad at the University of St Andrews. Photo by Oliverkeenan.

Reforms Of The Anonymous Monarch

They reckoned without the impact that a technologically adept Emperor could have. William knew full well how to use the internet, and had many aliases for use in Chat Rooms, where he could find out what his citizens were really thinking. Fully capable of his own research, and knowing that this day would eventually come, he had been drawing up an agenda for sweeping reforms well in advance of actually ascending the throne. He also had the advantage of having an informal secondary education from his Grandmother. It had not taken the Empress long to discover the political realities of the post-1993 situation, and she had spent the next 10 years studying the civil service and their techniques, and working out ploys with which to outmaneuver them. When she had her course in “Advanced Politics” ready, she handed her notes over to William. Instead of an insulated and fairly callow youth of only 24 years, he was more fully prepared than any Monarch who had previously ascended the throne.

He started by delegating all ceremonial and social matters to his father, save those few which were approved by that social guardian. He then presented his agenda both to the public and to the civil service at the same time. The principles on which he proposed to reform the Empire were simple:

  • Any bureaucrat whose sole responsibility was to the bureaucracy itself had his position placed in abeyance, pending reallocation of manpower.
  • There were to be no more than two layers of bureaucrats between himself and his subjects – one local manager of the office at which the public connected to the Civil Service, and one senior manager who reported and advised on policy.
  • All policy queries were to be directed to a working group consisting of a member of the house of Lords, the elected minister of the government, the Civil Service head of department, and an independent expert appointed by the Throne – who would have at least 24 hours and at most a week to determine how to resolve the problem – even if that solution impacted another department.

The Civil Service was horrified – it would lead to total anarchy, they predicted. There wasn’t enough time to determine what the real issues were, let alone find solutions, they forecast. Legislation could not be enacted that quickly, they moaned. A four-man committee could become deadlocked, they solemnly announced.

To which William issued a press statement conceding:

  • that there could be some confusion at first;
  • that if officials didn’t know the details of their departments, they might not be able to reach an acceptable decision in the length of time provided – but that outside of unusual circumstances, he would consider that a demonstration of incompetence, which was grounds for dismissal from the service;
  • that the government and opposition representatives had been deliberately included to permit a broader view than the isolated perspectives of the individual bureaucrat – being appointed head of a ministry or shadow ministry meant that they were the placed in charge of government policy as it impacted on the subject at hand;
  • and that the potential for deadlock was the reason why he always had the deciding vote.

At the heart of William’s reforms were the concepts of exceptions processing and regression analysis, things he had learned of in a computer-programming course at University.

There was a simple rule, a principle, for each government department’s function. To that rule, there were a number of exceptions generated. Each exception then became a general case. When the number of exceptions reached a certain threshold, the statement of purpose was reviewed and revised to incorporate the exceptions, and the count started over. The result was a simple set of rules which stated that “X” – whatever the member of the public had requested or required – either could or could not happen. If the member of the public disagreed, he could appeal to the staff responsible for that function of the department in question. If the basis of the appeal was an exact match for a previous appeal, and there had been no change in government policy or in circumstances, the previous ruling was used as a precedent; if it could not be judged by precedent, it would then be forwarded for review, and a new precedent would be determined to fit.

The Civil Service was even more horrified after all this was explained to them. It would amount to an 80% downsizing of the service. Not so, replied William – it would be a 40% downsizing because he was going to triple the number of local offices of most government departments. There would be a greater impact on senior positions, but this would be balanced by the recruiting of more of the people who actually did the work.

William then layed down the operating principles that defined each function of the various government departments, further stunning the Civil Servants and the government representatives. For example, the Social Security function of the DSS: “All Imperial Citizens earning a gross income of less than 25% of the average wage are entitled to a supplementary income of an amount set by the Elected Government in its annual budget but not less than 10% of the average wage, fixed .”

“But what about married couples where only one of them works? What about people who have millions in assets?” demanded the civil servants. “Excellent questions for review,” replied the Monarch. “Put some numbers on them – perhaps deeming an asset to have an annual income value equivalent to the current interest rate multiplied by the current value of the asset – and you will have your first exceptions. But don’t forget to include the deemed amount in the average wage.”

In general, the principles had a similar theme – all members of the public were entitled to whatever support was available unless they were specifically excluded. This sweeping reform also had the advantage of wiping out reams of legalese and casting matters into plain English. It made implementing Government policy changes so easy that there were no excuses for not carrying out an election promise – unless that promise was blocked by the combination of the Throne and the peerage for the long-term good of the Empire. At a stroke, it completely changed the power structure of the Imperial Government, and the expectations that the throne had of the branches of government.

Whitehall in London, looking south towards the Houses of Parliament. Photo by ChrisO.

The Battle Of Whitehall

The Civil Service retaliated with an attempt to drown the new system in paperwork. Every case was sent to review, every review resulted in a question, every question went to committee, every committee was deadlocked and sent to the monarch for approval. William sacked the staff responsible on the grounds of incompetence, including their managers. It was to have enough redundant staff that he could afford to do this that he had tripled the number of local offices. The Civil Service went on strike; he imprisoned the union leaders using the essential services legislation that they had helped draft.

After 6 months, the Civil Service was reduced to about 40% of its previous size. William compromised to the extent of permitting regional and national managers, responsible for staffing levels, administration, and expenditures, restoring some opportunity for upward mobility and promotion within the service – in exchange for removing the almost automatic handing out of honors. He gave each Civil Service head the choice of additional rewards for service as index-linked pensions or honors – they could not have both. He permitted the breaking up of some old departments into smaller ones, knowing that in the process he was reducing the powers of the Mandarins at the top of each, and the creation of a number of new departments. The resulting government structures were very similar to the banks business models – a loans officer (appeals), a branch manager, some clerical staff, the tellers (who dealt with the public), regional and national managers, and so on.

The National Governments also underwent their own upheaval. The new structure transferred power from the party rooms into the hands of the ministers appointed by the local Prime Minister. At first, there was some risk that they would join with the House Of Lords to veto the whole programme; but it was argued by William in meetings with the Parliamentary leaders that the setting of Policy would still be in the hands of the party; it was simply a matter of selecting the best member for the job of implementing those policies in the working parties. Delegation of authority worked in the military, it worked in business, so there was no reason it would not work in politics. What it also meant was that there was no longer any room for rewarding lengthy service with a promotion to the front bench, or other such corruptions; Ministries would have to be handed out to those judged competent to handle them. Besides, how did the Government think the public would react at the next election if they blocked the plans? The voting margin was narrow, but the decree was passed by the Lower House, and by 2010 the new system was operating smoothly.

The Economic Revolution

Having reinvented Government, William turned his attention to business. The profit-at-any-cost mentality had brought the economy to the edge of collapse, and could no longer be tolerated. He decreed a charter of social obligations for various “essential industries” to achieve within 5 years, or the relevant institutions would be nationalized and run by the Government.

His first targets were the Banks, Insurance Companies, Entertainment, and Telecommunications giants. The result was an immediate sharp recession – but one in which, for a change, levels of public service rose – resulting from the affected industries selling the shares they had bought as investments. The public eagerly snapped these up, ensuring that the recession was brief.

It was the most tumultuous period of change ever recorded outside of a time of War, and it freed William to deal with what he considered his real life’s task – the political problems of international relations.

Seventeen years of transformation

These, then, were the themes of the new age – privacy under attack, basic freedoms under attack, the legal system in disarray and threatening a total breakdown as courts became increasingly computerized, terrorists conducting precision attacks calculated to cause the maximum loss of pride, prestige, and innocent lives, dictators rattling sabers and threatening with weapons of mass destruction, new crimes and new pastimes and a social revolution which empowered individuals as never before – only for the empowered to become lost in the crowd of every other individual so empowered, except to those who were seeking to take advantage of them, new business models and practices which were ignored at the owner’s peril, upheaval on every front. It was a short but intense Dark Age, in which the “barbarians” emerged to smash the machinery of society, and then to build something new on the foundation stones that remained.

Next time, we’ll get into the year-by-year chronology of this period and examine some of these events in detail…

Comments (1)

Exceeding the Extraordinary: The Meaning Of Feats



From time to time, I like to look behind the curtain – to see what makes the mechanics of the games that I play tick, and what the implications are. Sometimes this leads down unexpected byways, and at other times it yields a nugget or two of insight. And sometimes, it just goes nowhere. So: In a d20 system (whether it be D&D, Pathfinder, d20 Modern, or whatever – what are Feats?

What Are Feats?

The 3.5 PHB doesn’t define them in its glossary. Chapter 5 of the PHB describes them as “a special feature that either gives your character a new capability or improves one that he or she already has.” The PHB then goes on to define Feats in terms of the differences between Feats and Skills, and then confuses the issue by dividing Feats up into several different types, with different rules applying to each type. Bonus Feats, Class-restricted Feats, Racial Bonus Feats and Feat Slots, Special Feat Lists… It’s a very flexible game mechanic, and it’s been used in all sorts of different ways as a result.

The Pathfinder SRD is even more vague: “A Feat is an ability a creature has mastered. Feats often allow creatures to circumvent rules or restrictions. Creatures receive a number of feats based off their Hit Dice, but some classes and other abilities grant bonus feats.”

One website defined them in terms of character options, used to customize a character. Another speaks of them as a metagame mechanic used to alter the way a character interacts with the rules. A third suggests that they are a way to change the rules of the game as characters become more powerful. A fourth describes them as a way to differentiate the capabilities of representatives of the same race/class combination at a metagame level. Still another talks of evolving characters from a generic common standard to a customized state that is more tightly integrated with the campaign world. And a sixth describes them as a way to give characters a bonus beyond what the normal character gets.

A fellow GM I was chatting to about this a few months back described them as “a way to customize classes or races as a means of adjusting game balance between these conceptual entities”.

And one of my players talks about them in terms of restoring the balance between humans and non-humans with racial abilities, and between Fighters and Mages.

These definitions run the gamut from the hypothetical to the min-maxing character crunch, from the simulationist to the pure roleplaying, from the campaign perspective to the metagame. And in any given campaign, any or all of them may be true – and there are some serious implications and repercussions buried beneath the surfaces of some of them.

Innate vs. Learned

One of the more interesting ideas that I came across in researching this article was the suggestion that all feats represented an innate natural skill or talent while class abilities were all things that the character learned, or learned to do, in the course of their professional development. I’m not entirely sure that I buy the notion, but it certainly raised an interesting question for consideration: is a Feat something that you learn or something that you can do? Or do Feats encompass both? And is that simply because later writers didn’t understand what a Feat was supposed to be any more than I do? In other words, has the original concept been contaminated – and if so, should feats that violate the definition that we arrive at be banned from the game?

It didn’t take very long to get into radical, even controversial, territory, did it?!

Because we have not defined exactly what a Feat is, no answer to these questions is possible. They are just something for us to keep in mind as we examine possible definitions.

If this were the purpose of Feats, progression would be as shown on the left - not as on the right

Balancing Acts

The notion that Feats are a means of tweaking the game balance between races and character classes doesn’t hold water, in my book – though this can be a secondary usage of merit. If this was the concept, only Humans and Fighters would receive feats, and they are not so limited. Of course, it’s possible that this was the original intention, and that the designers decided to raise the bottom line from zero to the default HD-based allocation mechanism.

I’m afraid that this theory doesn’t match up to reality. If it were correct in terms of race, then we disregard the standard Feat every X levels and look only at what the two factors contribute. Can anyone seriously argue that one Bonus Feat is enough to counterbalance all the racial advantages and abilities that Elves receive, or Dwarves? And, if it was correct in terms of class balance, then the receipt of Feats would be more in line with the geometric increases in power that Mages experience instead of a bonus feat every X levels. Either the workmanship is slipshod, and no-one’s ever noticed – yeah, right – or the actual distribution of Feats doesn’t match either pattern.

Nevertheless, the fact remains that there IS a bonus Feat for humans, and there ARE regular bonus Feats for Fighters. So these are either manipulations of the original intent, or the actual definition of what a Feat is must encompass this usage.

Character Options

The notion that Feats are a way of customizing characters by presenting them with a palette of choices and options is one that has played no small part in my approach to the subject, and one that several of the tentative definitions offered by different websites and GMs also touch on. But in order for this definition to work properly, it has to be assumed that all, or almost all, feats are of roughly equal value, and also that there are an arbitrarily large number of Feats for any given character to draw on.

The first condition is problematic. There is a standard employed in setting the effectiveness of Feats in the PHB, though it is one that I have inferred from the details instead of something that has been explicitly stated. That standard is:

  • +4 to one skill or roll
  • +2 to two related skills
  • +1 to four related skills
  • +2 to a type of saving throw
  • +1 to a combat-related numeric value eg Critical threat range
  • An ability that is normally useful once per round in combat
  • A more powerful combat ability that is only useful under specific conditions or is otherwise constrained

Metamagics, of course, fit into the last category, and introduce a sub-mechanic to the game – level adjustment – that is designed to contain the relative effectiveness of the Feat to something close to the appropriate standard. Unfortunately, not everyone has followed this standard – possibly due to poor analysis of the standard, or error – and there are some outright violators out there, some of them in “official” WOTC publications. By and large, though, the worst offenders tend to be home-brewed feats.

Side-Note: Class Abilities
It’s worth noting that most class abilities also meet this standard, but there are some even more outrageous exceptions. Possibly the worst offender is “evasion” and “improved evasion” which can make a character all-but-immune to damage-dealing magic regardless of the circumstances.

Where such magic has to target the character, evasion seems fair enough, but when it comes to Fireballs and other area-effect spells the logic starts to get shaky. Add to that the fact that the check involved is one in which those characters with access to these abilities are naturally good, and that the ability is absolute and not relative to the class levels of the characters involved, and it becomes a right mess.

Evasion and it’s improved counterpart virtually force the GM into the use of “Save-Or-Die” spells, which I loathe.
 

Fixing “Evasion”

“Evasion” and “Improved Evasion” are not that difficult to fix.

  • First, the DC for an evasion check should be increased by double the Spell Level, to make it a little bit harder to save against, especially with higher-level spells;
  • “Evasion” should be amended to read “Half the damage otherwise indicated on a successful Evasion save, Full damage otherwise”; and
  • “Improved Evasion” should be amended to read “One Quarter the damage otherwise indicated on a successful Evasion save, Half damage otherwise”.

These changes have the effect of permitting these abilities to continue making a major difference to the character’s capacity for surviving such spells while not making them a complete “get out of jail free” card. It also establishes a differential in which high-level spells are harder to Evade than low-level spells.

Finally, I would rule that using evasion leaves the character prone, requiring them to spend a move action to get back to their feet – unless they have an appropriate feat to let them do so more quickly.

 

Side-Note: Class Abilities and Feats
Another pet peeve that is also relevant to this discussion is the example of character customization offered by the DMG (Sidebar p175). Many players have interpreted this passage to mean that if the character has a class ability that is identical in description to a Feat, that class ability can be swapped out for a different feat with automatic approval by both the rules and the DM. Those who customarily wear medium or heavy armor are, according to these players, better off swapping out all armor-use feats bar the category of the armor they are actually using.

In theory, this is fine, but in practice, it plays hob with low-level game balance by giving characters as many as 4 additional feats (3 armor types and shield use). The result, when applied to archery-oriented characters or dual-weapon characters, is a hugely disproportionate capacity for inflicting meyhem.
 

Fixing Armor Proficiencies

This peeve is also susceptible to an easy fix. Simply designate Light Armor Proficiency as a prerequisite for Medium Armor Proficiency and Medium Armor Proficiency as a prerequisite for Heavy Armor Proficiency.

Further refinements are possible, such as:

  • Ruling that characters cannot fail to take an armor Proficiency when it is offered and then take it again at a later point. In the long run, this would make no difference; but at low levels in a campaign, the effects can be considerable. And/or,
  • Ruling that Magical Armor counts as one Proficiency Type less unless used with a shield, i.e. no armor proficiency is needed to use Magical Light Armor, Light Armor Proficiency is sufficient to permit use of Magical Medium Armor, and Medium Armor Proficiency is sufficient to permit use of Magical Heavy Armor – unless the character wants to use a shield with this armor, in which case they not only need the Shield Proficiency, they need the correct Armor Proficiency.

These changes not only expand the available choices of armor for a character (a little honey to make the restrictions more palatable), they restrict the benefit that can be achieved by gainsaying proficiencies. The character is forced to choose between being restricted to a lighter armor permanently and having more feats, or having access to the heavier armor types and the full range of protections, with only the standard number of feats. There’s even a middle ground, for those who like to compromise!

They also make the nature of an encounter less prone to telegraphing by means of the armor being worn. That guy in scale mail – is he a poorly-equipped fighter, a fighter wearing some fantastical enchanted armor, or a mage or rogue in enchanted armor? The beefy guy next to him in full plate – is he a Fighter, a Paladin, or a Cleric? If it’s not quite so obvious what character class an NPC is, it is also not quite so obvious what his vulnerabilities are, or what threat he poses!

 

Side-Note: Feats from multiple sources

A third (relevant) pet peeve that I might as well get off my chest while I’m talking about them is the assumption that players can draw feats from any published, compatible, sourcebook without GM approval, and that two different feats that do the same thing but have different names can automatically stack unless the bonus is of a Named Type.

This opens up all sorts of game-unbalancing possibilities. Two feats that are perfectly satisfactory in isolation can combine to create and exploit a rules loophole through which all sorts of game-unbalancing effects can crawl.

In 99% of cases, there is no problem, but that last percent – which min-maxers always seem to locate – annoys the heck out of me.
 

Fixing Multisource Feat Problems

Thankfully, yet again, this is easy to fix.

  • Any Feat that affects a given subsection of the rules, e.g. Flanking or Charge Maneuvers, is deemed to be mutually exclusive to feats from other sources that affect the same subsection of rules except with GM permission. Such permission is given on a case-by-case basis and never as a blanket ruling.
  • Any feat that confers a bonus to a given ability or score is deemed to be the same as any other feat that confers the same bonus, and therefore the benefits do not stack with that feat.
  • The “Flavor Text” that describe feats, including any personality traits, are considered rules just as much as game mechanics are; the referee is entitled to force them to be applied in roleplay, to take them into account in relations with NPCs etc, and/or to require the feat to be replaced with another if the character acquires a feat or other ability that inhibits or controls the non-game-mechanic consequences of a feat.
  • No feat or character class is permitted unless the referee also gains access to a copy of same for use with NPCs.

These four simple house rules permit the characters to utilize any game supplement that the players might have, from any source – within appropriate and reasonable limits.

Getting Back To Our Knitting

We’re still trying to figure out exactly what a feat is. I’ve been looking at the various definitions I could find, in reverse order, so far without finding a complete one. The best we have so far is “a way to customize characters”, which is at roughly a 2nd-grade level so far as definitions go!

The next one is “A way to give characters a bonus beyond what the normal character gets”.

This immediately raises the question, what is meant by the term, ‘the normal character’?

Two possible interpretations come to mind:

  • An NPC without character levels
  • A character without the feat
NPCs without character levels

This assumes that characters don’t automatically get class levels, that there is something extraordinary about those who do. The majority of NPCs encountered will be 1HD peasants, in other words, with no extraordinary capabilities whatsoever.

This interpretation actually stems from older versions of the D&D game system, especially AD&D, which stated outright that most NPCs never gain class levels. The problems with this notion are that it’s hard to set up social infrastructures for the advancement of PCs when they are so rare, and that it becomes difficult to explain where the villains come from.

If the campaign world is set up along these lines, it becomes something very different from the majority of modern campaigns. When creating the world, it becomes necessary for the GM to spell out exactly who the high-level characters are because they would be famous figures throughout the civilized world – and the same goes for the bad guys. PCs become tethered to the base of operations because that is where their character class has its resources – they can go out and adventure but must periodically return to home base to utilize those resources. If characters need to train in order to go up levels – another element that has fallen by the wayside in modern campaigns – that can only occur in locations where there is the infrastructure for such training. This is an elitist model in which the PCs are the elite.

More modern campaigns are more generalist. Removing the need to return for training before a character can go up a level permits more flowing narratives (the need to train being a constant handicap and interruption to an ongoing storyline). The result is a more “novelized” approach. It also means that many more characters have class levels, which in turn makes the integration of their class infrastructure more ubiquitous. It doesn’t completely solve the problem – for that you need something like the Shadow Levels approach that I offered in Shadow Levels: A way to roleplay the acquisition of Prestige Classes in D&D 3.x, one of the more popular articles here at Campaign Mastery.

Removing the restriction on the acquisition of class levels makes adventure easier to come by, but it makes a profound difference to the game world. It also necessitates the creation of character “classes” such as “Noble”, for those characters who don’t go out and adventure to maintain some level of authority over those who do. Without it, the NPCs can quickly be forced to dance to whatever tune the PCs call. (In my original AD&D campaign, I ensured that every nobleman had at least 10 class levels purely to justify their positions of authority – and higher rank had higher-level requirements. That campaign was the tale of the son of the King going out into the world to ‘prove himself worthy’ of his right of succession).

So, there are advantages and disadvantages to both, and a very different campaign flavor. This one assumption takes a modern game and gives it a very ‘old-school’ flavor. But the very fact that it is necessary to distinguish between the two indicates that this is not a correct base interpretation, though it is one that can viably be made in a campaign’s house rules.

That leaves us with:

NPCs without the feat

Restating the proffered definition of a feat as “A way to give characters a bonus beyond what a character without the feat gets” doesn’t seem to get us very far. It seems to be a tautology, adding up to “a bonus is an advantage over anyone who doesn’t have it”.

But there is a hidden implication there, one that removes the tautological overtone. In order to make this definition sensible, the assumption has to be made that not everyone receives everything that’s on offer.

If everyone has access to class levels, but NOT everyone has access to feats, we have a blended compromise between the ‘old school’ approach suggested in the previous section and a fully-democratized model in which everyone has at least theoretical access to everything – class levels and feats.

To be honest, I’ve never seen anyone write up such a campaign structure, in which Feats are the difference between elite (PCs & their Arch-nemeses) and ‘mundane’. But it makes a certain amount of sense, in terms of preserving the infrastructure benefits of the world and still making the PCs elite. It’s not much of an edge, especially at low levels, but it would become massive as the characters progressed.

The only problem with this approach is that I haven’t seen it done anywhere before. In fact, the opposite is true – the trend has been to make Feats more ubiquitous, not less.

Monsters gain feats just by increasing their hit dice, for example – preserving some semblance of balance between PC capabilities and the difficulty of encounters. Can it be seriously suggested as logically-consistent that Monsters can get feats and a 14th level NPC cleric can’t? I’m not saying it can’t be done, but the campaign setting would need to justify this discrepancy.

Examination of this possible definition of a Feat has led us down some interesting byways, but the very fact that standard usage – even within official publications like adventure modules – doesn’t fit the resulting models demonstrates that this is NOT the correct definition.

From Generic To Unique

The definition that I actually use is the next one to be considered: “A means of evolving characters from a generic state to a uniqueness that is more tightly integrated with the campaign world.”

Again, there are some implications and hidden nuances to this definition that are worth taking the time to explore, contained within a couple of loaded phrases in the definition.

A generic state

The suggestion here is that all characters start out being carbon-copies of every other character (other than differences in statistics). That being the case, character stats become the primary differential, in game mechanics terms, between suitability for this career vs that, without actually blocking characters from undertaking a career for which they are unsuited in ability.

The implications for roleplay are enormous. A character who is desirous by personality of a career in a particular character class – any character class – can adopt that class regardless of ability. There will be especially pious clerics with a Wisdom of six who think they have heard “the call”. There will be fighters with glass jaws. There will be rangers who couldn’t follow a ploughed furrow in the ground, and Wizards who can barely light a candle, and Thieves who can trip over their own shadows.

Naturally, few of these will progress beyond 1st level, and many may die trying – but from time to time there will be some who survive by sheer luck or by virtue of more capable companions.

What’s more, this raises the prospect of differentials between temporal authority and levels of expertise – knowing the right people, or having the right relatives, can get people promoted to positions of authority their abilities do not warrant. And of members who go ‘bad’ (which means different things when you’re talking about Rogues and Paladins, of course). The result is more akin to the real world with which we are all familiar, where nepotism and low-levels of corruption are routinely expected by the populace (whether they occur or not).

If the GM recognizes these implications, they can build a consistent world around them. Eventually, that will lead to the players becoming aware of the implications (assuming the GM doesn’t tell them outright as part of the campaign briefing) – and once they know, they can begin employing them as tools in their interactions with the society around them. For example, identifying a discrepancy between demonstrable capability and level of temporal authority implies that the position was achieved by virtue of something more than competence – something that’s useful to know when characters are seeking an avenue to political influence.

A uniqueness

To a min-maxer, there is an ‘ideal solution’ that maximizes the power of any given character class by stacking slight differences in relative effectiveness of every nuance open to the character in their favor. If every character class and feat and class ability is equal in effectiveness to every other, their constructions are no better than anyone else’s. The more players subscribe to this philosophy, the more the characters at higher levels become carbon-copies of all others within their class and character level.

Equality in diversity is naturally the enemy of min-maxers and the ally of the GM who wants the PCs and NPCs in his campaign to be more interesting than this carbon-copy approach. The more viable options, and combinations of options, that are available, the more diverse and distinctive characters become even with exactly the same race, stats, character class, and class levels.

One of the many definitions of “game balance” – or, more properly, game imbalance – can be “the capacity for successful min-maxing within the system”. A less negative definition of game balance might be “the capacity for reflecting character persona in ability options without detrimental effects to the character’s abilities relative to characters who have made different choices.”

As a general statement, the more “equality in diversity” there is, the more character construction becomes an adjunct to roleplay, as opposed to “rollplay”. Clearly, the two types of activity in synergy produce a greater effect than if they are in opposition. Ideally, you would want players to be able to identify what a character can (reportedly) do and be able to extrapolate to a personality.

In practice, that might be an unachievable ideal; but the more closely it can be achieved, the healthier a campaign will be in many respects.

So the sine-qua-non of this definition, in practical terms, and with respect to feats specifically, is the type of parity standards that were offered in “Character Options” above. Without such a standard, the capacity for inequalities exist – which undermines the potential for uniqueness by defining “must have” feats for any given character class.

A plurality of equality

The implications go further. Another is that there be many feats available for characters to choose from – at least five times as many as a single character has the capacity to receive, and the more, the better.

To some extent, this actually undermines the arguements and justifications I employed in my “pet peeves” boxed sections above. Every ability that is traded out makes a character more different from the standard, by definition. A character who has expended a feat slot doubling up on an advantage – an initiative modifier, for example – has not used that slot to gain a different advantage or ability, by definition.

So, which “Pet Peeve” solutions do I actually use?

  • Evasion/Improved Evasion
     
    I’d love to implement this solution but my players won’t hear of it. They argue that it is one of the few mechanisms counterbalancing the excessive power of high-level Wizards – which is true. So, until I develop some other counterbalancing influence over the spell-slingers, this is a non-starter. It’s ironic that one game-unbalancing element’s removal should be countermanded by another game-unbalancing element.
     
  • Exchanging Class Abilities for feats
     
    The only reason that this fix is not in place in my campaigns is that I hadn’t thought of it at the time! Right now, there’s a blanket ban on the practice in my campaigns – but that’s subject to change without notice if my players approve (it’s their campaign, too).
     
  • Open-sourcing of feats
     
    The actual restriction in place in my campaigns is more strict than this proposal. While I couldn’t always put my finger on the source of the problem, I felt that certain feats caused game balance issues, and so set up an approvals process† that let me approve, reject, or modify feats – and prestige clases, and spells, and so on. I have given ground (reluctantly) in the latter case – the Spell Compendium is just too convenient a resource – but in other ares, the process remains.
     
    At the same time, this solution is partially implemented – the “flavor text” part, to be specific. It’s actually taken quite some time to derive a general statement of what I look for in that approvals process. I’d love to introduce this fix in its entirety, but it’s probably too late for the current campaigns.
     
    Besides, if there is one “pet peeve” that is undermined by the arguement given in “A plurality of equality” above more than any other, it’s this one.

I’ll write a separate blog post on the approvals process some other time.

Campaign Integration

The final loaded phrase in this prospective definition is “more tightly integrated with the campaign”. What does that mean? In practice, it means that some feats can be designated “only available to race X” or to “Class X” or to “characters of level X or more”, or combinations thereof – in other words, manipulating and extending the requirements list for a feat to suit the particular campaign. It can also permit certain feats to be made available for free to all members of Race X or Class X – sometimes in addition to, and sometimes as a replacement for, racial or class abilities.

You’re really only limited in this area by the amount of time and effort you can put into the campaign ahead of time (it’s generally too late once play starts).

Counter-skinning

I refer to this practice as “Counter-skinning” because it really is the opposite to the technique of “skinning” one race or monster to create another whose capabilities just happen to match.

It’s when you combine the two that you really develop a powerful tool. “Monster X is exactly the same as monster Y except…

I employed this technique extensively to unique (additional) skills for the different races in the House Rules for my “Shards Of Divinity” campaign – another series of blog posts to be presented in the future – and it worked a treat.

Differentiation

The next potential definition of a feat to be examined – “differentiation between stock examples of a given race/class combination” – has turned out to be just another way of stating the same thing as the previous one, but one with fewer tools for the GM to employ. If the GM employs no manipulations – no Counter-skinning – that affect PC races, the “campaign integration” phrase of the preceding definition goes away and we are left with what is essentially a recapitulation of this definition. So it’s a functional definition, but one that’s less useful than the more verbose one already examined.

Changing Rules with Power Levels

This implies a greater structure to feats than is actually the case, though some feat dependency chains lend it an air of plausibility. Unfortunately, these dependencies are too haphazard for this definition to be correct in a general sense.

Could a more formalized, structured, hierarchy of feats be developed? Of course. Is there any benefit in doing so that outweighs the expense in time and effort of doing so? Are there any lurking downsides?

The potential upside is in fighting min-maxing fire with fire – because that’s what we’re really talking about, here. The downside is the usual one that comes with rampant min-maxing: cookie-cutter assembly-line characters become ubiquitous, the common standard.

This is pandering to the min-maxing crowd – either a GM who has reached his wits’ end and decided “if you can’t fight ’em, join ’em”, or a GM who thinks this is the way the game is supposed to be. I’m sure there are some out there who fall into the latter category.

Unfortunately, simply because the GM is restricted in development time and scatters his efforts over many different characters, he can never compete with the focused (almost obsessive) attention lavished on their characters by the dedicated min-maxer. The GM is on a hiding to nothing, to use an apt Australian expression. (Actually, the phrase comes from the UK but it has fallen into relative disuse there, so far as I can tell, while it remains a common part of the ‘Ocker’ parlance).

The problem is that min-maxing – or “power gaming” to put a more friendly face on the practice – is fairly addictive, something that everyone falls prey to now and then. Weaning players off it can be exceptionally difficult, if not completely impossible. The original premise behind The Knights Of The Dinner Table is that most of the players are incurable power gamers – much to the frustration of the GM and the representative of the genuine roleplayers at their gaming table.

The final definitions

The last couple of definitions to consider don’t actually tell us much more than those already examined – “a metagame mechanic used to alter the interaction between character and rules” (which implies that Feats aren’t part of the rules) and “Character options used to customize a character” – which ignores that Feats are available to more than just characters.

So, where does that leave us?

A feat can be many different things, in many different campaigns – and subtle changes to the definition can have massive effects on the underlying precepts of a campaign. There IS no one “right” answer; instead, we have a tool for manipulating campaign and game elements at an almost primal level, if carried through to their logical conclusions.

Make the choice that’s right for the campaign you and your players want to play, and you strengthen that desired style’s hold on the campaign. Make the choice that’s wrong and you’ll be fighting the game system all the way – and wondering why you can’t get it to work for you.

You can even change from one definition to another to reflect some subtle but fundamental change in the game world, changing the tone and texture without your players being able to put their finger on exactly how you’ve worked the magic.

Have fun…

Comments (6)

The Imperial History of Earth-Regency, Part 10: The Crumbling Of Icons – 1980-1997 continued


This entry is part 10 of 12 in the series The Imperial History of Earth-Regency


Pieces Of Creation is an occasional recurring column at Campaign Mastery in which Mike offers game reference and other materials that he has created for his own campaigns.

All images used to illustrate this article are public-domain works hosted by Wikipedia, Wikipedia Commons, or derivations of such works, except as noted.

This article is a work of fiction and no endorsement of the content should be attributed to any of the individuals or institutions named, photographed, or credited.

Author’s Notes: This Alternate History continues right from where it left off last time. The Civil Service has become a milstone around the metaphoric neck of the Empire, and the Empress has embarked on a plan to regain control of her Empire that is composed of equal parts inspiration, determination, and desperation…

1987

The new offensive against the Peerage bore unexpected fruit one year into the Empress’ four-year plan. By forcing businesses to focus on environmental issues, expenses began to rise dramatically, eating into profits, as expected. This reduced the stock value of the companies affected, leaving them ripe for hostile takeovers by the New Entrepreneurs, also as expected. Politically, this reduced the desirability of the Civil Service/Peerage/Big Business path, and reduced their ability to sway public opinion, also as expected, while elected politicians gained in power and influence, also as expected. As the peerage lost their grip on elected officials, so the Empress regained lost ground – all according to plan. In her political planning, she anticipated that the Lower House would eventually become representative not of the peerage who had previously funded their election campaigns but of the New Entrepreneurs who would be funding them henceforth – so that her decrees would only be blocked on those few issues on which both agreed. She had a whole raft of civil service reforms prepared and ready to go as soon as they became viable.

The movement of the Dow Jones showing the effect of Black Monday; image by Edward.

Black Monday

But on October 19th, the lesson of History that the Empress had not taken into account produced a shattering reminder of it’s importance, as a series of profit-estimate revisions in “blue chip” stocks brought about a massive fall in the Dow Jones index – some 23 per cent – creating a panic that threw the Empire into recession. Hundreds of thousands of jobs were placed in immediate jeopardy across the globe as businesses reacted sharply to their loss of value. Interest rates rose sharply as Banks tried to minimize the risks they faced when issuing loans, putting further pressure on business profitability. Many went under, further driving up unemployment, increasing the perceived risk of other loans, prompting further rises in Interest Rates.

The Derailing Of Reform

Instead of being able to concentrate on her agenda for civil service reform, the Empress, politicians and employers alike combined to try and restore sanity to the economy, while the unions began bitter wars for redundancy benefits, retraining, and other reforms. In order to prevent starvation, the government was forced to introduce welfare payments on a massive scale; to prevent a total collapse of the health-care system, they were forced to provide a medical rebate scheme providing free health care. All of this meant that the government was spending money that didn’t exist in comparison to economic growth. That produced massive inflation levels, driving prices up – and weakening the value of every pound the Government was providing. For every pound of expenditure, by years end, the government was providing only 89p of value – or, more accurately, for every pound of value that had to be provided to maintain a minimum, marginal, existence for those affected, the government had to spend £1.13. The Government had unintentionally become the Empire’s greatest “employer” – paying people to do nothing more than look for work – and the Civil Service was more entrenched than ever.

So massive were the consequences that they overshadowed all other news that year – not that there was much to tell. There were the usual assortments of calamities, calumnities, and catastrophes; the usual pointless bloodshed continued with nothing gained on any side; and so on. None of it mattered very much by year’s end, though it seemed important enough at the time.

1988

The Political balance within the Empire had changed as a result of the Empress’ manipulations, though, and she had regained much of the throne’s ability to rule by decree. There was an ongoing momentum toward change, which she was able to harness. Although she had not been able to achieve her long-term goals of civil service reform, she was at least able to influence events. She started by dismissing for incompetence the senior civil servants of the Empire, and promoting younger public servants to positions of high authority – people with fresh ideas, who had not yet been fully indoctrinated into the culture of the Peerage. Using the New Years Day honors list, the Empress demonstrated clearly that the game of Imperial Rule had changed, and that the player whom many thought defeated was staging a thunderous re-emergence.

David Copperfield, noted magician, in a publicity photograph for his 1977 Television Special 'The Magic Of ABC Featuring David Copperfield', photo by ABC Television. Copyright may persist in some countries.

The Honor List of 1988

For many years, the awarding of honors had been under the control of senior civil servants. They were the ones who drew up the lists of potential honorees, who controlled the choices (frequently through a “magician’s force”, where two choices were offered for each post, the one the peerage wanted and one who was patently unsuitable). While there had been the scope for the occasional “extra” to recognize some common citizen who had achieved extraordinary things on behalf of the Empire, these titles were ceremonial and titular, conferring no authority within the peerage.

With the New Year’s Honors of 1988, that changed. Not having had the chance to be taught all the long-established tricks of the trade, many of the new Department Heads had provided two reasonable candidates; where they had not, it was immediately apparent, and the Empress was able to refuse to accept the nominations, insisting on two viable candidates. The result was a weakening of the influence of the conservatives even within the Peerage; another 10 years of the same, and some sort of equilibrium would be reached, though that was hoping for too much, and the Empress knew it, as shown by her memoirs (posthumously published in 2025).

A window of urgency

The Empress knew that visible change was a political necessity, and very quickly. Firstly, there was the need to restore some optimism in the future of the Empire, to end the economic distress that was an unintentional by-product of her power struggle with the peerage. Secondly, there was the need to recapture public confidence in the ability of the Government to improve the welfare of the common man, lest the rioting and endless succession of coups continue. Thirdly, while the new “Big Businesses” were currently progressive in the attitudes, time and growth would make them increasingly conservative as the political and economic landscape grew to their liking. They would support changes they perceived as being in their best interests, but once those were achieved they would want to maintain the new status quo as strongly as the previous crop had done, so her window of opportunity was limited. And finally, the senior civil servants she had dismissed were still around, as were their predecessors; given time, the new Heads of the Civil Service would learn from them all the old tricks; only the headlong rush of events had left them floundering so far. Having pushed them off balance, she had won a short span of time in which to make changes; if it were squandered, she would soon find that nothing but the names had changed.

So this would inevitably be a year of transition and rapid changes within the Empire.

Symbol of the League of Nations, original image by Mysid and refined by others. Symbol may be trademarked.

100 days of Chess moves

The Empress started by ending the military farce in Afghanistan. The invasion had never come close to achieving its goals, and by providing an ongoing reason for hostility amongst the Arabian population, had encouraged acts of terrorism.

She then rearranged the Civil Service, creating a number of new bureaus and departments to deal with the new technologies and their applications.

She decreed tax benefits for the ongoing training of employees, especially those who would otherwise be retrenched, and changed the priorities of the Space Programme to a more pro- environmental stance.

Finally, she established a new organization, the League Of Nations, to provide a political counterpoint to one of the Peerage’s more subtle but far-reaching advantages, the “Family” network.

The League Of Nations
The latter requires some further description of the political structure of the Empire, as it had evolved over time.

The Empire had a monarch, an elected lower house, an appointed upper house, a civil service, and a military force – the latter mostly consisting of forces contributed by each member nation.

Each member nation also had a monarch, an elected lower house, an appointed upper house, a civil service, and a military force, as described in earlier sections of this history.

In theory, the latter were restricted to dealing with internal issues, and were subservient to their Imperial Equivalents. In practice, there had been unification, over the years, of the Peerages and Civil Services. The Civil Service of Spain, for example, could be considered nothing more than the “local branch” of the Imperial Civil Service. In particular, intermarriage amongst the traditional peerage meant that any given member could trace some sort of relationship to almost any other member; they were effectively one large family, related by blood to the Empress (The exceptions being the peerages of Africa and the Middle East, who had shown considerable disinclination to marry outside of ethnic bounds).

This of course totally broke down the theoretical independence of the various national peerages, and shows quite clearly why the same problems were experienced virtually simultaneously in all corners of the Empire. It hadn’t mattered much before the rise of modern communications, but coordination of strategies, policies, and planning had become increasingly easy as technology developed.

The elected politicians had no equivalent relationships, a significant contribution towards their ineffectualness.

The “League Of Nations” was intended to rectify that lack and to provide a new channel for diplomatic coordination amongst the member nations. Furthermore, it was inherently self-limiting; if politicians assumed positions of real power over their nations, they would inevitably develop agendas which favored their nation over others, which would generally lead to the development of opposing power blocs within the League.

The Avalanche Of Reforms

Many of the Empress’ moves in the 100 days of reform had been designed to dissolve – or at least, disrupt – the uniformity of structure in the peerage and the Civil Service. The real coup came with a reform of the Peerage.

Elizabeth decreed that the membership of the Peerage should be restricted in number according to the growth of industry within the Empire, rather than by population and socio-economic regionality.

At the same time, she changed the structures of the civil services in many of the key members, on the pretext of trialing various solutions to the problems facing the Empire to determine the best one – but “inadvertently” making them harder to relate to one another except through the Imperial Civil Service. While this shifted power from the locals to the overall Civil Service, it also meant that the flow of information through those offices rose by an even greater ratio. Of course, in a time of economic distress, it was easy to refuse any requests for the expansion of the civil service.

The net effect was to ensure that the Imperial Civil Service had no time to use their theoretically-greater powers – unless they neglected their primary tasks. Any Civil Servant who exercised their authority thus became eligible for dismissal on the grounds of incompetence. The theory was that over time, as the National civil services came to recognize that they had greater effective powers than their Imperial Counterparts, they would begin to assert that authority to their own benefit over those in rival nations, fracturing the overall unity of the Civil Service.

They’re all Domestic Issues

In the meantime, the National politicians were using the disarray of their local peerages to effect their own changes.

The South African government announced harsh new restrictions to the Apartheid policy, clearly making the first step in abandoning it altogether.

Ethiopia and Somalia used the avenue provided by the newly-formed League Of Nations to arrive at a peace settlement arbitrated by Denmark, who were seen by both as having absolutely no stake in the outcome, and hence as being as impartial as it was possible to get. This ended 11 years of ongoing border disputes.

The military forces released by the Afghan withdrawal undertook a series of lightning strikes in the Middle East, designed to force various hostilities into a pause for reflection, and began to blockade various nations in the region whose politics were opposed to that of the Empire – Iran and Iraq, Israel, Afghanistan, Syria, and so on.

This caused an immediate escalation of the ongoing Oil shortages, but those nations suffered immediate economic collapse. Within months, peace talks were underway between previously intractable opponents. At the end of the year, PLO leader Yassar Arafat renounced terrorism as an effective means of instituting change and recognized the state of Israel in a speech to the League, in a largely-successful attempt to win political support in the new venue for his claim to “Dispossessed Nation” status.

The emblam of the 'Empire Games' (Commonwealth Games) until supplanted by the Five Olympic Rings. Image by Bill William Copton.

The Last Empire Games: Symbol Of Change

The Empire games of 1988 were especially significant, not only for the sudden wave of optimism that was sweeping the world as a consequence of these changes, but because of a new participant. For the first time, Chinese athletes participated, at the personal invitation of the Empress. Although not overwhelmingly competitive in the events, the Chinese nevertheless began discovering common ground with the population of the rest of the world, and the event was adjudged a magnificent success. Certainly, the Mao reacted to the humiliation they experienced with a determination to succeed that would ensure the process of bridge-building would continue.

This was the last Empire Games per se to be held; at the conclusion of the event, it was announced that it would be renamed the Olympic Games thereafter, and that all nations who were willing to attend the Barcelona Games in four years time would be welcome. India, Pakistan, Central America, Japan, and many others could resume the international community. This was viewed as potentially the greatest step in achieving peaceful relations with the Mao in decades.

Only one occurrence marred the Games; Canadian Ben Johnson won the 100m, but was subsequently disqualified for taking steroids. Although prompting global outrage, this occurrence was not seen for the ominous portent that it would eventually prove….

The Exxon Valdez, three days after the vessel ran aground and shortly before the fateful storm. Photo by the US National Oceanic and Atmospheric Administration.

1989

The peerage rallied, as predicted, in 1989. Their chosen battleground was a legal challenge to the reported likelyhood of an ecological catastrophe, alleging that the campaign was designed to restrain trade and force them into unprofitable business practices.

This was an unfortunate, because 2 months after the lawsuit was announced, the Exxon Valdez, a fully-laden oil tanker, ran aground, spilling more than 40 million liters of oil along the Alaskan coastline. While mathematicians pointed out that probability just measures how unlikely something was to happen, what little public support they had wilted, and many of their allies chose the better part of discretion and abandoned them. No-one was fooled; the timing may have forced the traditional businesses further onto their back foot, but they would wait a while and regroup.

OPEC Headquarters Building, Vienna, Austria. Photo by Priwo.

The Coming Of OPEC

The Middle East continued to slowly edge towards a fragile peace. In June, one of the most divisive figures in the region, the Ayatollah Khomeini, died in Iran. Beloved of the secular hard-liners, even his many enemies conceded that the man fought for what he believed in – usually, they added, to the point of obsession. No matter how earnest his beliefs, it was his unwillingness to compromise – and his willingness to treat any who did as an enemy – that had been the cause of failure of many initiatives intended to heal the wounds. Ultimately, others would take his place as clerical spokesman and intractable fundamentalist, but without his political authority.

The change of government brought about a domino effect, leading the oil-producing nations of the region into a trade coalition designed to regulate oil prices and availability – OPEC. This was the biggest indicator to date that the Arabian nations were serious about peace; in order to be effective, OPEC pretty much presumed a lack of hostilities.

Frederik de Klerk at the annual meeting of the World Economic Forum, January, 1992. Photo and copyright by World Economic Forum.

The End Of Apartheid

Nor was this the only government to change direction completely in response to the changes in global political atmosphere. The next to fall was the Botha administration in South Africa; although they had moved towards reform in the course of 1988, the Botha administration had not moved far enough, fast enough, to satisfy the advocates for change who were growing in authority under the reform umbrella. He was succeeded by F. W. de Klerk, who immediately set about dismantling Apartheid and ending years of political repression.

Imperial Prime Minister John Major in 1996, Photo by PFC Tracey L. Hall-Leahy, Courtesy US Department Of Defense

The Winds Of Change

The same sort of thing was happening all over. Poland, Germany, Hungary, Yugoslavia, Bulgaria, Czechoslovakia, Romania, and Russia all moved away from previously hard-line conservative governments towards more progressive representatives. At year’s end came the ultimate expression of the reform movement, as sufficient changes were made in the different member nations governments to trigger a change of government at the Imperial level. John Major was suddenly the Prime Minister of the Empire – right in line with the decreasing average age of politicians, and ending decades of conservative administration.

1990

The nineties felt like a new beginning, in a lot of ways. The problems of the past were falling away, one after another, and in their place, new conundrums were emerging to trouble the policymakers. The three iconic figures of the 80s were now at their lowest ebb to date; Michael Jackson was a recluse haunted by media allegations of strange lifestyles and pedophilia; Sir Bob Geldof was virtually penniless, divorced, and an often-forgotten man; and the Princess Diana, while still a public favorite, was beginning to experience the bloodthirsty downside of what was generally known as the “media circus” or “paparazzi”, as her marriage to Prince Charles began its public disintegration.

Economic Recovery

The economy had staggered back to its feet after the blows dealt it in the 80s and the same mood of cautious optimism was pervading the stock markets and boardrooms, driven more by colorful entrepreneurs than by faceless men in corporate grey.

There was something of the underdog about these flashy moneymen, battling with the corporate greed of the peerage, which lent many of them public support and market share that they would otherwise not have captured. The 90s would reveal the economic consequences of this new economy and the fates of a second generation of new entrepreneurs, but at the time, they were riding the crest of a wave, a looming boom and groundswell of confidence not seen since the end of the Third Global War.†

That’s WWII in our history.

The Middle East: Musical Ideologies

The Middle East continued to experience a transition that had seemed unthinkable only a decade earlier, as the architects of much of the violence moved closer to a moderate position, while nations who had become members of unstable alliances against them reacted by becoming more extremist and distant from the Empire. In particular, Iraq and Israel would start the decade as staunch members of the Empire and end it as mistrusted agitators, not far removed from hostilities. The first part of this transition had become a clear trend in the late 80s, 1990 saw the introduction of the second.

Saddam Hussein as Prime Minister of Iraq, photo by Iraqi State Television.

Iraq: A disintegrating friendship

The concern was all about weapons of mass destruction under the control of nations in an unstable region, and had been since the Libyan flirtation with a nuclear weapons programme in the late 70s. It had been known for years that Iraq had built up vast stockpiles of nerve and biological agents, flouting the Imperial treaty with the Mao, but because they had no delivery systems for these weapons, because the weapons had never been used, and because the Empire needed staunch allies so badly in the region, there had been no urgency in addressing the situation. Furthermore, one of the reasons for the Iraqi regime’s ability to be such a staunch ally was the security that having these weapons under their control conferred. In 1990, that began to change. As peace continued to grow within the region, the Empire’s need for Iraqi support eased; but that alone was insufficient to prompt military opposition to Prime Minister Saddam Hussein, now entering his 12th year as supreme political power in Iraq.

In 1990 the second “great excuse” – lack of a delivery system with sufficient range – vanished. In April, Imperial Security agents intercepted components of a “supergun” bound for the regime. Subsequent investigation revealed that the Iraqis had been quietly acquiring Russian-made SCUD missiles for much of the last decade; these potentially had ranges roughly double those of the Nazi V2 of the Third Global War, which is to say that London was just inside their theoretical range. The SCUDs were conventional high-explosive devices, but a review of intelligence revealed that they could be modified to carry any warhead desired – and that the greatest concentration of the relevant expertise was Iraqi in nature.

None of these were newly-discovered facts; the failure was not one of intelligence gathering, but one of analysis, as different departments of the Civil Service attempted to protect their sources of information and gain an advantage over the heads of rival departments. The Ministry Of Trade knew of the purchase of the missiles, but because Iraq was a member in good standing and one of the few nations in the region allied to the Empire, they knew of no reason to bring the purchases to the attention of the Intelligence Department; the Ministry of Science knew that the expertise in converting SCUD missiles to alternate payloads was Iraqi, but because they had no missiles, saw no need to stress the fact to the Intelligence analysts; and so on.

All this put a new context on a number of side-comments made in speeches and in diplomatic talks with Saddam over the previous decade, in which the Iraqi Leader had repeatedly emphasized the “rich rewards” that would follow from the national support of the Empire without enumerating exactly what those rewards were. Analysts had dismissed this as rhetoric, or had assumed that the rewards in question were the same ones that the Empire foresaw – peace and prosperity, stability and trade. The question of what rewards Saddam believed would be forthcoming had never been asked, let alone answered. Now for the first time, intelligence analysts focused their attention not on the enemies of the Empire in the region, but on one of their allies, and they did not like what they found.

Saddam Hussein had privately expressed the opinion at one point that the conflicts of the Middle East would not be resolved until the region was brought under the control of a single political force; this had been interpreted at the time as meaning “Imperial Control”, but now it was speculated that he believed that Iraq would be rewarded for its support of the Empire by being given control of the dissident nations. He had similarly suggested that placing the control of the disputed Palestinian West Bank into the hands of a third party might be the most viable solution to the problems there – again, not mentioning Iraq specifically, but in this new context, the implications were clear. Saddam had been expecting to receive his “rewards” – but with peace looming without the conquest of rogue nations being necessary, he was preparing to take what he considered his due, with or without Imperial dispensation.

USK Navy F-14A Tomcat over the burning Kuwaiti oil fields - DN-SC-04-15221, photo provided by US Department Of Defense. Click on the thumbnail to see the full-sized image.

The Kuwait Invasion

By the time all this had been uncovered, it was late July. Even as Imperial Intelligence was reporting to an emergency joint session of the upper and lower Houses of Government, Iraqi military forces were staging. On August 2nd, 3 days after the realization of what was forthcoming, and long before a response had been determined, Iraq invaded Kuwait, and within a week had conquered the neighboring nation. Furthermore, the successful conquest had utilized nerve gas on both military and civilian targets. Iraqi forces were lining up to invade Saudi Arabia even as the Empire was being briefed on Iraq’s Kuwait campaign.

Upping The Ante

From out of nowhere (literally), a new factor strode into the centre of Imperial deliberations on a response, as a representative of the Mao appeared before the Imperial Parliament to convey a message from his government: ‘it now stands clearly revealed that a faction of your Empire has foresworn the agreements held between us. At the time of these treaty violations, they were considered good and faithful servants of your Empress. The Empire Of Greater Britain is clearly in breach of the agreements between our states. You now brand this faction as rebellious, but have taken no action to curb this rebellion. We will generously grant you a brief span of time in which to address this situation, before we declare your default a formal violation of the treaties between us, grounds for immediate action on our part. Understand that our agreements do not recognize individual prerogatives; they are treaties between cultures, and cultures do not change. Should you fail in this most reasonable request, your Empire will henceforth be considered untrustworthy, all agreements between us shall stand as void, and we will undertake whatever actions are required to eradicate any dangers posed by the Empire Of Greater Britain. Your Empire will forever be considered false and foresworn by ours, and we shall eradicate it without quarter or clemency. This is our first, last, and only warning.”

Having issued what amounted to a declaration of War, to be rescinded only if the Empire acted immediately against Iraq, the Mao representative vanished as suddenly as he had appeared.

This put events in Iraq into a whole new context. Within the next 72 hours, the Empire military was beginning a full mobilization, a complete embargo and blockade of Iraq had been decreed, and the Empire was officially in a state of Civil War. On August 7th, Imperial troops began to arrive in Saudi Arabia. A deadline of Jan 16th, 1991 was announced for the complete withdrawal of Iraq’s military to their former borders even as the military buildup for Operation Desert Storm continued. Imperial policies made it clear to Iraq – if chemical or biological weapons were employed against Imperial citizens, a Nuclear response would be forthcoming, with no further warning.

1991

Iraq again dominated the news in the early part of the year. January 16th came and went with no attempt to meet the Empire’s deadline; accordingly, on January 17th, hostilities commenced. It was afterwards determined that Hussein was so convinced of his sense of entitlement that he found it impossible to believe that anyone would seriously oppose him, and that reports of the Mao intervention – which left the Empire with no choice in the matter – were discounted as propaganda within Iraq.

This was the first “modern” war, unlike the Afghanistan campaign which had been fought along traditional lines, and the various civil wars that utilized whatever weaponry was on hand. Ironically, it had been the former relationship between Empire and Iraq that had left the former with a state-of-the-art military apparatus.

The conflict began with exchanges of missile barrages. While Imperial anti-missile technology proved very effective at stopping the majority of the somewhat out-of-date SCUD missiles*, the Iraqi interceptors had considerably less success at dealing with the latest generation of Smart Missiles launched by the Empire. Within 48 hours, the Iraqi airfields were severely damaged and their Air Force crippled, clearing the way for precision bombing runs over key defensive emplacements. 48 hours after these commenced, the ground invasion began, and on the fifth day after the commencement of hostilities, the Iraqis were in retreat, falling back to planned positions – only to find that those positions had been destroyed by the Imperial Air Force. After 5 weeks of military action, Kuwait had been liberated, though the departing Iraqis, in an act of economic barbarism akin to the temper tantrum of a child, had torched hundreds of Kuwaiti oil wells. On Feb 27th, Kuwait City was liberated and the Iraqis defeated.

*It emerged, two decades later, that the criteria used to define a ‘successful interception’ were generous beyond belief – simply launching a missile when an incoming launch was detected wasn’t quite enough to be called a successful interception, but having that anti-missile missile head in the general direction of the attack was. Nevertheless, the outcome was a clear victory for the Imperial forces.

The Dictation Of Terms

Politically, the conduct of the war with Iraq had been dictated by outside forces. Having achieved the objectives that the Empire had set, and unwilling to sustain the high numbers of casualties that would have resulted if Saddam had been forced to employ his weapons of mass destruction, the politicians of the Empire determined that diplomatic and trade pressures would be a more acceptable method of dealing with Iraq.

An ultimatum was issued – the Empire would not invade with the intention of deposing Saddam, provided that his regime immediately acted to destroy their stockpiles of chemical and biological weapons and any facilities capable of manufacturing more, and submitted to Imperial inspection and verification procedures.

There was widespread demand for the reclassification of Iraq’s Imperial Membership status, but that option was restricted to dealing with an inability to administer a nation effectively. Instead, trade sanctions and military isolation zones were established, essentially locking up the entire country within its own borders. It was widely anticipated that a civil war would soon depose Saddam without the need for the Empire to dirty its hands by violating its own charter and principles, especially given that Iraq was dependant on Imperial grain shipments.

The Aftermath Of Victory

News emerging through March suggested that this would indeed be the case, as the people reacted to their defeat and its consequences. The Iraqis responded in various ways; some blaming the Empire for turning against its own, others blaming their leadership for being so foolish as to challenge the might of the Empire, a fight that they were never going to win; some shouldered a burning resentment against whoever they considered ultimately responsible, others responded more actively, and a few looked around for someone to lash out at – their attention falling on the repressed Kurdish minority. Word of the resulting atrocities slowly began to filter out from the Iraqi borders even through the censorship imposed by Saddam, and in April the Imperial forces established to enforce the blockade created a number of safe havens for Kurdish refugees fleeing the regime.

Publicly, Saddam had agreed to the terms of the Imperial Ultimatum, but almost immediately he began playing games with the Imperial inspectors, hamstringing their abilities to pursue their mandate through bureaucratic interference and refusing to permit access to religious sites and his personal palaces. The Empire, for its part, was wary of any military buildup and ready to respond at once to any actual use of the banned weapons, but was otherwise prepared to starve Saddam out.

Map of The war in Yugoslavia, 1993, by Pawel Goleniowski. Click on the thumbnail to see the full-sized image.

An unstable stability

1991 also saw a civil war in Yugoslavia, as Croatia and Slovenia sought independence within the Empire. There was further violence aimed at achieving the same goal in Northern Ireland. Terrorism continued to evolve; where once the goal had been a perpetual wave of small attacks, the trend now was toward fewer acts of greater impact. And the first member of the Peerage fell victim to the economic climate, as Earl Robert Maxwell died under mysterious circumstances; his business & publishing empire, beset by massive debts and financial corruption collapsing within days.

In hindsight, it is easy to see that the optimism and security felt by the Imperial Citizens of the time were a superficial coating of progress with a perilously-rotted and unstable core lying beneath it. But, as remarkable as it now seems, no one at the time foresaw the inevitable crash.

1992

This was the year that some problems long considered “solved” within the Empire returned to haunt the administrators of the Throne, as racial issues dominated events. The year began with The Empire continuing its hypocrisy, simultaneously recognizing the independence of Croatia and Slovenia while denying Ireland the same treatment. Those setting policy were accused of bending over so far in the name of political correctness that it had become reverse discrimination – the Baltic Nations were granted recognition while the Irish were not because one population was of Slavic stock and the other was White. Although this accusation was strenuously denied, it was becoming apparent that this was a de-facto policy brought about because no-one was willing to risk appearing politically incorrect.

South Africa continued to march toward the abolition of the racial division that had marred its political landscape for decades, a referendum giving the Prime Minister of the beleaguered nation overwhelming backing for his plans to dismantle the Apartheid policies, while a Trade Embargo was imposed on the rogue state of Libya in an attempt to force it to hand over suspected terrorists.

The race riots of 1992 had an eerily-familiar feeling to those who remembered the Watts Riots of the 1960s. This photo from 1965 by New York World-Telegram, declared to be in the Public Domain by the US Government.

The Riots Of Los Angeles

On April 29th, racial issues surged to the forefront, as 5 days of rioting began in Los Angeles following the acquittal of a white policeman for the beating of a black motorist. Throughout the Empire it suddenly became clear that all sorts of racial double-standards remained in effect, regardless of the laws demanding equal treatment. Ethnic stereotyping and the natural congregation of communities of similar ethnicity had combined to overlay separate sub-nations over one another within the same geographic space, within which law, and social perspectives, were perceived differently, and handled differently. No solutions were obvious, and the problems would remain an undercurrent within Imperial society for the next two decades.

The Imperial Racial Divide

Nor was this phenomenon unique to the USK‡; it was simply more pronounced and obvious there. In London, Pakistani and Indian communities who had taken refuge from the invasion and conquest of their homelands by the Chinese developed their own branches of organized crime. In Australia, militant Aboriginal leaders began forging links with radical groups; although they would eventually back away from the terrorism route, the connections forged would eventually prove crucial to the Empire’s ongoing prosperity.

Within virtually every member nation of the Empire, there was an ethnic minority which began to feel oppressed by the majority or their public instruments. It didn’t matter how trivial some of the complaints were; the fact that there was any difference in treatment of ethnic groups at all was sufficient to arouse heated protest. Only one ethnic stock were not permitted to cry “foul” over their treatment – the Caucasian. These social developments virtually assured the introduction of exactly the type of reverse-discrimination that had set off a new wave of violence in Northern Ireland only a few months earlier.

Wanted poster for Serb leaders including Slobodan Milosevic, from the US State Department.

Serbia & Montenegro

So heated were the ethnic divisions in the Baltic regions that by the end of May, the Empire was forced to impose trade sanctions against Serbia and Montenegro following fierce attacks on Sarajevo, and the Imperial Military – hardly recovered from their Middle Eastern excursion – was poised to attack Central Europe. Within the month, diplomatic efforts at finding a resolution had largely been abandoned, and Imperial Troops had captured Sarajevo airport, permitting relief supplies to be airlifted into Bosnia.

For many of those affected, these shipments were the first substantial food they had received in months. As the Imperial hold on the region grew, stories of ethnic cleansing began to emerge which soon lost the Serbs any sympathy or support for their already unstable position. In mid-august, the Empire officially condemned the “ethnic cleansing” in Bosnia and vowed to use force if necessary to deliver humanitarian aid. For a time, the Serbs seemed to back off, but renewed acts of aggression in November caused the imposition of a Naval Blockade. On December 21, Slobodan Milosevic became Prime Minister of Yugoslavia. The Empire’s racial problems were about to get a whole lot worse.

Michael Jackson at the Cannes Film Festival 1997. Photo by Georges Biard.

The Crumbling Of Icons

At the time, the election results were hardly front-page material beyond the local area. Instead, the primary hue and cry in the media at the end of the year was the separation of the Prince and Princess of Wales. The lives of the three “anointed ones” of the 1980s had led all three into hard times by now; Michael Jackson’s popularity had ebbed away as lifestyle choices, innuendo, and a rapacious media left his music less relevant than his existence. He had been forced into a hermit-like lifestyle which was inherently abnormal, and then criticized for the abnormality of that existence. No longer the entertainer who had captivated the world, he was a parody of popularity.

Similarly, Sir Bob Geldof’s musical career had failed to survive the ramifications of Band Aid; no matter what he produced musically thereafter, it was inevitably going to seem shallow in comparison to the weighty issues and the massed icons of popular culture with which he had attacked those issues. His crusade had cost him his marriage, custody of his children, and his career; he would forever live in the shadow of what he had achieved in the mid-1980s.

In comparison, it might seem to the casual historian that the Princess Diana had escaped relatively whole the carnage of the paparazzi. While headlines had screamed of the infidelities of her husband, and the partisan support of the Empress, and the betrayals of personal staff and acquaintances, she had at least managed to retain her dignity and avoid being torn down to the common level. But at last the scandal sheets had real meat to their stories, and what had once been seen as the ultimate expression of hope for the future of the Empire was reduced to an increasingly tawdry divorce proceeding. From December 9th, 1992, the Empire would have to look elsewhere for hope.

The Uffizi Gallery is one of thee oldest and most famous art museums in the world. Photo by Chris Wee showing the gallery restored after the bombing.

1993 – A year of extremism

The Baltic situation continued to be a source of trouble throughout the year. The Imperial War Crimes tribunal was called into session for the first time since 1945 to seek justice for the Bosnian atrocities – when and if the people responsible were captured. Imperial Forces supervised the evacuation of civilians from Srebrenica, hoping to clear the field for combat. In mid-year, six “safe zones” for refugees were created. These immediately became targets for Serbian radicals.

It was also a year of increased terrorist activities elsewhere. A Palestinian extremist bombed the World Trade Centre in New York, killing five people and completing the transition to “mature” terrorism. The extent to which the authorities are still struggling to find a working counter to the threat posed by the new Terrorists is shown by the bungling of a raid by USK police forces on the headquarters of an extremist religious sect in Waco, Texas, after a 51 day siege. 72 people are killed, including many women and children. Just five weeks later, a bomb is detonated outside the Uffizi Gallery in Florence, Italy, killing 6 people. To this day (2055), it is uncertain who was behind the attack and what they hoped to achieve, as several credible groups claimed responsibility.

Smaller Extremism

1993 also saw the emergence of other extremist groups, fighting for smaller, better defined causes rather than one all-encompassing general manifesto. The prototype made its existence known to the world on March 11, when a doctor was murdered by an anti-abortion activist outside a Florida abortion clinic. As people despaired of being able to influence the political policies that controlled their lives, their increased desperation was becoming the breeding ground for more and more extreme positions.

Battle is joined

More than anything else, this showed that despite the Empress’ success in taking back some control over the Civil Servants when it came to major and singular issues, the fundamental apparatus that was the real problem remained whole and intact. It was still the purview of petty bureaucrats and civil servants to interpret and implement government policy in most of the areas that mattered to the ordinary citizen.

To be sure, the principle of the Empress overriding or revising policies in any given case or any specific rule or regulation had been established; but unless a case came to the Empress’ attention, nothing had changed. To some extent, her victory had simply ensured that elements of the Civil Service were now covertly antagonistic to her continued reign.

And of course, they controlled the media, and the media told a significant segment of the population what to think. The peerage/industry machine had taken several months to formulate their response to the increased political control of the Empire by its head of state, but from this year forward, many stories appearing in the mass-media began to have anti-monarch undertones, as the battle lines began to be drawn.

'Internet' Icon from the Tango Project, image by warszawianka. Click on the image to see the terms of use & learn more about the Tango Project. Image from the Open Clipart Library http://openclipart.org/

Birth of The Internet

But this was also the year in which the tools which would ultimately lead to the overthrow of the civil service and the reinvigoration of the Empire by its citizens began to achieve popular acceptance. There was a new conduit for information coming into existence – one which linked people together directly, and which bypassed the media barons’ spin doctoring. Although it would take many years to mature, the age of communications had achieved its ultimate expression: The Internet.

Cropped Photograph of Srebrenica in August 2004 by Samum. At first glance, an idyllic setting, but note the missing roofs and damaged buildings in the foreground. Click on the thumbnail to see a full-sized, unedited image.

1994

It seemed to many that one troubled region (the Middle East) had been calmed only for the Baltic Regions to take their place. Ongoing unrest in Bosnia dominated many of the 1994 headlines. In February, a mortar attack on a marketplace in the Capital, Sarajevo, killed 68 and wounded over 200, and in March, Serbian forces bombed the “Safe Zones” of Gorazde and Srebrenica. In April, the Imperial military retaliated with Air Strikes against the Serb forces at Gorazde, and the conflict resumed a slow boil. In December, a cease-fire accord was finally reached.

In general, it’s fair to say that the conflict had little impact on events in the Empire overall. Fighting in a relative backwater of little strategic importance was not something to overly disturb the daily routines of a culture that vast.

The key phrase, according to later criticism, was “of little strategic importance”; the Empire, they would accuse, had grown so unmanageable that there was no capacity left to fight for the ordinary citizen, only the larger ideological conflicts. This criticism overlooks the obvious – that any conflict in which there was a confluence of interests would naturally receive greater support from individuals who had that vested interest. Thus, the peerage (whose economic alliances were threatened by any disruption of oil supplies) would enthusiastically support any action aimed at achieving stability in the Middle East but such support would be far more lukewarm when confronted with an issue of relatively pure morality, with no greater economic impacts in prospect. No conspiracy is necessary when there is an overlapping of purposes.

Peace at last?

But in one particular part of the world, also renounced for it’s ongoing violence, the pointlessness of the unrest was observed by a few unexpected observers. The year seemed like any other in the Middle East from early on; in late February, over 50 Palestinians were killed in a Hebron mosque when an Israeli settler opened fire with an automatic weapon. A week later, six Israelis were killed by a Palestinian sniper. It all seemed so pointless to both sides, and was attacked by both as ‘unproductive’, a refreshing change of perspective. On May 4th, Israel and the PLO signed an agreement giving Palestine self-rule in the Gaza Strip and Jericho.

Former nuclear test site Maralinga, South Australia. Photo by Wayne England, who participated in the clean-up, April 2007. Note: the colors are correct. Click on the thumbnail for the full-sized image.

The Thorny Issue Of Reparations

1994 saw developments in the racial issues which had been edging toward a resolution for some time. In South Africa, the African National Congress won the first free elections. The transition to black self-rule was now complete.

The Australian Government agreed to pay £7,000,000 to South Australian aborigines displaced by the nuclear tests of the 1950s and 60s, and the following week, the New Zealand government offered £400,000,000 compensation to the Maori Tribes displaced by the arrival of European Settlers.

These were a disquieting development to many, opening the door to lawsuits for compensation by many others, even as it addressed passionately-held sources of tension. Most fervently opposed to such measures were the USK, who knew that any serious attempt at reparations to its Native American and African-American populations would bankrupt not only the country but the entire Empire.

The contentious question remained, how much discounting of any reparations should take place for the acknowledged benefits and participation in the benefits of modern society? How much should be discounted because the events in question took place in a different time, when different behavior and practices were deemed acceptable and right, and to which a more modern standard of morality could not be applied? How responsible were modern generations for the failures in the past of those who could not know “better”? In other words, how much of the demand for reparations was the result of greed alloyed with hindsight?

Those who opposed reparations, in principle, were always going to struggle to persuade others of their position, since the events of the past few years made it clear that Racial Equality was not a “solved” problem, but those who supported it, even in principle, had an even more difficult struggle to overcome: the limits of practicality. Realism dictated that some compromise would have to be reached, a compromise that neither side was willing to contemplate.

One proposed solution placed a statute of limitations on the offences, but no agreement could be found on where the dividing line should be located. Another proposal applied a fixed discounting rate to each year since the act of inequality was committed, on the basis that ongoing processes of social reform provided a larger share of any compensation owed; since this applied the greatest discounting to the most expensive claims, it met with considerable private support, but since it would fail to achieve any of the social ambitions of the most vocal and aggressive reformers, it failed to get traction amongst the “victims”.

In truth, neither side was willing to abandon the pulpits of exorbitant rhetoric that the issue provided. Any solution would have to be imposed from higher up the policy food-chain, but the leaders of the Imperial Government had more than enough to contend with in the modern day problems of the Empire.

The Rise Of The Internet

Communications technology continued to grow apace; by the end of the year over 15 million people were connected to the internet. From this year forwards it would be considered a mass medium. With it came a surge of excitement in the business world, as any business connected with the internet seemed to be poised for mammoth profits.

It was quite literally possible for a startup with nothing more than an idea to make its founder a multimillionaire overnight. The IT sector was exploding.

But 1994 was not without its warning bells in this area; for this was the first year in which a new subject was mentioned in the specialty press, a subject that would end the century on everyone’s lips: the millennium bug.

1995

Conflict resumed in Bosnia as the cease-fire agreement broke down. After months of pointless bloodshed an agreement was reached for the partitioning of Bosnia-Herzegovina into separate nations for Muslim Croats and Serbs respectively. This was the only practical solution, but at the same time it aggravated the minority populations in both of the resulting nations, who felt disenfranchised as a result.

The New Terrorism

The transition to “the new terrorism” was completed, as exemplified by the only three serious terrorist incidents to occur in the course of the year. In the first, a religious fringe group conducted a nerve gas attack on a crowded Los Angeles subway, killing 10 and injuring thousands. Three months later, the leader of the Supreme Truth cult would be arrested for masterminding the attack. A month later, a terrorist bomb in Oklahoma City killed 158 and injured hundreds. The third, coming late in the year, involved a radical right-wing Zionist sect which carried out a surgical strike on the Israeli cabinet. The Prime Minister and a number of aides were killed, hundreds more were injured.

The Price Of Profit

Even more significant, though less dramatic, were the consequences of a decade’s gradual transition in Economic Circles. Traditional businesses had been forced to react to the depredations of the “New Entrepreneurs” by adopting many of the same philosophies and practices.

In particular, many had converted from a policy of growth through capital acquisition and expanding customer base into a practice of considering these as mere seed capital for investments, which would then generate incredible profits for the owners and shareholders. In the process, many of the traditional values like customer service of these corporations had been thrown aside as “uncompetitive business practices”. The credo had become “profit at any expense”. As the need for ever-increasing profits became unrealistic expectations and greed and overconfidence, it was inevitable that someone would go too far.

There had been a number of near-misses, covered up by higher authorities lest confidence in the economic system be eroded, producing a depression; but in 1995 there occurred a loss so vast that it could not be concealed. The result was the total collapse of the Barings Bank after losses by trader Nick Leeson of more than £800 million. Tragically, this warning sign was misinterpreted by the management of other corporations as a breakdown of the audit systems, the procedures that were supposed to ensure accountability and limits of losses. The corporate culture which caused the breakdown would not be examined until it was far too late.

The Privatization Fallacy

The profits to be made by listing companies on the stock exchanges were so vast that even governments had gotten into the act, Privatizing many essential industries, in a direct reversal of policies that had been in place and considered incontrovertible only two decades earlier.

Industries that had been nationalized, because their continued functioning was deemed essential to society, were now privatized for vast sums to retire national debt that had accumulated over decades.

Of course, as soon as they were in Private hands, the same credo – “Profits at any cost” – came into operation, and with it, the vulnerability to all manner of economic ills. On the surface, the economy was stronger than ever; driven by booming technology stocks, it was growing at unprecedented rates – 1000% a month was not unheard of in extreme cases and sectors – but at its core, the economy was rotten, and the first stiff wind would result in the loss of branches – if not the collapse of the whole. And hurricane season was fast approaching.

The beautiful Lagoon at Mururoa Atoll, scene of a series of French Nuclear Weapons tests in the 20th century. Photo by Georges Martin, 10 May, 1972. Click on the thumbnail for the full-sized image.

1996 – collapse of the house of sticks

Politically, the slow boil within the Empire continued, as Chechen Rebels seized 3,000 hostages in the Russian town of Kizlyar.

France continued a series of tests of Nuclear Weapons in the Pacific aimed at giving them a Nuclear Arsenal independent of Imperial control.

The IRA called off the cease-fire that had endured for 17 months, just as the rest of the world perceived genuine hope for a peaceful resolution of the ongoing conflict.

A series of rapid-fire suicide bombings in Israel killed 31 people and injured over 100. Following the fourth attack within the fortnight, Israel announced that all peace agreements with Palestine had been abrogated by the Palestinians, and invaded to reclaim the territories to which they had granted independence. Within a month, an Israeli rocket airstrikes had hit an Imperial base in Lebanon killing 105 civilians, and turning the political clock back to the darkest days of the 1960s.

1996 was also the year that the Prince and Princess of Wales petitioned the courts for divorce.

Emerging Social Trends

A number of trends that had been building for years made public debuts in the course of the year. Legislation aimed at controlling the Internet began to appear, but these were national laws – or worse yet, local laws – and as such, completely unenforceable. These were the first indications of one of the dominant themes of a new era in Imperial History (and as such, will be discussed more fully in a subsequent chapter of this history): the rise of internationalism.

The quality-of-life debates that had been growing in intensity for decades came to a head as the Northern Territory of Australia passed legislation permitting terminally-ill patients to instruct their doctors to end their lives, despite Imperial and National laws against assisted suicide. Significant not only in the quality of life domain, this was a further manifestation of the theme of the years to come, as the interaction of laws at different hierarchic levels within the Empire came into question, and some of the fundamental assumptions of the Empire were called into question.

Resistant Diseases

But the biggest themes of the news year were medical developments. The warning by the Imperial Health Authority of an imminent potential plague of antibiotic-resistant strains of tuberculosis called into question 60 years of accepted medical practices. By the years end, resistant strains of many other diseases would also be generating headlines, as would the arrival of new, more persistent strains of diseases long considered to be of minor importance, in particular Legionnaires disease.

Up to 10 million sheep, pigs, and cows were destroyed in Britain before the Foot-and-mouth 'epidemic' was brought under control. Similar scenes took place in most of the Empire. Photo provided by Lawrence Livermore National Laboritories, USA.

The Mad Cow Nightmare

This followed the announcement in January of an outbreak of foot-and-mouth disease in Britain and Europe, and the admission by the Imperial Health Authority in March that the “Mad Cow Disease” could be transmitted to humans through eating contaminated beef products.

This, for the first time in history, raised the specter of an epidemic that took advantage of the existence of the Empire. While there were customs laws and inspections when shipping goods from one country to another, the fact that they were all members of an active and overriding political organization meant that these were far less stringent and restrictive than would otherwise have been the case. A massive programme of testing every herd in the Empire would be announced in December; but by then, France, Germany, Spain, Portugal, Austria, Denmark, Norway, Switzerland, South Africa, Tanzania, Zanzibar, Argentina, Brazil, Mexico, and Canada would all have confirmed outbreaks.

Mercifully, thus far, Israel, the US, Australia, and New Zealand, all appeared to be free of infection; and immediate bans on the import of beef, beef products, and fodder were put in place to keep them that way, while harsh countermeasures were undertaken that had been derived from long-standing policies on Anthrax infections. A single instance was determined to be sufficient cause for the slaughter and incineration of the entire herd.

Beef prices throughout the majority of the Empire collapsed, and untainted beef became a luxury commodity. The USK reserved the bulk of its beef production for domestic usage, over considerable protest; exports from Israel were limited by both political concerns and practical difficulties; and that left the antipodean supply as the only save source, a fact that the national governments immediately began to take advantage of. Short-sightedness squandered what could have been a huge windfall, however, when additional export charges failed to distinguish between live cattle and cattle for slaughter; some of the herds reaching affected ports were immediately diverted from the slaughterhouses to usage as breeding stock. It would take a decade, but eventually the domestic herds would be repopulated from uncontaminated sources, and – aside from an occasional isolated outbreak – would eventually be rebuilt.

It did not happen before public dietary patterns had been fundamentally changed, however. Lamb and Sheep production had grown quickly to occupy much of the gap left by the virtually-vanished beef industry, and Mutton and Chicken would be the dominant meat source for most of the Empire for decades to come.

Mad Cow and the USK

Willy Nelson, one of the primary organizers of the original Farm Aid benefit concert. Photo by Larry Philpot of www.soundstagephotography.com


The Agricultural sector of the USK economy, despite being the largest employer in the nation, had been struggling for more than a decade. It is not insignificant that one of the first imitators of “Live Aid” had been the rather more topically-focused “Farm Aid”, targeting support for family farmers in the USK in danger of losing their farms through Mortgage debt. A concert was organized for Sept 22, 1985, less than a year after the event that was its direct inspiration, and quickly evolved into an annual event (missing 1988 and 1991).

Responses to the Mad Cow crisis in 1996 were consequently more varied than might be expected. Some primary producers saw the event as a vindication of the superiority of the USK over the rest of the world, and argued against any increase of exports, a view that – when stripped of the excessively-nationalistic rhetoric – would ultimately prevail. Others wanted to trade uncontaminated beef for political concessions at the Imperial Scale, while some wanted to impose additional taxes on beef exports to raise funding to be spread as relief payments throughout the agricultural sector.

Nervous commodity markets immediately discounted the value of Beef stocks, which in itself imposed new economic pressures on the farmers, but which did not go as far as the near-total collapse of prices in the realms to both the north and south of the USK, where outbreaks had been confirmed.

This immediately produced a black market in cheap – questionable – beef shipments into the US. Rumors of these shipments began circulating almost immediately, further depressing an already deflated market and further lowering public confidence in the beef industry. It was concern that inflating the value of beef further would only encourage these unsafe practices that ultimately killed any prospects of the USK using the international demand for beef to solve its domestic agricultural problems.

1997

The “Mad Cow” catastrophe went from bad to worse as it was discovered that the soil itself could harbor the infectious agent. This discovery was made as farmers attempted to replace their herds, only to have the disease re-emerge in cattle who had been tested and certified “clean”. It was clearly necessary to not only slaughter an entire affected herd, but to sterilize the soil on which they had grazed and to quarantine the affected farm for a period of 6 months – draconian measures that aroused storms of unrest amongst the public.

Despite the produce only coming from farms tested and declared free of the disease, the domestic beef market in much of the Empire collapsed to such an extent that it would be a decade before it had fully recovered. But with these harsh measures stringently applied, the threat posed by the disease was clearly receding by mid-year.

Only then did the government begin to examine closely the causes of the original problem, seeking answers to the questions of where had the infection come from, and what could be done to ensure that it never happened again? The answers would not be as forthcoming as Imperial analysts expected, and would not become public for years.

The sea of flowers left at the gate of Buckingham Palace in memorial to Diana, former Princess of Wales, speaks to the affection in which she was held. Photo by Maxwell Hamilton. Click on the thumbmail for the full-sized image.

The final bloom of the ‘English Rose’

This was the year in which the fairytales came to an end. Following her divorce of a year earlier, the former Princess of Wales began keeping company with Dodi Fayed, the son of the owner of Harrods (and many other businesses), and the shy smile that had captured the sympathies of millions returned with increasing frequency.

But the divorce had left her vulnerable to the predations of the paparazzi, and increasingly desperate measures were necessary to maintain the couple’s privacy. One rainy night in August, after the couple had been drinking at a restaurant, the press again caught up with them; the couple drove off at high speed, pursued by the reporters and photographers. On a road made greasy by the rain, the powerful BMW lost control and hit a tunnel upright; the pair were killed instantly.

This was one of the critical moments in history; Diana died before the press were able to tear her reputation down for the sake of headlines, despite their best efforts; and in her passing, she was anointed a saint by the public. For the second time in the century, the Empire stopped for a few hours; in 1969 it had been the first landing on the moon, in 1997 it was for the funeral of the embodiment of the promised future. Even those who felt distant from the monarchy found in those days that the world was a sadder, greyer, place.

The Crown In Crisis

Had they behaved differently, the outpoured support might have shored up the rule of the Empress Elizabeth, or even that of the future monarch of the country, Charles; but it was widely held that they were responsible for the circumstances that led to Diana’s death, and instead found that popular support for their rule was markedly declining.

In part, this was driven by a hostile press, who were willing to attack anyone for headlines; in part it was driven by hostile media owners, who had come under attack by the Empress; and in part, it was fully deserved.

The Empress had been so busy focusing on the ongoing battles with Government, and Peerage, and emergencies, and Civil Service, that she had lost touch with her subjects. A blinkered view of the deteriorating relationship between the Imperial Family and Diana and an old-school perspective that told her to keep her feelings private had gradually led her to lose touch with what modern citizens expected of their rulers and public figures. This, more than anything, had been at the heart of many of the conflicts between her and Diana; she had considered the Princess’ behavior to be excessively demonstrative, consistently outrageous, and perpetually verging on the exhibitionist.

These problems were compounded by a situation in which it was not considered etiquette for her servants to correct or even advise her; on the contrary, she was supposed to advise them. It took someone who was not afraid to be critical of the Imperial Family, even to discard protocol completely, to correct the situation.

Prime Minister Tony Blair at the White House, 2001. Photograph by Paule Morse, made available by the Executive Office of the President Of The United States

An unlikely savior

Fortunately, there was such a person at hand – the newly-elected Prime Minister of England, Tony Blair, who had in the past been highly critical of the role of the Imperial Family. It was Blair who explained to the Empress his ‘theory’ that Princess Diana had been so popular with the public because she enabled them to identify with her; distance and forced deference were barriers that had been erected between the Imperial Family and the public, but that if she desired to do so, there was an opportunity to use the current climate of discontent to reconnect with them; all that was necessary was to discard an outmoded policy of presenting herself as an impersonal throne and embrace a policy of letting them see the Monarch as a person. “Of course,” he is reported to have said, “I am sure that this is nothing that has not occurred to Her Majesty,” covering the breach of protocol. Prior to this moment, the generational gap had been the cause of considerable disrespect toward Blair behind the scenes within Buckingham Palace; at this moment, that barrier fell away, and the two entered into a new and more cooperative relationship, one that would reinvigorate the connection between Ruler and Ruled.

This was a policy that Prince Charles had also been advocating, something that he, ironically, had learned from Diana herself. But it forced on the Empress a very hard choice: she could rehabilitate the Monarchy’s image by humanizing herself, but in doing so, she would entrench the public perception of her son as unworthy to inherit, a view deriving from the tawdry infidelities that had caused the marriage to Diana to break down in the first place; or she could attempt to rehabilitate his image despite this additional handicap, risking the Empire itself should they fail to win back the support of the people.

The decision of destiny

By the end of the year, it was clear to the Empress that should Charles ever succeed her, he would preside over a hollow shell of what had been. She had originally intended to retire in favor of her son on her 60th birthday; but circumstances left her no option but to continue beyond that date, until her grandson, Prince William, had reached his age of majority. On William’s 21st Birthday, he would be crowned Emperor of Greater Britain.

It was Prince Charles who had made the decision for her – pointing out that if he abdicated his right to inherit, he could follow his heart and marry for the love he felt for Camilla Parker-Bowles, a decision with which he would be more than satisfied. Only when the Empress’ memoires were published in 2025 would the world learn that she had already decided to act as she subsequently did. Given a choice between the two, he would choose happiness over the throne – a choice that Elizabeth herself might have made, but one that she had never had the opportunity to explore.

So ended the Age of the three anointed saints of the late 20th century. What had started as an absurdly popular recording of dance music had ended in the disinheriting of the heir to the British Throne.

Comments (1)

Grokking The Message: Naming Places & Campaigns


This entry is part 5 of 11 in the series A Good Name Is Hard To Find

So, here it is: a day late, thanks to the Easter long weekend, but better late than never! Normal Service will be restored next week… in the meantime, enjoy.

We’re still working our way through what was originally intended to be Part 4 of this series, believe it or not! Part 1 concerned itself with setting the goals for the series, identifying the characteristics of a good name and considering the value that a good name could add – and the impairments that could result from a detrimental name. Part 2 explored Name seeds, a system for generating character names of passable-or-better quality that I have developed. In parts 3 & 4, I examined name structures, which are the framework within which a Name Seed can be employed can be employed, a subject that segued into telling a story with a name.

Logically, if I were not so focused on trying to make up for lost time, I would have left the last couple of sections of Part 4 for this segment of the Naming series for this part, it would have been a better fit. But hindsight is 20/20 by definition, and at the time I just wanted to get as much of it done as I could – I was so tired at the end of it, that i could barely put one word in front of another, let alone see the structural forest for the narrative trees!

Looking at the rather ambitious agenda I have laid out for this post, I’m not even sure that I’ll get all the way through it in one sitting. Assuming that I do, Part 6 (the originally-intended Part 4) will look at integrating name cores and name structures; and Part 7, to follow that, will look at various name-generation tools and aids – most of which may come as some surprise. But, in this part – and the last few sections of the previous one – we are taking a minor diversion. The subject is telling a story with a name…

Naming Places

Everything happens somewhere. If you are lucky enough to have your adventures take place on Earth, or some commercially-published game setting, a lot of the work of naming things is done for you – thank your lucky stars! If this is not the case, then you have a lot of work in front of you, because any map contains a lot of things that need naming: Mountains, Valleys, Forests, Plains, Deserts, Rivers, Lakes, Waterfalls, Seas & Oceans, Roads, Cities, Towns, Streets, Inns, Banks & Lawyers, Other Business Establishments, Towers, Keeps, & Castles, – even Planets, Stars, Nebulas, and Galaxies… and I’m sure I’ve missed something.

Place names always tell a story, whether it be of exploration, discovery, exploitation, nobility, greed, or whatever – they always have a tale to tell. Even a name like “New York” – the subtext being “Just like York was, only better”. Beyond this, there’s no one pattern – until we look at each type of Place in its own right…

Naming Mountains

Mountains are generally named for appearance (especially when a metaphor can be used to describe that appearance), for their climate, for the explorer who discovered the mountain or some family member, for its height (using a relative measure), for the political location, for the inhabitants, or for a famous person known to the discoverer. The combination of all these options is so broad that just about any name you can think of can be acceptable and justified later.

That’s a bad way to do business. Unless there is some obvious name (“Troll Mountain”) or something highly distinctive about the mountain’s appearance (“Dagger Point”), I prefer the name to reflect the historic activity of the region, or the type of action that I expect to hit the players with if they enter the vicinity, either symbolically or metaphorically.

These modes of assigning the name ensure that whatever name is chosen reflects the sort of things that the namer would have been thinking about. “Black Rock” (coal) – “Small Nugget Mountain” – “Long Pine” – “Bloodfreeze” – “Goat Back” – “Twisted Ally” … well, you get the idea.

I rarely explain the origins of the name, and certainly without a high-level skill check of some kind. The more evocative the name, the better – but once you have the PCs on the hook of curiosity, you have to reel them in gently, to use a fishing metaphor! This way, the PCs are never sure whether I’m dropping hints, describing history, describing prophecy, being cryptic, or trying to mislead them – a lot of potential interpretation when the real objective was simply to get a name that sounds cool.

The same technique works for Volcanoes, but the lexicon will usually involve anger, hostility, violence, smoke, or some other more specific reference to the nature of the mountain. These can be made quite subtle, however, if you are in the mood – “Glacier Slip” is a great name for a volcano with a frozen peak. After the eruption, the PCs will know why the Glacier Slipped – beforehand, they won’t have a clue.

One more example that they may not get even after the fact, giving you an inside joke with which to amuse yourself: “Jaggerfalls”. Don’t get it? “Jaggerfalls” = “Jagger + Falls” = “Rolling Stones” + “Falls” = “Landslides” – and what causes landslides? Earthquakes, i.e. Tectonic Activity, i.e. Volcanically active.

The tallest peak in a region should have a name that especially reeks of Majesty. Put a little extra effort into naming it.

Naming Valleys

Valleys tend to be given optimistic names, because they will frequently be the closest thing to prime real estate in the region. Many are named for the first town to be located in the valley, or vice-versa. They may also be named for some other geographic feature in the region, such as “Three Falls Valley”. With those caveats, the same approach used for mountains usually works just fine for Valleys.

How many valleys can you think of that are named “Happy Valley”, “Pleasant Valley”, “Green Valley”, “Paradise Valley”, “Peaceful Valley”, or something similar?

This optimism can often be used to form a poignant counterpoint to whatever nastiness you have in mind for the location. The darker and more disturbing the events to take place, the more I tend to give the valley a sweetness-and-light name.

The final source of Valley names is the name of the tallest peak adjacent to the valley. When I don’t have anything especially nasty in mind for the inhabitants, I will often use this approach simply to save the more evocatively misleading names for the occasions when they will be most useful.

Of course, no pattern of this sort should be 100% consistent, or it will become predictable. Mix it up occasionally, just to keep the players on their toes.

Naming Forests

Forests are often named for the quality of light within them, or some metaphor describing that quality, though they will sometimes take their name from that of the underlying terrain. Avoid the temptation to name forests for a shape they might make on a map – not only do their perimeters change frequently (making that shape a relatively recent phenomenon), maps were usually not that accurate when it comes to forests.

The second popular source for a forest name is something related to the watercourse that feeds the forest. You HAVE figured out where all the water comes from and where it goes, right?

Finally, beware the temptation to use the actual word “Forest” too often within the names of this type of geographic feature. Pick some other descriptive quality or some metaphor for what lies within, most of the time. “The Silverdim” is a much more evocative name than “Dim Forest” – though “Dimwood” works for Tolkien.

Beyond these considerations, the same guidelines provided for Mountains work fairly well.

Naming Plains

Plains are incredibly dull places, lacking dramatic elements or geography to use in naming them. As a result, they are frequently named for the waterway into which they drain, for the color of the soil, for explorers and their families, in fact for just about anything the explorer can think of. As a result, most have very prosaic names.

An exception comes with one specific type of plain: Tundra. The climate tends to dominate the naming of such areas, often cloaked in metaphor once again.

Quite often, plains don’t receive any name at all – that’s how dull they are. The names are reserved for the towns that locate themselves on the plain.

Naming Deserts

If climate dominates the naming of Tundra’s, how much more common are such name derivations when it comes to Deserts? “Dry Well” works well. So does “Hazy Desert”. Naming a desert “Blue Water” after the mirages is a nasty trick.

Colour, especially of sand, is almost as common. “White Sands” is the obvious example, with the Painted Desert a close second.

Explorer names are also very common – the largest desert in Australia is named the “Simpson Desert.” Sometimes these are named for the first discoverers, sometimes for the first to successfully enter and return, and sometimes for a lost expedition.

Geography & Vegetation come fourth. Mesas, cacti, isolated mountains, all these may lend their names to the desert which surrounds them.

Naming Rivers

River names are almost as broad in derivation as mountains. Frequently, the best tool you have for naming geographic features is an Atlas, but when it comes to rivers, I’m afraid the US is mostly out of luck, because the names are those provided by the Native American inhabitants who preceded white settlers. If you can find a resource that provides literal translations of such names, however, treasure it, because these literal translations are the best bible to naming rivers and waterways that you can find. African rivers have the same problem, as do Australian, and the Pacific regions.

England & Europe are also out of luck, but for a different reason – the languages there have changed so much that the original meaning is frequently as obscure as for their North American counterparts.

Spanish speakers may have an advantage here, because the Spanish frequently renamed the rivers they discovered in places like South America – thought many may retain native names of obscure derivation, so not even this guide is completely infallible.

If you can’t tell where a river name comes from on a modern map, why should things be any different anywhere else? Use the “alien languages” techniques presented later in this post (or in the next, if I run out of time) to generate a language for the original natives and use it to name the rivers by translating names derived in the usual ways.

In fact, the only time that you really have to worry about naming rivers and waterways in general is when there are no “native speakers” to ‘solve’ the problem for you. When this occurs, take a step back and use some literally descriptive elements – Size, width, shape, depth, colour, speed.

If you don’t think those qualities, and the metaphors they engender, are going to be enough, consider the nature of rivers – sometimes willful, occasionally contrary, changing direction as they see fit – these are qualities that (rightly or wrongly) have been attributed to women by men for millennia. “It’s a female prerogative to change her mind” – is there anyone in western society who hasn’t heard that before? In modern times, it can be appreciated that this is probably the result of human biology – monthly hormonal changes, the changes and cravings of pregnancy, and so on. Nevertheless, I’ve found that giving rivers feminine names works very well.

Naming Lakes

Lakes, on the other hand, are more frequently given names of more recent derivation. That is because it takes a relatively high level of sophistication to recognize a lake for what it is, rather than any other type of large body of water.

The larger the lake, the more likely it is to have a name of ‘modern’ derivation. (“Lake Superior”, “Lake Victoria”). The smaller it is, the more likely it is to have a native-tongue-derived name. So for small lakes, use the same approach suggested for Rivers; for larger lakes, size and importance are obviously the dominant factors to consider in naming them.

Naming Waterfalls

The Waterfalls people think of are always spectacular geographic features, frequently very beautiful, and warrant naming accordingly. But there are innumerable small falls, especially in mountainous regions, and these frequently receive more prosaic names.

The road from Sydney to Katoomba, for example, is a distance of less than 48km (30 miles) but I have counted more than a dozen minor waterfalls – often little more than a trickle – in that span. Now, the cross-mountain passages around Sydney are a little unusual in that there is no access through the valleys, to get across the Blue Mountains – people found the hard way – you have to actually go across the top of the peaks, down into a valley, then up across the top of the next peak. So we get to see more of this phenomenon that citizens of most other country.

You can get a better idea of the scale of the situation with a quick squizz at – and remember that these photographs are just the larger ones, there are many smaller ones not featured!

With two scales of waterfall, there are two approaches to naming, and these tend to follow a similar pattern to that of lakes – the smaller ones have minor, relatively unimportant names (though some can be quite picturesque, as the link above shows), while the larger, more spectacular ones tend to be named for more important people. The more prosaic names are often named for the nearest township, or the waterway, or the peak.

Naming Seas & Oceans – and straights

The largest bodies have specific and unique names, derived from ancient Gods (“Atlantic Ocean”, from Atlas), from some relative characteristic (“Pacific Ocean”, from the word meaning peaceful, named for the contrast with the Atlantic), or from the dominant landmass (“Indian Ocean”, for India).

Intermediate bodies – Seas – are generally named for the local landmass, especially when there is a slightly-different archaic name for the landmass. Often, these need to be qualified with a geographic location to distinguish one from another – “South China Sea”, for example – but a quick glance over this list of seas will show that this general statement is honored almost as often in the breach as in the observance. “Red Sea”, “Cooperation Sea”, “Cosmonauts Sea”, “Black Sea”. Added to which are the bodies of water named for their explorers, also obvious on the list – “Mawson Sea”, “Drake Passage”, “Bass Straight”, even “Bismarck Sea”.

The smaller the body of water, the more likely it is to have been named either from a native source, for the discoverer (or a relative or sponsor), or for a famous explorer.

Naming Roads

It’s not uncommon for roads to have more than one name, because a road gets its significance from where it leads. Each town, then, will often have a different name for each road that leads from it.

There are very few exceptions to this general rule. Most of those are named for the explorers who mapped and surveyed the route followed by the road. The longer a region has been settled, the less likely this is. Navigational references are also reasonably common, as are roads that are named for a geographic feature that they pass – a road past “Washingoa Falls” (a waterfall, invented name) might be named Washingoa Road.

This phenomenon means that giving the geographic feature a good name is a two–and-a-half-for-one beneficial deal – not only does the feature become an iconic element of the landscape, but the road shares in that iconic status, and (this is the half), each name-checks and reminds the players of the other. (For the record, I don’t think “Washingoa Falls” is a very good name).

Naming Cities & Towns

So, if many roads get their names from population centers, the problem of naming the roads is merely deferred – and not for very long.

The names of population centers frequently follow a pattern that differs from one geographic and socio-political region to another. You can often hear the name of such a population centre and think “that sounds like a town in (region)”. Names from the US Northeast are different to names from the Midwestern US which are different to names from the Western US, which are different to names from Mexico, or Alaska, or Hawaii, or Southern England, and so on.

In part, these patterns are real, reflecting the history of settlement – Southern California names have a more Spanish flavor, for example – but in part, they are psychological.

The key to naming cities and towns is to employ generic names for places that don’t matter, and reserve the effort for the ones that do – then try to capture the iconic flavor that you wish to impart, so that hearing the name puts you into the correct mindset for the landscape.

Do this right, and a lot of other things that would be hard work become easy.

Naming Inns

Take, for example, the concept of an inn or hostel. There is a world of difference between an Irish Pub and the equivalent establishment in New York, London, or Las Vegas, or outback Australia. Not only will they have different names, but the appearance and flavor of the establishments will be very different.

If the town name already has the players (and yourself) in a receptive and geographically-appropriate mindset, simply referring to “an inn” conveys the right mental image right away. Reinforce this with an appropriate inn name, and the mind fills in any blanks in the details provided by the GM with an appropriate mental image.

This impression can be fragile, however, and easily disrupted if there are jarring discrepancies between the description you provide and what the impression generated by the name. It’s important to get the architecture and furnishings right, or you will undo all the good work.

The easiest way of making sure that all the details match up is to identify a real-world analogue for the region. Make sure that the neighboring regions also match up.

For example, let’s say that your game setting is somewhere very much like the central Irish countryside, the . Use the town names from the region as models and templates for your town names, get descriptions of the local architecture from tourist sites, and so on.

You can also work from the other direction – find a book which features an inn or establishment description, and use its location to lead you to regional maps and other information of use. It’s best to avoid fantasy novels for this purpose, for two reasons:

  • There is going to be a lot less reference material available concerning a small region of a fantasy world. You can’t exactly use Google Image Search to hunt for photos, or Google’s Street View to get a look at the local architecture.
  • You don’t know how accurately the author has done his research, and hence how consistent the architectural and narrative references are.

A far better source is generic non-fiction. Find an evocative narrative description and make it your own. Use it as a starting point for your own research – and be prepared to revise, replace, or abandon parts of the original description if your research contradicts it. You can even use a keyword internet search to find the right description. For example, “smoky cantina” pulls up a number of websites on a Google search, each of which contains part of a phrase – put them together, with a few bits in-between, and you get:

“Sad mariachi songs play until dawn over the moonlit beach behind the low fence. Men roll dice in the corner, wagering nonsensical sums on the outcome and puffing blue smoke from hand-rolled cigarettes. In a back room, two sweaty men in checked-red shirts and scarves were playing pool, while at the dingy bar, a surly bartender pours shots of tequila and lime for an out-of-place figure while a hot-blooded flamenco dancer crawls over him in search of a ticket to a better tomorrow.”

Notice how little of this passage actually describes the architecture or the people in the setting; and yet, how evocatively it conveys an impression of the place. Sight, sound, taste, smell, temperature – five of the six main senses are engaged. The reference to tequila makes it clear that the scene is in Mexico or the southwestern united states – so to really ground this location, fire up Google Maps, go to the right part of the world, look at the place names and use them as a template. Translate them if necessary – no need to name somewhere “La Cereza” if the language is inappropriate. Take the English translation – “The Cherry” – and do a search for similar names in the right part of the world. It won’t take too long to find “Cherrybrook”. Just change the iconic references – the dress style, the game, the music, and the drink (tossing in one or two more for good measure) and you get:

“Sadly-plucked lute strings waft music over the moors until dawn behind the low fence of the Cherrybrook Inn. Men roll dice in the corner, wagering nonsensical sums on the outcome and puffing blue smoke from a long-stemmed pipe. In a back room, two sweaty men in faded robes were play jacks, while at the mahogany bar, a surly bartender pours tankards of ale for an out-of-place figure while a hot-blooded barmaid crawls over him in search of a ticket to a better tomorrow.”

We’re clearly talking English Pub; the only dating references we have are to the “long-stemmed pipe” and the “faded robes”, and those place it anywhere from the early 18th century back to the dark ages – all prime fantasy eras. Even without describing boars’ heads mounted on the walls, or pennants and flags, or thickly-smoke-stained windows, a sense of the presence of such typical decorative features is created.

There’s also a subtext – mahogany isn’t cheap, so there is a hint that the present clientele is a step down the social ladder from the pub’s past.

Inns and pubs are frequently named for wildlife, for the owner, for the town or suburb in which they are located, for some local geographic feature, for famous figures, for famous battlefields or events – in fact, for just about anything you can think of. Unfortunately, it’s just as easy to get a non-evocative name as it is to create an evocative one. “Cherrybrook”, the example given above, is somewhere in between.

When naming an Inn, the best approach is to try and capture the tone of the place, and of the action you want. Whatever overtones the name projects will be added to your narrative description; the same narrative will have a slightly-different nuance if the Inn is named “The Surly Griffon” as compared to “The Bath-house Tavern”, as compared to “The Soldier’s Rest”.

Naming Streets

Streets are usually named for everything else in the country – towns, famous figures, you name it. The only time to really worry about street names is when you want to cast a general impression or tone over an entire district, a subtext similar to those of an Inn, but applying to many buildings.

The plebian approach is to take that subtext and apply it directly. “Diplomat Row”, “Merchant’s Way”… you get the idea.

A far more effective approach is to employ a metaphor for whatever quality you want the region to embody, or a synonym, or even for something you associate with that quality. “Envoy’s Row” and “Barter Way” both have a touch more nuance to them, a little more style. Compare “Temple Street” (okay but dull) with “Cloister Avenue”.

There are two parts to a street name, and that last example gives some notion of the importance of each. As a general rule of thumb, your important streets should never be named “street” – unless a bucolic humdrum is the mood you are trying to capture.

Remember, too, that a plebian name will often be replaced with something descriptive by the local population – “Potter’s Road” may become “The Avenue of Smells” if there are a lot of tanneries along it, or “Tinker’s Road” if that’s where all the blacksmiths are located.

Naming Banks & Lawyers

There are times when a particular institution will want to project a particular image. Banks and Lawyers are the two institutions that reflect this most clearly; each needs to project trustworthiness and, consequently, conservatism. To some extent, in modern times, we have stepped away from that ever so slightly; but in almost every setting you can point to, the names of this type of institution will be positively dripping with formality.

The best way of expressing that formality is to take the rules for naming upper-class individuals and generate one or more, then name the institutions for those individuals.

With Banks, it is most commonly a single individual, but names that reflect the national government are also popular (Bank Of Cyprus, Bank Of England, Commonwealth Bank, Bank Of New South Wales – just to name a few that come to mind right away).

In general, the difference is that the Banks named for individuals are private banks, founded to facilitate growth and/or trade in a particular region, while the more abstract names are ‘official’ banks established by the Government.

One name is rarely enough for a law firm, however – two, three, or five seem to be the most common (for some reason, I’ve never noticed many with only 4 names. Perhaps there is some compound growth relationship that means most firms can go from three partners to five almost every time – or not at all).

References

The best reference I can point to for information on the origin and functions of Banks is fictional: in one passage of Time Enough For Love by Robert Heinlein, the central character, Lazarus Long, is acting the part of a Banker in a burgeoning Colony. Warning: I would rate this book as MA15.

I don’t have any singular reference to offer on the formative aspects of Law Firms. My understanding is a mélange of episodes of L.A. Law, A Civil Action (both the movie starring John Travolta and the novel by Jonathon Harr), and many novels by John Grisham. And oh, yes – throw in some Boston Legal while you’re at it.

Click the thumbnail to purchase the book from Amazon

Click the thumbnail to purchase the DVD from Amazon

Click the thumbnail to purchase the book from Amazon

Naming Other Business Establishments

Most other business establishments will reference the name of the owner, or the name of the settlement in which they are located, at least until the late 19th century or early 20th century. Only then do mass communications and a wide-ranging transport system permit business to start out national or international in scope.

There are exceptions, especially when it comes to trade consortia – the most famous example being the East India Trading Company, which featured prominently in the second and third Pirates Of the Caribbean movies.

Click the thumbnail to purchase from Amazon. Now available as a 3-Disc BlueRay-DVD combo.

Click the thumbnail to purchase from Amazon. Now available as a 3-Disc BlueRay-DVD combo.

Naming Towers, Keeps, & Castles

There are two primary reasons for such structures to be erected. Firstly, they can exist to defend a region or a border; and second, to control and dominate the region around them. These can be characterized as defensive and offensive functions, respectively.

Naming conventions for these structures are often differentiated by the primary function. Defensive structures take their names from the population centers quite frequently, while offensive/control structures take their name from the surname of the family who control them – with further refinements to the name necessary only if there are several belonging to the one family.

One mistake that a lot of fantasy game writers and GMs make is in distinguishing between Keeps and Castles. A keep is a fortified tower, frequently built inside a castle; the two terms are not interchangeable. Often, this won’t matter, but as soon as someone corrects the basic terminology of your name, it’s credibility and all the beneficial effects that it might have had go out the window.

Until you are sure of what your doing, check Wikipedia – or some other appropriate reference source – anytime you give a class of building a title!

Naming Planets

There’s always a story behind the naming of a planet as soon as you get beyond our solar system. There are only so many mythological references to go around, which is where the names we use in our system come from.

Take a look at the Extrasolar Planets Encyclopedia listing of the planets discovered to date beyond our solar system and you will find that not one of them actually has a name. Instead you get things like “1RXS1609 b” and “CD-35 2722 b” – clearly not names intended for everyday usage. As of this writing, 611 planetary systems containing 763 planets have been detected – and that’s not counting 158 Unconfirmed, Controversial and Retracted planets (some of which might eventually make it onto the main list).

Most authors don’t have a naming pattern to the planets with which they populate their science-fiction universes. Two of the exceptions are Larry Niven’s Known Space series, where each world has a name and a reason for that name – whether it be Down or WeMadeIt – and The Mote In God’s Eye (and it’s sequel) by Larry Niven, this time with Jerry Pournelle. Since the latter are set within Pournelle’s CoDominium universe, I think it fair to count these as separate examples, rather than being a recurring gesture of verisimilitude by one author.

And that’s the lesson here. So long as you have a plausible reason behind the name, you can be as inconsistent as you like, except that if multiple planets are named by the same source, they will almost certainly exhibit a consistent pattern or theme.

Naming Stars, Nebulas, and Galaxies

Again, in modern times, these objects are given a user-unfriendly catalog designation that would never make the grade in regular service. The practice in Star Wars is to name stars after the inhabited planet within that orbits them and append the word “System”, and that’s a definite step in the right direction.

Very few stars actually have names; the few that do received them in ancient times, because they were visible and distinctive to the naked eye. Places like Rigel, Regulus, Vega, Sirius, Polaris, and Mira. Most of these proper names derive from Arabic with Latin a distant second. Only a handful have proper English Names, such as Barnard’s Star. The problem with these names is that they are used inconsistently, often spelt in different ways with no standardization, and there are also a few cases where names have been duplicated – there is an Alnair in Grus and another in Centaurus, for example.

Another way of naming stars is by apparent brightness (as seen from earth) and constellation, using the Greek Alphabet – “Alpha Centauri”, “Epsilon Eridani”, and so on. Officially named the Bayer Designation, this system (created in 1603) quickly runs into problems because there are a LOT more stars than there are letters of the Greek Alphabet in a constellation – something not really appreciated until the later 19th century.

The Guide Star Catalog II contains 945 million stars of up to Magnitude 21 (the higher a magnitude number, the dimmer it appears to be).

That’s an awful lot of names needed. NO one system will be enough. Most will never receive a meaningful name – to do so, a star will have to be significant, and probably lacking in a name from any of the other sources. And that’s without counting Nebulas and Galaxies!

So the rule of thumb to use is the same one as for planets – but employ it sparingly.

A quad of Wikipedia links to end this section:

Naming Campaigns

GMs often don’t name their campaigns, simply referring to them as “My Campaign” or by the name of the game system. You don’t have to GM for very long before this becomes inadequate. Some GMs leave it to the players to come up with a name after it’s been running for a while but that often leads to unsatisfactory results. So it’s better for the GM to come up with his own name.

General Principles

A campaign title should tell the story of the campaign – and that gets tricky if you don’t intend to railroad the campaign. At the same time, you can’t give away too much about the campaign; the title has to entice and tease the players, without giving too much away. At the same time, the title has to accurately sum up the overall uniqueness of the campaign.

That’s more easily said than done. I think that the easiest way to explain how to achieve this in practice is to demonstrate with ten of my own campaigns, and a couple of Johnn’s, with which I’ll get started:

The Carnus Campaign

According to Johnn’s introduction to (“A Brief Word from Johnn”), Carnus the Campaign started with the players in the City of Carnus, which was actually Ptolus with a new name.

Naming a campaign for a central adventuring location isn’t new, but the problems come when setting a second campaign in the same location. How do you distinguish between them? How do you refer to one and exclude the other?

Further, such a naming approach makes the adventuring location the central fact of the campaign, rather than simply the place where the action takes place. It’s the difference between naming the trilogy by JRR Tolkien “Middle Earth: The Fellowship Of The Ring” (etc) and naming them “The Lord Of The Rings: The Fellowship Of The Ring”. Now, if your campaign is one that’s just a lot of unconnected stuff that happens, that may be fine – but if there is a larger theme or plotline involved, such a name can detract from it (I’ll offer a counterpoint to this arguement in a few paragraphs, so don’t get excited just yet).

The Riddleport Campaign

Johnn used the same approach when naming his Riddleport Campaign, but this was more appropriate since the city was/is central to the campaign premise and events, as his posts on the campaign here at Campaign Mastery, and at Roleplaying Tips make clear.

I have also seen a similar approach used in Pirate Genre and Sci-Fi campaigns where the name of a ship that is the PCs base of operations is the central hub of the Campaign, and hence the Campaign is named for the vessel.

The Adventurer’s Club

The Adventurer’s Club is the Pulp Campaign that I co-referee. It is named for the club that has gathered the PCs together, and that serves as a hub for their adventures. At the same time, the club has taken on a life of its own, having its own plot arc which touches the lives of the PCs frequently, either tangentially, incidentally, or directly. A couple of years ago (real time) the Club was taken over by the FBI as a resource too dangerous to be left to its own devices, for example.

There are a couple of subtexts to the name. Putting “Adventure” up-front in the title describes the sort of scenario that we run – very much a “there and back again” with dramatic action in-between – a stylistic promise to the players. “Club” emphasizes that the collective is more important than the individual parts that make up the PCs, and also stresses that alliances and fellowship will be ongoing subthemes within the campaign. Lastly, the name has the right flavor for a Pulp campaign.

Fumanor: The Last Deity

The players adventured in this campaign for two years before I revealed more than the first part of the name. As a result, they still refer to the Campaign simply as “Fumanor”. I didn’t like withholding the name, but it gave away altogether too much; that said, it took the PCs a lot longer than I expected to reach a point where they could be told the name, by a good couple of years. Initially, the title referred to the quest to name the last Deity of the Pantheon (described in more detail in “The Absence Of Plot Direction” section of my article, A Potpourri Of Quick Solutions: Eight Lifeboats For GM Emergencies), but it had been designed to have a potential sequel campaign with the same characters and with exactly the same name. In this second phase of the campaign, the title referred to the last Deity not to have joined the Pantheon assembled by the PCs, or to the rise of Lolth from lesser being to a Demigod (or better), or both – and implied that it had done so throughout the campaign, since the seeds and clues to both developments had been carefully planted in the course of the first campaign.

It’s worth noting that the first part of the title is the name of the Kingdom in which most of the action takes place because the central plotline was the destiny of that Kingdom. This, of course, is in direct contradiction to my earlier comments that such a title was only useful when the campaign was undirected; this is an exception to that rule because the direction and theme of the campaign are provided by the subtitle.

Fumanor: Seeds Of Empire

This effect, in turn, permitted me to continue to use that Kingdom, and it’s fate, as the central connecting thread to sequel campaigns. The Seeds Of Empire campaign is about the difficult transition from Kingdom to larger political state; the Kingdom having now grown to the point where Kingdom-level administration is inadequate, and where the Kingdom is facing Imperial-scale problems – like rival contenders for control. Since that growth was a direct byproduct of PC actions in “The Last Deity” campaign, and the PCs were all from races whose political, social, theological, and personal statuses had all been radically altered by the events of that campaign, the connection was fairly obvious. One of the three contending societies that feature in this campaign WILL dictate the shape of the emerging Empire – its up to the PC’s to make sure it’s the one they want it to be.

Fumanor: One Faith

Originally, there was only going to be one three-part sequel campaign to the original Fumanor, but when one of the players temporarily relocated to Canberra for a year or so, but didn’t want to surrender his participation, I split them into two. The first part of the originally-intended campaign became the foundation for the One Faith campaign, the second and most of the third part became the basis of the Seeds Of Empire campaign, and I whipped up a new second half for the One Faith campaign. Although the events in the One Faith campaign thus far have preceded the entire Seeds Of Empire plotline, the two are gradually synchronizing; the whole shebang is intended to (eventually) climax in an epic finale featuring the PCs from both campaigns. At the moment, both campaigns are roughly half-complete.

Shards Of Divinity

When a player asks you to run a campaign so that he can learn how you do it, and how he can improve as a player, it’s hard to say no. Shannon was a player in the later stages of the “second half” of the original Fumanor campaign, but chose to drop out – the campaign was too big in scope, and he was too inexperienced, for him to get a handle on. Five years on, and he felt that he had learned a lot, and was now ready to dive into something bigger. The result was the Shards Of Divinity campaign – a world in which the source of all arcane power is the shattered remains of the original creator of the Universe, and it’s now running out, and in which one PC (Shannon’s) is – through a stroke of chance – in a position to undertake a quest to restore it – having become the sole witness to the original act of creation, and the highlights of human history since.

From that description, the source of the title seems fairly obvious, but the PCs are slowly coming to realize that there are layers of hidden meaning to the name as things that originally seemed quite unrelated begin to connect – everything from Gods in extreme depression who are a mere fraction of what they are purported to be, to the nature of divinity, to the source of divine power, to the nature of the fey, to mystic circles and rituals are starting to link to each other in unexpected ways, and everything they see around them is being revealed to be both more and less than they thought.

Champions

This is the oldest campaign of mine that I’m going to mention here. It was named for the superhero team that was the focus of the campaign, which in turn was named for the rules system. That team name was chosen by the players – but I now deeply regret not having pushed them to be a little more creative, as the fact that it is a trademarked name limits what I can do with my vast stockpile of notes and adventures. I’ve written 2 and three-half novels telling the adventures of the group, with a lot more material to work from – and none of it can be published without a complete rewrite.

Zenith-3

Over a decade ago, the Champions Campaign – which had been put on hold for a few years while I ran TORG – was rebooted into a sequel campaign. The original was heading toward Ragnerok, an epic climax; in the new campaign, that event was five years in the past. A team of novices recruited into a trainee program by the original, parent team, and sent to an alternate dimension, D-Halo, because the Earth-Prime was too dangerous for novices, the PCs eventually discovered that they were in fact the focus of a conspiracy by a 5th agent within the ranks of the parent team and had been sent somewhere almost as dangerous as Earth-Prime would have been. Eventually – at the end of the original Zenith-3 campaign – they overcame that threat.

The name of this campaign obviously derives from the code-name of the superhero team – even though, to the inhabitants of Dimension-Halo, they were simply known as The Champions – because this is the story of the team’s evolution and coming-of-age.

But the name carried a hidden sub-context: the team were forced to climb to the very summit of their chosen profession in order to succeed.

Warcry

I’ve described the origins of the Warcry campaign before, so I won’t go into it again. Created in a hurry as a spinoff to contain a PC that was too powerful for the main team, a minimum of effort went into looking beyond its self-evident title.

Zenith-3: The Regency Campaign

As the Zenith-3 campaign neared its climax, a miscommunication between my players and I was discovered. It had been my intent for them to return to Earth-Prime and deal with all the ongoing problems that I had seeded into the background; but they had the impression that they were to engage in a rotation programme, exchanging places with another of the Zenith teams, and that they were quite looking forward to it. After some thought and discussion, a plan emerged which would see a split campaign – some adventures would take place on Earth-Prime and some on the new world to which they were assigned, Earth-Regency – whose history I have been publishing in these pages each Monday for the last few of months.

I can’t give too much away at this point, but I have told the players – and so can tell you – that over time, their presence in Dimension-Regency will make that dimension a focal point for something BIG, which I have code-named Armageddon. I wrote extensively about the process that I employed in designing the campaign architecture in the series of articles on campaign and adventure structures (November-December 2011).

What I can say is that there are, once again, a couple of meanings to both the main campaign title, and to the campaign sub-title. There is the obvious reference to the team itself; but, once again, the team will have to climb to the peak of their profession – and beyond – in order to win at the end. The plot arcs and circumstances will give the characters the chance to do so, but seizing the opportunities will be up to the players; I can (and have) warned them that subtlety, cleverness, and control will be more important than raw power to the outcome. In terms of the sub-title, it is a metaphoric reference to the Dimension in which the adventures will predominantly take place; but there are at least 3 other layers of meaning as well, that I can’t reveal. Let’s just say that the campaign title is relevant for all sorts of reasons and leave it at that!

The Tree Of Life

Nor can I tell you a whole lot about this campaign yet. The basic premise, from which the campaign appears to draw it’s name, is that the cosmology of the prime material plane is shaped like a vast tree, with it’s branches running through two of the elemental planes to the outer planes, and the roots running through the other two to reach the abyss; and that, for reasons they don’t understand yet, heaven is full; and that a demon prince has successfully wiped out every cleric (and virtually all the non-clerical support staff) of all the churches in the world in a simultaneous strike; only four PCs survived, the de-jure spokesmen to their faiths, and one of those has since fallen.

Once again, there are layers within layers in the campaign title.

Summing Up

Some of these campaign titles work well, for various reasons, mostly relating to a depth of meaning within the title. The rest range from acceptable to the poor; these have only a straightforward meaning, of various degrees of nuance and relevance. A great name gives a reference point and a context to the entire campaign, a poor one can detract from a campaign – or from later usefulness of the work involved in creating and setting up the campaign.

If I get the opportunity, I put a lot of effort into coming up with a campaign title; it serves as a touchstone to the identity of that campaign and is instrumental in shaping not only my thinking as the campaign proceeds but that of the players. In every case where I haven’t had that time (or the expertise, in the case of “The Champions”), I’ve regretted it to at least some extent.

Whew! Almost 8000 words and I am seriously out of time on this post. There’s still a lot to come; in the next part of this series, I will focus on the fine art of naming adventures, with dozens of examples. A Dozen Dozens is not out of the question…

Comments (8)

The Imperial History of Earth-Regency, Part 9: Peter Pan, The Saint, & The Fairy Princess – 1980-1997


This entry is part 9 of 12 in the series The Imperial History of Earth-Regency


Pieces Of Creation is an occasional recurring column at Campaign Mastery in which Mike offers game reference and other materials that he has created for his own campaigns.

All images used to illustrate this article are public-domain works hosted by Wikipedia, Wikipedia Commons, or derivations of such works, except for the image of Prince Charles and Lady Diana.
 

This post was delayed for the Easter Holiday. I hope all our readers had a great break!

Photo by NASA, taken at the Kennedy Space Center.

The Communications Age: Peter Pan, The Saint, and the Fairy Princess – 1980-1997 (~60 years ago)

Author’s notes: Of all the material contained within this Alternate history, this was the section that my players found hardest to digest when it was initially presented to them. I think that this can be attributed to two factors:

  • First, they were either too young or too old and cynical to appreciate the way the public felt at the time about certain public figures; if you did not experience it, you can’t find it completely credible;
  • and Second, their view of the era is principally Australian in nature (unsurprisingly), without an appreciation of how the rest of the world, and North America & Britain in particular, responded to these individuals.

I’ll respond to their comments about specific individuals in authorial asides as they become relevant.

An era of Transition

1982 marked the beginning of the end for the Age of science, or so it seemed. At its beginning, scientific discovery had been seen as the answer to all problems, and the future had been perceived with optimism. Society was open and welcoming and people could leave their doors unlocked, and the Government was the people’s friend.

By its end, science had been forced to admit that it didn’t have all the answers, and might never have them. Much of the progress that had been found to have attached price tags that were unacceptably high – industrial pollution, thalidomide, the discovery of incurable social diseases like AIDS and antibiotic-resistant STDs. The drug trade threatened to tear society apart, having already driven crime rates so high that people lived in fear despite the locks on their doors and the bars on their windows.

The Government lied at best and conspired with big business and “the establishment” to keep “the system” in power, at worst. Hope for the future had been replaced with fear and greed. But when you reach rock bottom, there are only two choices: death or the long climb out of the abyss…

Author’s Notes: I personally lay much of the cynicism and mistrust of government in modern times at the feet of seven key events, and two of these did not occur in this alternate history. I thought it worth taking a moment to reflect on each of these, to provide some context for events and attitudes alternate history.

  • McCarthyism: If you accepted the cold-war position that you were either “with us or against us” then the McCarthy witch-hunts made a certain amount of sense but can only be seen as having gone way too far. If you were more inclined to think that the other side was made up of human beings too, with the same desires and needs as ‘The West’, then these were an abomination and one miscarriage of justice after another. Either way, by adopting an us-vs.-them attitude, and then casting popular figures, with whom many people identified, into the “them” camp (rightly or wrongly) forced everyone else to think about which side they were on. As the witch-hunts became more and more ridiculous in their extremes and increasingly politically biased against McCarthy’s domestic political opposition, the “them” camp looked increasingly attractive by comparison. Divisive politics at it’s worst, it should be no surprise that it divided the community – and put “The Government” that McCarthy represented into the opposition.
  • The Korean War: This was a conflict whose resolution never seemed to be a victory, it just kind of limped to a conclusion. It engendered a sense not of a titanic struggle between two enormous alliances but rather of leadership that seemed less than those who had come before. After all, the previous generation of leadership had won World War II fairly decisively.
  • The Kennedy Assassination: Although he was never as strongly supported as rose-colored hindsight would have us believe, it is nevertheless a fact that John F Kennedy embodied the hope of a brighter future to an awful lot of people, and that this hope seemed to die with him. In part, this is due to the contrast of the eras before and after this pivotal event – before, the Space Race seemed the dominant theme, and afterward it was the mud, muck, and flies of the Vietnam Jungle. Had Kennedy not been killed, I am sure that his all-too-human faults would have tainted his reputation; but he died, and became an icon and a legend.
  • The Vietnam War: Which, of course, brings us to the war that so many opposed so strongly that they eviscerated the servicemen and -women who fought it. I think that a lot of people resented everything about the war, from being forced to fight it through to the manner in which it was fought. Spouting slogans at the government wasn’t enough, the mob needed a symbol of the enemy they opposed – and that symbol were those who actually fought in the war, whether they wanted to or not.
  • The Watergate Scandal: With the world slowly realizing that their leaders pulled on their pants one leg at a time (the same as ordinary people), the atmosphere was ripe for the idealistic perception of government to be shattered, once and for all – and this was the watershed event that showed that the administration had feet of clay. I’m not an American, but even in Australia, the reverberations were felt. First hope, and now Trust had been destroyed; is it any wonder that cynicism and pessimism would be the hallmarks of the decades that followed?
  • The Tabloids: Feeding all of this was the inescapable conclusion of the trend that had started with Hearst and his willingness to pump up, or even fabricate outright, stories to sell his Newspapers. The tabloid mentality, pandering to the most sensationalistic urges and emotions of its readership, holds unremitting sway over the populace only so long as they believe what they read. Every time an excess of zeal in reporting is revealed, it fuels cynicism; every time the press pursue a beat-up story for the sake of sales, or ratings, it slices off a thin segment of the community who know better and who will be mistrustful thereafter. In order to reach those affected by this growing cynicism, the headlines and exaggerations have to be even stronger. The National Enquirer, in the 80s and 90s, became a byword for going to such nonsensical extremes that some people could no longer tell what was fiction and what was fact, a situation which has been lampooned mercilessly ever since. Newspaper stories were always colored by the Editorial philosophy and vested interests of their owners, I’m sure; but efforts to keep those influences at arms length were slowly worn away. All sense of self-restraint seemed to vanish, and people became aware of the bias that existed as it became more obvious. The current situation in the UK with The News Of The World and the Phone-tapping scandal seems to me to be the ultimate expression of this trend, and hopefully the outraged reaction that has followed will be the start of a trend in the opposite direction.
  • The Dictators: And finally, providing fuel for the fire were the excesses of dictators. The revelations of the practices of Idi Amin were a bombshell to any who thought the human race had outgrown such barbaric acts, or been purged of them by the victory over the Nazis in World War II. Such events had occurred in the past; they were part of the folklore of human history, nothing new; witness the excesses of Vlad The Impaler, or the Spanish Inquisition. But that was the problem – people had thought that we had outgrown such barbarism, and when we learned that we (as a species) had not, it tarred everyone who had stood with the architects of barbarity with a little of the same brush. This problem persists even in more modern times – was Saddam Hussain’s persecution of his citizens because of their faith really all that different (if a little less systematic and extreme) from what the Nazis did to the Jewish population of Germany? And how much of American reaction to the Gulf War a sense of guilt over having supported his regime?

It must be remembered that the current generation will become World Leaders in three-to-five decades, and the experiences and philosophies apon which their attitudes are built will form the baseline of their politics. The attitudes of youth 30-40 years earlier are their formative experiences. Our current leaders were children in the 60s, 70s, and 80’s, when Industrial Pollution became regular front-page fodder and the illusion that what benefited a large corporation was necessarily good for the community at large. Their priorities are fixing the things that they perceived to be most wrong with the world at the time, or their modern incarnations.

So, what impact do these events play in the alternate-history world of Earth Regency? Vietnam and Korea didn’t happen – but the Russian equivalent of those experiences, the invasion and occupation of Afghanistan, did – though expanded to cover a larger field of the Middle East in general. The other key events remain, though – in many cases – once removed from the centre of power. Government can thus be mistrusted, but there are the Empress and Imperial family to shield the ordinary citizen. There is thus an avenue of hope and trust that was lacking in our history. But, at this time in history, the Empress is beginning to seem more remote and distant, the representative of a past generation. The populace, and especially their youth, are looking for new figures to idealize and idolize.

Michael Jackson, 1984, derived by SpeedDemon74 from a photograph by the White House Photo Office

The Peter Pan Of Pop

Although no-one recognized it at the time, there were three socially-significant figures who exemplified the rebirth of optimism and human decency in the Empire, and it had been in 1980 that the stories of these three figures began.

The first was an American entertainer, whose showmanship made him the most popular artist in the world. Michael Jackson had very deliberately turned his entire existence into a larger-than-life circus act, selling millions of records, and carrying pop music to its zenith as an entertainment medium. Popular entertainment was transformed by the sales of his album “Thriller”; he transformed a larger-than-life cottage industry into a professionally-operated Big Business.

And then it all began to fall apart on Jackson; reports of increasingly-eccentric behavior led to the nickname Wacko Jacko, and a public made insatiable for sensation began devouring not only the product, but the people that generated it. Rumors, fiction, and outright lies were all grist for the mill – accuracy was no longer important, headlines were all that mattered. The youth countercultures of the 60s had been inherited by the firebrands of the 70s and were unified by Jackson. There was always a mythical element to the story of the “Peter Pan of Pop”, a fairy-tale element that played on people’s lack of hope, offering an escapist retreat from a world that was otherwise becoming unbearable.

To me, Michael Jackson will always be a figure of tragedy. Denied any semblance of a normal childhood, it is no surprise that his adulthood – after earning enough money to make any dream a personal reality – would be bizarre. A Child-like naivety and trust lie behind virtually every decision he ever made, in my opinion – whether that be trust in the Medical profession, in his preference for relating with children (who naturally shared his perspective), or that his life of excess would be understood by his fans.

There was a time, before the advent of the tabloid headlines of his life, where Jackson could seemingly do no wrong. Everything that he touched turned to gold. Those same child-like qualities made him a repository for optimism and hope, a living idol to the inner child within all of us.

His rise and fall are a Greek Tragedy, writ large because of the way he embodied what others wanted to preserve in themselves – hope, optimism, and the ability to enjoy life to the full with no cares for tomorrow. He was successful because he appealed to the things we like about ourselves. It was all too easy to forget that he was also human, and fallible.

Like JFK, he became a popular idol; unlike JFK, he survived to be torn down from the pedestal apon which he had been placed, rightly or wrongly, by the public. Had Kennedy survived, perhaps the same thing would have happened to him; he certainly had enough opposition with whom to contend. But that’s another might-have=been, and one that doesn’t fit within the current story.

The Communications Age

Not all the lessons learned from the larger-than-life success of Jackson were good ones. It became acceptable to spend as much as necessary to achieve a blockbuster success. The same attitude began to pervade all other forms of entertainment, and then business in general. The counterculture figures from the 60s were now aged in their 20s, 30s, and 40s, and had largely been assimilated into the mainstream of the society against which they had rebelled.

The youngest amongst them achieved new levels of greed and excess, and were officially tagged with the collective nickname “Yuppies”.

Real-life prediction: A decade from now, if not sooner (5 years or so) the big international issue will corporate responsibility and making the executives of corporations accountable to the public for their behavior. Think back over the news stories of the last couple of years and you can see the early trends in this direction.

Nor was this the only new word entering the language at this time; a more formal title for the period might well be “The Communications Age”. New language had been infiltrating for decades as the consequences of scientific and industrial progress; but the vast majority of these were technical terms that had little influence on everyday usage.

A frying pan was still a frying pan; the “Non-stick Teflon Coating” was just sales jargon. Now, however, domestic innovations began to appear with increasing regularity, and the language changed as a result. And the most fertile field for those innovations was the communications field, as ‘GPS’, ‘VCR’, ‘Mobile’, ‘Hands-free’, ‘HUD’, ‘ISP’, ‘PC’, and ‘CD’ all became everyday conversational terms.

Princess Diana at the opening ceremony of the community centre on Whitehall Road, Bristol, UK, May 1987. Photo by Rick.

The Princess

The second great figure of the era was even more strongly symbolic of the escapist / fairy tale popular appeal. Lady Diana Spencer was considered a flower, the embodiment of the shreds of hopes and dreams of the common people made manifest.

When she married the heir apparent of the Empress Elizabeth, she became the public symbol of hope. It was then widely believed that the Empress Elizabeth would abdicate on her 60th Birthday, and that her son would ascend the throne; and thus the coming generation would have a representative, an ear and a voice, at the very centre of power.

Behind the scenes, the Cinderella fairy-tale was far from reality, the combination of the weight of public expectations and a husband with adulterous inclinations overwhelming the young woman at the centre of the storm. In the public thirst for sensation, the fairy tale would be exposed, piece by piece, as a sham.

Matters were not helped by the old-world moral judgments of the Empress Elizabeth, who was placed in an impossible situation as the marriage began to fail. Hailing from an era in which loyalty, “for better or worse”, meant forever, she did not support the fragile Diana as much as the Princess was, perhaps, entitled to expect; nor was she especially successful at reigning in her son’s indiscretions. As the marriage first floundered and then ended, she discovered that the sensationalist press had eroded all faith in Prince Charles as a potential Monarch, even as they had destroyed her faith in her son’s discretion and attention to duty.

Worse, he had undermined confidence in the Monarchy as an institution; Diana had been perceived as the People’s Princess, the ally of the commons – titles that were supposed to belong to the Empress – and the failure of the marriage had become perceived as the failure of the Empress to stand by the people. The entire concept of the Empire as a political institution was beginning to lose favor amongst its citizens – without whom, the Empire would be nothing at all.

Diana somehow emerged from the entire fracas with her perceived connection to the people intact; but now that she was no longer royalty, she was seen as fair game for the sensationalists, who slowly dragged her down to earth. She had been careful to maintain a public face of respectability, and (to her credit) never let her dignity escape her, and had even begun to rebuild her personal prestige through many social & charitable projects, when she was killed in a terrible automobile accident. The wolves turned on the legend and did their best to tear it asunder; but the sensationalist movement was beginning to die, and as a result, her legend survived.

We in Australia held a privileged position in terms of being able to see the entire story unfold at arm’s length. We saw the British public attitude of the era as they identified with “Lady Di”; we saw the disintegration of the fairy tale; we saw the rebirth and rise of popularity within the United States; and we saw the British public revere her as a martyr to the lust for headlines of the tabloids and paparazzi.

Part of the appeal was generational; Prince Charles was roughly the same age as my father, Queen Elizabeth roughly the same as my Grandmother. Diana was approximately my age, seemed to like the same things that people of my age liked, had similar attitudes and opinions, and so on. She embodied a hope for the future to many people, whether they were strong supporters of the monarchy, or not.

At the same time, this was the coming of the New Romantics and the tail-end of their extreme counterpoint, Punk. More than musical styles, these represented philosophies in opposition; and for those without the anger at and resentment of society to fuel a punkish attitude, Princess Diana seemed to embody the cleaner-cut image of the New Romantics.

Sir Bob Geldof at the headquarters of the International Monetary Fund, 23 April 2009. Photograph by Stephen Jaffe, courtesy International Monetary Fund.

The Saint: Sir Bob Geldof

The third of the Great figures of the 80s could not have existed without the contributions of the first two. If Michael Jackson unified youth cultures throughout the Empire, however briefly, and Lady Diana gave them optimism and hope, it was Bob Geldof, later nicknamed “Saint Bob”, who showed just what the combination could achieve, socially, when they really wanted to.

His relief project, Band Aid, and subsequent Global Live Pop Festival “Live Aid” (the Mao being a notable non-participant) raised funds in excess of 500 Million Pounds for famine relief in Africa. It spawned imitator events from across the world, most notably USA For Africa (organized by Harry Belafonte and Ken Kragen, and featuring Michael Jackson amongst others). Equally important to future generations was the revelation of the consequences of misrule by African Warlords and Dictators.

The political promise that the politicians had so feared in the late 1960s had been realized – in a socially-acceptable way. Equally importantly, the devastating pictures of mass starvation that resulted reminded people of the benefits that science, and the Empire, when used properly, could provide.

African hunger would be a recurring issue; while no-one of the time thought that any of these activities would be a lasting solution, African aid, and its management (and mismanagement in some cases) would focus attention on the causes of many of the problems in future years.

Granted an Honorary Knighthood by the Empress in 1986, Sir Bob remained active in African relief and similar projects for the remainder of his life. His plainspoken demeanor, occasional outbursts of hyperbole, and – to some extent – his naivety in terms of distribution of the proceeds of his various ventures in the cause, left his efforts open to criticism after the fact, though few doubted his sincerity and willingness to sacrifice his own personal career to the cause. His de-facto position as the media spokesman for just causes and political enlightenment were eventually usurped by Bono of U2, whose activism covered a wider range of issues; but to the public at large, they were all walking in Saint Bob’s shoeprints.

Imperial Resurgence

These three people, more than any others, could be considered the prime movers behind the Imperial Resurgence. Had any one of the three not existed, it is doubtful (in retrospect) whether or not the Empire would have survived to the present day (2055).


Okay, so Mike was popular, Bob made social responsibility popular, and Diana’s wedding was a popular fairy tale. That doesn’t mean that people have to buy into the Deification of the Holy Trio. It can be argued that the people would have rekindled their hopes anyway – very few can live in total despair for any period of time and continue to function – and that these three, amongst others, just happened to be the figureheads anointed by the resurgence. But to Imperial Citizens, they are reverenced.

A return to prosperity

The increased enthusiasm on the part of the ordinary citizen generated other resurgences. In particular, the Economy, which had been slowly growing moribund, began to grow, and a more wary and realistic faith in technology emerged. Technological Solutions were perceived as only part of the story; the real problem with Industrial Pollution, for example, was not a scientific one, it was a social problem. People demanded the products that were being manufactured – a social phenomenon – and it was that demand that was the real cause of the environmental damage. The solution would also have to be a social one.

There was a general perception that any seemingly insoluble problems only seemed so because it was not properly understood. Crime, for example, wasn’t just a social problem, it needed a scientific analysis to find the solution. The concept of prison reform, which had become popular through the 1970s, was increasingly perceived as a failure, because it promised an easy ride to criminals; the deterrent element was missing.

These changes in attitude took time. Together with changes in fundamental social concepts like ownership, and the social unit, they would slowly reinvigorate the Empire, and ultimately culminate in a new groundswell of optimism in the following generation; but it was the Communications Age that layed the foundations.

The original Sony Walkman, photo by joho345

1980

There were few developments of obvious, lasting historical significance in 1980; no doubt the days were as filled as at any time, but from a remote perspective the world seemed to be holding its breath and enduring the calm before the storm.

The Communications age began before the end of the Age Of Science, with the launch of a portable, personal tape recorder, the “Walkman”. It was not recognized at the time as the harbinger of a social revolution, it was just another gadget. It would be years before many people discovered their existence, and even today many have the (false) impression that the Walkman post-dates the Personal Computer (Even fewer realize that the Compact Disc predates the Walkman by two full years!)

Those false perceptions notwithstanding, the Walkman was a new concept in that it personalized the entertainment experience, elevating the individual over his surroundings. Prior to its release, music and entertainment were social activities, involving anyone within earshot. If music were played, everyone in the room heard it; there was a shared aspect, a social aspect, to the experience. Now music became a personal experience; in itself not a groundbreaking development, but one that would symbolize the coming decade and much of the decade to follow.

The Individualistic Experience

For 17 years, in fact, the dominant social trends could be symbolically cast in that one concept – the Individual over Society. Individuals worked for their own benefit first, the benefit of other individuals second, and a collective society hardly at all. Indeed, so little common ground was experienced through this period that society collectively was perceived as a faceless mass, a lowest common denominator, a generalization of individuals.

But at the time, none of this was evident. Life was dominated by day-to-day events, and only with hindsight could a trend be perceived; and many of those day-to-day events were trivial, even irrelevant in the historical sense. Which is not to say there were no significant developments….

Rhodesian Disunity

The sequence of events in Rhodesia came to an end as the South made the transition to Black Rule under the joint leadership of Prime Ministers Mgabe & Nkomo; the north continued in its state of anarchy.

Afghanistan Deadlock

It was announced in February that 90% of Afghanistan was now under direct Imperial Military Control – but that 60% of the Afghan military remained intact within the last 10% of the country.

The Iran Crisis

On April 25th, the USK took unilateral action to free the hostages in Tehran, launching a commando strike. Unfortunately, the Americans were not the equal of the Australian Special Forces, who had already ruled out a raid as too risky; the action was bungled and the hostages killed by their captors.

Less than a week later, terrorists seized the Iranian embassy in London, demanding the release of political prisoners; but unlike the Tehran situation, the layout of the London embassy was conducive to successful intervention, and Australian Special Forces successfully freed the hostages and captured the terrorists within a week of the alarm being raised.

Although Wikipedia Commons also has pictures of the eruption itself, I couldn't go past this spectacular 2004 image of the volcano crater steaming. Prior to the eruption, it looked like an ordinary mountain. Click the thumbnail for a larger image.

A Bellow Of Nature

In mid-may, the long-dormant Mount St Helens unexpectedly erupted with the force of 10,000 atomic weapons. Because of the demonstrated capability of The Mao to create and trigger volcanic events, this brought the Empire closer to global war with the Mao than at any time in the last 35 years. Tensions did not ease until specialist geologists from around the globe confirmed that the eruption was natural in origins.

Ronald Reagan, photograph courtesy the National Archive & Records Association ARC 558523. Photo by UD Department of Defence, Department of the Navy.

Other news of the day

There were other terrorist actions through the year – bombings, assassinations, and so on – as the extremists offered the frustrated an outlet for their dissatisfaction. Israel unified Jerusalem and declared it to be the new capital of the Zionist nation.

Ronald Reagan was elected Prime Minister of the USK despite opposition by King Jeremy Washington I. And finally, Michael Jackson’s “Thriller” was released to critical acclaim and initially poor sales.

1981

1981 felt much the same as 1980. There were a few developments of lasting interest, and new trends continued to gather momentum, but it was nevertheless a year in which life was simply business-as-usual for most of the population.

The Iranian Crisis Deepens

In January, Iran released the 52 Imperial Hostages who had been held in Tehran since November 1979, carrying an offer to the Empress: Iran would rejoin the Empire, and use it’s influence to help persuade the other rebelling Middle Eastern states, in return for an equal voice in the governance of Jerusalem, and the eviction of the USK† from the Empire.

The Kingdom Of The United States Of America. Refer earlier parts of this series for explanations.

While the Empress may have been tempted to consider the offer in those moments when the United States was being especially exasperating, the peace offer failed to take into account two crucial facts:

  1. The USK was vital to the defense of the Realm; and,
  2. As a practical measure, the Empress didn’t have the power to accept or reject the proposal; that would be controlled by the Diplomatic Corps.

Iran had lost touch with the political realities of the Empire, and as such, the proposal was doomed to an inevitable failure.

Prince Charles & Princess Diana Photo © 2010 hans thijs, flickr

The Spanish Experiment & other events

The “United Leadership” experiment of King Carlos‡ came to an unhappy ending, as 200 civil guards under the command of Colonel Terjeo Monila attempted a coup. Carlos resigned as Prime Minister, admitting that his bold attempt to unify sufficient power to force change had failed.

See “The Rules Change” in Part 7 of this series.

Heavy fighting again broke out in Beirut in April, and in June Israeli aircraft bombed a nuclear reactor under construction near Baghdad.

In July, Charles, Prince Of Wales, married Lady Diana Spencer.

The following month, USK Aircraft shot down two Libyan jets over the Gulf of Sirte, while October saw the assassination of Anwar Sadat of Egypt in protest over his peace accords with Israel.

Throughout the first few months of the year, sales of “Thriller” would grow, until it ultimately became the most popular single body of music publicly available; it would be “Top Of The Charts” worldwide for over a year.

The IBM 5150 was the first fully-assembled PC; all the previous ones came as kits which had to be assembled by the user. Photo by Biffy B.


The year also saw the arrival of the space shuttle and the recognition of AIDS as a disease. IBM launched the PC (with 64K of RAM and a single floppy disk drive); it would become the industry standard over the years to come. It certainly was not recognized as the means by which the individualism that had not yet become dominant would first achieve its full flower, and then ultimately wither.

1982

At the start of the year, it looked like it was just going to be more of the same old same old. But the strongest hurricanes start as a light breeze…

Africa

In the culmination of the Smith Plan, Southern Rhodesia establishes a new identity as Zimbabwe. Joint leader Nkomo, whose relations with Mgabe were always strained at best, was dismissed from office because he would not agree to Prime Minister Robert Mgabe’s intentions to establish a police state.

A Pyrrhic Defeat

Israel agreed to give the Sinai over to direct Imperial control in the interests of maintaining peace. By the end of April, all Israeli forces had withdrawn from the region. The Afghanistan advance by the Imperial Military all but ended in the stalemate.

Relations between Iran & Iraq decayed and then devolved into war. The Israeli Ambassador to the Imperial Court was shot by Palestinian terrorists; in retaliation, Israel invaded Lebanon. The significance of this last development would not recognized for over two decades, when it would revolutionize politics within the Empire.

In the meantime, the bloodshed continued unabated. It took less than a month for Israeli forces to encircle Beirut. In an effort to prevent civilian casualties, Prime Minister Begin offered to permit the PLO to withdraw from the city with their weapons.

This was the first acknowledgement by the Israelis of the earlier decision by the Civil Service to recognize the PLO as a political organization – by negotiating with them and treating them as a political authority within the region, they gained political credibility throughout the middle east as a “dispossessed nation”.

Debate raged for almost two months, but the resulting political benefits were too strong for the more moderate elements within the PLO to resist. By accepting, they would be able to claim shelter and sanction within the same laws and rulings which created the artificial national state known as “Israel” – and could thereby claim all the legal and diplomatic protections and concession extended by the Empire toward the Zionist state.

As the only areas in the region under direct Imperial control, the Empire had only two choices in terms of a homeland for the PLO, protected by Imperial Law – Afghanistan and the Sinai. For the population to relocate to the latter, they would have to march directly through the centre of the Iran-Iraq conflict; the only viable answer was for the PLO to be accorded protected status and Rule of the Palestinian region.

In permitting themselves to be ‘defeated’ and withdrawn from Beirut by the Israelis, they would ironically achieve everything that they had been fighting for. For the first time, “Success” was no longer synonymous with “Victory”. Furthermore, Israel would have to support their position or risk weakening their own political authority within the region and losing many of the concessions granted them by a sympathetic Empire in the wake of the Holocaust.

Cordoned-off street in front of the HSBC branch in Beirut, October 2005. Photo by Robysan. Click on the thumbnail for a larger image.

The Beirut Bloodbath

Although it was widely regarded as a troublemaker and an agitator, the PLO was in fact a stabilizing influence within Beirut. Within a week of their departure, and as the Israelis began to push into the city, the Lebanese Druse militia and the Lebanese Army restarted their long-standing Civil War. Three warring states each opposing each other converged, and Beirut became a bloodbath.

Each faction committed what can only be characterized as atrocities on the captured supporters of the others. On August 18th, over 800 Palestinians were executed by Christian militia in two refugee camps in western Beirut, for example.

It was slowly becoming evident that, just as a revolution in military tactics would be needed to succeed against the desert guerillas of Afghanistan, so a revolution in politics would be needed to solve the problems of the Middle East. But at the time, no-one had any idea of what shape that revolution would have to take – had no idea even of where to begin – and in any case, the Civil Service / Peerage alliance were inherently conservative and resistant to any change. Only when this political problem was solved could the search for new paradigms within the Arabian Peninsular begin.

The first reasonably portable computer was the Epson HX-20, shown here in its carrying case. Photo by sandstein.

1983

While a lot happened in ’83, most of it made little difference in the long run. Bloodshed continued in the Middle East. Apartheid continued in South Africa. Uproar continued in central Africa. Terrorism just continued. But some events layed the seeds for future developments.

The worst drought since 1973 (!) ravaged Ethiopia, bringing famine to millions. The Laptop computer introduced the concept of portable computing.

Pioneer 10 or 11, painted by Don Davis, Image provided by NASA. Click on the thumbnail to see the fullsized image.


Pioneer 10 passed the orbit of Neptune, then the most remote planet of the solar system. The IRA destroyed Harrods in London using what “must have been” a Mao sonic bomb that was attuned to the stress-points of the steel girders; 6 people were killed and dozens injured when the building collapsed.
 

I felt that some direct terrorist attacks would be made on the heart of the Empire, simply because it was the central point of authority. Such attacks often occurred in our history, targetting the United States, but on Earth-Regency, some would have to be aimed at London, simple because London was more important on the global scale. Furthermore, because London is closer to the Middle East, there would be more capacity for such attacks. This was the first such additional attack.

The HIV retrovirus was identified. Australia stole the America’s Cup from under USK feet – the first time since the contest’s inauguration in the 1870s that there had been a non-USK victor. This was done using clever, innovative engineering. And the second round of arms limitations talks with the Mao ended in complete breakdown.

The Apple-II computer. Photo by Marcin Wichary, Flickr.

1984

This was the year in which the disparate elements that marked the decade as a turning point began to coalesce. Violence continued in the Middle East, but calm began to return to Central Africa with South African troops leaving Angola, just as internal civil violence escalated.

Diplomatic talks with the Mao resumed after the contentious issue of disarmament was removed from the Agenda; by the end of the year, a new trade agreement was in place which promised a massive economic boost. Apple Computers released the Apple II, the first computer with a graphic interface. “Thriller” sold over 37 million copies in the USK alone, while Bob Geldof’s “Band Aid” produced a chart-topping single to raise money for famine relief.

Greed Is…

Entrepreneurs and Interest Rates began to emerge as the economic patterns of the decade. The rise of the new breed of Entrepreneur, the ultimate expression of the “Yuppie” movement, was a particularly significant development, because for the first time, these were not members of the Peerage.

A new subclass of the “Working Class”, they generated unprecedented wealth through three avenues: Communications (Michael Jackson, Alan Bond); New Technology (Bill Gates, Steve Jobs), and New Products (Franklin Andrews, head of the Asia-Pacific Trading Company).

Our readers will be familiar with most of those names, except possibly Alan Bond; his Wikipedia entry is located here.

The one name that they won’t know is that of Franklin Andrews, because that was the name of the father of Lance Andrews, aka “Behemoth”, one of the characters from the original superhero campaign, from which this history is divergent. The latter is a name that will crop up a number of times in the later chapters of this text.

Differences in the history of trade within the Empire become significant at this point. Tea was an Indian product, as was rubber; when India fell to the Mao in 1914, production of these commodities shifted to South America. Many of the other Chinese products, like Silk, had never reached Western markets.

While the sale of Imperial products to China was the province of the Peerage, differences in Cultural & Economic systems ensured that Asia was a relatively small market. The sale of Chinese products within the Empire was where the Big Money was; and the most significant trader was the Australian, Franklin Andrews.

The peerage tried to stop the rise of these new competitors, but found themselves hamstrung by Common Law. The battle lines between old and new, age and youth, were now clearly established.

It was Band Aid that showed the strength of the emerging youth factor as a social force. The more that disposable income trended to focus downward in age, and the more of that money that was spent on products under the control of the new entrepreneurs, the more strength the established political parties gained from their policies of recruiting a younger generation.

In 1964, the average age of the members of the Lower House of the Imperial Government was 52; in 1974, it was 49; and, by 1984, it had lowered to 45. If the trend held true, the 1990s would be dominated by a Prime Minister in his 40s, and the 2000s by one in his Thirties.

Live Aid at JPK Stadium, Philadelphia, 1985. Photo by Squelle; click on the thumbnail for a larger image.

1985

“Band Aid” was no more than a band aid on the problems faced by Ethiopia and central Africa. By the end of 1984, Geldof was planning an even more ambitious project – a 48-hour-long rock concert televised globally – including (for the first time) – China.

Taking place in July, and watched by over 1500 Million people, Live Aid raised over £350 million for further famine relief.

Although this was a tiny sum in comparison with the needs of the region, it was equivalent to five years of additional disaster relief through official channels.

The most notable omission from the performance list, which included hundreds of heavyweights in the popular music industry, was Michael Jackson, who had organized his own “Band Aid” equivalent project. His reluctance to be involved in the project, and the public castigation that followed, proved the first cracks in the Jackson mythos.

The long slow road to peace

The slow trend towards Peace in the Middle East resumed without addressing the problems that had led to previous outbreaks of violence in the region. Israel agreed on a staged withdrawal from Lebanon, which was complete by mid-year. Libya released four Imperial civilians after negotiations by an envoy of the Archbishop of Canterbury. But that was as good as it got.

October saw the Peace again shattered as PLO extremists murder 3 Israelis in Cyprus. Within hours, Israeli bombs were falling on the Empire-supervised PLO holding camp from which the extremists were believed to derive. While 60 hardliners within the PLO’s ranks were killed, civilian casualties were ten times this number. In addition, 23 Imperial representatives were maimed and 4 killed. The Empire responded by declaring the Israeli action an “Over-reaction”, and warning that any repeat would result in punitive action. This was not enough for some of the Arab nations, and threats of War as a result of the incident lingered for months.

Eyeball-to-eyeball ruthlessness

The Israelis insisted that demonstrating that for every Zionist killed, 200 Palestinians would be executed in reply, including those responsible, would have a deterrent effect. This position ignored the obvious facts that the fanatics, responding to what they considered oppression, would only become more fanatical in response to such “punishments”; and that dead religious fanatics frequently became martyrs to their cause. Consequently, relations between the Imperial Court and Israel became strained, and the PLO moderates gained in sympathy, which they hoped to parley into additional support for their claims to “Dispossessed Nation” status.

Only in 2015 would it be discovered that the PLO hardliners deliberately targeted the Israelis whose deaths had triggered the retaliatory strike in anticipation of a Jewish overreaction. They viewed the deaths of over 600 of their own as a worthwhile sacrifice if it generated additional pressure for the Empire to recognize their claims over the West Bank.

The Terrorism Escalation

If peace was again in short supply in ’85, one thing this year had too much of (as had been the case of late) were acts of Terrorism. March saw the 25th anniversary of the Sharpville massacres in South Africa; the anniversary was commemorated by fresh rioting and by the police firing into the crowd – a mirror image of the events of a quarter-century earlier.

146 were killed during Tamil separatist attacks in Sri Lanka on May 14th. One month later, Shi’ite Muslim gunmen hijacked a TWA airliner and demanded the release of 700 prisoners held by Israel, while in July French Nationalists blew up the Greenpeace ship Rainbow Warrior while it was anchored in Auckland harbor.

August saw 60 dead, 100 injured, when a car bomb exploded in Christian-controlled east Beirut; two days later a retaliatory car bomb exploded in the Muslim sector, killing 50. Of course, the PLO attack on 3 Israelis and the response have already been discussed. Less than a week afterwards, Palestinian guerillas seized the Italian luxury liner Achille Lauro, and murder a USK Hostage. The grim total of over 300 deaths through acts of terrorism in the course of the year would not be exceeded for the rest of the century.

Author’s Notes: While it’s certainly possible to criticize the West Wing episode “Isaac And Ishmael”, which was written and broadcast in the week following 9/11, the one line that most strongly resonated with me at the time, and which (in hindsight) summed up what would be the US attitude in response, was the response to the question of Rob Lowes’ character: “What’s the one thing that strikes you most about terrorism?”, and the response, “Its 100% failure rate.”

The litany of punch and counterpunch listed in the above section clearly demonstrates the utter futility and waste of such methods. Nothing makes a populace more determined to resist than poking them with a stick – and, when you’re talking about a national body, that’s what all acts of terrorism amount to. The 9/11 attack united most of the world in anger and fury, and certainly stiffened American resolve to resist any attempts to change their attitudes toward the Middle East.

It was thinking about that event and the response that it engendered that led me to the plot idea expressed in the last paragraph of the preceding section. I have no inside knowledge concerning the incident; I can only state that such an action as I have described seems consistent with the characters of the people involved.

Top portion, front face, Space Shuttle Challenger Memorial, Arlington National Cemetary, USA. Photo by Tim1965.

1986

The success of Live Aid didn’t have an immediate impact on Imperial Society. The youth movement was largely cause-driven, and not yet the outright political movement that it would become in the 1990s. The still needed a unifying trigger, a cause to rally behind. This was the year in which they gained that cause.

Undoubtedly the biggest news stories of the year were two catastrophic engineering failures. The first of these dealt with the dramatic and tragic failure on launch of the Space Shuttle Challenger, which threw the Imperial space programme into disarray; the second explosion of the nuclear power plant in Chernobyl, which leaked substantial quantities of nuclear fallout over northern Europe.

Only marginally less significant was the concession of the failure of the peace process in Northern Ireland and the new offensive against Libya, which once again was edging towards a nuclear capability.

Reactions

While the events themselves were significant, the reactions of the general public were even more telling. For many years, the value of the Space Programme had been questioned; as diplomatic progress was made, the need for intensive development of space became less pressing. Increasingly, the demand was to shift funding away from Space and toward environmental concerns.

With the discovery of the Ozone hole over the Antarctic in September of 85, these concerns became even more pointed. But following the catastrophic failures of the engineering of which the Empire had been so proud, the impetus became overwhelming, and the minority Green parties in the various members of the Empire became a significant political force, largely by capturing the youth vote. There was a general demand for a step back from the technological forefront and an increasing emphasis on more mundane endeavors.

Increasing concern was repeatedly and loudly voiced concerning the growing population problem and the ability of the Empire to maintain production of food – issues that were heightened by the contamination resulting from the Chernobyl incident.

New Dilemmas

There were a whole raft of new issues to be contemplated. Actually, most of them weren’t all that new; but the sense of urgency, of insistence on priority, was new.

Issues such as recognition of the native inhabitants of the colonies – Canada, Australia, the USK, and Africa – had been growing for some time, for example, spearheaded by the anti-Apartheid movement. There were suggestions that the Empire had double standards which were difficult to refute.

Awareness of the problems of Agriculture had been becoming more general long before the plight of Ethiopia brought them into the living rooms of Imperial Citizens all over the globe. Soil Salinity, Ozone, Oil spills, Nuclear Waste, Smog, Reliance on fossil fuels, Urban Sprawl, Topsoil Erosion, Rainforest restoration – none of them were new issues.

  • The most extreme position demanded that polluting industries be shut down until the environmental issues were resolved. The economic chaos that would have ensued made these demands absurd, and these demands were rejected out of hand.
  • A more balanced (but still extreme) proposal called for a moratorium on further research & development until the rest of the world was brought up to core Imperial engineering standard

Comments Off on The Imperial History of Earth-Regency, Part 9: Peter Pan, The Saint, & The Fairy Princess – 1980-1997

The Power Of Synergy: Maximizing Character Efficiency



One of my regular players and an occasional contributor here at Campaign Mastery, Ian Gray, has a simple philosophy when it comes to rewards – never ask for +5 when five +1’s will do.

The Judo Of Wishes

It’s a philosophy that has developed from his experiences with Rings Of Three Wishes and similar items. Like almost every D&D player out there, he’s seen people make outrageous demands and requests when using Wishes, and the inevitable reaction by the GM has been to do their utmost to screw the PC up as punishment for their audacity and in an attempt to keep some semblance of game balance.

The usual player reaction to this denial of their unmitigated greed has been to become amateur lawyers, attempting to make the terms and conditions of the wish ironclad in defense of the desired and exorbitant benefit they have claimed. The worst case of this that I have ever witnessed occurred when one player prepared a sixteen-page typed contract – for one wish.

This only makes the GM work harder and with more bloody-mindedness at finding and exploiting any loophole they can uncover, in my personal experience, and Ian has made the same observation. Since anything the GM says, goes – (short of driving his players away from the Game Table in outrage) – the deck is inevitably stacked in the GM’s favor in such contests – sooner or later, they will neutralize or steal or pervert or corrupt or render unusable the Player’s ill-gotten gains.

Ian observed this happening to other players on several occasions and quickly decided that a plus-one or plus-two that he got to keep and use was infinitely better than a plus-five that the GM will move heaven and earth to turn into a plus-zero. What’s more, as soon as it is announced that a PC is using a Wish, the GM – through experience and ingrained habit – inevitably girds his mental loins, bracing himself for whatever abomination the greedy player is about to demand. Making a slightly-weaker-than-reasonable request makes the granting of the request, with no hidden catches or strings – practically automatic, using the GM’s own determination to fight unreasonable requests against him.

The Stacking Equation

At around the same time, as I understand it, Ian was also formulating a second philosophic principle that has shaped his PC development ever since – it doesn’t really matter which came first. This states that it is more than twice as much work getting and keeping a +2 bonus than it is getting and keeping a +1 bonus. In other words, it’s easier to get two +1 bonuses that stack than it is to get a single +2, and much easier to get three +1 bonuses that stack than it is to get a single +3 item – but the end result is the same, in terms of character capabilities.

Extrapolating from that: it’s easier to get four +4 bonuses than it is to get a single +16 item (in fact, outside of possibly some monty haul campaigns, no such items even exist, and nor should they). Or six instead of +24.

Are these numbers starting to look alarming yet?

Looking at the rules

A typical +3 weapon costs roughly 18,000gp according to both the 3.x DMG and the Pathfinder Core Rulebook. According to the NPC gear value charts for 3.x (p127, 3.5 DMG) that means that the absolute earliest that a character should be able to get his hands on that equipment is around 11th level; the Pathfinder rules are more explicit and suggest 17th level (p454, Core Rulebook). The character-wealth-by-level table brings that forward to about 7th level (3.x DMG p135); I couldn’t find an equivalent table in Pathfinder.

If you can achieve the same result from three different sources of +1 – a feat, a magic item, and a +2 stat gain or a class ability – how soon can you get there? A feat: 1st level, or perhaps 3rd if you have to wait. A +1 magic item (value aprox 2000gp): 2nd level (3.x), 7th level (Pathfinder) – but I have seen 1st level characters for both that are so equipped. But, let’s stick with the official guideline for the moment. A stat gain of +2? You can get plus one at 4th level – and a potion or a scroll can make up the balance from 2nd level on (but again, I’ve seen 1st level characters with potions as starting equipment). A class ability that only gives +1 is pretty low-level – certainly, any such would normally be received by 4th or 5th level, and 3rd or sooner would not be unexpected.

Total: between 3rd and 5th level (3.x) a character can have the same benefits expected of a 7th level character. For Pathfinder, that’s 7th level to achieve the same effect as an 11th level character.

It takes work

A lot of players just show up to play, not even looking at their character sheets away from the Game table. Ian is not like that – he works hard for his +16 or +24 or whatever. Outside of game time, he will go over his supplements and references, looking for combinations – this class ability with that feat and the other magic item and this other feat – that actually total the sum of their parts, or more.

Nor is he – despite the impression you may have received so far – a min-maxer. He carefully develops a character concept and profile, evolving it as he interacts with the game world, and every choice that he makes has to be justified in light of that character concept. If it seems right for the character, he will ignore an obviously beneficial combination (in terms of the rules) and choose an option that seems more appropriate to who the character is. All this is an expression of his role-playing, not rules-lawyering (at least most of the time).

As he puts it: The bottom line is that you get out a game rewards equal to the effort that you put into it. Ian puts in a lot of effort, and he reaps the rewards – and he has trouble understanding those who don’t, especially if they complain about the relative power level between his character and theirs.

An Unfair Advantage?

Yet, all this single-minded attention gives Ian what many would consider an unfair advantage, simply because the GM can’t spend months or years developing and improving each encounter in advance. Heck, we’re usually lucky to find time to rub two dry words together!

GMs can live with this situation in one of three ways:

  • they can either target the lowest common denominator – matching the effectiveness of most of the party – and accept that Ian will make things look easy; or,
  • they can craft opposition that presuppose Ian-level effectiveness on the part of the PCs and accept that those characters not built to the optimum standard will suffer for their laziness; or,
  • they can try to mix-and-match – one foe of a standard suitably to confronting Ian’s PC and others to a standard appropriate for the other PCs.

Right off the bat that seems like a no-brainer, doesn’t it? When you put it that way, #3 is the obvious right answer. Unfortunately, it’s not that simple. Let’s consider the ramifications of each (in reverse order):

The Mix-and-match solution

Because the GM doesn’t have the time to build an efficient enemy (in the same way that Ian’s characters are efficient PCs), this solution equates to adding gross firepower to the encounter. Instead of (say) a CR8 creature, drop in a CR15.

But that means that the entire party gets not only the experience for defeating the CR15, but also the loot that a CR15 carries – which is a lot more than that of the typical CR8. The net result is that the characters earn more experience than is warranted at this point in the campaign, becoming more capable more quickly. And because Ian’s PC is not of a higher level than the others (or not much – something I’ll get to in a moment), he progresses just as quickly, with the progression amplified by his ability to design good characters.

This solution might work in the short-term, but it does so at the price of making the overall problem worse.

There’s a second exacerbating factor as well – using this approach means that when a solo encounters occur, matching effectiveness means that Ian gets the experience for beating a CR15 while the others get the experience for beating a CR8. It doesn’t take very many such encounters before he has gained several levels over the rest of the party – which only makes the apparent disparity of power levels worse.

All this tends to create ill-feeling and jealousy amongst the other players, as well, because they not only don’t get anywhere as much time in the spotlight, that spotlight doesn’t even burn as brightly when it IS on them. So it’s not the perfect solution that it might have seemed on the surface. In fact, it’s not even close. Throw in the frustration that the GM experiences, and the genuine difficulties of coping with parties whose power levels are so disparate, and you have a recipe for disaster – and I have seen whole campaigns shut down as a result.

I have to admit, this lesson was hard-earned; for a very long time, this was my solution to the problem. It was only when I started to wonder why the problem seemed to be getting worse that I came to the realizations offered in this section.

Targeting the Optimum PC

So, what then, for the idea of using Ian’s power level as the guideline for everyone, in effect “encouraging” the other players to match his expertise in character construction?

This falls into the trap of creating an “us-vs.-him” feeling at the game table, where the “him” is the GM – the other players feeling (quite rightly) that the GM is picking on them because they aren’t as skilled, or don’t have as much time to invest, or don’t have access to the same game resources, as Ian does. There is also a growing resentment toward Ian, whose fault they often consider this to be.

Mechanically, too, this solution has it’s problems – in fact, these are just the same problems as the previous answer, but amplified by the fact that there are now several CR15 opponents and not just one.

This is throwing HP at the problem and hoping it goes away – but because XP and HP are connected, you are also throwing XP at the problem, which only makes it worse.

In other words, this is no solution at all.

Targeting the Lowest Common Denominator

By virtue of excluding the other proposed solutions as fundamentally flawed, this then has to be the correct answer. But there are consequences of adopting it that make life harder for the GM.

The game effectively becomes too easy for the players. You can expect them to win every straightforward encounter without great difficulty. So the trick to making this solution work is to fill the game with challenges that are not so straightforward. Build nasty little surprises into the game. Be deceptive. Be secretive. Accepting that you are overmatched on the power front, attack on a different vector. Play smart, not strong. Emphasize role-play and relationships and situations in which the shortest distance between two goals is NEVER a straight line.

In an ideal world, this is the perfect solution. If you are at least 20 IQ points smarter than your players, this can work. If you and they are more reasonably matched – if you are a mere mortal when not ensconced behind the GM’s screen – you will need to find some other answer. It takes the power that a player like Ian confers to the characters that he designs and makes it largely irrelevant. Sadly, it’s never that easy.

I want to digress for a moment to emphasize that it’s not all downside, having a player like Ian in your games. What you have here is a player who pays attention to what you reveal in the game, who actively thinks about it a lot, who gets the little hints and appreciates the bigger picture and the twists and turns of the plot, who gets and appreciates more of the game than anyone else at the table. And who is a nice guy, to boot.

Everyone has a different tolerance level for the problems that players like Ian engender, but I’ll put up with an awful lot to keep those qualities at my table.

This article is not intended to be a criticism of him or his play – he’s doing nothing wrong – it’s about a GM being able to cope with a player of his caliber.

Other solutions

There are more than those three answers, of course, and it’s entirely possible that the reason none of them seem to be entirely satisfactory is that we haven’t looked hard enough for alternatives to find the solution.

  • Ian as player consultant: It’s a simple solution to the problem of disparate PC power levels: even up the playing field a bit by having the other players consult the acknowledged expert at character creation. Ian is quite happy to do so, because character creation is a skill like any other – the more you do it, the better you get at it. This also eases tensions, hostilities, and resentments amongst the other players toward Ian, producing greater harmony at the game table. Not a total solution, but a definite ameliorative.
  • Recruit Ian’s Talents: There have been a few occasions when I have needed a really top-notch NPC, and judged that the price of giving Ian some inside info about the campaign direction was less than the price of using an under-created character. Getting Ian to help in the creation of some of the top-line NPCs makes the game better for everybody, so he’s usually happy to do that, too. Again, not a complete solution, but a useful approach when you need the enemy to be top-notch.
  • Talk to him about the problem: The first character of Ian’s to really exhibit the mega-built problem in one of my campaigns was Warcry. The first thing I did was verify that Ian wasn’t cheating, and the second thing I did was to talk to him about the problem. Much of this article is a distillation of that, and many subsequent, conversations with him concerning his approach. The initial conversation led to the next solution:
  • Retire the character when it gets to be too much: In the case of Warcry, it was a good character with a lot of plot potential and I had worked up a number of interesting adventures for the character to have with the team. The obvious solution was to split the character off into his own campaign and have Ian generate a new PC for the main campaign. It worked quite well, and with greater awareness of the problems, Ian deliberately chose to create a less confrontational character the second time around; as a result, Glory was able to stick around until the first Zenith-3 campaign came to a close, even though (towards the end) she was again becoming too powerful relative to the other PCs. For the new campaign, Ian has generated another new character – one that he’s had about seven years to polish – but one that is even less directly powerful in terms of a direct confrontation.
  • Find a shortcut: The final solution is to match Ian at his own game. But wait a minute – the entire premise of this article is that no GM can spare the time from general game prep to do so, isn’t it? Well, yes, it is, but that’s not the end of the story. If a shortcut can be found that at least simulates what Ian does, then the whole problem goes away. Suddenly, that impractical answer, “Target the lowest common denominator”, becomes practical. And I think I’ve thought of just such a solution.

One Structure To Rule Them All

If it is conceded that there is one optimum construction for each character class, and that what Ian does is to winnow through the lesser options until he settles on the best one for the current circumstances of game and the particular character that he’s created, there is an approach that replicates his work – in less time, and without the finesse and artistry that he employs, so it will be a lesser solution, but better than nothing.

The solution is a Zwicky Morphological Box:

  • Each class has a number of functions and abilities.
  • Each of these functions will emphasize or be controlled by a particular numeric value. Sometimes there will be more than one, creating sub-variants.
  • Each sub-variant will have a particular characteristic apon which it is based.
  • Every feat will benefit either a numeric value, enhance a particular class ability (ie a function), or a characteristic. Some feats will produce a paradigm shift, altering the basis to a different characteristic.
  • The same is true for every magic item.
  • The same is true for other class abilities and Prestige Classes.
  • The same is true for the optimum tactical situation for the character to utilize their primary focus to best effect.

What I propose is a series of columns of lists, one set to each character class, one column to each character class function (ie, class ability) and any sub-variants. Each column would be divided into sections – Class Abilities, Feats, Magic Items, Skills, Spells, Prestige Classes, Tactical Notes. In addition, there would be a simpler set of columns (no sub-variants) of lists, one for each characteristic.

  1. Go through each of the class abilities for that class. If any of them enhance the character’s primary focus AND are accessible at the same class level as the primary ability, they go on the list under “Class abilities” for that primary ability focus. You can do this at the same time as you are setting up the initial lists.
  2. Go through each feat in the Core Rulebooks for your game – find a list of them, if you can – and number them, i.e. index them numerically. Then list it in the feats section for each primary focus or stat where it is relevant. This should take a matter of seconds for each feat; you aren’t worried about all the bric-a-brac and fluff and restrictions that come with it, just with the general question of ‘does this enhance or improve this focus ability’? If the feat has any prerequisites, these can be noted by number in brackets. Of course, you will also need a master list of indexed feats.
  3. Ditto magic items, in the Magic Items section. (Some won’t go anywhere – add them to another list, called “fluff”). Some may generate new sub-variants – Frostbrands vs. Flame Tongues, for example. Create by copying and pasting into a new column.
  4. Ditto skills, in the Skills section. Most of these will have no effects on any core functions, and can be ignored – you’re mostly looking for synergy bonuses and opportunities to enhance tactical positions. But some skills will recur often – spellcraft, and knowledge (religion), and spot, and listen, and search, for example.
  5. Ditto Class abilities from Prestige Classes.
  6. And so on, until you’ve finished with the core rulebooks. Next, grab the first game supplement that comes to hand, and do the same for what’s in that.
  7. Repeat as necessary. (It might be a good idea to keep a list of game supplements that you’ve processed, in alphabetic order, so you don’t waste time going over old ground a second time).

What you are really doing is culling all the alternatives that don’t benefit the class ability that you want to focus on in each column.

A table or spreadsheet is perfect for this work – and the implementation of tables in Open Office makes it better suited than Word for the purpose, because it lets you copy part or all of a column.

The Time Factor

Will this take time? I’m afraid so – but by simplifying the questions involved, and permitting a quick skim to do the work, and making each entry as simple as possible, and using cut-and-paste with multiple lists open at the same time, it should not take very long.

The beauty of the approach is that in the long run, it actually makes your game prep more efficient, so an initial investment in time helps in the longer term.

And, of course, the results are persistent within the game system that you are using – until a new edition comes out and your campaign switches over.

Why I haven’t done the work for you

I entertained thoughts of doing just that – bit by bit, over the course of multiple articles here at Campaign Mastery. Or of putting the results in an e-book – I’m sure that it would sell! And it would have the benefits of recycling something I’d like to do for my own campaigns into something publishable – which is probably the only way I’m going to find time to do it at all for the next few months!!

But a little thought about the project gave me pause. Every campaign is different – I don’t have every game supplement that’s out there, and I don’t interpret them all the same way, and my House Rules are different to those of the next campaign over. That means that every campaign’s lists would be just a little different from each other, and the format means that it becomes a lot harder to customize them after the fact. In fact, I think it would be even more work to customize an existing list than it would be to create a new one from scratch.

I could be persuaded otherwise, if our readers demand it – once the current Monday series of the Alternate History is finished, of course – but, for the moment, the best solution is to show all of you how to do it.

So, if I have to do it myself – why no example?

Unfortunately, it would take almost as long to craft an example as it would to do the whole thing. I would still have to glance at every feat, every magic item, and so on. In fact, arguably, it is more work to do it one class at a time (because there is more redundant activity) than it is to deal with each potential entry just once for each of the lists required.

This is an all-or-nothing project – and so it isn’t possible to extract and create an example, except perhaps for the layout of the lists – and those will vary with the software each GM has available, and with their own ideas, anyway. I’ve certainly had no time to optimize the design, and have not actually done this myself yet – so there are no examples to offer. Sorry.

The Bigger Picture

A few of you may be thinking that none of this matters to them – after all, they don’t have Ian Gray in their campaigns (for good or ill)!

But the fact is that everybody does have an Ian, at least to some extent. Every player has his own unique strengths and abilities, and no two are ever going to be identically competent at character design. Some will have a ‘favored class’ that is their preference, and whose options and nuances they’ve mastered, but be fish-out-of-water when it comes to optimizing a different class.

So the same problems exist, to at least some extent, in every campaign out there. It’s only that Ian has gone further than anyone else I know down this path – and hence, brought the associated difficulties sufficient prominance to be noticed.

Fractionalizing the Differential

Can this power, this technique, be turned to the Dark Side? Can it be adopted by the players to add to the problems confronting the GM?

Of course – but it’s hardly the end of the world if it happens. In fact, by normalizing the efficiency of character construction for both players and GM, and reducing the differential between the run-of-the-mill player and the Ian Grays of the gaming world, a campaign will be a lot stronger. Opposition will be more nearly a match for the PCs, making the challenge – and the fun of meeting that challenge – better for all.

Oh yes – and it also pulls the teeth somewhat of any genuine min-maxers amongst your players.

Not a bad thing to have your name associated with, eh, Ian?

Comments (4)

The Imperial History of Earth-Regency, Part 8: The Ascendancy Of The Peerage – 1978-1979


This entry is part 8 of 12 in the series The Imperial History of Earth-Regency

Only a short post this week, I’m afraid, and half of it is taken up with a reality check on where things stand at this point for readers who may be coming in late. I could have continued, but I would like to start each Chapter in it’s own post – so I’ll make up for it, next time around.


Pieces Of Creation is an occasional recurring column at Campaign Mastery in which Mike offers game reference and other materials that he has created for his own campaigns.

All images used to illustrate this article are public-domain works hosted by Wikipedia, Wikipedia Commons, or derivations of such works, except for the image of the photographers, which is governed by the SXC terms of agreement.
 

You don't appreciate how big the Pyramids of Giza really are until the skyscrapers offer some perspective. Photo by Jerzy Strzelecki, licenced under the GNU Documentation Licence version 1.2. Click on the thumbnail for a larger image.

Status Check:

As we resume this Alternate History, the Empire is beset by tumult and dissension. Many of the problems are political, some are social, and some are economic.

Politically, the Middle East is by far the least-stable corner of the Empire. Ideological conflicts have produced an unstable political landscape full of ongoing wars and temporary peaces. At the start of 1978, Lebanon was in a state of Civil War, and an attempt to invade Afghanistan had turned into a 7-year bloody standoff for the Empire. A moderate had been elected Prime Minister of Israel, leading to a negotiated peace with Egypt; currently hostile are Syria, Iraq, Libya, Algeria, South Yemen, and Afghanistan. Another of the trouble spots, Saudi Arabia, had recently had a change in head of state, with a Moderate King succeeding a Militant. Pakistan, once the most loyal of Imperial members, had slowly disintegrated politically to such an extent that it had been placed under direct Imperial Control, with neither political party trusted to conduct an honest election. Middle East -based terrorists are an ongoing problem for the Empire.

Second only to the Middle East is the balance of the African states. Idi Amin’s regime is coming under increased public attention as his record on civil rights begins to emerge. The Empire has already placed an arms embargo on South Africa in protest over the Apartheid policy. Somalia had invaded Ethiopia, while Rhodesia had commenced an attempt at creating a unified African Tribal political entity based on the Israeli model. Several other kingdoms have attempted or threatened secession or revolution, with varied results.

Early indications suggested that South America was heading down the same political path as The Middle East and Africa. Most recently, a coup in Argentina had removed Prime Minister Isabel Peron after she blocked investigation of Electoral Fraud allegations leveled against her, while an attempt by Chile to secede from the Empire had been blocked by the application of “The Pakistan Resolution”. In general, the continent is viewed as a remote backwater, of little overall significance.

Europe also teeters on the edge of political disintegration, largely resulting from the overwhelming control of daily life within the Empire by the Civil Service, which the Empress Elizabeth saw as the solution to all her problems for most of the first 25 years of her reign. This has effectively left elected representatives powerless to implement changes in policy not approved by the Civil Service, whose first rule is to protect themselves and their positions. Public unrest is at unprecedented levels as a result.

Unconventional attempts to find solutions are beginning to surface; in Spain, the King is also the Prime Minister, a situation viewed as a one-off – but one that will demand closer scrutiny should King Carlos manage to reign in the Public Service. In the meantime, Elizabeth has at least forced the Civil Service to accept the principle of dismissal for incompetence. Northern Ireland is also a trouble spot; where Mao-backed guerillas have committed a number of terrorist acts in support of demands for an independent voice within the Empire.

North America, dominated by the USK† has become a problem of an entirely different nature to the Empire; as its strength has grown and that of Europe has waned, they have begun to dictate various aspects of Imperial Policy. The Americans are consumed by a particular arrogance that reflects their status as the strong right arm of the Empire: they have slowly become the Political and Popular Cultural leaders of the world, and they know it. Bringing them to heel has so far proven almost impossible. Only the presence of the renegade Central American Kingdom on their boarders has so far kept them in check.

† USK= Kingdom Of The United States Of America. Refer to previous chapters of this Alternate History.

Socially, other problems remain unsolved throughout the Empire. Youth countercultures and the Generation Gap have opened a divide between the Middle-aged majority and their children. Gender and Racial inequities are slowly easing in most corners of the Empire, though some areas remain backwaters of discrimination. The criminalization of Narcotics has generated an escalating crime wave by addicts which society has proven helpless to control.

Even more turbulent is the Imperial Industrial sector. While the issue of union corruption has receded into the background, it remains as an ever-present background element. The union movement has become a breeding ground for politicians, just as the Civil Service has become the breeding ground for Peers. Because the peerage also controls big business, and the Civil Service effectively control government policy, the latter are in an overwhelmingly strong position; only the protections of Common Law prevent total control of the Empire by the latter groups.

Because the Empire was reaching saturation point in the development of known resources, the inherent weaknesses in the 20th century economic models had made inflation an ongoing crisis; this enabled the combined power of the Peerage & Civil Service to clamp down on wages, leading to industrial activity on a broad front. The Empress was aware that the Lower House / Union Movement were her best weapons against the rampant power of the Peerage, but it was a weapon she dared not use, threatening the Empire with face total economic collapse as a byproduct. Only the Coal Act, which defined industrial actions which interfered with “Essential Services” to be a form of terrorism, had so far prevented the cessation of industry altogether.

Overshadowing all these internal crises was the ever-present threat posed by the Mao. The non-human rulers of Asia possessed technologies which, for all their gains in scientific knowledge, remained as unfathomable and inscrutable as ever; while science was capable of analyzing and identifying the applications to which this technology was put, as shown by the discovery of their ability to control the weather, the fundamental operating principles remained cloaked in shadows, the subject of equal parts speculation, assumption, and prejudice. Until the invasion of the rogue state of Afghanistan, the most significant wars of the last two centuries had been fought with the Chinese masters of the Asian continent or their allies. While of late, progress had been made in establishing accords and protocols with the Chinese and their shadowy ruling class in summit talks aimed at achieving specific goals to the benefit of both, they remained the most significant single threat to the ongoing existence of the Empire.

Only beginning to emerge as problems to be solved in the latter part of the 20th century were the environmental consequences of the massive industrialization of the last century. Although the full scope of the problem is not yet appreciated, some progress has already been made, with Business held liable for ecological damage resulting from their operations – in theory. In practice, these laws have just failed their first real test, following the first recorded ecological disaster, centered on the town of Seveso (near Milan) in Northern Italy, devastated by the accidental release of poisonous dioxin gas from a nearby pesticide plant. By reaching private settlements with those directly affected, the Peers involved had successfully prevented their testimony; while this was effectively the committing of even more serious crimes, without the testimony of those receiving the settlements, the case was legally helpless. In effect, a criminally-negligent administration used wealth to reduce very serious charges to a rap on the knuckles – at an expense far less than a legal defense would have cost them. The peerage had finally found a way around Common Law, the only thing that had been keeping them in check….

Joshua Nkomo. Photo by Robin Wright courtesy The Christian Science Monitor and the Alicia Patterson Foundation, licenced under the creative commons 3.0 unported licence.

1978

The Rhodesia plan for a united “Black African Nation” was rejected by black leaders Joshua Nkomo (leader & founder of the Zimbabwe African People’s Union) & Robert Mugabe (the Secretary General of that organization, who had been imprisoned as a Political Prisoner since 1964) in March, as the Empire declared it illegal under Imperial Law; immediately, guerilla warfare increases dramatically as the “Patriotic Front” attempts to force moderate Black Africans to reject the plans. In the midst of these developments, Somalia accepted defeat and withdrew from its invasion of Ethiopia.

This was a particularly bloody month; it also saw a PLO attack which killed 11 Israelis, and an invasion of southern Lebanon by Israel in response. In mid-year, Islamic fundamentalists rioted in Tehran calling for the removal of the Shah (King), whose policies of modernization were at odds with the religious fundamentalists. Civil Unrest and violent demonstrations would lead to Martial Law and a military government by the end of the year.

The following month, Ahmad al-Gashmi, the President of North Yemen, was killed by a bomb. Two days later, the same extremist faction assassinated the Muhammad Ali Haitham, the Prime Minister of South Yemen. With tensions mounting, the Empress personally interceded with the leaders of Egypt & Israel; two weeks of face-to-face negotiations in Buckingham Palace lead to the Buckingham accords, which formally end 30 years of hostility between the two.

Terrorism remained an ongoing problem. Former Italian PM Aldo Moro was kidnapped by Red Brigade terrorists; this was the first international recognition of the group, whose goals were the restoration of the Roman Empire. They did not want to be rid of the Empress so much as they want to be free of Her civil servants; although it had never been done previously, they had no problem with the concept of the one Empress being head-of-state of multiple Empires at the same time. The proposal was unilaterally opposed by all concerned as inherently unstable; inevitably there would arise an occasion when the Empress would be called apon to favor one over the other, destroying the loyalty of the people slighted.

1979

Whenever a society experiences rapid expansion of knowledge, watershed years have a tendency to occur more frequently. The sum of human knowledge in the Empire was now doubling every 25 years, and even experts were finding that they could not master the entirety of their chosen general subject, but were increasingly confined to specializations. Synthesis of new approaches by collecting a disparate group of specialists in relevant fields – the think-tank – would play an increasing role over the next two decades.

Short-term consequences of this expansion of knowledge meant that paradigm shifts in perception occur more frequently – and with each, ‘acceptable behavior’ is redefined. The Generation Gaps were widening. 1979 was recognized even before its commencement as just such a decisive year.

The Mao Summit Talks

January 1st 1979 was touted as a day of hope for all mankind, as ongoing diplomatic relations with the Mao were agreed to for the first time. The breakthrough came with the begrudging political acceptance by the Imperials that the Chinese Empire was the equal of their own system of government. It was hoped that through greater understanding and respect for one another that a fourth Global War could be avoided. Nor were the diplomatic concessions one-sided; the Mao had to swallow their own pride somewhat and acknowledge that the British Empire had grown to the point of achieving parity and equality with their own culture, and were worthy of respect.

However promising the achievement of mutual recognition, it did not erase the fundamental differences between the two regimes. They had different cultures, different technologies, different religious beliefs, and different philosophies. The Mao regime emphasized the comfort and security of their citizens, at the expense of their independence; while the British Empire stressed personal achievement, social mobility, and the maximum amount of freedom for its citizens, at the expense of social guarantees of prosperity. The poorest citizens of the Mao regime were incalculably better off than the homeless and destitute of the Empire, but the wealthiest of the Imperial Peers possessed a luxury unheard-of within the Chinese borders.

The Mao were slow-growing, deliberate, and methodolical; already plans were underway that would not reach fruition for centuries. The Empire, in comparison, was explosive in growth, moving into new areas long before the old was fully established. The results were a much larger Society subject to perpetual growing pains, and one which perpetually needed new areas to grow into. Many of the social and psychological problems that were beginning to emerge were analogous to cabin fever, the result of a confinement and bottling up of that drive to explore. Escapism, in many forms, became an increasingly-prominent feature of literature and mass media; in the past, the youthful vigor and drive had been marshaled and directed into exploration and colonization, but with nowhere remaining to go, new forms of diversion were needed to consume that energy, and media providers who saw this as an opportunity for profits were eager to take advantage of the need.

The Mao were not without problems of their own; slow to change, slow to react, slow to integrate new ideas and new discoveries. It was a certainty that progress of all sorts – literary, social, and scientific – was ponderously slow. If the Empire had now achieved Parity with the Mao, in a century, the Mao would be as antiquated in capabilities as a Victorian Army faced with the best military capabilities of the modern day, or as the Native Americans had been against the western settlers who confronted them during the conquest of North America. These facts did not change human nature; the citizens of China were just as ambitious and desirous of luxury, just as caring for their children, as were their Western counterparts, and their youth possessed just as much excess energy; The Mao focused this energy into an obsession with precision and ritual; the average Mao citizen participated in over a dozen ceremonies and rituals each day, end dissipated the remainder through an increased reliance on manual labor. But the price of this solution was a stultification of their society, a reluctance to innovate when conventional solutions were no longer sufficient.

A few philosophers dared to suggest that both were extremist views, forced down mutually-exclusive social developments by the presence of the other; the optimum social solution would be somewhere in between, a blending of the British drive to explore new ground with the Mao ability to make maximum benefit of what resources they had available.

Donald Perisque Summerkinde, in his landmark 2032 historical and social analysis, A Romanesque Myopia compared both societies with that of the long-past Roman Empire, finding many analogies for each to ponder.

The Roman Empire had been limited in size by the nature of their administrative and economic systems, while the limitations that faced their Modern-day equivalents were essentially geographic in nature – there simply was no new territory left to gain, save by means of hostility against the other, but the consequences were the same – each had found its own form of social degeneration and decline, inevitably manifested most strongly by those with the greatest excess of energy at their disposal as a rebelliousness against whatever had been fashionable a decade or two earlier.

This, he argued, was the true cause of the rise of The Teenager as a social and marketing force. In both societies, the excess energy was manifested and consumed by new means of artistic expression, usually condemned by the generations prior to theirs as “barbaric noise”.

Summerkinde also compared Mao society with that of the North American natives, and came to the conclusion in persuasive fashion that the two were more alike than had been generally realized; the study of Amerind culture would thereafter become an accepted part of the curriculum for the training of diplomatic personnel, and surviving tribal members who had fought so hard in the late 20th and early 21st centuries to preserve their culture suddenly found themselves rewarded with high diplomatic credentials. The irony that a people who had been lied to and deceived so often, and been subject to so many broken treaties and promises, were now the leading negotiators of such treaties and promises, was not lost on them. Some consider it Coyote’s grandest jest.

The Ayatollah Khomeini, Photo by Aleain DeJean, taken 5 February 1979. Photograph is in the public domain in Iran, its country of publication. This photo has been edited, click on the link to see the original and the terms of use.

Rise Of The Modern Theocracy

Internally, developments were far less promising. Faced with near-universal revolt, the Shah of Iran fled to Egypt even as troops were staging to arrest and imprison him. Within two weeks a Theocratic regime led by the once-exiled Ayatollah Khomeini had seized control, and Iran joined the ranks of those hostile to Imperial control.

Harsh laws, based on Ideology instead of democratic principles, began being implemented daily. For the rest of the year, Iran would be in turmoil as the new state sought to override the protests of those disenfranchised under the new regime; in November, terrorists seized the Imperial diplomatic headquarters, taking over 100 hostages, in protest at Imperial “meddling” in the Middle East.

The promise of an African Peace

African developments at least showed the possibility of peaceful outcomes to ongoing problems. Nationalist troops aided by Tanzanian soldiers drove Idi Amin from office in March, reestablishing normal relations with the Empire, while in Zimbabwe the parliament voted overwhelmingly to support the enfranchisement of a predominantly black government. The two-year plan for African Black Unity had failed to be accepted outside of the Rhodesian borders, thanks in part to opposition from within the Empire (read: the Civil Service / Peerage), but the developments in Uganda suggested that this was more because it was ahead of its time than from any real impracticality.

Photograph of the Three Mile Island nuclear power generation station. The reactors are in the smaller cylindical buildings with the rounded tops. Photograph by the United States Department Of Energy, 1979. Click on the thumbnail to see the full-sized image.

The march of progress

In October, the Imperial Health Office declared that after a 22-year campaign, smallpox had at last been eradicated. In hindsight, this was the height of irony; just as the age of science appeared to be drawing to a close, it had begun delivering on the promises it had made.

Unfortunately for the increasingly polarized society, popular sentiment was more in tune with the panic created by a minor failure at the Three Mile Island nuclear reactor in the US; the radioactive material that leaked was less than that received during a dental x-ray, or 8 hours television viewing, but these facts did nothing to quell public hysteria.

The Tabloid Media

This event was a turning point in journalism within the Empire, marking the emerging rise of sensationalism over substance as a guiding principle. While the experts recognized that the public trust won by Woodward & Bernstein and others of their ilk had been betrayed, the integrity of the news media discarded in the choice of flash over substance, this realization would be slow to come to the public at large. The media barons – Peers all – had in effect seized control of the public, and through the public, the branch of the government designed to keep them in check. The Empress’ task of regaining control of her Empire had been made that much harder.

She still controlled the courts (though the judicial process had been at least partially derailed by the application of money and the prospect of rewards of privilege and peerage), and she still controlled the Military (who were dependant on the Peerage for supplies and armaments). But without an independent Media, the Peerage would tell the public what to think – and Public Opinion would tell the Lower House to support the true Peerage position (the Upper house would often adopt a seemingly antagonistic position, arguing over trivial details, while the substance of what they wanted came to pass). With both branches of government united, policy was now the province of Big Business. The descendants of the Barons had at last won the battle with the Throne.

Or so they thought.

Comments Off on The Imperial History of Earth-Regency, Part 8: The Ascendancy Of The Peerage – 1978-1979

With The Right Seasoning: Beyond Simple Names


This entry is part 4 of 11 in the series A Good Name Is Hard To Find


Welcome to “part 3a” of this series on names and naming things – and finding the right choice. Today’s post was actually intended to be part of the previous entry in the series, but the subjects of Mononyms (got it right this time, thanks again elijah!) and bi-structured names just sort of grew… a lot.

So, we’re still talking about Name Structures, and there is a still a lot of ground to cover, so let’s dive right in…

Tertiary Names

In our society, Tertiary names come in three principle varieties: Middle Names, Maiden Names, and Addenda. This barely scratches the surface of the potential value of such names.

Middle Names

With increasing populations and rising levels of communication, two names can become insufficient to identify a specific individual. How many Paul Smiths are there in the world? How many John Jones? The practice of Middle Names usually begins in those with sufficient prestige that many members of the one family are known publicly throughout the land – the aristocracy, the wealthy, and the nobility. To preserve and utilize the prestige that past family members have accumulated, these often have very similar Christian Names and (of course) the same Surname – so some means of identifying two different individuals within the family becomes necessary, especially since these groups tend to have greater longevity and hence a greater probability of two like-named individuals being alive at the same time.

Another way of looking at this trend is that as christian name choices become relatively constrained, the flexibility and freedom that most citizens enjoy with respect to christian names needs to be transferred somewhere. It follows that in important families, most of the advice concerning choice of Christian Names in the previous article actually applies to the Middle Name of the individual, while the Christian Name becomes an adjunct to the Surname.

I once read – and I no longer recall where, so unfortunately I can’t cite the reference – that it was only in the 20th century that middle names became routine and common. That, if true, simply speaks to the power of Christian names as a means of unique identification, especially when coupled with an address or locality. Even now, it is not all that common for people to emphasize all three of their names – though the trend would be for this to become more common in the future if the population continued to increase.

Ethnic Alternatives
Middle names are not the only solution; they are principally a Western-society approach to the problem. Chinese Names, Arabian Names, and (some) Indian Names use an entirely different approach, for example. In fact, this seems to be an excellent place to point to the excellent series of Wikipedia articles on Ethnic Names, which I wish I had discovered many years ago (assuming that it existed then)!

In particular, the Chinese approach to naming reflects the dangers inherent in using English as a cultural basis for assessing the limitations of language. Because the Chinese written language contains so many characters, (3-4,000 in general usage), they will not reach the point of needing additional names beyond their current three-character (three-syllable) system for centuries, even if their population growth were to continue unchecked.

Nevertheless, the majority of our readers – and of game settings – are Western in derivation. So this series will continue as though the Western approach is the ‘natural’ solution, even though I – and now you all – know better.

Middle-Name emphasis

That means that within the context of a general population level, it is possible to infer things about a character simply from the emphasis he or she places on his middle name. In any pre-20th century westernized setting, emphasizing a middle name is a mark of arrogance. Where it may be necessary as a point of identification, it would be more common for characters to reduce the middle name to an initial, and this continues in formal address to this day – my bank uses this format, for example, to refer to me. Consider the (fictitious) name of Patrick Jonathon Bellweather, which I will be using as an example throughout this section: ignoring the middle name and reducing the Christian name gives a fairly typical name, “Pat Bellweather”. Slightly more formal is “Patrick Bellweather”. More formal again (in a modern context) or – perhaps – more rebellious, is “Patrick J. Bellweather”. This same name, in a sixteenth-century setting, carries a distinct overtone that is diminished or lacking completely in the modern context.

Another approach, especially where first names are controlled by inheritance issues and eccentric demands, is to reduce the Christian name to an initial and to use the middle name as the Christian name. This conveys the same overtones of wealth and authority, but without the same level of formality. Compare “Patrick J. Bellweather” with “P. Jonathon Bellweather”. Because this particular approach is no longer as popular as it once was, modern usage carries overtones of a traditional formality, while it would not be all that remarkable 150 years ago.

In particular where one name is Unisex (or has a masculine equivalent that is only different in spelling, if at all), these approaches were often used by women to disguise their gender when participating in male-dominated fields of activity, especially literature and science at the turn of the 20th century and even all the way through to the 1950s and 60s.

The Impact Of Culture

It is clear, from the preceding examples, the extent to which cultural attitudes can impact names and naming conventions – and hence, the capacity of a given naming convention to reflect a character’s social and cultural background. The name is all about where the character is coming from – his or her reaction to those origins is a key component of the character’s personality.

Equally importantly, by placing a group of characters and their current circumstances within a visible social context, a GM can generate naming conventions and give them an original context simply by persistent usage, adding to the uniqueness and verisimilitude of an original society within his game. The preeminent exponent of this approach remains J.R.R. Tolkien, with his many imitators walking in his footsteps. The Lord Of The Rings and The Hobbit employ this approach throughout, and so accustomed is the human mind at detecting such nuances that we don’t need to be told that “Aragorn” and “Arathorn” are related – the names themselves do most of the work, all we need is the specific relationship. This is also true of the Halflings, the Dwarves, the Elves, and even the Rohan – names are used to bind them collectively into a cohesive social entity within the stories.

In modern times, naming conventions and name sources have become so homogenized that this approach leaps out at the reader, almost hitting him over the head with the cultural indicator to make sure that he doesn’t miss it.

Maiden Names & Regnal Names

And speaking of the impact of culture, consider the practice of Maiden Names and Regnal Names. Both use a change of name to symbolize a change of social status – whether that be by marriage or ascent to a throne (Civil or Ecclesiastic).

Prior to the mid-20th century, the change of name on marriage was symbolic of the domination of women in society by men. In the course of the latter part of that century, this attitude was challenged by extremist proponents of Women’s Liberation, but even as they did so, the social convention was changing. These days it is viewed as a commitment to the union, not a gesture of submission; and some of the practices discussed below in “Decadent Naming Structures” such as hyphenating the surnames have also become accepted practice, as has the option of the woman retaining her maiden name.

It is one of the most obvious examples of using naming conventions to invert the traditional practice for a matriarchy. It was thinking about this that led, in part, to the “syllable exchange” of Ullar’s society (whose naming conventions were discussed in the previous part of this series).

Addenda as Tertiary Names

Patrimony, Lineage, and Ancestry are often displayed through tertiary names, and in far more traditional ways than simply playing around with Middle names. In fact, western society has three means of doing so, and many game cultures import a fourth from other sources.

‘Jnr’ is applied when a child has exactly the same name as a surviving or famous ancestor – most commonly father, but sometimes applied to a grandparent or older relative of sufficient fame. It is implicitly (and sometimes explicitly) coupled with ‘Snr’ for the elder – so it is possible for three generations of child within a single family line to have exactly the same “primary name” without confusion (Senior, no suffix, Junior). Rarely, “Senior” is used as an alternative, and usually denotes a case in which the younger generation has become famous despite coming from (relatively) common roots.

One of the easiest (and perhaps, best) ways of giving a society a different feel is to preserve (and make more common) these practices, but translating the suffixes into the language of the society or using a synonym such as “elder” or “the elder” – effectively, using the tertiary name as a title. Titles are a subject we’ll get to in a little while!

Junior imparts a sense of youth, innocence, and even naivety to a name, while senior imparts a sense of seniority, maturity, and even gravitas. Compare “Patrick Bellweather Jnr” to “Patrick Bellweather Snr”, picturing the image that each name brings to mind – it doesn’t matter if “Jnr” is 81 years of age (with a still-older father), the first image most people will have is a youngster, early-20s or less – while “Snr” brings an image of a middle aged-man.

The third approach that is common is to employ numbers. Where this familial naming convention extends beyond two generations (or three at the most), this is the accepted practice. This is a technique for demonstrating lineage that avoids the immediate connotations of “Junior” and “Senior” while implying a larger family history. “Patrick Bellweather IV” could be of any age – but the emphasis placed on lineage reeks of old money and family history, even as the (relatively common) primary names indicate working class roots. The name itself is a capsule history lesson.

The final approach that fantasy cultures often assimilate from other sources is the use of ‘bridging words’ to tell the story of the family in condensed format – the equivalent of “son of”, or “of the”.

All these are useful ways to reinforce character descriptions, adding to the backstory of a character without wasting time on descriptive irrelevances – a shorthand approach, if you will.

The Lack Of Female Equivalence

Once again, the male-female social dichotomy that is part of western history has an influence, in that there are no female equivalents to Jnr and Snr. In part, that’s because the female was expected to change her name when she married, but it is also in part due to inheritance precedents, which were generally to males. Even money bequeathed to a woman was likely to be actually placed under the control of a nominated male administrator, be it a brother, and uncle, a legal representative (conservator or trustee), the local priest, a family friend – almost anyone short of a passing stranger, really. There have even been a few cases where celebrities and trusted political figures have been named as trustees without ever having met, or known of, the decedent or heir. In general, these are refused, though legend & rumor has it that a few have accepted – but it is equally possible that these are examples of Hollywood scriptwriting!

The general method of distinguishing “Marie Obatelli” from her mother remains with another change that occurs with marriage, the change of title. One is referred to as “Mrs.” while the other is “Miss”, “Ms.”, or uses no title at all. It is also relatively uncommon – for the reasons espoused in the preceding paragraph – for female children to be given the same middle name as their mother, thus using the middle name for its purest purpose. If the use of “junior” and “senior” had remained as popular in the modern day as it was 50-60 years ago, it seems virtually certain that some female equivalents to those terms would have entered the lexicon, but the fading from popularity of the masculine terms left little demand for the creation of a feminine version.

Any society with anything approaching gender equality with gerontocratic tendencies (rule by the elderly) – such as most fantasy Elven cultures – would either forbid direct name inheritance, have some other naming structure, or would need both male and female equivalents of “Junior” and “Senior”.

Bridging Words

The use of bridging words is not all that uncommon. Spanish has “de la” which means “of the”, or just “of”. “De” and “Du” are also “of” in French, and prepended to many surnames, as is the Italian “di”. “De” also recurs in Portuguese. German for “of” is “von” and I’m sure that it is immediately recognizable as a part of names from that part of the world, as is the Dutch “van”. Finally, the Irish use an abbreviated form of “of” – as in, “O’Brien”, “O’Kelly”, and so on – and the Scottish “Mac”.

Some cultures use patronomics for daughters as suffixes – these are the syntactic equivalent of bridging words. The Scandinavian nations are especially prone to this practice.

There are an almost-unlimited number of relationships that can be acknowledged through bridging words, the only restrictions are on the imaginations of the GM. These should always reflect the society in which they are found (or vice-versa) – a traditional meritocracy might well have “student of” and “teacher of” as bridging words! They won’t look so strange when they are translated into an appropriate language – though these will usually yield polysyllabic results, and if there is one thing all the real-world examples have in common, it’s that they are short.

Monosyllables tend to be the early words in a language, expressing things that are fundamental to the lives of the primitive cultures from which they derive or that they judge important – so the use of bridging words in this way implies a fundamental trend in their history toward valuing the relationships described by the bridging words. Anything that is too long would be eventually “worn down” by regular usage. Take “Student” and “Teacher” – in most languages, these are immediately recognizable to English-speakers when translated. But Icelandic offers “Nemandi” and “Kennari” as translations. “Nem” and “Kari” would be entirely appropriate “condensations” of such roots after centuries of usage – “Jon nemMagnus Eriksson” would be “Jon, son of Erik, student of Magnus”, while “Magnus kariErik Vigfusson” would be “Magnus, son of Vigfur, teacher of Erik”. This usage also suggests a one-to-one apprenticeship system similar to what many fantasy games have for Wizards.

By all means, strive not to be literal. By far the easiest way to simplify a relationship to a monosyllable is to use a metaphor prior to translation – “light of”, “fire Of”, and the like will work well in just about any language. “Jon Eldur Erik” works quite well (“Jon, fire of Erik”) as do “Louis foc de Vega” (Louis, fire of Vega), “Helena luz de Ruiz” (Helena, light of Ruiz”), and “Marcel lumi Versoire” (a condensation of the French for “Light Of”).

An Orcish Diversion
Just for the practice, let’s try applying these principles to an interpretation of Orcish society – even though it means briefly skipping ahead to some of the content from later in the series on manipulating languages.

Orcish male names would tend to be simple and violent in nature, and fairly guttural. “Crush” and “Kill” and “Axe” and “Blade” and “Make bleed” and the like. The most guttural languages are things like German and Russian and Hungarian. Just because we haven’t mentioned them before, let’s go with Hungarian as the basis for our fantasy Orcish and alter the words as necessary/desirable. Orcish female names would be more prosaic, and probably related to other natural phenomena that the Orcs encountered, like “running deer” (somewhat Amerind in flavor).

There are two ways children can be perceived within most Orcish societies: As weapons to be hurled against the enemy (sons), or as shields against time that will breed more weapons (daughters). [Side-note – this immediately suggests that the women are the keepers of culture, craft, treaties, records, and the like. It is arguable whether or not – in light of this side-note – inheritance would be through the mother (the stay-at-home keeper of the culture) or the father (very Nordic, always looking for trouble somewhere). The best solution when this is the case is to try it both ways and see what looks best. Or perhaps to take a third choice: daughters acknowledge Mothers, sons acknowledge Fathers. I like this option, so that’s what I’ll choose.]

So: “Kill, blade of Sword”, becomes “Oldmeg Penge Kard”, which we can simplify to “Oldeg pengKard”. “Sunshine, shield of Flower” becomes “Napzutes Pajzsa Virag” which we can simplify to “Napzutes ZsaVirag”. Both sound like perfectly acceptable names, and furthermore, names that seem to have a cultural depth and realism behind them that is otherwise hard to convey, especially in so short a statement. You could waffle on for five or ten minutes of narrative about Orcish society without it sounding anywhere near as convincing.

Decadent Naming Structures

When you are a person of influence, you tend to marry other families of significance. And, when you are a person of influence, you occasionally need to remind people of the power and authority at your command. Using your name to do so is one of the more subtle techniques available, and one that is open to anyone – whereas philanthropic personalities can’t readily employ ostentatious displays of wealth. The latter also tends to work against credibility in business negotiations. So there is a continual pressure amongst the wealthy and powerful toward what I describe as “decadent naming structures”.

Hyphenated Surnames

The most common approach employed in the early-to-mid 20th century was the hyphenated surname. With better communications and the advent of the PR machines, this has become less needful in more modern eras, but prior to the rise of Television for the masses it was frequently the best approach for advertising an overt connection between two major power-blocks.

To see how effective this is, let’s try adding a couple of hyphenated names to our usual test subject, “Patrick Bellweather”. Picture the character that is so named in your mind, and then compare that image to:

  • “Patrick Bellweather-Rothschild”
  • “Patrick Bellweather-Hilton”
  • “Patrick Carnevon-Hughes-Bellweather”

Now, if confronted with one of those hyphenated names, how would your impression of the Bellweather family change? That’s right, all of a sudden the entire family is given a degree of cache and significance that is beyond the reach of someone who is just a “Bellweather”.

The Significance of Hyphens
The hyphenation indicates that the character is important – but what does that actually mean? In our culture, that of Western Europe, the significance is attached to wealth or political power, because those are things that we value – even if that is only at the insistence of those who possess wealth or political power. In a different culture, it would be expected that those values are different. Wisdom, physical strength, athletic prowess, even seductive capacity and hedonistic appetite – choose something appropriate to the culture that you have created.

Extended Names

The wealthy of some other cultures take the principle of hyphenated names a step further, and use their names to tell a story. The use of bridging words in names is a sort of ‘watered down’ version of this practice, though I have no idea if there is an actual cultural connection. Traditional middle-eastern cultures are strong proponents of this naming practice.

For example, consider “Muhammad Saeed ibn Abd al-Aziz al-Filasteeni” – which translates to “Praised Happy, grandson of the Palestinian slave of The Magnificent”. Muhammed (Praised) is the name of the character, Saeed (Happy) is the name of his father, Abd al-Aziz (Slave of The Magnificent, which is one way of referencing God in Islamic cultures) is his grandfather, and the family are Palestinian in origin. So what we have here is the grandson of a Priest.

More than anything else, this shows the extent to which religious thought dominates the society in question – the most important thing about the character is not anything he might do, nor any achievement of his father, but that he had or has a grandfather who dedicated his life to God. This verges on the obsessive, in western eyes – which fits our perceptions of the culture in question.

Extended Names In Games
By now, you can probably predict what I’m going to say. To employ this technique for characters in a game, simply pick something that your society obsesses about, compose a relevant description, then use an appropriate language to develop names that reflect that society.

Dwarves are probably the perfect example of this approach, obsessed as they are with mining and the earth.

So, “Son of the brother of the digger of silver on the western slope of Mount Implacable”.

Trying that phrase in different languages eventually turns up Basque, where it reads “Zilarrezko Digger anaia Son mendiaren Implacable mendebaldeko malda”.

With a little tweaking, we get “Zilarrezko mugitzeko lurra anaia gizonezko umea mendiaren ez da gelditzen mendebaldeko malda”. Now, that’s a dwarvish name to reckon with!

In ordinary usage, this would be “Zilarrezko iloba gizonezko”, or “nephew of the silver man”.

Sounds pretty good to me.

Abstract and Descriptive Names

Of course, there are a lot more things – and people – that need names. Supervillains and heroes pose a particular challenge, as do things that people name – supernatural monsters, ships (both naval and star-), organizations, places, projects – and adventures. These are all slightly different problems, and some people have trouble with them. In general, they can all be categorized as ‘Abstract names’ or ‘Descriptive names’, so these are what these sections are all about.

Naming Superheroes & Villains

Superhero and villain names are all about projecting a dramatic identity in a single word – usually a noun, sometimes a verb, and sometimes accompanied by a title. The simpler and less intelligent the character, the simpler and more straightforward the name usually is – but sometimes names are bestowed by the media, so this is not always a reliable guide.

More intelligent characters have a wider palette to draw apon, and some choose a nom-de-plum which represents a subtle in-joke (which they never explain, but which makes them smile every time they hear it – useful for endearing yourself to the media). Others like to subtly reference their powers without blatantly advertising their nature. An example from my superhero campaign is “St. Barbara”, named for the patron saint of artillerymen, rocket scientists, pyrotechnicians, and all others who deal in high explosives, and who wields explosive energy beams (amongst her other abilities).

Others reference abstract qualities that (they believe) they represent, or national values, or simply have names that sound “cool” or “threatening” or whatever the image is that they want to put forward.

And then you get the really clever ones, who deliberately use their identification to mislead others as to the nature of their powers so that their enemies will underestimate them, or prepare defenses against the wrong things, or simply be steered away from some weakness that they would really rather not see exploited. One obvious example is a former PC in the supers campaign prior to the original Zenith-3 campaign, who went by the name of Behemoth to disguise the fact that he was both the gadgeteer and the brains of the team. It didn’t work so well when they became famous, but it did give him a decided edge in his early adventures.

So, how do I choose a superhero or villain name?
Taking into account the intelligence level & creativity of the source of the name, I start by considering the various factors and approaches listed above and choose the one that seems most appropriate.

Once I know the naming philosophy that the character is to embody, I can start listing possible names that express the concept of the character in the appropriate manner.

I then employ a thesaurus to find synonyms for all those potential names which are added to the list of potential options. I’ll also do a web search and check Wikipedia for more ideas.

Once I have about 10-12 items on the list, each new possibility gets compared to those already on the list; unless it is at least as good as those I already have, it gets left off. If it’s noticeably better, it replaces one of the lesser choices.

Unless the character is being named by the media or some other English-speaking source, the next step is to translate the list of potential names into the native tongue of the character.

Finally, I’ll go through the final list of contenders, one by one, assessing them for drama, pronouncability, and “appropriate overtones” – the subtle qualities that distinguish a workable name from an inspiring one.

(I was going to insert an example at this point, but this article is running a little late, so we’ll take that as read).

Ships & Starships

These are named for those who have historically represented the ideals of the operators, those who have commissioned or constructed or even designed the vessel (or their relatives or pets), or those abstract qualities that are generally or ideally symbolic of those qualities.

For a military vessel, that generally means a famous captain or admiral, a shipyard/dockyard, or other great city, a ruler or member of the aristocracy, or an abstract quality that reflects other naval traditions or ideals or the specific military role of this particular vessel.

Merchant vessels are named for famous traders or merchants, the trade routes taken by those traders, the merchandise that the vessel is to carry, a figure of which the owners wish to curry favor, a city with a great trading history, etc. They tend not to be named for abstract qualities as these are not considered all that attractive, but many are also named after wives or girlfriends or pets, or qualities like luck that the owners hope they will enjoy.

Ships of exploration are generally named after famous explorers, after those who have commissioned or funded the expedition, for the abstract qualities of discovery or endurance, or for a family member of the owners.

Pirates, on the other hand, like booty, bawdiness, alcohol, freedom/liberty, and intimidating others. Their ships are often named accordingly, though sometimes they will choose a name that lets them claim innocence – at least in the eyes of a nation they wish to become a privateer of. All that having been said, it’s surprising how often a ship will be named for the figure nailed to the prow! (One of our regular readers, Ian Mackinder, runs a 7th Sea campaign; and, in the past, he’s run Traveller and Star Trek and Klingons, so between them he’s worked with vessels in all of these categories. I’m sure he’ll enlighten/correct me if I’ve left anything significant out.)


From the other direction
I want to close this subsection by mentioning an idea presented in “My Enemy, My Ally” by Diane Duane (now getting hard-to-find, I’m afraid). The Romulans in this Classic Star Trek novel take the concept of sympathetic magic and apply it to the names of Starships, believing that the ship’s subsequent history will reflect and be shaped by the name and the qualities it symbolizes – for example, the crew of a ship named the “Intrepid” will forget to feel fear, and so on. For this reason, Romulan vessel names are derived from a specific animal or weapon or equivalent rather than for a general quality, which can overpower all common sense.

Next Time

Whew, out of space and time already! In the next part of this series, I’ll be talking about

  • Naming Places (including Inns and Castles)
  • Alien & Non-human names
  • How to create an Alien Language
  • How to appear to create an Alien Language
  • The Emphasis Of Inheritance
  • Fashions In Naming, and
  • The Importance & Usage Of Titles

Look for Part 5 of “A Good Name Is Hard To Find” in two week’s time…

Comments (8)

The Imperial History of Earth-Regency, Part 7: Disintegration And Repair – 1973-75


This entry is part 7 of 12 in the series The Imperial History of Earth-Regency


Pieces Of Creation is an occasional recurring column at Campaign Mastery in which Mike offers game reference and other materials that he has created for his own campaigns.

All images used to illustrate this article are public-domain works hosted by Wikipedia, Wikipedia Commons, or derivations of such works, except for the “Civil Service Uniform” illustration which is by Me!
 

Continuing the alternate history right from where we left off in the early 70s, these three years were a watershed period in Imperial History. Problems were solved (at least in theory), new problems arose, consequences of old solutions became new nightmares, and some solved problems became unsolved. Some situations improved, but others became worse than was thought possible. An era of instability and bloodshed, it began a transfiguration of the administrative and social concepts at the very heart of the Empire.

General Augusto Pinochet, photo courtesy Archivo Clarin Argentina, taken 19 Jan 1978

1973

’73 was a case of same old same old. Israeli fighter planes shot down a Libyan 747, Palestinian terrorists murdered the Imperial representative to Sudan, Israeli Commandos invaded Beirut and killed three Palestinian guerilla leaders, a violent military Junta in Chile seized power in the name of General Pinochet, Egypt and Syria attacked Israel in retaliation for the Israeli aggression earlier in the year; Israel declared war over Imperial protests, leading eleven Arab nations to cut oil production and raise prices in protest over the perceived Imperial support for Israel, a military coup in Greece ousted the established Government, and the Invasion of Afghanistan continued to flounder.

Profile photo of Richard M Nixon for official portrait, taken 2 sept 1970

Watergate

Irregularities in the conduct of the recent USK† elections led to the impeaching of Prime Minister Nixon. The “Watergate” affair had first come to light in late 1972 thanks to the investigative work of two reporters, Woodward & Bernstein. It was widely believed that they had an inside source, who came to be nicknamed “Deep Throat”.

Some speculate that “Deep Throat” was a member of Imperial Intelligence, who leaked enough details of the bungled attempt to bug the opposition political offices under the direct orders of the Empress, a covert action to bring to heel another US head-of-state before he became an embarrassment to the Monarch. It was not the first time that untoward political events in the USK would be attributed to the Monarchy (refer “John F. Kennedy” in the previous chapter), and it would not be the last event interpreted in this way by conspiracy theorists.

†USK = “Kingsom Of The United States Of America”. Refer previous chapters of the series.

Northern Ireland

Attempts were made to reach a peaceful settlement of the unrest in Ireland, producing a government that would be representative of all the interested parties. This settlement had been approved by a general referendum in Ireland but only 59% of the populace had bothered to vote, despite it being compulsory under the law. At the first attempted sitting of the new Government, proceedings were disrupted by militant Protestants.

Map of Chile, Image by Scansculotte

The Chilean Revolution

Pinochet’s coup was accompanied with a new approach to the problem of relations with the Empire: the Chileans simply ignored it and waited for it to go away. There was no dramatic ultimatum, no attempted secession, no request for recognition; in fact there was no recognition of Imperial authority at all. If this tactic had been attempted a year or two earlier, it might have posed a significant development; but the Pakistan resolution had provided a precedent, which the Imperial authorities proceeded to implement – give them what they want to such an extent that they don’t want it any more.

The instructions given to the various civil servants involved were concise and specific – all Chilean citizens had to obtain citizenship of another country before they would be acknowledged. No trade licenses or agreements with Chile would be honored. No payments promised to the Government of Chile would be honored. No defense of Chile would be mounted in the event of aggression – which had some of Pinochet’s neighbors salivating. No Chilean border controls would be recognized. All expatriot Chileans not seeking citizenship in another country would be deported – with a full and specific explanation of the reasons. The entire country was up for grabs if anyone wanted it; and if Chile fought back, the object of their military action would receive full Imperial Military support.

Within a week, troops were approaching the (former) Chilean border on three sides, and a Chilean Ambassador was desperately trying to arrange diplomatic talks about the Chilean membership of the Empire with diplomats who refused to acknowledge their existence. When the invading ground forces were only minutes from entering Chile, the diplomats got their meeting… at which point, their options had narrowed to two: either petition for recognition, on any terms the Empire chose, or cease to exist as a nation.

Pinochet chose the former, and almost immediately reports of human rights violations began to surface; Chile was consequently downgraded in Imperial Membership repeatedly, and more direct controls were put in place by the Empire. On readmission to the Empire they had been assessed as a Class VII nation; by the end of the year, they were a Class X, and Imperial Administrators had taken control of virtually all public functions.

Pinochet repeatedly failed to learn from his mistakes, however, and continued to take his displeasure out on the public; Chile remains a backwater in the Empire to this day (2055).

Dirty Politics in the Aegean

The Greek uprising presented a more difficult problem.

This was not exactly a fringe nation; while not in the Heart of the Empire, they were not far removed from it. In addition, Imperial Intelligence had uncovered the fact that the coup was secretly sponsored by the opposition to the established government with the deliberate intention of weakening their control over the country, and thereby improving the opposition’s chances of gaining office legitimately, by forcing a downgrading of Imperial Citizenship – ostensibly as a consequence of the current government’s policies. It would be months before the Empire decided how to react to this attempt to take political advantage of its policies by undermining a member in good standing within its borders.

The Three-Day Week Experiment

The year ended as industrial troubles deepened throughout the Empire. While the lessons of the Coal Strike of ’72 (refer ‘1972’ in the preceeding chapter) had not been lost on the leaders of the Unions concerned, they could and did authorize go-slows, ‘blue flus’, and other more passive disruptions of their industries. If the Government were to accede to the union demands, it would produce an inflationary spike that would almost certainly plunge the Empire into a recession; if they did not, the cumulative impact of lost production would produce a deeper, longer-lasting recession after a lengthier waiting period.

Suddenly, all the new policies which had seemed to meet the challenges being posed of Imperial Government were being shown as lacking.

In desperation, the British Prime Minister imposed a three-day working week in a bid to cushion the impact of granting the pay rises demanded; but it was widely recognized for what it was, and the Stock Market continued to decline. After only three months, the shorter working week would be rescinded.

Ein Fiq, Golan Heights - photograph by Roybb95 taken 23 Feb 2004. Click the thumbnail for a larger image.

1974

Imperial Diplomats made further headway in the Middle East, persuading Israel & Egypt to accept a peace plan partitioning their forces on either side of the Suez Canal. The Israeli Prime Minister would later resign when his constituency became bitterly divided over both the War and the acceptance of the peace plan. The Empire monitored the dissention between hardliners and moderates closely, and drew up tentative plans to further reduce the Israeli citizenship status if it escalated, which would have the effect of placing the military and police forces under direct Imperial control.

This was acknowledged as a move to be resisted if possible, as it would undoubtedly spark terrorist reprisals by hard-liners; but, argued the supporters of the plan, it could also have the effect of taking much of the wind out of the Arab Nation’s sails, and might ultimately reduce the level of bloodshed. They were all guessing, really – tapdancing through a policy minefield at the edge of a cliff in total blackness.

The caretaker Government continued to pursue diplomatic resolutions in a bid to stave off this new indignity, and in May reached a settlement with Syria over their territorial claims in the Golan Heights. When the new government was elected in mid-year, they found their choices to be an acceptance of this fait accompli or of being perceived as responsible for their nation becoming a third-world country in the eyes of the rest of the world.

Hatfield Main Colliery, one of many mines whose workers were involved in the 1970s industrial activity. Photo © Paul Glazzard, licenced under the creative commons attribution-sharealike licence version 2.0

A softening of Industrial Relations

In late February, the coal mining industry began a full-scale strike in a bid for better working conditions and a protest for the heavy-handed actions in breaking the strike of two years earlier.

Agreement for the strike was virtually unanimous, and that made the consequences of enforcing that policy the equivalent of extending the strike in perpetuity.

One key difference was that this time they waited until the worst of Winter was past, and sufficient stockpiles had been accumulated to make the action little more than an inconvenience – in the short term. This was a compromise to social responsibility that helped garner them public support, or at least reduced the levels of public alienation, and had a marked impact on the Official response; instead of wading in with all guns blazing, the civic-mindedness of the industry was taken into account. The “essential services” decree was softened in consequence, and a moderate pay rise was granted, despite the knowledge that doing so would sponsor a new wave of wage claims in other industries, and an inevitable rise in inflation.

This was a compromise, there is no question of that; that it was viewed as a just and necessary compromise did not change the impact that it had on the various levels of governance throughout the Empire. Certainty that policy, once established, would be applied consistently was replaced with a seed of doubt. Whatever short-term banefits accrued, the long-term repercussions would outweigh them – but it takes a heart of stone to ignore those in difficulty, and (contrary to public opinion), bureaucrats are not insensitive to the impact of their decisions. Sometimes, it was felt, the long-term pain was a justifiable price for a little short-term relief.

Marcello Caetano of Portugal, photo by UNEO, licenced under creative commons 2.5 general licence

The Portuguese Catch-22

Unfortunately, few people can predict the future with any accuracy, and the true cost of this decision was one that none of the bureaucrats who had made the decision could see coming.

An almost bloodless coup ousted the Prime Minister Marcello Caetano of Portugal. This was a direct result of dissatisfaction with the public service, and reflected the growing impotence of elected officials to make sustentative changes in domestic policy, and signaled the growing public dissatisfaction with their leaders that resulted; but because the coup did nothing to replace the bureaucrats who were now the true masters of the Empire, little changed in consequence.

It should have been viewed as a warning signal of troubles to come, but the reports to the Empress by the Civil Service were so sanitized that the correct interpretation was of events was almost impossible. Instead, the report ascribed the coup to dissatisfaction with the politicians’ ability to deliver on their promises.

Prime Ministers Willy Brandt and Richard Nixon, photo by de:Benutzer:Wolpertinger

The Coup is Complete

In Germany, Prime Minister Willy Brandt was forced to resign following a spy scandal, while the military coup in Greece ended with a raid by the Australian-based Imperial Special Forces which captured or killed the leaders of the revolutionary uprising.

The former leader of the Government was then reinstated, and informed that if two conditions were met, Greek Citizenship in the Empire would not be affected. The first was that he regain control of his country – without violating Imperial edicts on human rights or civil liberties – and the second, that he act to address the causes of the internal dissent that had led to the coup in the first place.

This was a new step for the Empire, and completed the cycle of finger-pointing. For any given domestic problem, there was now always someone else who any involved body could accuse as the source of the trouble. It therefore became impossible to act to correct the real problems, and only superficial symptoms remained addressable. Power had been distributed so evenly through the Empire that no-one had both the authority and the knowledge to change anything. The civil servants were in command.

Chairman Yasser Araffat of the PLO, photo taken in 1974, image provided by www.comunismulinromania.ro/

The Howling Storm

This was most clearly demonstrated a few months later (in retrospect) when the Empire recognized the PLO as representatives of the Palestinian people, and began the process of making Palestine a sovereign nation.

This angered most of the Middle East, to be frank; the rival Arab tribes protested loudly, while the Israelis felt that the entire peace process had been sabotaged and demanded guarantees for their citizens living there.

The Empress was furious; the decree had been made without consultation, and all the Civil Service protests that the recognition was simply to accredit the PLO as local spokesmen and that the sovereignty of Palestine was merely under investigation as a possible solution, could not disguise the fact that they had showed their hand and usurped the Empress’ authority.

Wheels within wheels

This development placed the Empress in quandary; the Empire was perceived as too vast, too complex, for it to be administered without a Public Service of some kind, and she had based her entire rule on the premise that while politicians represented the people, and the peers represented industry, they were all essentially amateurs, and the most valuable advice would come from experts recruited into the Service.

Now the Experts and Bureaucrats were beginning to dictate policy, even if only by painting an unflattering picture of the alternatives, while quietly setting the wheels in motion for whatever solution they favored this week. Furthermore, the first suggestions of corruption within the service were beginning to become apparent. She could not abolish the Service, and any reform would have to be carried out by the very people it was aimed at, and she could not do nothing.

At the same time, she knew that there could be no half-measures; this was a battle for control over the Empire itself, and she would get only one opportunity. Until it came, she would have to bide her time and endure the situation – and take advantage of any lesser chances to keep them from further consolidating their power in the meantime.

1975

If there was no solution at hand for these new problems, at least there was progress being made in solving the old ones.

The equal pay and sex discrimination acts came into force at the beginning of the year, removing the last legal race and gender-based inequities. Tough new laws also came into effect that forced industry to bear the responsibilities of the environmental impact of their operations, which would hopefully begin the long process of correcting the Empire’s pollution problems.

Hopes and Hopes Dashed

The stalemated invasion of Afghanistan continued despite tremendous losses. King Fasail of Saudi Arabia was assassinated by his nephew, permitting his son to gain the throne. This was seen as the first good news to come out of the Middle East in some time, tragic though the circumstances were; Prince Khalid ibn Abdul Aziz was widely known as an Imperialist, while his father was suspected of subtly derailing many of the attempts at a negotiated peace in the region. But hopes of a further improvement in the level of turmoil were short-lived; only three weeks later, civil war again tore Lebanon, and especially Beirut, apart. This was another conflict of ideologies, the division between Christian Evangelists and Muslims being at the heart of the conflict.

Nigerian Soldier, photo by SSGT. Paul R. Caron, USAF.

Even Civil Servants Make Mistakes

Nor was the rest of Africa all that peaceful. In July, a bloodless military coup deposed former General Yakubu Gowan as Prime Minister of Nigeria. The Civil Service had completely misread the signs of the impending revolution, and several of the more influential members had publicly backed the stability of the existing Government.

This gave the Empress a small opportunity which she was quick to exploit; she sacked the offending members for incompetence. There was, of course, an immediate response by the Civil Service, who threatened strike action; but the Empress responded by defining the Civil Service as an “essential industry”, and threatened to invoke the Coal Act against any striking worker.

Nor was the Service entirely united over the issue; in particular, those who stood to gain promotions through the dismissal of the sacked were not wedded to the notion of industrial action. It was entirely likely that Elizabeth would be able to maintain basic services with a much slimmer public service, though the influence they had slowly accumulated would certainly be lost.

The legendary 'uniform' of the British Civil Servant - Bowler, Briefcase, Suit, Tie, and Umbrella (Brolly), Cup of tea optional.


A Life In The Service
By now, a career path through the civil service was reasonably well established. After a classical education through one of a small handful of Universities, one would be recruited into a junior position (for which the individual was grossly overqualified), and would then be indoctrinated into the culture of the Service.

As one gained experience and a satisfactory record of achievement (within the Service definition of the term), one would rise through the ranks. As he individual rose in authority, he would be cultivated as a ‘friend’ by various influential bodies. If he chose the right mix of patrons, he would be helped in his career – introductions made, consultancies arranged, and so on. These would, in select cases, lead to a senior official taking the prospect under his wing, and placing him on the “fast track”.

That official would train the individual in the various aspects of the senior bureaucracy, and groom him to eventually become his replacement. Eventually, the up-and-comer would become a senior official himself, or even a department head. He would then serve for a number of years, repaying those who had sponsored him during his rise; he would argue the position of the moneyed interests who had backed him, he would arrange special favors and memberships of various committees for his former superior, and he would, eventually, select and train his own successor.

After a few years in various civic roles, arranged to his benefit by his hand-picked replacement, he would accumulate enough Brownie Points to be nominated for a minor peerage, which would enable him to take his place as a captain of industry, earning exorbitant remuneration. He would then be steered into the House Of Lords, where he would again work through the ranks, gaining seniority as he represented his backers on various committees and review boards, before retiring, an extremely wealthy man with a long list of honors and privileges.

The benefits of a career in the Civil Service was entirely too much to lose in a single stroke, which would be the result of the Empress following through on her threats. Her willingness to do so was uncertain, but the risks were just too great for the matter to be left to chance.

Ultimately, it was better to accept the principle that Civil Servants could be sacked for incompetence – and then to ensure that the review process was entirely within their control. This was not a decisive victory in the undeclared conflict between the Civil Service and the Empress, but it at least threw them into some disarray and pegged back their confidence.

King Carlos I of Spain (image cropped and edited). Photograph taken 1977.

The Rules Change

In October, Crown Prince Juan Carlos of Spain made history by running for the position of Prime Minister – and winning.

In theory, this was forbidden under the rules of peerage within the Empire, but because he was not yet King, he was able to renounce the privileges that came with membership of the peerage, except for the one privilege that he was not permitted to rescind – that he would one day inherit the throne. That day was not long in coming, as his father succumbed after a lengthy illness less than two months later. This blurred the lines between commons and peers, lines which past rulers within the Empire had been careful to delineate very clearly.

It is likely that had the gap between election and ascension to the throne been lengthier, the Empress would have forced him to either renounce the throne or step down as Prime Minister; but the timing was such that by the time a decision had been made, it was too late to do anything about it.

That said, the new King was undoubtedly popular, and hence there were clear advantages in uniting the two roles. He appeared to be that rarest of animals, a popular and enlightened Dictator. Before she passed judgment over the situation, Elizabeth decided to interview the young King. What was said at that meeting is not a matter of public record, but afterwards, the Palace announced that in this particular case the Empress had seen fit to make an exception to the usual rules – this being one of the few areas in which she still had unconditional authority.

King Carlos announced in his New Year’s speech of 1976 that he did not view the two roles as incompatible; he viewed his position as Prime Minister as representing a perpetual vote of confidence in his rule by his subjects. He also explained that there were conditions attached to the exception granted by Her Wisdom, specifically that his position as Prime Minister was not to be construed as a hereditary office, and that should he ever be defeated electorally, he would not stand for elected office subsequently, and that he would accede to the will of the people if his policies were rejected.

The beautiful mountains of Argentina, photograph by Alps. Click on the thumbnail for a larger image.

1976

The events of 1976 had a familiar ring to them.

A bloodless coup deposed Prime Minister Isabel Peron, widow of former PM Juan Peron, over electoral fraud charges, whose investigation she had blocked using the power of her office. Race relations in South Africa continued to deteriorate. The ceasefire in Lebanon collapsed, only to be restored after 6 days of bloodshed that accomplished nothing for anyone. Idi Amin began to show his true colors, restricting the right to vote in national elections in such a way that only his supporters and aides were eligible; in effect, he rewrote the law to make himself Prime Minister for life.
 

Entebbe Airport, Uganda, scene of the hostage drama that unravelled a Dictator. Photo by SSGT. Chris U. Putman. Click on the thumbnail for a larger image.

The Devil’s Choice at Entebbe

Palestinian terrorists hijacked an Air France jet and forced it to fly to Entebbe, Uganda – the first hints of an accommodation between the Amin regime and the Arab rebels. This put both Prime Minister Amin in a difficult position; Amin did not want outsiders in Uganda, fearing that they would cause trouble for his regime; but if he did not permit the Imperial Military to deal with this new problem, he would confirm the allegations, and bring down on his head the very trouble that he hoped to avoid.

He took the only way out that he could see, deliberately misunderstanding the instructions he received. He was supposed to prepare accommodations and briefings for specialist Israeli anti-terrorist troops who were to be on-hand to storm the aircraft if the terrorists made unreasonable demands; instead, he “misheard” this to say that he was to use the Imperial Troops stationed in Uganda (and nominally under his command) to storm the aircraft because the terrorist’s demands were unreasonable.

In theory, most of the hostages would have been rescued had the task been left to the Israeli experts; instead, the B-grade Ugandan military botched the operation appallingly, and all but 3 hostages were killed outright. Prime Minister Amin was suitably apologetic afterwards, blaming the poor state of communications equipment throughout Africa and requesting £735 million to upgrade telecommunications throughout the continent in a programme to be administered by Uganda on behalf of the Empire.

The audacity of this request in a fiscally-restrained climate was breathtaking, and initially it served its purpose of distracting the Imperial Bureaucracy from any investigation of the Entebbe massacre; the proposal won considerable support in a number of quarters, especially France and Germany, whose industries would almost certainly be subcontracted for the job. It fell to “The Whisperer” to raise the red flags and begin the downfall of Amin.

The Whisperer

“The Whisperer” was an opinion/gossip column in the London Times which every now and then was the medium of choice for “official” leaks.

The column of July 7th is widely believed to be an example of this usage; it stated that the author had heard a recording of the conversation with Prime Minister Amin prior to the Entebbe disaster which showed no signs of communications problems. It then argued that this suggested that the competence – or lack thereof – of the Prime Minster was more likely to have created any misunderstandings; and then asked the very pointed question, “These people couldn’t organize barracks and lunch for the specialist soldiers being dispatched to deal with the crisis; now they want to organize a 735 million Pound pan-African telecommunications programme?”

With the distraction eliminated from the agenda, and the implication that Amin was lying about the cause of the misunderstanding, the very issue that he had hoped to deflect again took centre stage. Journalists began to investigate his regime with a thoroughness and tenacity that were in the highest ideals of journalism.

Ian Douglas Smith, Prime Minister of Rhodesia. Photograph taken in 1990 by Cliftonian and made available under the creative commons licence 3.0.

A Rhodesian Diversion

This investigation was interrupted briefly by a dramatic development in nearby Rhodesia. For 6 years, they had tried to go it alone as an independent nation in the centre of Imperial-controlled Africa.

On September 24th, President Ian Smith stunned the populace by announcing a two-year plan for transition to Black Rule, and readmission into the Empire as the “Kingdom of Tribal Africa”.

This proposal was almost farcical; the native tribes of Africa were no more capable of union than the Arabs, and for essentially the same reasons. Perhaps in 30 or 50 or 100 years, this might be a realistic goal; but the concept at the current time was sheerest fantasy. Nevertheless, Smith was completely serious.

His concept was to take a leaf out of the Israeli political handbook: Rhodesia would form the central administration for an empire-within-the-Empire; any native African would be eligible for citizenship in the new political unit, in addition to any national citizenship they might retain locally.

In particular, Smith wanted to try and avoid Central and Southern Africa degenerating into the kind of senseless hatred that consumed the Middle East. They were laudable and lofty ambitions, and the generated many columns of interesting newspaper ink, but no-one expected it to work.

Bosco delle Querce ('Wood Of Oaks'), the park built to commemorate the Seveso Disaster. Two tanks of contaminated waste are buried here. Photo taken 20 March 2011 by Massimiliano Mariani and licenced under the creative commons 3.0 licence.

The Incident At Seveso

The newspapers always find something to print, but 1976 was an excellent news year by any standards. The ongoing Middle East crisis, the Entebbe crisis and its aftermath, and the Spanish succession were all good for headlines for weeks. As was the first of a new type of story that would become all too common in future years – the first ecological disaster.

This story broke less than a week after the Entebbe disaster, and centered on the town of Seveso (near Milan) in Northern Italy, which was devastated by the accidental release of poisonous dioxin gas from a nearby pesticide plant.

Subsequent investigation showed that the pesticide concern had been losing money for some time, and that as a result of cost-cutting, safety measures had been neglected, which permitted an accidental fire to develop into a major catastrophe. This was the first serious test of the Imperial legislation on pollution and if found guilty of placing the environment at risk, several Italian nobles faced potential jail terms.

The prosecution case fell apart when private settlements conditional on a denial of liability were reached with the survivors; to protect their compensation, they refused to testify. In theory, their testimony could have been compelled, but in practice they would have been unconvincing. Also in theory, the nobles involved had committed the even graver offences of conspiracy to pervert the course of justice, bribery, and interfering with witnesses; but unless the survivors testified, those charges would not succeed either, and even then it would be touch and go.

Thus the management received a reprimand, a rap over the knuckles, and a financial loss that would certainly have been much greater when the legal fees for defending against the charges were included. The newspapers had a field day on several occasions as this succession of scandals emerged.

The Seveso Legacy

There were immediate repercussions to these controversial events. In particular, there were three consequences:

  • the links between the Civil Service and the Peerage, who in turn were directly connected to big business, came in for considerable (and unwelcome) scrutiny;
  • manufacturers throughout the Empire came under harsh scrutiny by both environmentalists and investigative journalists whenever there was a slow news day;
  • and the public discontent grew even stronger.

The events confirmed in the Empress mind that the Civil Service now posed a grave threat to the Empire. It had become a recruiting ground for the Peerage, and in the process, had become partisan. She was still unable to force a confrontation, however; the biggest weapons at hand against the Peerage were the labor unions, but using them would only replace one problem with a bigger one, and nothing less would be enough. The wedge she needed was still missing.

1977

1977 was always going to be significant year in Imperial History. Not only were there still ramifications of ’76 to fill the headlines, but this was the year of the Empress’ silver anniversary. While the week-long official celebrations were not scheduled to begin until June 1, the year would be full of lesser events – dedications, public appearances, and so on. In many ways, this was time the Empress would have preferred to spend addressing the problems still facing her rule, but the calendar could not be denied, and there were undeniable advantages in the longer term to reinforcing public support for her rule.

Photograph of Haile Selassie taken October 1, 1963, during a visit to Washington, by Cecil Stoughton.

The African Dilemmna

Those problems continued to be hot news items. Africa continued to disintegrate politically, for example.

In February, Prime Minister Haile Selassie of Ethiopia was murdered. There were three political theories of the crime, none of which were ever disproven; the simplest was that this was a coup attempt that went horribly wrong; the second, that it was a terrorist act; but the third, and favorite amongst the ever-present conspiracy theorists, that Ethiopia had been opposed to the Rhodesian “Grand Plan” and had been removed to make way for someone more inclined to the will of Ian Smith.

Only two weeks later, Archbishop of Uganda Janinin Luwum, a civil rights activist, was murdered by security forces before he could keep an appointment with a British Journalist; and with each such act, the world edged closer to discovering the truth behind the Amin Regime.

In July, Somali forces invaded Ethiopia in a dispute over the Ogaden area, while in September, the militant Black South Africa leader Steve Biko was killed while in police custody; but when everyone involved was acquitted by an obviously partisan investigation, it became one violation too many for the Empire.

The ongoing violations of human rights in the Kingdom’s war of oppression against the Native Tribes had been public policy for some time, and tolerance of extremism anywhere on the continent was wearing extremely thin. In November, the Empire prohibited weapons sales to or within South Africa.

Zulfikar Ali Bhutto, photograph taken during a visit to the White House by the White House Press Office.

From Bad to Worse

The Middle East was no less disastrous. The Afghan meat-grinder continued to chew up men and material.

In March, Zulfikar Ali Bhutto claimed a massive victory in the Pakistan General Elections, maintaining the existing government despite the ill-fated rebellion against Imperial Rule. Within days, the kingdom had erupted in widespread violent protests and uprisings over allegations of vote-rigging.

After almost two weeks of riots, and after the Empire had promised an impartial investigation of the allegations, the violence quietened down; after receiving the reports of the investigators, Imperial General Zia ul-Huq arrested Prime Minister Bhutto, seized power, and declared Martial Law.

He publicly thanked the investigators and ensured that they were not molested by the Press on their way to the airport; in fact, they were ushered out of the country with unseemly haste. Only when their jet landed at Heathrow were they able to reveal that their investigation had uncovered vote-rigging and massive corruption by both major parties.

The General had, in the meantime, passed this news to Her Majesty and advised that as neither party could be trusted to form a legal government, rule of Pakistan had to be placed in the hands of a third party, of necessity; his military dictatorship would control all government functions until such time as the parties held a proper election, under the gaze of international inspectors, which showed no fraud or malfeasance.

He then established rules of governance that made such elections virtually impossible, in particular placing restrictions on the right of citizens to gather in numbers – not unreasonable, in terms of maintaining law and order, and given the violent unrest of recent times, but of a certainty they made organising a general election all-but-imposssible.

This was a fundamental breakdown in the Imperial Model, which assumed that Governments would always be democratically elected. Nevertheless, under the conditions described, there was little choice; at least ul-Huq was considered to be loyal to the Empire. The Imperial concern was that this set a precedent for others to follow; by interfering on behalf of both sides in an election, in a way that was manifestly apparent to the citizens, it was possible for a military to seize power indefinitely on a more-or-less permanent basis. It was the ultimate distillation of the Military Coup.

Anwar Sadat (L) and Menachem Begin (R), Photograph by White House Staff Photographers.

The Birth Of Hope

By mid-year, there were finally promising signs from the region that there might be a return of peace in the Middle East, with the election of a known Moderate to office in Israel, Menachem Begin.

After months of effort by Imperial diplomats and go-betweens, Prime Minister Sadat of Egypt signaled his willingness to negotiate peace terms with the Israelis. Once this willingness had been stated, the Empire broke diplomatic speed records in arranging a face-to-face meeting between the two; less than two weeks after his initial offer of negotiations, Sadat had addressed the Knesset (the Israeli Parliament) and was engaged in top-level discussions, and meaningful progress was underway.

Sadat immediately came under political fire from his neighbors for this stance; but he refused to have the policies of his nation dictated by others, and responded in December by breaking off diplomatic ties with Syria, Iraq, Libya, Algeria, and South Yemen. Another major power in the region had returned to the Imperial fold; but would the consequences increase or decrease the resiliance of the fragile stability of the region? Only time would tell…

Comments (1)

The House Always Wins: Examining the Concept of House Rules


Sometimes we old hands, tired of a subject that’s been talked to death, or thinking that everything there is to be said on the subject has been said, forget that a lot of players and GMs have come into the hobby more recently than we have, and hence weren’t privy to those discussions.

This produces, or at least contributes to, the ‘Generation Gap’ that differentiates modern gaming from ‘old school’ gaming. To overcome this, every topic needs to be revisited from time to time.

One of the most controversial posts I’ve ever written here at Campaign Mastery – much to my surprise – has been Draco Inadequatus: Beefing Up 3.x Dragons, published in August 2011, largely because it touches on one of those subjects that is all-too-familiar to old hands of RPGs – House Rules.

In a good-spirited but lively debate in late January, in the comments of the aforementioned post, Scott (one of our readers) touched on the subject of House Rules and why they should – or shouldn’t – be present in a game.

Credit Where It’s Due
I want to start by acknowledging up-front two references that were great reads in researching this article. I’m going to try and avoid repeating what they say, so you might want to click on the links below and have a read. They’ll open in a new tab/window, so don’t worry about losing your place.

The first is a section of “Introduction To Role-Playing” called ‘Kim D&D’ and especially the section ‘Setting Philosophy‘ and especially the first two subsections, concerning Realism vs. Playability and Continuity and Longevity. I don’t know when Kim E Lunbard wrote his content, but he couldn’t have summed up my attitude to the subject any better if he had tried, hitting several of the points I wanted to make squarely on the head and reminding me of what’s important in this conversation.

The second is a forum topic post at Warpshadow.com in which Eidre lists her house rules criteria. This dates all the way back to 2005 but is still directly relevant today.

Why Have House Rules?

Let’s start by looking at some of the reasons why a GM might introduce House Rules:

  • Regulation of Social Behavior
  • Correction of inadequate rules
  • Compatibility with the Campaign Setting
  • Improving game performance
  • Correcting a perceived Game Balance problem
  • Inspiration and Experimentation
  • Adding to the fun or game flavor
  • Increasing the flexibility of the Game
Regulation of Social Behavior

House rules are often used to regulate social behavior at the game table, but while house rules such “who brings the drinks” may be important, most of them are beyond the purview of this dissertation. But there is some overlap with the type of rules that we’re concerned with – questions like cocked dice, dice that roll off the table, consequences for interrupting the GM or another player, and so on.

This sort of thing is what most people would recognize by the term “House Rules” and they are not that dissimilar to what many other sorts of “House” might have – from a restaurant to a footie clubhouse. I doubt there is any serious contention from anyone that this type of House Rule, defining etiquette and standards of personal behavior, is perfectly acceptable. But don’t get too comfortable with this sense of agreement, it won’t last.

Correction of inadequate rules

In fact, we’ve gone directly into a source of ongoing warfare between different gaming philosophies that has been bubbling away within the hobby for as long as I’ve been part of it – 1981, if anyone’s counting.

Are there rules that just don’t work? Most game systems usually have one or two mechanics that either simply don’t work, or that take too much game time to administrate, or whatever. When this sort of situation comes to the GM’s attention, he has only three choices:

  • Live with the rule as written regardless of how broken it might or might not be;
  • Eliminate the rule and any attendant subsystem or sub-subsystem completely with a house rule; or,
  • Replace the rule with a house rule.

I can remember when no RPG review was considered either complete or adequate without the identification of at least two holes in the game system to watch out for!

To be honest, there is a lot less justification for this sort of change than there used to be in the bad old days. That doesn’t mean that this is completely dead and buried; I remain firmly convinced that the DC-setting mechanism for skill checks in D&D 3.x is broken, making success too easily achieved. But, by and large, the game designers have learned over the years, and game systems are now (generally) a lot more robust. This sort of problem is now the exception, not the routine reality that it used to be.

Compatibility with the Campaign Setting

This is a reason that far too few GMs consider, but it’s one that I consider not only perfectly justified, but absolutely mandatory.

Game settings which are not supported by House Rules are settings without depth. They are making no difference to the day-to-day operations of characters within the game, and hence are pretty superficial.

It’s only when a game setting begins to make itself felt within the game mechanics that it really comes alive. Different, campaign-specific skills, magic, materials, powers, restrictions, character classes, different or variant races, unique monsters, strange settings with unexpected environmental effects… perhaps just one or two of these, or all of them.

But this opens a Pandora’s box – whatever is put in place will never be as rigidly play-tested and very rarely as well-designed as the original rules. And if you change the rules too much, the game will become unrecognizable. The Game Setting can justify any rules change – but with that power comes great responsibility. I’ll talk about the right way to discharge that responsibility a little later.

Improving game performance

Maybe there are rules that actually do what they are supposed to, but have a side-effect: slowing combat or other frequently-occurring gameplay to a languid molasses-like crawl. The weapon speed adjustment rules for AD&D used to be like that, for example. Elemental Controls and Multipowers in the Hero System can sometimes be another example – the combat sequencing system certainly is.

There is always going to be room for improvement in any set of rules, and anyone who thinks differently needs to change the colour of their lenses – lose the rose-tinteds, folks!

At the same time, there is a school of thought that runs, “if it ain’t broke, don’t fix it”. And “don’t reinvent the wheel.”

And both sides have valid points in this particular debate. Quite often – as Johnn discovered, and reported, in his post My Group’s Time Thief Revealed – what seems to be the roadblock, often isn’t.

There are also times when changing something to make it more playable does extreme damage to the look-and-feel of the game system – at which point you should think twice before keeping the House Rule.

And, finally, quite often you will take a mechanic that is clunky-but-works and replace it with one that is faster-but-broken – or worse yet, with something that is clunky-and-broken.

The upshot is that this motivation for house rules is a quagmire filled with unsuspected dangers and expected twists and turns. That makes this motivation a marginal one at best.

Correcting a perceived Game Balance problem

And here’s another hot-button topic setting up an army marquee on the field of battle: just what is game balance, anyways?

The problem is that the one term is used generically to sum up a whole host of different aspects of the rules system. It can refer to imbalance between races, genders, classes, spell effectiveness vs. spell level, combat effects in general or in specific, risk-vs.-reward, the power of magic items vs. the ease of construction and/or cost of said items, and on and on and on.

It’s usually at this point that the “Realism First” proponents start speaking up, if they’ve even managed to keep quiet for this long.

The problem is that changing any one of these rule-effectiveness ratios usually causes another to go out of whack, which also needs adjustment, and so on. I’ve seen it happen on many occasions.

The difficulties are made worse by the fact that these are rarely as simple as a constant ratio. Take D&D 3.x Mages, for example: at low levels, they are as fragile as spun glass and almost as useful in a combat situation – until at some point they suddenly go to the opposite extreme and then power their way into the stratosphere. There is no one simple fix for this, because what we have here are two or more separate game imbalances with opposing effects.

Boost the usefulness of low-level mages and you will usually find that high-level mages, already demigods in comparison to their peer rankings in other classes, become Cosmic Powers. I’ve seen all manner of attempts to fix this, and most have failed miserably. The only one I saw for AD&D that came close to success was so radical that the game no longer felt like D&D, it was closer to RuneQuest, and beggared the question of why they didn’t play RuneQuest to begin with?

This isn’t just a quagmire, it’s full of landmines – a dangerous reason for a House Rule. Sometimes necessary, but more often than not, it’s a poor justification.

Inspiration and Experimentation

Cooks play around with recipes, artists play around with paints, and roleplayers play around with rules systems. It’s a fact of life, a measure of our level of interest in the games that we run that, like a car-collector, we want to get our hands dirty and tinker around under the hood in search of an extra 3/8ths of a horsepower.

I’ve been astonished at the degree of (polite, slightly removed) crossover interest between the Formula 1 fans that I chat with on twitter and the RPG GMs and authors. Each has promoted a link or article about the other at some point, sometimes more than once. It was only when I wrote the paragraph above that the similarity of interests made itself clear to me.

Of all the reasons for creating House Rules that have been discussed so far, this is probably the least defensible, but also – cloaked behind claims of more benevolent motivations – probably more common than we like to admit.

At the same time, every improvement to the robustness of game systems that has occurred in the history of RPGs started life as an experiment and was filtered through the rule author’s accumulated databank of things that had worked and not worked in the past. You can’t learn how to write good rules until you’ve written your share of bad ones.

So there is something positive to be said for experimentation with the rules, so long as the author (usually but not always the GM) doesn’t hide the real reason for the rule behind some other purpose.

Adding to the fun or game flavor

This is a somewhat touchy one. We all get different things out of an RPG, so one person’s “fun” probably won’t match that of the person seated next to them.

All the changes that I’ve seen that use “fun” as a justification rank amongst the most egregious cases of Monty Haul-ism and munchkin-ism that I’ve encountered:

  • “These Vampires shoot death beams from their eyes – cool, hey?” (well, No.)
  • “The Dragon leaps out from the old man’s backpack and breathes fire at you. It has a wingspan of 400 feet and does 30d6, no saving throw.” (Really? Uh-uh.)
  • “The wand explodes when the Druid casts Warp Wood on it, doing 400d6. Save for half damage.”
  • “Okay, everyone’s weapons are now +1 enchanted.”

These are all real examples that I’ve encountered at the game table. In each case, the GM had changed the rules (and, in one case, what was physically possible) to make the game more “fun”.

Other examples of this type of rules change include incorporating critical hit tables from another game system because they sound like fun, etc.

As for game flavor, if it doesn’t match the campaign setting you want to use, how can you justify changing it? Either the ‘flavor’ matches the setting and the campaign, in which case the change is contradicting what you want to achieve, or it doesn’t, and the change can be better justified on those grounds.

So far as I am concerned, neither of these justifications hold water. Either a House Rule can be successfully defended on one of the other grounds listed, or it shouldn’t be there.

Increasing the flexibility of the Game

Few GMs stop to realize it, but every game supplement outside the core rules that they choose to incorporate into their games comes with a House Rule stating the acceptance of the material in that product.

Sometimes, more than one will be needed, because even when sourced from the same publisher, even the publisher of the core rules, these are not always compatible with each other. Sometimes they can be downright contradictory, at other times one simply ignores the existence of the other.

The only mention of the 3.0 Epic Levels Sourcebook in Deities and Demigods is the suggestion that in an epic campaign, it might be necessary to increase the character level of the Deities.

Every time a player takes a prestige class that isn’t in the PHB or DMG, a House Rule comes into existence. The only exception to this statement comes if there is one House Rule that gives blanket approval to ANY source that the player wants to use.

Heck, even the core rules of a game usually contain a few optional rules!

The Compact With Players

There is an unwritten agreement between the GM and players. The GM promises to interpret the rules fairly, and to create adventures for the PCs that are fun for the players – or, at the very least, interesting/diverting. The players agree to accept any measures that the GM finds necessary in order to achieve this result, and to tolerate the GM’s style. The GM promises not to interfere with the player’s ability to dictate the attempted actions of their characters, and the players agree to accept the GM’s arbitration of the results and consequences of those actions.

If a game is no fun for the players, they won’t stay. Every House Rule has the potential to erode that fun, or to enhance it.

If a game is no fun for the GM, he won’t run it. Every House Rule has the potential to destroy that fun, or to enhance it. Every arguement about a House Rule will certainly reduce it.

Gaming is a shared experience, and both sides need to be mindful of that fact, and allow the other their share of the joy.

Taming The Rules Jungle

So there are lots of reasons for introducing House Rules. Some of them are dubious, some of them hard to argue with. (That is NOT a challenge to our readers to do so!) The only way to sensibly integrate House Rules is to have some House Rules about House Rules!

In General

There are a few things that should be done, regardless of the justification for the rule.

  1. Make sure everyone knows what the House Rules are.
  2. Keep them in writing.
  3. Review them from time to time, and make sure they are up to date.
  4. Each time a House Rule is introduced, give it a defined trial period.
  5. Establish very clearly the reasons for the House Rule and, from those justifications, what the purpose of the rule is.
  6. Discuss the need with your players BEFORE introducing the house rule.
  7. Look for implications, especially hidden ones. Have your players do likewise.
  8. Establish clear criteria for failure, and a clear fallback position. The rule might need to be tweaked, replaced with a new House Rule, or abandoned completely, reverting to the base rules.
  9. When the review period is up, demand your player’s input on the rule – is it a success or a failure? Has it had unexpected consequences or repercussions, and are those desirable or undesirable? Are any gains worth the costs?

There are probably more, but that list – even though most of it is blindingly obvious – is enough to get started on.

Some specific notes, by justification:

Social Behavior:

These are the least likely to cause you trouble if prepared in advance of someone getting upset. Don’t get carried away with penalties and the like, if you don’t have to – remember the Compact! That said, if it comes down to a choice between shutting down a game that is even only sometimes fun or getting strict about social behavior, you may need to get tough. It’s better to throw one player out than spoil everyone’s fun.

Each GM has some soft spot that players can savage – I’m a fairly mild-mannered guy, but constantly being interrupted and not being permitted to finish what I was saying can make me explode. I’ve never been driven to the point of just getting up and walking out, though I have seen others do so. I did once, in early 2005, reach the point of contemplating getting out of RPGs altogether because they just weren’t fun any more. And I did once reach the point of ripping an adventure in two and shutting down the campaign – before being persuaded by the rest of the players to resume it.

The worst time to craft such rules is when problems have driven someone to the point where they are about to snap.

It’s also worth remembering that there are three sides to any dispute – your side, their side, and the people caught in the crossfire.

Correction of inadequate rules

This justification requires careful thought. Why are the existing rules inadequate? Is there any chance of misinterpretation? What would constitute an improvement? What else will get lost in the change, if anything?

Campaign Setting

Because this can be such a blanket justification, it comes with some specific hazards. It may be necessary to be prepared to junk part of the campaign setting if the rules to implement it are excessively troublesome. I’ve written in the past (My Biggest Mistakes: The Woes of Piety & Magic) about one occasion when that was the case. This is worth reading because it also has advice on removing a flawed house rule.

Game Performance

Although it might seem that direct comparisons of performance are possible, this is never quite true; there are too many variables. Even repeating exactly an encounter or circumstance won’t work because the players will have seen it before, and will have learned from the experience what works, and the die rolls will be different. It follows that these criteria will either be statistical in nature, or subjective. The problem with statistics is that they can easily be misinterpreted; the problem with subjective differences is that they have to be fairly substantial to be noticeable and memorable. And the problem with using both is that they can disagree.

And, of course, as noted earlier, the whole House Rule can be mis-targeted.

So assessments of House Rules justified in this way are often the fuzziest – just when a clear yes-or-no is most desirable.

The best answer is not to get too fancy – track the ratio of combat time vs non-combat time, divide by the number of encounters of each type, and compare the gross effects. Then take with a grain of salt.

Things to watch out for are anything which gives the players or GM more work to do – refer to the “Piety” article listed above. A small delay which recurs frequently can add up to a mountain of lost and wasted time.

Game Balance

Here we’re talking about assessments made in a dynamic situation. As a result, these are also pretty subjective or fairly extreme.

Inspiration and Experimentation

Ohhkay. More than any other, this category needs clear criteria for failure that can be objectively measured, because you are replacing something that works with something that might work better.

Fun/Game Flavor

If this is the sole justification for a rule, it had better be a spectacular success or else. I have never seen house rules that exist for no other reason that were any good, but that doesn’t mean they don’t exist. Even when I advocated House Rules in the context of genre-enhancement as part of the series on Reinventing Pulp the changes can be justified in terms of Campaign Setting.

Again, be very clear as to what will constitute an improvement.

Flexibility

Since this is the one area of House Rules that is virtually guaranteed to be present in a campaign, it is the most important of all to have clear criteria. I use a fixed procedure to assess proposed character classes, feats, spells, magic items, even new skill uses. The following is an extract from the house rules for my Fumanor campaigns:

  1. Editable Copy A physical copy of the must be provided to the referee for conversion to an editable computer-based document. Where this is provided by the loan of the sourcebook to the referee, the “computer version” must be generated by the referee when time permits before this step is considered complete. Players wishing to accelerate the process may choose to submit an electronic copy ready to be edited and then copy-and-pasted onto the approved list. PDFs of the source which do not permit copy-and-paste are considered the equivalent of loaning a sourcebook, because the work required is still the same. Note that it is not enough to simply list the game mechanics, the ‘fluff text’ are considered to be rules needing review as well.
  2. Background Justification: The referee will review the content from a standpoint of campaign background fit, and make any adjustment deemed necessary, or refuse to approve the submission. If the submission is not rejected as unsuited to the background, it then proceeds to step 3.
  3. Comparative Justification: The referee will then review the content from a standpoint of game balance, and make any adjustment deemed necessary, or refuse to approve the submission. If the submission is not rejected as unsuited, it then proceeds to step 4.
  4. Rules Justification: The referee will then review the content from a standpoint of uniqueness, logic, and necessity, and make any adjustment deemed necessary, or refuse to approve the submission. If the submission is not rejected, it then proceeds to step 5.
  5. Requirements Review: The referee will then review the requirements to ensure that they reflect the considerations of steps 2-4 above, and make any adjustment deemed necessary, or refuse to approve the submission. If the submission is not rejected, it then proceeds to step 6.
  6. If approved, the submission will be noted for inclusion on the the official Approved lists.
  7. When time permits, the referee will act on that note and add the approved version to the official Approved list. If time is short, he may include the submission as an addendum to the official list; this qualifies as approval.
  8. Until placed on the register of approved submissions, characters may not choose the option, though they may reserve a “slot” (feat slot, character level, skill points) pending availability.
  9. Any character who has an un-Approved option listed on their character sheet will lose both the option and the slot it occupies, and the option in question will be banned from the game from that time forward regardless of whether or not it would have been approved had it been submitted properly. It is therefore in the player’s best interests to submit any desired material for approval in advance of choosing the option for their character.

Looking Beyond The Rules

I want to close this article by pointing readers to an Ask-The-GMs post from September 2009. While the secondary focus of the post is answering a specific rules question – an answer that not everyone agreed with – the primary focus is on the process that I use in determining such answers. Every ruling is a House Rule, that should be applied consistently from that time forwards. Click on this link
to read the post. And remember to have fun in your games!

Comments (10)

The Imperial History of Earth-Regency, Part 6: Coming Apart At The Seams – 1960-1972


This entry is part 6 of 12 in the series The Imperial History of Earth-Regency


Pieces Of Creation is an occasional recurring column at Campaign Mastery in which Mike offers game reference and other materials that he has created for his own campaigns.

All images used to illustrate this article are public-domain works hosted by Wikipedia, Wikipedia Commons, or derivations of such works, with the exception of the “fistful of dollars” which is public-domain clip art.
 

Beatles original drum logo by Erwin Ross, Hamburg. Logo may be a registered trademark.

The 1960s

History continued its implacable march. The 1960s saw the rise of “pop” music and the overwhelming success of the Beatles.

U2 spyplane photo by jamesdale10

1960-1962: Unhappy Developments

A U2 spy plane crashed in China, and the pilot was arrested. A planned summit meeting with the Mao had to be abandoned when they subsequently refused to attend unless the Empress delivered a personal apology; she refused. The cause of the crash was never ascertained; was it mechanical failure, despite the aircraft being inspected and judged airworthy before the mission? Or had the Mao learned to counter the primary Intelligence source deployed against them during the 1950s? Even the possibility of the latter cause was enough to accelerate demands for the conquest of “higher ground”, i.e., Space Travel.

In the Congo, local battalions of the Imperial Army mutinied against the government when they refused orders to conduct a wave of terror against the populace, which had rioted against Prime Minister Youlou only a week after his election following allegations of electoral fraud. Imperial peace-keeping troops had to be sent in to restore order, the beginning of an unhappy pattern in Africa. In response, the all-white government of South Africa banned all Black political groups and arrested the most vocal Black Leaders, including Nelson Mandela, on trumped-up charges. African politics would tear the continent apart for the next 20 years.

Fidel Castro, 1978, photo by Marcelo Montecino

Castro

In October 1960, another case of internal disunity within the empire rose in significance as Cuba nationalized all USK-owned† industries in response to “Economic Aggression”; within 6 weeks the USK would establish an embargo on all freight to or from Cuba originating from their Kingdom and warned that any ships passing through their waters with a Cuban destination would be subject to seizure without reparation. A naval blockade by the US Coast Guard was immediately put in place to enforce these edicts and a military base established at Guantanamo Bay. On 3rd January of 1961 they would sever all diplomatic links to the Island nation, and in April an attempted invasion by US-based Cuban expatriates and sponsored by the USK branch of Imperial Intelligence was foiled. In response, the Cubans led a coalition of Central American Nations to secede from the Empire and form an independent Kingdom. King Castro welcomed alliances with other nation-states within the Empire; while most were cautious, a few opened negotiations without waiting to see the official Imperial Response.

†USK – Kingdom Of The United Stated Of America. Refer to previous chapters of this history for any further explanation.

The USK had made it difficult for the Empress to intervene directly against the new state, having compromised the reliability of the Intelligence Service on which she relied for accurate assessments, necessary for the formulation of effective and practical policies. The unilateral action taken in the Imperial name as good as announced that the USK was ready to go to war with the rest of the Empire over the issue; a circumstance that would undoubtedly trigger the attempted secession of a number of other disaffected nations.

Politically, the choice was between condoning the actions of a rogue state or of overseeing the disintegration of the entire Empire. She was reluctantly forced to choose the former option, but tasked IMAGE and the Intelligence services with the priority mission of bringing the USK to heel. This cost her much of the political credibility she had built up over the years; and she foresaw the Americans becoming the centre of political power within the Empire if they were not stopped.

Tristan da Cunha, from the NASA ASTER volcano archive. Click on the thumbnail for a larger image.

Tristan da Cunha

Later that year, the island of Tristan da Cunha was destroyed by a volcanic eruption; the population had been successfully evacuated after a warning to the Empire was issued through diplomatic channels by the Mao. This was seen as a mixed blessing by the Empire; it was of great concern that the Mao knew of the impending eruption, but the issuance of the warning gave hopes that common ground for peaceful relations might be found in humanitarian considerations.

NASA Landsat image of the Shetland Islands

1963

In October of 1963, a similar warning was sounded concerning the Shetland Islands; although not known to be volcanically active, the Empire evacuated them rather than taking a chance. The morning of the supposed “eruption” brought clear skies and a Chinese invasion force which appeared literally from thin air in front of the various instruments that had been emplaced by Imperial Scientists to monitor the situation. The Mao forces seized control of the island unopposed, in the process demonstrating that they had achieved a technological defense to the Empire’s nuclear capability.

After substantial saber-rattling by both sides, and having proven their capabilities, the Mao withdrew; before the island could be reoccupied by the Empire, it was destroyed by a volcanic eruption, revealing that the Mao, too, now had city-destroying capabilities.

Scientists studying the remains soon established that the volcano had not been generated through nuclear power; none of the signature characteristics were present and there was no radioactive residue. Mao science – like the Mao themselves – remained as inscrutable as ever. And, as usual when the Mao were involved, there was anguish amongst the Imperial Intelligence community.

Prime Minister Kennedy of the USK - official white house photograph

John F. Kennedy & The Hippie Movement

In late 1963, the American President who had overseen the entire Cuban Crisis, John F. Kennedy, was assassinated while pursuing reelection. Conspiracy theories immediately began to surface, though none were ever proven – but not all could be disproven either.

Was it a Cuban-backed assassin? Or was Lee Harvey Oswald a patsy for an internal US revolt against the Kennedy Dynasty? Or had the Empress removed a leader who had become an official Embarrassment? This was also the year that had seen race relations within the USK come to a violent head; was there a connection? When the founder of the pro-violence Black Nationalist Group was assassinated over a year later, was it payback?

There were those who would never be convinced, and the truth (whatever it was) would never be known to anyone’s complete satisfaction.

The assassination of Kennedy was seen as a watershed event in Imperial History. Had he not embarrassed the Empress over the Cuban situation, instead of a second term as USK president, he was considered a sure bet for being the first colonial to become Prime Minister of The Empire. His natural charisma and the aura of hope and destiny that he manifested at will were tremendous assets, as was the undoubted ability to extract that little bit more enthusiasm and optimism in and for the Government. (Which gave the conspiracy theorists yet another possible solution to the mystery – had IMAGE “brought the USK to heel”?)

Although Kennedy had never announced such plans, it would later become public record that such a campaign had been in preparation before Cuba scuttled his chances. But more significant than might-have-beens was the impact of his passing; Kennedy had achieved substantial popular support throughout the Empire with his enthusiasm for the future, and when he was killed in Dallas, much of the public faith and trust in the Government seemed to die with him.

From that point on, a growing element within the Imperial population would consider the Government their enemy, not their leaders. Over the next four years a counterculture amongst the youth of the Empire would begin, fueled by psychedelic drugs, alcohol, and disillusionment with “The Establishment”. The heirs of the Beatniks of the 60s, the ‘Flower Power’ movement would succeed where its predecessor had failed, becoming a universal element within most Imperial Nations rather than a cultural fringe.

1967

1967 also saw a war in the Middle East narrowly averted, while in 1968, problems arose in Northern Ireland, where extremists demanded an Irish Kingdom within the Empire.

John Lennon rehearses 'Give Peace A Chance', Photo by Roy Kerwood

Peace, Love, and Anarchy

The Hippie Subculture rebellion was in full swing, and a growing faction centered in Germany but spreading throughout Europe advocated a united political front – “Youth Of The World, Unite” was the slogan – with the avowed intention of electing John Lennon to the position of Imperial Prime Minister in 1969 – for the purposes of disbanding the government once and for all.

Author’s side-comment: John F Kennedy vs. John Lennon – that would have been an election worth watching!

Despite the loss of support from areas of the US following the “bigger than Jesus” controversy, the prospects were enough to concern established politicians throughout the Empire. But when the leader of the movement, Rudi Dutschke, narrowly survived an assassination attempt, the “peace-loving” movement exploded into violence across Europe.

In many Kingdoms, especially France, the rioters were joined by workers and union officials who staged strikes in sympathy and in support of the cause. The Empire was teetering on the edge of total collapse; and in its defense, the students brought previously warring political factions together in a united front against the rebellion.

(It is worth noting that there is no evidence that Lennon was ever associated with the “Youth Of The World” movement; he was an anointed saint to the movement’s leaders, and horrified by many of the deeds committed in his name.)

A measure of influence?

As happens periodically, the rival political parties had grown so alike in policies and manner over the years that they were almost indistinguishable. They now formed an alliance with the Peerage to defend the privileges and powers that they had accumulated.

Radical changes in civil policies took place throughout the Empire in response to the agitation of the rioters. France banned all open-air demonstrations; Germany cut funding to the Universities, forcing many students out of their places; and gung-ho Americans lamented the lack of a good war to which they could send the protestors.

Author’s Note: the absence of the horrifying pictures which were televised into living rooms during the Korean & Vietnamese conflicts is a subtle influence on Imperial Society but one of incalculable power and significance, as will become evident later in this History. It continues to shape the Empire in 2055 and its perceptions and can be overlooked only at your peril.

Buzz Aldrin removing passive seismometer from compartment on the Lunar Lander, photograph by Neil Armstrong. Click on the thumbnail for a larger image.

1968-69

Eventually, passions calmed; but the tension remained in the air. The specter of Youth Armies, denied any other recourse, staging revolutionary uprisings throughout the Empire, was one of the key background features of the next two years.

In January of 1969, Czech student Jan Palache burned himself to death in public to protest the “oppression of his age group” – ironically on the same day that the voting age throughout the Empire was reduced to 16 and made compulsory in all Kingdoms; and many of the political parties began strenuous efforts to recruit a younger constituency.

It was felt that by providing an outlet for the political passions, the threat of youth uprisings could be averted. But the change seemed to make little difference to the outcome of the US elections, the first to be held under the new voting rules, as Richard M Nixon was swept to power in an extremely narrow victory.

It wasn’t all grim tidings in 1969. The USK space programme made its spectacular landing on the moon with Neil Armstrong and Buzz Aldrin, getting the space programme back on track, which gave the whole Empire a Morale boost. But this was followed by unhappier news, with the discovery that the Chinese were supplying Mao Technology to the Irish, and were probably behind the rebellion against the Empire, and by a military coup in Libya which installed Muammar Qadaffi as the new head of state.

Investigating The Mao

American spy planes launched from Korea had achieved only limited success in scouting out the Mao, and had revealed little more than was already known about their capabilities. It was known that China showed few of the characteristics of an industrialized nation; there were no signs of factories, the cities were sprawling, decentralized affairs, and the roads appeared to be relatively primitive. There was no sign of railroad tracks. To the outside observer, i.e. the Empire, China was at best a medieval culture; and yet, they had shown themselves to be a superpower equal, even superior in some respects, to the Empire.

Civil works had been achieved which clearly matched the best that the Empire could manage; whole mountains had been leveled, or raised, farming was as intensive and productive as the best the Empire could achieve. Scattered throughout the Chinese mainland were spires of mountainous rock hundreds of feet high, on which vast fortresses that watched over the lands below.

Imperial strategists were eagerly awaiting real-time imaging from orbiting satellites, in the hope of gaining some greater knowledge of how the elementary problems of production and distribution were handled without the necessary industrial infrastructure. How could they allocate acceptable targets in any future confrontation if they didn’t know what was vital and what was not? This would no doubt prove to be vital intelligence in understanding the Mao and their technology, and might even be useful ideas that could be applied within the Empire. But as yet, the state of the art was inadequate to build such devices.

The Irish Rebellion

In the meantime, the Irish situation presented a heaven-sent opportunity, which several Imperial advisors demanded to take full advantage of. If the Mao were giving the rebels technology, they had to be showing them how to use the weapons; and that gave Imperial Intelligence something to get their teeth into. A movement within the Empire even arose dedicated to stirring the Irish up further to prolong the opportunities presented, and sabotaging attempts to arrive at a peaceful resolution. They risked both careers and lives; treason was still a capital offence, after all.

Unfortunately, the first thing that the Imperial agents doing the investigating learned was that the Mao had also exported their “techno-elitist” political structure to the Shin Faen; not everyone was being shown how to work the technology, only specific individuals – perhaps one in 100. Most of the acts of violence were being perpetrated using conventional weapons.

What astonished the analysts most were the collateral effects on the Irish cultures, where conservatism, superstitions and “the old ways” began making inroads even amongst the general populace. A survey conducted in 1970 found that more Irish people believed in leprechauns than believed in the Moon landings.

Agent OOC

One agent in particular managed to get further than any before in assessing the Mao technology; this was 00C‡ of MI6, who would go on to become a legend within the Intelligence community.

He was so convincing an infiltrator that he was sponsored for testing by the Mao trainers to see if he had “The Knack”, as the Irish had come to call whatever the difference was that the Mao were looking for. This test was conducted as a mixture of an ancient religious ceremony (which predated Christianity by several centuries), Tantric Meditation, a Buddhist ceremony, a few neo-military exercises, and an interrogation under the influence of some form of truth serum.

00C had mastered tantric meditation some years earlier, and was able to use the abilities so learned to delay the influence of the truth serum long enough to pass what he viewed as the only significant test, and of course he was in superb physical shape, and so had no problem proving his fitness. But for some reason he could not understand, he was turned down by the Mao for training. He didn’t have “The Knack”. He could only conclude that there had been some other subtle test carried out under the cover of the religious smokescreen and that he had failed that test.

‡ In the late 1990s My superhero campaign developed a spinoff based around an elite team of UNTIL agents, UNTIL being the United Nations police force in the Champions milieu. They had a number of spectacular successes against a variety of opposition – Viper, Genocide, Demon, and other Intelligence agencies. One of their key members was Agent Falcon, formerly Agent 00C of the British Secret Service – a ‘James Bond’-meets-Batman character. After the spin-off campaign wound up, the character (now an NPC) was seconded to the primary superhero team of the Campaign World – he was that good – to hunt down a suspected traitor within the team.

Since this is an Alternate History of that campaign world, significant individuals from that suite of Superhero campaigns will get name-checked or even play significant roles from time to time. When it happens, I’ll pop in one of these sidebars to offer a spot of context.

The Green Earth Treaty

But even while the Mao were sponsoring terrorist uprisings in Ulster, they were negotiating with the Empire on a number of different issues.

Two topics of particular attention were trade, and Major Weapons. Although unable to make much headway in the former area, the summit talks did at least have a beneficial result as both parties agreed that germ warfare, and biological weapons in general, were barbaric and should be outlawed by both.

This agreement cost the Empire more than it cost the Mao; China’s technology was not the type to develop a germ warfare weapon, so far as could be determined; but the Empire was happy to dispense with them nevertheless. They were always a risky business, quite capable of striking wielder as severely as target, and the ever-present risk of accidental release was unconscionable. The “Green Earth Treaty” was coached in generalities, but with some specific issues which benefitted both sides, essentially forbidding any form of “scorched earth” defense.

Whatever happened, any territories captured from the other side in future conflict were to be left in a condition to support the local populace and economy. The treaty, in effect, traded the volcanic weapon of the Mao for the Germ Warfare factories of the Empire, and both sides felt the more secure for the agreement. The Mao negotiators hinted that this was not their only weapon capable of widespread destruction, but the Empire still had its nuclear arsenals, so parity was maintained.

It was also tentatively agreed that on another occasion that this arsenal might also be placed on the negotiating table; “When you know what other weapons we can employ,” replied the Mao representative, insightfully.

The 1970s Begin

Through the 1970s, “Pop” would become subdivided into “Rock”, “Heavy Metal”, “Disco”, and a swarm of other musical sub-styles.

Author’s note: You may be wondering why I keep heading each decade with a spot-check on popular culture, especially musically. All will be revealed, eventually!

1970

The decade began with further declines in the stability of the Middle East and Africa; the trouble started in March of 1970 as Rhodesia declared itself a republic and severed all ties with the Empire.

Even as a military force was being prepared to reclaim it, and show the other member nations that “this is not a gentleman’s club from which one can come and go at one’s whim”, Israeli and Syrian forces met in violent conflict, which in turn edged Jordan and Egypt closer to hostilities with Israel. To make matters worse, the Empire was considered by the Arab nations to be biased and partisan in the matter, as shown by its creation of Israel in the first place, and none of the participants was particularly willing to accept Imperial mediation as a result.

This gave the Imperial strategists a new conundrum to unravel – if they used military force to keep the Arab States in line, it suggested the Arabs were correct in both the claims of partisanship, and that they needed to be independent in order to have any say in their own futures; while if the Empire did nothing, they would be shown as unwilling to aid a member state under attack by self-proclaimed independent forces. It was Central America all over again.

Ultimately, they found they had no choice at all; the troops that had been intended for Rhodesia were instead dispatched to defend Israel, and the Empire gave up its territorial claims in the Middle East in return for a cessation of hostilities. This was felt to be the only viable solution in the short term; and it meant that the Empress had to decide, once and for all – was the Empire to be a military state, in which freedoms were begrudgingly granted in exchange for loyalty to the throne, or was it to be a Commonwealth of independent states with a central Government, as it had slowly been becoming?

It was clear that it had been the audacious move to independence by Central America that had inspired the Arab nations; if American bungling had not forced the Empire’s hand, politically, the Cuban revolt would have been suppressed and the issue would not have arisen.

Again, the Empire had no real choice; this new pattern had to be nipped in the bud or there would BE no empire a century hence. But the answer was not as clear-cut as it first appeared; there was little support amongst the populace for a war of aggression, and acting to preserve the Empire might well condemn it to a more protracted and violent death. So inflamed were passions that in May, four students at Kent State University in Ohio were shot dead by troops while taking part in an Anti-War demonstration.

Now, more than ever, the experts of IMAGE needed to find a third answer, a way out of the corner into which the Empire had painted itself. They could scarcely believe it when the rebelling nations furnished the solution of their own volition – but that’s getting ahead of the story.

A costly victory

It would be repeating a common error in perspective to consider the Arab nations of the Middle East as a single unified voice, or even as capable of presenting such. They may stem from a common racial stock – the former Persian Empire – but they consist of numerous tribes who hate each other with a passion, and each tribal leader generally considers it his Allah-given right to speak, and act, for all. This disunity was a direct outgrowth of the essentially medieval cultural level of the societies, which perpetuated the old ways, the old religions, and the old hatreds. Now it played into Imperial Hands.

Libya began construction of a nuclear reactor, with the avowed intention of making itself a Nuclear Power, able to dictate terms to the Empire as an equal; while Palestinian guerrillas hijacked four passenger aircraft to obtain the release of Palestinian “freedom fighters” (terrorists) being held in Israel, Switzerland, Germany, and Britain.

When negotiations reached an impasse, the Imperial Army moved in to recapture the aircraft. The stumbling block might have been overcome with persistence and patience, but the Empire was on a short fuse and had little tolerance for delays and games. Rather than be captured, the Palestinians blew up the aircraft and all aboard.

By deliberately painting all Arabs with the same collective brush, the Empire was able to rapidly generate a pro-War sentiment where none had existed only months before; and in September of 1970 the Empire began what would be a decade-long and struggle to recapture the Middle East – and, not coincidentally, to recapture the oilfields that were critical to its long-term future.

It was a victory of sorts in the short-term, breaking the deadlock blocking military mobilization into the region; but ultimately it would backfire, turning allies into enemies and moderates into zealots.

In the meantime, Rhodesia was in an uproar. The controlling white minority refused to concede control to a Black Prime Minister under any circumstances; and furthermore, the entire peerage was exclusively white – and determined to keep it that way. These two issues prompted the formation of two rival Guerilla movements, each of which opposed each other as much as they did the white rulers of the Kingdom.

Caricuture of Ugandan dictator Idi Amin by Edmund S. Valtman. Click on the thumbnail for a larger image.

Idi Amin

Once the decision was made to let Rhodesia get away with secession, at least for a time, it was obvious that there would be imitators. The first was a blending of Qadaffi’s rise to power and the Rhodesian coup, as Major-General Idi Amin ousted the existing government and established himself as dictator and absolute ruler of Uganda. Amin was more cunning than these revolutionary predecessors; after ensuring that the available public records supported his claim of needing to take office in such a forceful manner to restore order to a nation that had grown corrupt and greedy, he petitioned for Imperial Recognition of his government.

He knew that while independence would in theory give him more freedom, it was an open invitation to the Empire to treat his power-base as they saw fit; whereas, by appearing to be a member in good standing, he would have almost as much freedom, guaranteed trade, the protection of the Imperial Military in the event that his neighbors got out of hand – and sapped much of the political will to oppose him.

All these things were important, because Amin was a sociopath who had navigated his way to supreme power, where he could give vent to the desires and urges he had spent decades hiding, suppressing, and cultivating; a monster who deliberately invoked the protection of society for his own benefit.

Middle Eastern Strategy

But these facts would not emerge for some time. At that time, Imperial focus was very definitely somewhat to the north of Uganda. The Empire’s conduct of the war puzzled many; the best public explanation was that too many cooks were trying to run the show, producing a wildly haphazard approach.

In reality, IMAGE was dictating the military objectives, and had customized the approach to each of the targets. In particular, they were using diplomatic carrots to appeal to the more moderate countries while throwing military sticks at the ones they wanted to “chastise”.

This multipronged policy began to bear fruit after only a few months, as Egypt and Jordan renewed their vows of fealty to the Empire, and opened negotiations with Israel concerning the disposition of the Palestinian West Bank. The military aspects of the campaign were in more serious trouble; the invasion of Afghanistan had met with something less than success, the country being rocky and full of passes in which “a grandmother with a broom could hold off an army”, to quote one Russian general.

The Arabic reaction was not all that swift, but was brutal when it eventually came, as Palestinian terrorists assassinated the Prime Minister of Jordan, prompting King Husain to rule out further talks; Jordan would henceforth try to steer a passage between the two factions that would neither offend nor satisfy either.

Location of Bangladesh, map by Rei-aitur based on a World Map by User:Varchion

The Pakistan Solution

At about the same time, other minor conflicts were threatening to escalate; In particular, India and Pakistan had become embroiled in a conflict over the autonomy of Bangladesh, where religious separatists had attempted to form a separate nation within the Empire.

The Imperial government was inclined to recognize Bangladesh; it followed the precedents established in the Middle East, and it was hoped that tensions could be prevented from escalating into a new conflict with the Mao, who had so far resisted being drawn into what was essentially a local matter on the fringes of both Empires. Pakistan, however, threatened to secede from the Empire if its borders were not guaranteed – in other words, if they didn’t get to keep Bangladesh.

The Empress was not impressed. She doubted that Pakistan truly wished to be an independent state trapped between two Empires; and she doubted even more strongly that even if the Government wanted to try, that the populace at large would want to go along.

She ordered all Pakistani citizens at large within the Empire to be rounded up, detained for 24 hours, and then deported – the last 747s permitted to enter or leave Pakistan airspace. At the same time, all non-Pakistani Imperial Citizens within the Kingdom – including the diplomatic teams – were evacuated, severing all diplomatic ties. It’s important in understanding her response to note that no notice was given to the Pakistani government of the return of their citizens, and that as members of the Empire, immigration checks were desultory at best.

The returned citizens were thus released into the general population to explain the reasons they had been sent home. Even as they were doing so, all trade with Pakistan was stopped and all shipping blockaded, effective immediately the last diplomat was evacuated. Bangladesh was recognized and the food shipments on which Pakistan proper relied were diverted there, as were a substantial contingent of Australian Special Forces, ready to react strongly to any attempted incursion. In effect, the Empress called the Pakistani bluff and put in place – forcibly – the very situation that Pakistan had threatened.

Central America was a long way from the Chinese boarders; that, and its general lack of importance in the greater scheme of things was sufficient to enable it to get away with secession. Pakistan was right next door to the captive state of India (which had been a Mao posession since its capture during the Third Global War), and had received enough people fleeing across the borders to understand the limitations on personal freedom and opportunity that conquest by the Mao would entail. The result was a mass uprising against the Pakistan government, which fell within a week. If rogue states could use revolutions to get their way, so could the Empire!

Topographic Map of Pakistan by BishkekRocks. Click on the thumbnail for a larger image.

Reforming An Empire

But when Pakistan petitioned for forgiveness and readmission into the Empire a week later, it was not restored to full citizenship.

IMAGE had been studying the larger problem handed them by the Empress, and had arrived at a modern redefinition of the concept of Empire. During the interim of the Pakistan Revolution, a progressive ten-tier level of Imperial Citizenship had been defined in the interim; at each stage, greater local autonomy was granted and less Imperial control mandated. Each stage was calculated to take progressively longer; Class 10 members could be recognized as Class 9 in only 1½ months, Class 8 could be attained in an additional 3 months, and so on. In effect, Class X members were on probation for almost 64 years.

Pakistan was readmitted as a Class IV member – capable of limited self-government, but not fully trusted, and with direct Imperial controls over their military and economy, with some travel restrictions, and with some trade penalties. It would require satisfactory membership within the Empire for 56 years before they were restored to their old status, though that could be reduced in certain circumstances.

This was the real Pakistani Solution – the development of an entirely new Imperial Model which defined an avenue for nations to transition from Imperial Subject Nation to an independant member of a great community of nations. The reformation of Empire which had begun with the American Revolition was now complete – at least in theory.

Wheels Within Wheels

To ensure that Pakistan did not feel that it was being singled out, a full-scale review of all other Imperial nations was carried out. The Empire had been relatively stable for well over 100 years, the results made zero difference to the typical Imperial Citizen.

The unusual timescales involved had been carefully chosen; Germany, the last great source of political instability within the Empire, had been forcibly brought to heel some 26 years earlier, and hence would have received a Citizenship upgrade to what was now designated Level VIII status some years earlier; the immediate effects was of a reduction in the restrictions on their citizens that had been imposed following the Third Global War and an increase in their level of freedom. In 8 years, they could be expected to qualify for level 9 status, and in 2010, all going well, they would be returned to full independent membership within the Empire.

Importantly, those countries whose membership had been forcibly interrupted, like France, but who had refused to cooperate with the invaders, would be unaffected. It went completely unnoticed that this placed yet another aspect of Imperial Policy within the purview of the Civil Service, who now held considerably greater powers than they had possessed at the founding of the Empire, when their bungling had brought about the American War Of Independence.

Decimal Currency & The Metric System

In 1971 coinage throughout the Empire was decimalized, an action which the French and Americans had long advocated, and the metric system was imposed throughout the Empire, which the Americans had long opposed.

The archaic Imperial weights-and-measures system of pounds, feet, and miles, had long been a stumbling block in the growth of the Empire, but the fact remained that a lot of people found it difficult to understand the new scales. Weights and lengths were not a huge problem, volumes were reasonably easy to cope with after a period of adjustment; the problems stemmed from the larger units of area. The “Hectare” was a unit that people had trouble getting a grip on, and the older “Acre” remained dominant in popular usage.

NB: This is a reflection of the author’s personal biases and limits!

1972

1972 brought no relief to the Empire from the problems besetting it. The year began with a strike by coal miners which led to large-scale power cuts.

The union movement, which had been skating on thin ice for many years, had finally gone too far. There was already an Oil Crisis, due to the ongoing problems in the Middle East; as a consequence Coal Mining was currently defined as an Essential Industry. The striking workers were arrested and charged with treason and sabotage, crimes of such magnitude that many of their normal civil rights were reduced or suspended entirely.

Interrogations established three scales of penalty:

  • Those who merely obeyed the vote were acquitted;
  • those who actively voted in favor of the strike action were found guilty and given suspended sentences – which banned them from working in any sensitive or essential industry, prohibited them from running for political office, and placed a permanent black mark on their records which would aggravate any subsequent appearances before the Magistrates;
  • and those who actually planned and led the strike action, who had stated publicly that the current need for their industry’s services meant that they had the Government “over a barrel”, and hence had known the severity of their actions, were imprisoned for 12 years, and permanently reduced in Citizenship classification. They were banned from ever holding positions of leadership again, prohibited from certain careers, and placed in punitive taxation brackets.
    A special grant of child and spousal support was made to the families of those affected – to be paid on the date that their divorces from those affected were decreed final.

These decrees, promulgated by the Empress at the instigation of the Civil Service with the support of the House Of Lords, were enforced by the Military, and marked a turning point in Imperial Society, introducing some aspects of Martial Law into the day-to-day lives of the citizens.

The Middle East Nightmare

The bloody stalemate in the Middle East continued to devour the Imperial Military. It was recognized that before a final Victory could be achieved, someone would have to develop an entirely new military doctrine, the equivalent of the leap forward taken by the Australians in the Jungles of GW3.

Terrorism continued to grow in intensity and frequency, culminating in the murder of Eleven Israeli athletes in the Games Village near Munich during the course of the 4-yearly Empire Games, and the hijacking of a Lufthansa airliner in a bid to secure the release of three terrorists being held over the Empire Games assassinations. This resulted in the implementation of strict anti-hijacking measures internationally, especially at airports.

A lesson in Economics

This was also the year in which Inflation became the economic buzzword. Economics is never easily understood and frequently seems to adopt positions contrary to common sense – a favorite example being that a strong dollar is bad for an economy. Inflation is all about the rate of growth of an economy, and in particular, the rate of growth in comparison to the manufacturing costs of the items that are on offer within the economy. If inflation goes up, it means that the manufacturing costs of the average commodity have increased by more than the real-world value of the pound, the dollar, the yen, or whatever. This of course means that it costs more of those units of currency to pay for the item in question.

When costs are dominated by factors other than the labor required, inflation is stable and largely irrelevant; but when the labor cost is the dominant factor it can create a feedback loop, a runaway chain reaction. The cost of living goes up, which creates pressure to put wages up so that people can continue to live the lifestyle to which they have become accustomed, but that puts costs up by almost as big a percentage as wages went up, so prices go up – raising the cost of living and starting the whole cycle over again.

The other half of the inflation story comes from the difference between economic growth and fiscal growth. Economic wealth comes from increasing the total value of possessions under control in the economy. Discovering a new oil well or mineral deposit adds real value to the economy. Simply printing more money produces the superficial appearance of economic growth, without actually increasing the wealth in the economy – which means that the wealth represented by a unit of currency shrinks. In terms of goods, the value of the currency shrinks, which means that it costs more to buy something.

In the early part of the 20th century, currency values were fixed – a dollar represented so much gold, or whatever standard the currency was pegged to. But after World War II it was decided that this was unrealistic and inflexible, and that the currency should have an unspecified but fixed worth, with the amount of a given commodity that you could buy with a dollar changing.

In theory, this meant that the economy was not forced to grow only by the amount of resources discovered during the year; “soft growth” like increased employment, new materials, new manufacturing techniques, new products, or whatever, could be taken into account. In theory, this was all well and good; it was only when married to another economic reality that the new evil became inevitable. That economic reality was the growth in employment vs. the growth in resources, and allied with a floating currency, this spelt trouble.

If the population grows 5%, and the level of employment stays at the same level, it means that there are more jobs than there were this time last year. That demands that the amount of cash circulating through the economy is a certain amount per head of population – regardless of the increase in wealth. This produces inevitable fiscal growth, regardless of the economic growth, which shrinks the resources represented by a single unit of currency – and sparks the endless cycle of rising prices and shrinking value of the Pound – in other words, inflation.

Prior to the 1920s, economic growth from new resources vastly exceeded the fiscal growth. Even afterwards, up until the 1950s and 60s, the economic growth that resulted from adding more workers equaled or outstripped the fiscal growth of the economy, so inflation still wasn’t a problem. But as it became harder to discover and exploit new resources, that began to change. If the population growth is 5%, the value of resources discovered and mined has to grow by at least 5%, or the economy begins to inflate. If 1,000,000 tons of wheat were grown this year, 1,050,000 tons have to be grown next year, and 1,276,281½ five years from now – and so on. The laws of compound interest demand that all economies begin to suffer from inflation as the worlds on which they live are fully exploited and their populations rise. It was an inevitability from the moment of the invention of currency.

Economic high fever

But in the 1970s, much of this simple progression of cause-and-effect was poorly understood, if at all; Inflation was seen as inherently bad, to be stopped at any cost. The economy was perceived as out of control, and like anything in that condition, was a train wreck looking to happen. The result was a growing lack of public confidence in the economy, and the 1930s had already shown the consequences of that – the economic disaster predicted by the doomsayers.

To keep the economy running, it was necessary for governments to print more and more money, representing wealth that the economy just didn’t have. The alternative was a 1930s-style crash, followed by another great depression – something that was in no-one’s best interests. To provide the illusion of health, and shore up that public confidence, national governments began borrowing vast sums of fictitious money from each other, and running up huge deficits. This cushioned and slowed the crash – at the cost of rampaging inflation.

Meanwhile, bank interest rates, which in theory were pegged to the growth in wealth, but in reality reflected fiscal growth, had gone through the roof. It was not impossible to reap an interest rate of 25% per annum.

Larry Hagman, 1973 publicity photo by ABC Television, USA. Copyright on this photograph may persist in some locations.

The New Entrepreneurs

Added to this was the growth in new industries driven by and servicing new technologies, which made it possible to acquire a fortune seemingly overnight. Just as in the 1920s, there was a get-rich-quick mentality in the air, and a new breed of entrepreneurs emerged from it. It was these entrepreneurs who were the ultimate accumulators of the fictitious “wealth” that the governments were pumping into their economies.

These colorful figures quickly found their existence reflected in the entertainment media. The programme that most reflected their influence – and the levels of underhandedness to which they had to stoop to keep their business empires intact against the ravenous appetites of others of their ilk – was “Dallas”. The soap-opera lives, trials, and tribulations of the Ewing family more closely reflected the reality of the world around them than even those living in that era appreciated.

It was in November of 1972 that the Empire, whose Civil Servants were the only ones in a position to be able to perceive the looming disaster of the overall international economy, imposed a 90-day freeze on wages, rents, and prices. This was the first of a number of attempts aimed at reigning in the impossible levels of inflation. If they had been able to maintain it for years – and reduce the birth rate at the same time – it would probably have succeeded. But because it could not be held long enough, and did nothing to address the population growth vs. wealth growth issue in the long term, it was doomed to failure. The great unknown was when the Crash would occur – and how bad it would be.

Revelations in the 70s

The weather has generally unreasonably cold through the latter half of the 20th century so far, a period of twenty-odd years. In 1962-3 there had been no frost-free nights in the northern hemisphere from Dec 22 through to March 5th; in January, cold weather forced the cancellation of virtually all sports.

It was through the use of weather satellites in attempting to understand the global weather patterns that the first significant breakthrough came in understanding Mao technology, when in 1972 an enterprising meteorologist attempted to chart the entire “life-cycle” of a single weather front.

Of course, this meant using satellites to track the front beyond the Imperial Borders, into Mao-controlled territory, and furthermore, involved the combining of data from multiple different satellites.

His report created a huge storm within the defense and intelligence communities, as it was shown that the Mao had clearly mastered some form of weather control. Severe weather fronts seemed to break apart as soon as they crossed into Mao territories, becoming mild and regulated. Simultaneous with this effect, a new weather-front formed off the coast of Japan, without perceptible cause, whose growth precisely correlated in timing and intensity over time with the diminution of the original front. This new front would then make it’s way through the normal climatic channels, picking up moisture from the Pacific and chilling in the arctic air, before sweeping south to engulf the rest of the northern hemisphere. In effect, the Mao were shipping their bad weather to the rest of the world.

Emvironmental War?

Was this an act of War? It clearly benefitted the Chinese at the expense of the Empire, so at the very least it was an unfriendly act; but since there were demonstrable benefits for the Chinese, and there were no regulations within the law that were directly comparable, it was not an easy debate to resolve.

Ultimately, it fell to the Empress Elizabeth to make the decision and set the precedent. She considered, and then determined that “…insofar as The Empire at no stage takes into account the consequences on, and well-being of, the Chinese Peoples in determining its policies (other those which directly relate to relations between our Empire and theirs), the Empire has no right to demand this consideration of the Chinese and their rulers. This is, of course, an issue of some sensitivity, and we should all strive in future negotiations with the representatives of China to ensure that the welfare of all becomes such a consideration.”

It’s important to bear in mind the context of the times when considering this decree. It was the early 1970s and the unwanted byproducts of their industrial culture were only beginning to show as problems on the Imperial Agenda. Concerns about the environment and pollution were beginning to rise for the first time, although these were still seen as regional issues; this waterway being affected by that factory, and so on. By generalizing and aggregating these into a global perspective, Elizabeth showed publicly for the first time just how sharp a mind resided on the throne at Buckingham Palace; it would be decades before Imperial politicians and policymakers would reach the same perspective.

Risk Assessments

With the determination that this was not to be considered an act of aggression, the focus shifted on tracking and analyzing Mao climate-control and its impact on their society and capabilities. These studies focused into two areas; one group studying the effects of an absence of severe climate, and the other analyzing the way the Mao utilized less severe weather.

The first group came in with their results quickly; they already had considerable raw data (thanks to the enterprising Meteorologist, who was given a Nobel Physics Prize for his discovery). The first question was how long this had been going on? It was decided that because there was no evidence to the contrary, it would have to be assumed that this had been the cause of the unusually chill winters that had been experienced in the Empire periodically over the last several centuries. Any impacts on Chinese society would be firmly entrenched by the modern era.

The basic principle of insurance is to prepare for the statistically improbable by collectively paying a fraction of the costs of repairing the damages that result from an incident.

An insurance insight

If, in a year, 1 home out of 1000 insured homes will burn down, for example, then the fundamental cost of a year’s insurance against fire is 1/1000th the cost of a new home. If the home costs 100,000 Pounds, the policy would cost 100 pounds a year, plus a fair share of the administrative overheads, and any profit margin. This imparts a degree of stability to life that is otherwise impossible to achieve. Insurance companies generally only run into trouble when they lose sight of the need to build up adequate capital reserves to meet future needs and begin disbursing their reserves as increased payouts to stockholders, increasing the paper value of their worth without increasing the funds available for their use. The fact that the people who stand to gain the most from this shortsightedness are also the people who select the corporate leaders of an insurance firm would lead to problems in the 90s.

The biggest problems in assessing the risk in the 70’s came from those statistically remote threats that generate multiple claims – floods, bushfires, earthquakes, and the like – and the assumption that the countdown to such events is set at zero at the time insurance first comes into effect. If there is a 1-in-100-million chance that an entire suburb will be wiped out by a flood in any given year, then simply dividing the rebuilding cost up over 100 million years produces a very low component – less than a penny per year. But if such an event happens after only 10 years of the insurance being in operation, then only ten one-hundred millionths of the cost have been saved towards such an unlikely event. Living and existing in an uncertain world mandates awareness of risk and of forward planning that arises directly from the need to be ready for the improbable.

Insurance companies soon learned to absolve themselves of these risks, passing the responsibility for dealing with “Acts Of God” to the Government. Slowly, over time, the reduction in capitol reserves would introduce more and more restrictions into the coverage provided. No conspiracy was required; one company would make a small restriction, others would imitate it, and then another would inch the line of coverage lower still. Corporate greed can overturn the principle of competition for a customer’s trade with no need for secret negotiations.

A Glimpse Into Mao Society

But the Mao were not faced with those risks. In fact, they were not faced with any environmental dangers to property. Without those risks, spontaneity and flexibly and a willingness to take risks would become far more prominent. The Chinese would be far more inclined to react to the here-and-now and not to the possible problems of tomorrow. To Imperial eyes, they would be tragically short-sighted; this not only explained much of their military conduct, but their reasons for war, and their conduct at the negotiating table. There would be a trend towards fatalism and a belief in destiny – that events would have a life of their own which could not be set aside – and a trend to react, not act. Awareness of these psychological elements would potentially be of huge benefit in future relations with the Mao.

The other primary effect of preventing serious climatic effects over a long period, it was determined, would be another side of the same coin: the Chinese would have no need to hold resources back in preparation for such unwanted occurrences, permitting a greater utilization of what they already had. Since the need to concentrate resources and then direct that concentration to the areas in which it is currently needed is one of the triggering factors for the growth of centralized communities, this began to explain some of the geopolitical landscaping within the Mao regime.

These results were also indicative of the social consequences of Mao technology. Unlike that of the West, Mao science was clearly an elite study, with few individuals able to understand and utilize the technology effectively – facts that had been known since the Second Global War. This implied a feudal structure to their society, with a poorly-educated underclass dominated by a specialist human middle-class and a well-educated alien upper class. This lower class would have little civil liberty; their lives would be tightly regulated. A Caste system was also distinctly possible.

There was a clear resonance in these findings with Eastern Philosophy and religion, with its concepts of reincarnation in the material world, acceptance of the status quo, and pre-destiny.

Further revelations in that area followed the analyses of the other group. They found that even minor weather systems were regulated and controlled by the Mao, travelling down restricted passages which fed into a series of dams and waterways. The entire Mao Empire received its water supply through irrigation. Again, the most significant aspect was the lack of wastage; in most countries, a significant percentage of rainfall takes place in areas outside the catchment zones for the domestic water supplies. By regulating where it rained, and how much, the Chinese were able to utilize virtually 100% of the available supply.

This in turn permitted extremely intensive agriculture – just add fertilizer. It went a long way to explaining some of the patterns that had been observed but not understood in Chinese agriculture; crop-intensive areas alternated in vertical strips down the map with livestock-intensive areas. The Chinese clearly practiced empire-wide crop rotation with natural fertilizers being provided in off-seasons by cattle, buffalo, and other domesticated animals. This was indicative of a tightly-regulated society, one which changed only slightly from one generation to the next; but (as was revealed by the earlier analysis), one capable of rapid and total response to changing circumstance. Once again, the pattern of being reactive, as apposed to being active, was revealed.

At last, thought the Imperial analysts, they were beginning to understand their enemy. They were learning the right questions to ask – and why some of their earlier questions had failed to yield answers of value.

Screenshot of NASAs World Wind globe software showing China. Click on the thumbnail for a larger image.

Reality Check: You didn’t think it would be that easy, do you?

In the early 1970s, satellite technology at last achieved the breakthrough that the strategists had been waiting for.

Not only would real-time imaging – or even rapid-time-slice photography – give valuable information about the technology and society of the Mao, it would also provide the intelligence community with a vital reality check of the deductions made from the weather control analyses.

The scientists and spooks were intently interested in the first images, of a farm being prepared for spring planting. They expected to see simple ploughs being drawn by cattle or buffalo.

Instead, they saw furrows being ploughed with no visible agent other than a single individual standing at one end of the field. This person pointed, and a furrow began at his feet and extended the length of the field; he would then take a step to one side and repeat the process. In the meantime, other people – who appeared to be dressed as simple farmers – walked along the newly-ploughed furrow and dropped seeds. When the entire field was ploughed, a gesture from the specialist brought rain clouds from the nearest weather-channels to provide a good covering of rain for the field, and he moved on to the next field in sequence. This evidence confirmed the broader speculations and deductions of the analysts while at the same time showing the potential for error in detailed forecasting of the impact on Chinese society. No oxen, no ploughs; but an essentially medieval/feudal social structure, with a technological elite.

As I said last time, if it’s not clear to any Gamer reading this that the Mao are using magic, you aren’t trying hard enough. Unfortunately, science doesn’t admit that magic exists…

A rude shock for Physics

The physicists were more outraged than anyone else.

One of the sacred cows of modern physics is that action at a distance, without some transmitting medium through which an effect can travel, is impossible. One of the hallmarks of Einsteinian Relativity is that it describes space itself as the medium which carries gravitational effects, eliminating one of the major examples that appear to contradict this belief. Yet the Mao ploughman violated this principle with a casual gesture.

Their fundamental assumptions had to be completely wrong to permit this to occur. They immediately linked this phenomenon with the other impossible things the Mao were clearly achieving – weather control, volcanic eruptions, some manner of instantaneous travel – and decided that the only way to avoid abandoning every principle on which all western technology was based was to assume that there was some previously-unknown medium which Mao technology manipulated.

In other words, just because they couldn’t see it, didn’t mean that there was nothing there to see – they just had to develop the right sensors, was all.

Vitally, this gave them a means of identifying – eventually – the nature of Mao science. All that was needed to identify a field of research as the one which the Mao had mastered was to look at a spring planting, (or probably a harvest, for that matter). If a sensor saw something, it was the right field of study; if not, it was something the Mao weren’t using – and hence, something that might give an advantage against them if applied to military technology.

Either way, the late 1970s set the stage for what was hoped to be a new boom in military technology. That meant funding, and the only thing that matters more to a physical scientist than his or her scientific principles is the funding which gives the opportunity to explore those principles. (Some would argue that funding is less important, overlooking that observations are expected to defy incomplete scientific theories, necessitating the revision of scientific principle – but without funding there are neither observations nor ability to revise. A problem with the science can always be corrected, if the funding is there.)

Personal Side-comment: This is the great danger posed by the ideological determination of what scientific should be funded and what should not. Modern religious, social, and political zealots are great at setting up “scientific” research that can only find the answers that the research was designed to find in the first place, confirming predetermined ideas and carefully eliminating results that do not conform. True science studies the actual data and uses it to test the best understanding of actual reality in hopes of discovering a new avenue of insight. “Research” which is not conducted in this fashion is not science. It was true of Lysenkoism, It’s true of most “Christian Science”, and it’s true of Global Warming.

There’s a parallel to the early reception of the theory of Continental Drift, which is now widely accepted, but was initially rejected for reasons both good and bad. Because the advocate of the theory, Wegener, was not a Geologist, many dismissed his findings. The fact that Wegener had no explanation for the motive force and didn’t try to invent one should have made his observation of the fact that continents DO drift more compelling, not less. But it contradicted the accepted dogma of the time. It was only in 1958, fourty-six years after Wegener’s proposal, when the theory of plate tectonics was introduced, that the ‘scientists’ involved began to accept the theory. You can read more on the story at this Wikipedia page.

Right now, we are in the situation in which Global Warming is accepted dogma and attempts to disprove it are stifled. The theory that it is NOT happening is the equivalent of the theory of Continental Drift. It might be happening, it might not – but the evidence is so biased by the dogma that it’s hard to accept it at face value.

I’m a skeptic on the subject because political ideology is being used to cut funding to attempts to disprove the theory – I’d be far more willing to accept the possibility if those in power were not so intent on shutting down any opposing view. It’s not science, it’s dogma. Feel free to disagree.

Comments (7)