Currently, as hinted at in the previous post, creatures are naturally aware of every object in their local area. They are, for all functional purposes, locally omniscient.

This isn’t actually too bad in terms of how the game plays: we are privy to every option they have, so having them choose from all those options simply makes them seem more intelligent and relatable from our perspective. This is actually a good thing.

It is bad, however, in terms of simulating evolution: it completely negates the selection pressure to develop sensory organs. They need some perception system simply for that. I think the system I’ve designed and am beginning to update gives me the best of both worlds, but… we’ll have to see.

So… the system focuses on 3 senses: smell, sight and hearing. Touch is assumed, given that creatures react when they bump into things.


“Good news everyone! I’ve invented the smell-o-scope!”

This is the equivalent of the aforementioned local omniscience. All creatures will be able to smell and approach every object in their vicinity. At this stage we won’t have smell trails or tracking, nor will we have dedicated smell apparatus: it’s simply assumed that this is how blind creatures find their way to the food in their area.

Navigating by smell does come with a few inconveniences to offset it’s omnipresence, however:

* a creature navigating by smell cannot run,
* they must also pause every few seconds to sniff the air, and
* there will be a sine-curve added to the direction they move, slowly reduced as they close in on the target.

Minor inconveniences at best, but given their short lifespan, taking longer to do things will have a dis-proportionate impact on their survival. This should incentivise the creatures to develop sight or hearing, to avoid having to deal with these behaviors.


Just because you’re paranoid doesn’t mean the voices in your head aren’t people 3 kilometers away talking about you.

This is the creatures ‘reactive’ sense. When they are being hunted, approached by a potential mate, or targeted for any other reason, they will recieve a Ping() from the other creature. Their sense of hearing will be what determines whether they respond to this ping or disregard it.

In the long term, these pings (and by extension, the sense of hearing) will be the centerpiece of communication and various social behaviors: for now, they merely facilitate the fight-or-flight response.

Hearing can also be used to detect prey and mates in the first place. Of course, the sense comes with a pretty obvious downside: it can only be used to detect living creatures. Trees and corpses don’t make noise, and will have to be seen or smelt out.

Despite not having ears, P. Specium will actually have a pretty good sense of hearing: they have a large ground contact area, so it makes sense they would be able to pick up vibrations through the ground. If/as they develop legs, they will have to find alternatives or lose some hearing ability. So I guess I’m going to be modelling a bunch of disembodied ears. Good times.

If the things the facial feature placement system frequently does with eyeballs and horns is anything to go by, the resulting faces will be exactly as horrifying as many of you are currently imagining.


I see dead people.

The “best” sense in terms of having the fewest restrictions: creatures navigating by sight are not inconvenienced, and sight can detect any object in it’s range. Sight is less reactive, though: it doesn’t have the omnidirectional capabilities of hearing.

In the current version of the game (0.7.0), the sight system is quite poor: a creature will only detect something in it’s sight if that thing is in it’s FOV at the moment it updates its perception. Given that FOV can be quite narrow, and the perception updates can be many seconds apart, creatures miss obvious things happening nearby on a regular basis.

Ironically, the new system will make the perception updates even more irregular, only happening when a creature needs to seek something (food, a mate, etc) or recieves a ping. This is necessary to keep the frame rate at a reasonable level, but hopefully by performing these updates reactively, via pings, sighted creatures should be a lot less oblivious to the world.

Additionally, a new behavior will have to be implemented: “scan“. For a creature seeking something, or responding to a ping outside of their vision range, rather than just reading in the objects in their FOV at the time of update they will deliberately look around themselves, getting a full 360 degree picture of the environment.

Since this negates the advantage of having a FOV, higher FOVs will allow creatures to perform these scans more rapidly, shaving seconds off of their reaction and search times. It will be up to natural selection to find a balance between FOV, range, and the energy it costs to grow hundreds of massive eyeballs from every part of your head.

… or just one big one. (Src)

I think that image is a good place to end this post, don’t you? He looks so dapper. So very dapper.



Behavior Tree – Glory Shot

In all it’s magnificence:

Click to ultrahugenate

Click to ultrahugenate

So, where to from now?

Well, there are still a few communicative issues I need to address. The Behavioral Sprites are half done: fading them over time was a good idea, since you can now tell what they’re up to without having a page overwhelmed with opaque thought bubbles. They still need a few more additions: most notably for migration and seeking a mate.

The creatures really need an animation to signify that they’re sleeping, but this is a tricky one: some creatures sleep standing up, some lie on their side or their back or their belly… what works well for an upright biped may not translate well to a splayed out quadruped.

But those are relatively trivial things. There still a much bigger item to address: perception. Currently, the creatures have a sort of localised omniscience: 0.8.0 will include a much more complex system than that, which I will go into my evil plans for next post.

I know what you’re thinking, but the better question is: why wouldn’t they be evil?



AI Rework – Stupid is Smart

Moving on!

One of the more interesting aspects of revising the AI has been learning that my prior tweaks to keep Primum specium from going extinct despite its stupidity had formed a strange mutual reliance between stupidity and survival.

That’s right. Making P. specium smarter is actually making them worse at survival, at least in some very specific areas. So I guess Dumb Will Triumph Because Smart Is

Um… wait, can I start over?

This is most noticable than in how the new, supposedly intelligent P. specium treats dense ‘fields’ of food, like grass tufts.

Previously, a single blind herbivore would move from tuft to tuft. This was a very simple mechanic: they’d start moving, bump into a tuft, and start eating it. This made them fairly efficient grazers: though they’d occasionally miss thanks to blindness, in a dense field of grass they would usually find a fairly close tuft to eat.

Now they’re smarter. The calculate how much energy they’ll get from all the food sources in their vicinity, subtract the cost of walking to each, and select the best option. This usually means a corpse or plant with high energy. Grass tufts are inefficient and don’t have much energy, so even if they’re close, they’re rarely targeted.

The end result? Creatures deciding that the dozen low-energy grasses next to them are worthless and walking miles for a single high-energy corpse. This is a very poor survival decision, especially when 30 other creatures are also going for the high energy source, and are already closer to it.

Though it does make for some pretty patterns in the swarms of creatures…

They're eating that plant! And then they're going to eat me! Oh nooooooo!

They’re eating that plant! And then they’re going to eat me! Oh nooooooo!

I think I’ve invented a whole new level of stupid. Where before they made stupid decisions simply because they didn’t think, now they make stupid decisions because they think too much.

So the question is… do we make the creatures even *more* intelligent, having them think ahead to the next food source, and the food source after that (a recursive, branching algorithm that sounds like a CPU nightmare) or do we simply tell them to overcompensate for distance to force them to target closer objects?

Both solutions sound like they would work, but neither is particularly attractive: the first for CPU reasons, the second because this update is supposed to be about *intelligence*, not cheap tricks to make them act stupider.

So, can we take a third option here? A food sources attractiveness doesn’t have to be a simple measure of the energy it contains: it could also factor in the number of similar objects in it’s area, and even the number of creatures targeting it.

These are imperfect proxies, but that’s actually a really good way to think of them. Proxies, imperfect representations of reality, are a far better analog for a biological thought process than rationally pre-calculating the consequences of an action. The creature’s aren’t robots (which is a really weird thing to say in this context), and they aren’t perfect. We’re not trying to stop them from making stupid decisions, just make them less blatent.

Of course, “blatancy” is open to debate. For example, for the child individually, killing and eating it’s parent is actually a very logical decision: the parent is likely to have quite a bit of energy and be weakened from pregnancy, and eating it vastly improves the child’s chance of reproduction. The obvious problem with this Randian survival strategy is that while it might be logical for the individual, it is terrible behavior for the species as a whole.

The initial and obvious proxies were energy content, walk cost, battle cost, efficiency and diet. I’ve since implemented a ‘food density’ proxy that allows them to prioritise denser area’s of grass, and a ‘competition’ proxy that reduces the attractiveness of food that is already being approached or eaten by other creatures. I’m also looking into an ‘empathy’ proxy, based on genetic distance, to reduce the amount of familial murder and cannibalism. Plus I’m screwing about with the way the thought bubbles work. I just can’t leave well enough alone, can I?

Oh well, it’s all good. I may not succeed in making them smarter, but if nothing else, their stupidity should be more complex and entertaining from here on out.




AI Rework – Behavior Tree Level 4

Note: the following is written in-character by a different character than my usual in-character character because my usual in-character character was floating in space for a number of weeks. This also explains my recent absence both here and on the forums. I’m going to say the reason I was floating in space was because I required perfect isolation to hear the grim horrorwhispers of my eldritch masters, and not because I accidentally pressed the wrong button and vented the contents of my orbital lair into space. That would just be silly. Apropros of nothing, however, WHY DO WE EVEN HAVE THAT BUTTON?!

(alternatively, see the forums if you want the boring “real life” reasons. Be warned, they’re boring. Like, really really boring. Actually, don’t even bother. Go read Homestuck or something instead)

Moving on, this particular fictional in-character is May, my protégé, research assistant, unpaid minion and occasional test subject. My most sincere apologies for the lack of Cthulhu Mythos references and trendy internet colloquialisms. May considers memes to be frivolous and Lovecraft overrated. She is of course entitled to her opinions, just as I am entitled to extensively test the psychological effects of being buried in live octopuses while the top 100 most commonly used phrases on 4chan are shouted through a megaphone.

Hello all. We’ve had a minor… ‘incident’ involving your usual blogger, so I will be filling in on his behalf for a brief duration. I apologise for any differences in tone or content that may result from this, and thank you for your patience.

In the original design for the SeekFood routine it used a conventional Selector. Thus, a creature that prioritised Hunt would only fall back to an alternative, like Scavenge, if Hunt failed.

Due to a near-pathological failure on the part of the  developer to think the consequences of their actions through, this design leads to a logical problem, briefly discussed in the previous post: there are cases where Hunt won’t actually fail, but is still an ineffective or illogical decision: for example, when a creature has a scavengable corpse in close proximity, but chooses instead to hunt a creature some distance away.

Solving this problem required a significant rewrite, and the complete removal of quite a large branch of the behavior tree: specifically, the FindClosestMatch routine mentioned in AI Rework – Behavior Tree Level 3.

As this removal effectively invalidates the last few blog posts of this series, I find it necessary to apologise to any reader who was expecting a coherent narrative to this series.

In the place of the FindClosestMatch routine, we were obliged to implement a different system, as shown below.


This system involves some notable, complex changes, which I shall attempt to elucidate for your benefit. If you would prefer to avoid a somewhat technical discussion, feel free to skip to the conclusion.


IEdible is a C# Interface, a code constuct which serves to provide a ‘description’ of certain functionality within different classes that can be referenced in place of the objects themselves. Since both Creatures and TreeObjects are edible, they can both implement this interface, and can thus be stored in the same List object.

This allows us to sort all edible objects, rather than “all trees” or “all creatures”, and thus replace the type specific FindClosestTree and FindClosestCreature with a more generic FindClosestEdibleObject.

While an intriguing change, this isn’t exactly where we are headed with this rewrite. It’s merely an important middle-step before we implement the next major change.


This is the list of edible tree’s, corpses and creatures mentioned before. To prevent having to perform calculations on the entire list of them, it is populated when the creature requests it, filtering out the easily-eliminated options like objects that are too far away, creatures that are inactive and trees that are too large to reach.


This is the foundation of the new food seeking routine. Rather than simply looking at distance, creatures will attempt to logically determine what the most rational food source to approach is.

To sort the potential food sources from most-attractive to least-attractive, a creature generates an ‘optimality’ value for each. This value is measured in joules: it is literally a measure of how much energy a creature would gain from eating a particular food source, minus the energy cost of walking to the target and the damage cost of fighting it (if applicable).

These values are used relatively and are far from perfect, as the calculation doesn’t factor in the metabolism and climate costs associated with the time it takes to reach targets, that they may move, that the prey may flee rather than fight, that someone else might reach it first, or that it may be too much to eat in one sitting.

While it’s tempting to continue to develop the process until it acknowledges all the above factors, it’s easier on both the developer and the CPU to simply fudge the figures a little. This simply means the creature’s aren’t perfectly rational: much like biological organisms, they favor conclusions that can be reached rapidly: intuition over logic.


The end result of this process is that creatures will attempt to sate their hunger logically, by analysing their environment and selecting the food source that provides the best energy-to-cost ratio. This change should make the creatures far less prone to poor decisions than they were previously.

While there were a few haphazard starts involving the developer setting the sort order wrong (and thus creature’s attempting to make a meal out of the *worst* food source in their area, rather than the best), and the Movement Energy Cost being subtracted on a per-time basis rather than a per-distance one (a curious error that may well be affecting evolution in 0.7.0: since it means an increase in speed actually *reduces* the cost of travelling a set distance), these errors were merely due to the mechanical ineptitude of the developer, rather than any foundational design problem. After they were fixed, the solution was made to work.

It has a curious effect on the simulation, significantly reducing the percieved randomness of creature’s behavior and introducing noticable patterns of movement. Highly attractive food sources will draw in creatures from some distance away, like moths to a flame. In addition, creatures will opportunisitically hunt weakened members of the population. While children killing and eat their parents remains a problem with this new AI, the presense of sound logic behind this behavior (children are born hungry, and parents have a lot of biomass and are often weakened by pregnancy) makes it notably more interesting.

Thank you for continuing to read this blog despite, or perhaps because of, the demonstrable insanity of it’s author.

Holy crap those italics. Learn to close your tags May.

To make up for my absence, here is a highly-contagious mind-virus disguised as an extended Lovecraft reference.

That’s right. I know what my readers want.

“Uh, sir? I’m quite certain I don’t actually wish to know the answer, but… what exactly are you doing?”

“Wait, you’re not in the… aw crap. Have I been shouting 4chan memes at a tank full of octupi for 3 hours for nothing?”

“… yes. Yes, I believe you have.”


AI Rework – Hunting

So, here’s where we left our “Seek Food” behavior tree (simplified somewhat):


new Sequence(
    new FindClosestTree()
    new WalkTo(ClosestTree),
    new Eat(ClosestTree)

… which was then upgraded to include…


new Selector(
    new Sequence(
        new FindClosestCreatureMatch(
            new Inverter(new IsAlive))),
        new WalkTo(ClosestMatch),
        new Eat(ClosestMatch)
    new Sequence(
        new FindClosestTree(),
        new WalkTo(ClosestTree),
        new Eat(ClosestTree)

There is a problem here I haven’t quite worked out the best way to solve (the predeliction of creatures to scavange a distant corpse when surrounded by perfectly browsable trees, because of the way the Selector at the top works), but right now I’m working on hunting.

Hunting itself is (at least at the moment) a relatively simply behavior: find something alive, walk over to it, make it not be alive, and eat. The complexities arise not from the plan itself, but from interaction of the other emotions. Fear and anger are the two arbiters of the fight-or-flight responce, and hunting needs to influence these emotions in both the hunter and huntee.

This is where a brand new system comes in:


Ping is the way information is transferred from one creature to another. Eventually, everything from “I’m hunting you” to “I need a mate” to “I require aid” will go through the ping system. It works in a relatively simple manner: creature’s Ping others when they have information that might be of relevance to the other creature.

new Sequence(
    new Ping(HuntingYou),
    new RunTo(ClosestMatch),
    new Ping(KillingYou),
    new Kill(ClosestMatch)

Now just because a creature recieves a ping, doesn’t mean they will react to it. Pings are analogous to noise: a creature might not hear it, they might hear it and not be able to locate the source, or they might locate the source but deem it inconsequential. That will all depend on the (yet to be implemented) perception system.

Assuming the huntee does recieve the ping, spots the source of it, and responds, it’s fear and anger will go up. Chances are good one of these will go above their current emotion, and with an urgency threhold of 0, they will react immediately.

This reaction will in turn be accompanied by a ping back to the hunter, letting it know its preys intentions. Now it’s the hunters turn to have an emotional reaction. Seeing a large, violent creature turn to fight might well be enough to amp up its fear and send it running for the hills.

The best part of the ping system is that it’s very CPU-friendly. Rather than continually scanning the area every frame for potential threats, prey just needs to wait for and respond to a single ping. This reduces the number of checks that need to be done and ensures we only need to track relevant information.


So, what’s the end result? I outfitted P. specium with the new Hunt/Flee behaviors and put 250 identical wannabe predators on the map.

The results were interesting. About half chose to flee while the others chased them, based I think on the order in which the pings were sent. I was actually expecting them to get caught in a both-creatures-hunt-both-creatures-flee loop, but it didn’t turn out that way at all.

They didn’t last: I’m not sure they even managed to reproduce. This was at least somewhat thanks to the problem I mentioned earlier: if there were still any creatures in the area, they would hunt them down rather than browsing the vegetation. It was like a zombie apocalypse if the zombies ate other zombies.

All that said, even “zombie” intelligence is a significant step up from their previous AI. Even with their animations not working (I broke them back when I was dismantling the old ai, and haven’t gotten around to fixing them), they now have a sense of purpose and agency that was notably absent before, and I can’t overstate how much of a difference this makes.


More Changes

Since I wrote this, I’ve made yet another change. Creatures will now have different run/walk speeds: running for maximum meters-per-second, walking for maximum meters-per-calorie. I’ll need to change a variety of formula and perhaps add a few new stats to support this.

In addition to the implications for the simulation, running makes creatures more communicative to the player, since it is only used when hunting or fleeing. To help emphasise this, I’ll probably also adjust their leg animation using the existing Step Size system, to give them a longer gait when running.

For now I’ve just put in a placeholder run speed of 3x walk speed. This results in admittedly freakish sight of predatory Primum specium pointing themselves at an unfortunate victim and sliding across the map at a speed that is just a bit faster than what you’d think they were capable of.

They couldn’t run down a human, but if there were enough of them they might be able to swarm you, especially if you weren’t expecting it. And nobody expects the Primum inquisition.

Still a lot of work to do, but things are starting to take shape.

Next time on The Wriggling Dead: the hordes. Oh god the hordes.

They're eating that plant! And then they're going to eat me! Oh nooooooo!

They’re eating that plant! And then they’re going to eat me! Oh nooooooo!

1 Comment



Either that’s a bug, or the Outer Gods are making their presence felt in the simulation…


AI Rework – Behavior Tree Level 3

AI Rework – Behavior Tree Level 3.

Something a little different this time: rather than extending the tree, I’m taking measures to make it more versatile.

Last time we were here, we had this:


That’s all well and good for finding the closest tree, but what if we want to find the closest creature? We end up writing 5 more routines simply to do something we’ve already done.

That’s a waste of code, and as everyone knows, code is extremely expensive. It’s like printer ink. That’s why programmers get paid so much, cause of all the code they go through.

3 Kilobytes to the Euro

3 Kilobytes to the Euro

3 Kilobytes to the Euro

Seriously though, “find the closest X” is a routine I expect to be using a lot. “Find the closest genetically-compatible creature”, “find the closest corpse”, “find the closest prey”, “find the closest rocket-propelled chainsaw”, “find the closest chainsaw-propelled rocket” and so on. You get the idea.

So with that in mind, it’d be cool if I didn’t have to rewrite 5 routines for ever variation on a simple premise. Good code is reusable code, and right now the Behavior Tree system is making it a lot harder to write good code.

So, the obvious thing to do would be change this:

    new CheckClosestEdibleTree()

… to this

    new CheckVariable(ClosestTree)

Rather than hard coding ClosestTree, I’m feeding it in as a parameter. Each of the dependent methods store it and act on it when their Act() method is called.

This is a fairly elegant solution, with only one minor drawback: the fact that it will not work.

The reason it won’t work is a bit complex, but I’ll try to summarise it with an example. Suppose we’re trying to define a routine that will get the creature to walk towards the “ClosestCorpse“. When the creature is born, it’s AI is initialised and Context.ClosestCorpse is fed into the MoveTo routine:


The creature then Moves during it’s life. As it moves, the game updates the Context.ClosestCorpse object.


But the MoveTo routine was set back when the AI was initialised: it still references the original Corpse A! If the creature is told to approach the closest corpse now, it’ll wander off in the wrong direction…


This is where Lazy Evaluation comes in. We don’t want a reference to whatever corpse was fed in when the AI was initialised, we want a
reference to whatever corpse is currently stored in Context.ClosestCorpse. Luckily C# provides a fairly neat way of doing this: rather than
providing a Corpse to the MoveTo routine, we provide a Function That Returns A Corpse (Func).

Then there’s a neat syntax to generate exactly that:


And we’re done! MoveToClosestCorpse() can now be changed to the much more generic MoveTo() command, and can use any variable we want to feed into it.

Except, we’re not quite done. This system works exceptionally for *getting* a variable, but what if we want to set one?

Back to our original example. Some fiddling with the above syntax means we can now generate a FindClosestMatch() method, which identifies the closest Creature or Tree that meets a given criteria. For example, this:

new FindClosestCreatureMatch(parent,
    new Inverter(new IsAlive(() => parent.Context.NextCreature)),
    new Inverter(new IsEntityBehindFence(() => parent.Context.NextCreature))

… identifies a creature that is a) dead, and b) inside the map boundaries, without me having to create a whole new FindClosestCorpse() routine.

But this result doesn’t go into the “ClosestCorpse” variable, because FindClosestMatch() can’t just set any variable it wants. It’s a predefined routine: it stores it’s variable into a generic “ClosestMatch” variable.

This is still somewhat workable, so long as we only need the ClosestMatch variable for one thing, within the routine in which it is defined. It just means that the value can’t be retained for later use without custom routines.

But can we do better? Can we use a similar syntactic trick to lazy evaluation, but for a getter? As it turns out, yes! But… it looks like this…

No seriously, what?

No seriously, what?

Can you read that? I can’t read that. I don’t even know how it’s supposed to be read. Great Cthulhu that’s a hideous pile of mess.

Sadly, this is a case where reusability and readability are diametrically opposed requirements. If I were to refactor for reusability using this trick, the code would be rendered unreadable.

So, I think in this case I shall err on the side of readability. While I do have a SetVariable routine in case I’m ever that desperate to reuse code, I think I’m better off creating custom “SetClosestCorpse” style routines for storing variables and using the temporary “ClosestMatch” variable wherever possible.

Next Time On Somebody Help He’s Forcing Me To Write His Blog Posts: Hunting and Fleeing! Plus: we find out what happens when 250 identical Primum specium become predators who prefer live prey. (hint: they die. A lot)


1 Comment


Get every new post delivered to your Inbox.

Join 441 other followers