Currently, as hinted at in the previous post, creatures are naturally aware of every object in their local area. They are, for all functional purposes, locally omniscient.
This isn’t actually too bad in terms of how the game plays: we are privy to every option they have, so having them choose from all those options simply makes them seem more intelligent and relatable from our perspective. This is actually a good thing.
It is bad, however, in terms of simulating evolution: it completely negates the selection pressure to develop sensory organs. They need some perception system simply for that. I think the system I’ve designed and am beginning to update gives me the best of both worlds, but… we’ll have to see.
So… the system focuses on 3 senses: smell, sight and hearing. Touch is assumed, given that creatures react when they bump into things.
This is the equivalent of the aforementioned local omniscience. All creatures will be able to smell and approach every object in their vicinity. At this stage we won’t have smell trails or tracking, nor will we have dedicated smell apparatus: it’s simply assumed that this is how blind creatures find their way to the food in their area.
Navigating by smell does come with a few inconveniences to offset it’s omnipresence, however:
* a creature navigating by smell cannot run,
* they must also pause every few seconds to sniff the air, and
* there will be a sine-curve added to the direction they move, slowly reduced as they close in on the target.
Minor inconveniences at best, but given their short lifespan, taking longer to do things will have a dis-proportionate impact on their survival. This should incentivise the creatures to develop sight or hearing, to avoid having to deal with these behaviors.
This is the creatures ‘reactive’ sense. When they are being hunted, approached by a potential mate, or targeted for any other reason, they will recieve a Ping() from the other creature. Their sense of hearing will be what determines whether they respond to this ping or disregard it.
In the long term, these pings (and by extension, the sense of hearing) will be the centerpiece of communication and various social behaviors: for now, they merely facilitate the fight-or-flight response.
Hearing can also be used to detect prey and mates in the first place. Of course, the sense comes with a pretty obvious downside: it can only be used to detect living creatures. Trees and corpses don’t make noise, and will have to be seen or smelt out.
Despite not having ears, P. Specium will actually have a pretty good sense of hearing: they have a large ground contact area, so it makes sense they would be able to pick up vibrations through the ground. If/as they develop legs, they will have to find alternatives or lose some hearing ability. So I guess I’m going to be modelling a bunch of disembodied ears. Good times.
If the things the facial feature placement system frequently does with eyeballs and horns is anything to go by, the resulting faces will be exactly as horrifying as many of you are currently imagining.
The “best” sense in terms of having the fewest restrictions: creatures navigating by sight are not inconvenienced, and sight can detect any object in it’s range. Sight is less reactive, though: it doesn’t have the omnidirectional capabilities of hearing.
In the current version of the game (0.7.0), the sight system is quite poor: a creature will only detect something in it’s sight if that thing is in it’s FOV at the moment it updates its perception. Given that FOV can be quite narrow, and the perception updates can be many seconds apart, creatures miss obvious things happening nearby on a regular basis.
Ironically, the new system will make the perception updates even more irregular, only happening when a creature needs to seek something (food, a mate, etc) or recieves a ping. This is necessary to keep the frame rate at a reasonable level, but hopefully by performing these updates reactively, via pings, sighted creatures should be a lot less oblivious to the world.
Additionally, a new behavior will have to be implemented: “scan“. For a creature seeking something, or responding to a ping outside of their vision range, rather than just reading in the objects in their FOV at the time of update they will deliberately look around themselves, getting a full 360 degree picture of the environment.
Since this negates the advantage of having a FOV, higher FOVs will allow creatures to perform these scans more rapidly, shaving seconds off of their reaction and search times. It will be up to natural selection to find a balance between FOV, range, and the energy it costs to grow hundreds of massive eyeballs from every part of your head.
I think that image is a good place to end this post, don’t you? He looks so dapper. So very dapper.