Posts Tagged screenshots

Multi-texturing and Biomes

One of the priorities for 0.5.0 (the environmental update) has been to overhaul the multi-texturing system.

In 0.4.1 the best the engine could do was 4 textures (desert, grass, forests and cliffs), and those textures were placed at world-start, never changing. The end result was a very static map, very boring environment: basic tree loss/growth was the entire extent of environmental dynamics.

All this changes in 0.5.0. The design plan calls for a dynamic, responsive environment, and by crikey that’s what we’re going to have. (Crikey is Australian for… something. I dunno. Honestly I don’t even know what the word means, but it’s a stipulation of citizenship that we all say it at least once a week)

The first thing to add is more biomes. Desert/grass/forest works well as a basic, semi-tropical biome set, but what about colder biomes? Or warmer ones? Or wetter ones or drier ones? My current biome map concept has more than 20 different biomes drawn on a temperature/fertility map, including some hilariously extreme ones

If you can’t survive in a lava lake you’re simply not trying hard enough.

But that many biomes means a whole different approach to multi-texturing… or at least, I thought it did. This was when development started to go bad…

In 0.4.1 I used a multitexturing method called ‘texture splatting’. Imagine you’re painting the ground texture on a blank canvas. What texture splatting does is gives you a ‘stencil’ for each texture: so you can paint forest through the red stencil, grass through the green stencil and desert through the blue one, and by the time you’ve used all the stencils you’ve painted the entire terrain. All these stencils are nicely wrapped up into a single texture, called the blend texture.

Unfortunately, this only works for a limited number of textures per draw call: you can’t load more than a set limit of textures onto the GPU without it having a fit. So in order to render an unlimited number of textures, I’d need to change how I was approaching it.

My first attempt at a replacement system was sort of like an extremely complicated colouring book, where each grid square is numbered 1,2,3,4,etc and our hypothetical artist fills in the grid with grass/desert/forest from a colour key. Since we can include as many numbers as we like in the key, we can have as many biomes as we want.

Quick! Chop down a tree with your fists!

Once that’s done, our artist is faced with a problem: (s)he has to blur these colours together smoothly without pixilation or sharp edges. And since it’s going to be animated, a simple linear interpolation just isn’t going to cut it.

As it turns out, this is HARD. My own approach was to sample the biome-legend multiple times to give each pixel *four* biomes, each with a weighting value, which I then folded into the same channel as the key
(effectively, I designated the first digit of the channel as the key, and the remaining digits as the weight).

Honestly, the most surprising thing about this ridiculously complicated strategy is that it worked… mostly.

The grid is meant to be there.

But “works in general”, doesn’t mean “works well enough to use”. On closer inspection, the system produced artifacts all over the terrain. (not a typo: graphical glitches are “artefacts”, ancient relics are “artifacts”. I read that somewhere and internalised it, so it must be true) I actually got it working perfectly wherever 2 biomes met, but that was the limit: at any intersection with more than 2 it produced hard edges and odd colours, and the moment it was animated these artefacts started jumping about and generally making themselves easy to spot.

How many artefacts can you count?

So I ended up scrapping the idea entirely, and starting over with a new strategy, one requiring less math and more art.

This new strategy was much simpler: we go back to using the ‘stencil painting’ system, but this time once we’re done painting with the first 4 stencils (biomes), we put a new, transparent canvas over the existing canvas and keep on painting on that with a new set of stencils. Rinse and repeat.

This method turned out to have it’s own set of pitfalls, chief among them, alpha-blending and redrawing the entire terrain multiple times, with different textures each time. For an item which takes up as much of the screen as the terrain, this is a large graphics cost, and in a GPU-bound game probably would have spelled the end of this strategy. But Species is CPU-bound: it has GPU cycles to spare. So full-steam ahead.

“Everlasting pure darkness” will not be a biome in the full game. (Unless someone mods it in).

The artefacts of this method also turned out to be quite different to the ones I faced with the other method. The other method loved to produce singular, localised artefacts: hard edges and biome colours where they shouldn’t be. This method’s artefacts usually affected the entire terrain. I’d say two in particular are worthy of noting here, mainly because I haven’t actually managed to fix them: double rendered polygons and biome-set edges.

Biome-set edges were where one transparent layer tried to fade out into another. I never had any trouble with the inter-set blending, but proper alpha blending is a temperamental thing. In this case, because the biome colour fades out at the same time as the opacity does, the end result was a faded-but-noticeable black ‘border’ between different blend-sets.

Oh so that’s what wars are fought over.

I managed to ‘fill-in’ the majority of these borders by extending the colour to the very edge, but there is still a faint one around the first draw pass. Thankfully it’s subtle, and dealing with it had some odd side effects, so I’m going to leave that one alone. It’s not hugely noticeable.

Double-rendered polygons, on the other hand, are a problem.

It’s zombie-triangle apocalypse! Only trigonometry geeks will survive.

This isn’t actually a problem with the rendering method: it’s a problem with the QuadTerrain itself, which I didn’t know about until the rendering method made it visible. See, when the terrain is completely opaque, rendering a polygon twice has no effect. The colour from both renderings is the same, so it’s an invisible artefact. But when you render a *transparent* object, like, say… one of the 4-biome passes of the terrain… *then* it becomes quite visible, as you can see above.

But fixing it means re-familiarising myself with the QuadTerrain class, which I haven’t touched in quite some time. I’ve already made a little progress, eliminating about half of the artefacts above with one fix. Hopefully the next fix will get the rest, but I doubt it: bugs like this are often inversely exponential. You might be able to fix the majority of them easily, but there are always one or two subtle, extremely well hidden ones that you have next-to-no chance of ever finding.

Oh well, best I can do is to make sure I get most of them.

Currently, I have 23 biomes defined as rough shapes on a temperature/fertility axis. This includes ‘extreme’ biomes, like lava and salt-plains, and a number of underwater biomes that will only be found… err, underwater. (Fertility will at least partly be determined by height: everything below the water plain will be water).

So far everything is coming together as planned. Biomes are done, 3d trees are done (I’ll do a proper post on them later), I’m midway through tying the two together, and soon the nanozombies will be unleashed on the unsuspecting… oh wait wrong blog. You didn’t hear that.

Qu

“Other stipulations of Australian citizenship include:

– Must defend vegemite no matter personal opinion of taste. (sticky salty gunk)
– Must be willing to throw foreign tourists in front of a croc to save yourself. (Or was that the other way around?)
– Must know how to defend against drop-bears (trick question: there is no way to defend against drop bears)”

Advertisements

, , , , , , , , ,

Leave a comment

T-minus 2 days…

It menaces with spikes of death and candy. It’s a dandy!

, , , ,

8 Comments

T-minus 4 days…

FROGGLOR WILL EAT YOUR SOUL

, , ,

1 Comment

Behavioral Evolution

“FYI, the first third of this post consists of Qu just typing randomly with no coherent subject as he tries to come up with a subject for todays post. I have no idea why he decided to include it. Padding, probably. Feel free to skip to the stuff actually worth reading further down (assuming any of this dreck is “actually worth reading”).” -i

Welp. I have no idea what to write. I generally just write these posts by typing out whatever bullpoppy pops into myhead, and then going back and inserting some rediculous analogies, a bit of hyperbole, replacing the swear words with crap like “bullpoppy”, then clicking submit. And then re-opening the post and editing out all the spelling mistakes and adding in the links and images I meant to include in the original. Proofreading and preview buttons are for wimps!

But at this very moment my brain is giving me no prompts whatsoever, hense the above pointless and time-wasting sentence. What on earth am I going to write about?

Well, at this point in the saga, I have a working (albeit basic and rather buggy), simulation. Ummm…

Dammit writers block.

Dammit I just called myself a writer because I blog. Next thing you know I’ll be printing out business cards with this URL on them and adding “blogger” to my resume.

Okay, okay, let’s get serious. The first thing to do from this point in the games development was to expand and balance the selection pressures. But since this process is extensive, (so extensive it’s still ongoing, and shows no sign of letting up), and since I did a fairly pretentious post on it last time, let’s move on. I’ll be coming back to that topic sooner or later, there’s pretty much no way to avoid it, so no point in rushing it. Let’s talk about something else.

FFFFFFFFFFFFFFFFFFF-

Hmm… I could do a post on performance and optimisation, but it’d come out all technical and I doubt too many people are interested. We’ll save it for later too.

Redundancy? I’ve touched upon the things that changed from the original design in other posts… but that’s pretty broad subject, maybe something more specific…

Oh, I know! Behavioral evolution.

HAHA TAKE THAT WRITERS BLOCK I AM AWESOME.

“Stuff possibly worth reading starts here.” -i

The original design called for an unbelievably complex and mutable behavioral system, that could theoretically evolve to handle any situation. I’ll try to give a summary of it, but when I say “complex” I don’t mean “Pythagoras’ Theorum” complex, I mean “7th dimensional trigonomcalculus in a rotating reference-frame” complex, and I’m pretty sure that doesn’t even make sense.

The basic idea was that every creature could have a list of Perception Categories, based on visable data like “size” and “number of teeth”, so they could distinguish between predators and prey. Each perception category would also include an action (flee, approach, attack, mate), which based on the theory would, over time, evolve to become appropriate to it’s category.

Using this system, any action could be combined with any object. So Attack!Creature and Eat!Vegetation were possible, but so was Flee!Vegetation and Eat!Creature. (Note: creatures can only be eaten after they’re killed, so this did nothing except maybe annoying the creature being chewed on).

I actually implemented a placeholder version of this system, and discovered it’s flaw: it made creatures stupid. The earliest creatures picked actions at random, and would spend a good 90% of their time doing nonsensical things. The ones that survived were the lucky but random few who managed to realise they were supposed to eat their food, not copulate with it (not that there’s anything wrong with that (sorry, couldn’t help myself)).

At first I thought this was okay: creatures would naturally evolve to be smarter. But it quickly became apparent that the randomness cause by their own stupidity was the only selection pressure worth talking about in the game: it was overwhelming all the visible effects.

In addition, it was impossible to monitor. Getting statistics out of this system was like milking a tiger snake: not as hard as it sounds, but all you’ll get is poison. And there’s also the possiblity of being bitten by a pissed-off, venemous reptile, though I’m not certain the analogy streches that far. But it was very, very hard to tell the difference between “intelligent” and “brain-damaged algae” simply by looking at the perception categories. It was definately impossible to glean any useful information at a glance, which was the way I wanted to set the statistical elements of the game up.

And if that wasn’t enough, it evolved slowly. Increasing the behavioural mutation rate didn’t make it evolve faster, it just meant that the immediate descendants of an intelligent creature might well be back to running away from the scary food and trying to mate with the giant toothy predator twice it’s size. And decreasing it meant you could wait for hours before seeing anything.

And yet… if the only thing I was going for was accuracy, I might well have hung onto this system. It’s not too bad at replicating the way behavioural evolution in reality works: we recognise patterns and have instinctual reactions to those patterns. But evolution in real life doesn’t happen over the course of minutes, like in Species. It takes era’s. So this system, which is actually quite fast compared to real-world physiological evolution, is extremely inefficient compared with the simplified version found in Species.

What I had unrealisingly done was put a real-world system into a simplified simulation.

But I’m not “only” going for accuracy: the dual design goals for the simulation in Species are accuracy and intuitive… err… ness? Okay that word sounds ridiculous and probably isn’t an actual word, but you know what I mean. I want to be able to see evolution happening in Species, which is difficult to do with complex behaviors.

So that lead to the current, much-simplified system. It’s not quite as versatile as the ridiculously complex system detailed above, but it’s versatile enough to allow mutable behaviour while still allowing the user to see at a glance how the creature’s instincts govern it’s decisions. It makes the creatures understandable and (relatively) predictable, which is a good thing…

The system consists of four main genetic variables:

Curiosity/Cowardice: Determines a creatures reaction to other creatures in its line of sight. A high curiosity will cause this creature to approach, while a low one will cause this creature to flee.

Aggression/Sociability: Determines a creatures reaction to reaching another creature. An Aggressive creature will attack: a Social creature will either “Play” (an action which still has no real effect, but I’m planning on adding one. Sooner or later. When I can think of an effect it should have) or Mate. Which of those it does is determined by…

Amorosity/Asexuality: Amorous creatures will attempt to *ahem* “share genetic material” when approached by other creatures of the same species, while asexual creatures don’t. Since creatures are perfectly capable of reproducing by Parthenogenesis (cloning), it will be interesting to see whether sexual or asexual behaviour is the most viable within Species, or even if it changes depending on environment.

Interest In Tree’s/Lack Thereof: a behavioural variable that I may end up simply combining with the “diet” variable, because a) it’s redundant (Carnivores have no reason to approach trees), and b) it sounds silly. Interest In Tree’s causes creatures to prioritise vegetation over creatures (or vice versa for low values).

A few other variables, like “Scariness” and “Decoration”, provide modifiers to these: (scariness makes other creatures less curious, decoration makes them more amorous, etc).

And that’s it. This behavioural system theoretically provides support for predator and prey relationships, herding behaviour, and sexual selection, while being simple enough to understand at a glance and to not overwhelm the more interesting (debatably), visible selection pressures.

It’s not perfect though, and I’m always open to suggestions. E-mail me, or drop a line in the comments section below, if you’ve got an idea for this or any other system in the game. 🙂

Cheers,
Qu

Yeah, I’ve totally given up on making the random screenshots at all relevant.

CTHULHU WILL EAT YOUR SOUL

, , , , , ,

8 Comments

Heads and Hands and Feets and Eyes Oh My!

With the basic body-plans implemented, the next job was detailed bits: things like hands and feet. In truth, this is probably one of very few aspects of the game that were changed very little from the original design: I take this as proof that the many long hours I spent labouring tirelessly over design documentation prior to beginning the game were well spent, because my fragile psyche is more comfortable with that idea than the alternative.

Individual body features like this are simple, pre-built parts, imported and attached to the end of the much more complex limbs. In general though not always, their only mutable statistics are scale and ‘type’, type being a discreet integer: the ultra stat that determines all the other stats. One stat to rule them all, and other such unoriginal movie references.

The complexity of these parts will come in the form of versatility via quantity. I intend to add a ridiculously large number of different types of feet and hands, all with different stats and connected through a web of allowable mutations.

Interestingly (well, to me: my sense of interest runs on Insane Troll Logic), this sort of feature requires a very different approach when it comes to generating content. With a feature that relies on procedurally manipulating geometry, like the torso or to a less extent the limbs, content generation is exponential: I do a bit of modelling work and then spend a lot of time programming, but I don’t actually see any content until the very end at which point there is suddenly ridiculous amounts of variety spewing messily all over the place: thin torsos, thick torsos, wide torsos, fin-shaped torso’s and symbolically phallic torso’s.

WHAT HAS BEEN SEEN CANNOT BE UNSEEN

In general, I like this approach. Sure it takes a load of work before you see any result, but the ultimate result is awesome. Oh god, did I really just say that right after that screenshot?

With features like limb-tips, on the other hand, content is very much linear. I do a little programming, and from that point onwards the amount of content in the game is directly proportional to the amount of modelling work I put in. And quite frankly, I’d rather be improving the game mechanics than building crappy placeholder models for it until I can get an artist to look at it, so this sort of content is sorely lacking.

I do have a rather interesting idea to combat this lack of content, but I’m not certain how easily it can be done… stay tuned on that front. 🙂

Moving on, with hands and feet modelled and in place at the ends of limbs, it was time to move onto heads. Heads are more complicated than hands and feet, because they’re not just an attachment: they also do something. I’m not talking about influencing diet, or bite damage, I mean something technical… okay okay, feet raise the body by a certain amount which is “doing something technical”, but if you’ll just disregard my inability to be internally consistent for a moment and let me make my bloody point… What Heads Do is serve as an attachment point for Eyes.

Eyes are the most complicated type of feature, because in addition to having an independent texture and model (both of which influence their stats), there can also be any number of them, and they can be situated anywhere on the head (spoiler: later they’ll be able to be situated anywhere period. For the moment, just the head).

My initial solution to this conundrum left a lot to be desired. I instituted the EyeList, a system for tracking and mutating a variable number of eyes, and when I was modelling the head-shapes I added several rows of dummy objects. Each dummy object was a potential eye position, and by using them the eyes knew where to appear on each of the differently shaped heads.

I’ll go over the many horrobad problems inherent in this system later, when I reach the point in the saga where I fixed them. Suffice it to say, my fixing involved a method roughly analogous to the method veterinarians use to fix pets.

In the meantime, Eyes! And Heads! And Feetses! That is totally a real word!




/tangent I’d like to quickly expound on one other difference between procedural content and handmade content. Handmade content is specific, which means it can be tailored. So if I want a head shaped like a donut, or a skull, or a paperclip, I can model and import a head shaped like a donut, or a skull, or paperclip. But with procedural content, I can never reach that level of freedom: I might be able to get a spectrum of infinite variations on torso width and height, but I’ll never have a skeletal torso, or a torso with a great big hole through it, unless I add a discreet aspect to the Torso genetics (hmmm… idea…). This sort of melding of discreet and floating point variables can produce an astounding amount of potential variety amongst the creatures: certainly more than I expected when I started designing the game.

Our universe is purely procedural, but then again it can afford to be. It doesn’t require considerations like CPU usage and memory footprint: where I’m forced to simulate a creature using maybe a hundred distinct numbers, our universe simulates a creature using trillions of physical molecules all working at cross-purposes. The limitations I run into are ultimately the result of discreet variables representing macroscopic structures: the various 3d meshes that make up body parts. The universe has no such discreet variables: everything in it is floating point, to a Planck level of accuracy.

Ultimately, it comes down to the fact that Species is a simulation of a universe, not a universe in and of itself. But I’m certain the creatures in Species don’t care. They struggle to survive and reproduce in their own universe just as much as real-life creatures do, so maybe… just maybe… that qualifies them as alive…

Cheers!
Qu

* * * * * *

PS: I’m going on holiday for a bit (New Zealand! I’m gonna join the sheep races and be the fastest sheep jockey that ever lived, and then I’ll feed my hire car to the birds that eat cars, and then I’ll hang around Mount Doom heckling the midgets with rings!), so it’s improbable there’ll be anything to see here for the next two weeks. But I’ll definitely be back come Christmas.

* * * * * *

“I haven’t the heart to tell him there’s no such thing as sheep races…”

, , , ,

1 Comment

Finite State Monsters

I fail at blogging. It’s Tuesday night and I still haven’t put anything up here. Who wants to administer the mandatory kick in the pants?

So, moving on again…

With the leg-and-torso system in place I was up and running, adding more possibilities and developing the creature AI and visualisation. Colours! Necks and tails! FSM’s!

… yes, FSM’s. No, not Flying Spaghetti Monsters: That’s what I thought when I first came across the acronym too, but here I’m talking about “Finite State Machines.” If that sounds like a doomsday device to you, awesome. Keep believing that, because a doomsday device would be so much cooler than an actual Finite State Machine.

Long story short, a finite state machine is a term in AI programming that refers to a list of simple, mutually exclusive states that an entity (an enemy, an NPC, a creature, etc) can be in, along with a set of rules as to which states can go to which other states. A typical example for an FPS enemy would be as follows:

As you can see in the above example, an enemy has to be startled before they’ll start attacking. This might give you a chance to play a ‘jump’ animation (for example, the grunts in Halo), or to get them to shout for help, before they target the player.

Be separating the code into mutually exclusive segments like this, it becomes easier to debug, easier to change and actually has better performance: I don’t have to check for cover, see whether the enemy can shoot, or run any collision or movement AI for an enemy marked as “Idle”, since he would just stand around. It also makes the enemies easier to read and predict, which is an essential element for many games.

Back to Species: the FSM for Species is a lot less regimented, and a lot more complex because creatures can go from one state to almost any another pretty much at any point, with “moving” used as a hub. By using “moving” in this fashion, a creature has to turn and walk towards it’s target before it starts eating/attacking/mating with it, meaning that under most circumstances creatures will be facing the object they are interacting with as they interact with it.

Dead is the exception: if I tell a creature to die, it dies immediately, no matter what state it’s in.

*footnote: This is actually an idea I had while drawing this diagram. Currently, “playing” is an utterly useless behavior, and creatures jump between Moving and Mating freely, but if it serves as an entry point to mating then sexual selection becomes much easier to implement.

Now, at this stage in the chronological saga, I’ve implemented Idle and Moving, and am now adding Eating (which, because I considered this need when making my Vegetation System, isn’t too much of a problem), but I want to discuss the FSM as a whole and nobody can stop me, so I’m going to, temporal paradoxes be damned!

Something you’ll notice about this FSM is that a creature can’t actually decide what they’re going to do with an object until they reach it. If they see another creature but aren’t close enough to perform an action, they have to go to “moving” until they reach it. Even if they are close enough, they still have to pass through moving to get to the correct state. They can’t just go to “attack”, or to “mate”, because there’s no “moving” code in those states: they’d just stand there staring longingly at the creature they want to attack or mate with (or worse, they’d try to damage or impregnate the other creature from a distance, with no physical contact. Hmm… note to self: develop a device to do that in real life).

This directly affected the eventual structure of the behavioral system, which has undergone more iterations and revisions than Doctor Who canon: I couldn’t just give a creature a high aggression and expect it to attack. Predators need to combine a high curiosity level with a high aggression, so that they seek out other creatures and then choose to attack them.

Okay, now returning to chronology.

With the structure of an FSM set up, my randomly generated protoCreature wanders the world, going to each plant, eating it, and moving onto the next. Given a couple years, she might be able to eat the entire map (or those parts of it she can get to, anyway).

Oh heck… NON-SEQITOR SCREENSHOT LIST!

Decapitated Dumbo says "Hi kids!"


SPIDERS! GIANT FREAKING SPIDERS- on no, wait. It's only 6 legs. That's ok then!


.

And I shall call him "Beaky"

Because why not? (because I couldn’t come up with a friggin segue, that’s why not. Bah! Humbug)

Picking up where I left off…But for the other states in the finite state machine, I would need to implement a lot more than I had done. Attacking requires other creatures and a health system. Mating requires other creatures and a reproduction system, which in turn required a mutation system. Even eating was far from complete, being purely herbivorous and lacking an energy system entirely. And we’d need a system for dying when creatures ran out of health, and a system for tying physiological differences to survival traits like speed and stamina, and a per-creature-perception system, and a behavioral modification system, and a…

Wow. When you actually list them out like that, it sounds like a much larger endeavor than I thought it was at the time.

A common thread through all of these systems is that they are purely environmental: they control the universe in which the creatures live, not the survival of the creatures themselves. The creatures ability to reproduce, and thus any evolution stemming from natural selection, are emergent, not inbuilt.

Anyway, the next system on the agenda is preparing the ubervirus for release on an unsuspecting populace. If we can get a large enough infection rate in the first few days we can turn 90% of them into ravenous undead monsters by the end of- wait. Wrong agenda. Which agenda was I talking about? Oh, right, the one involving interior decoration and making everything look faaaabulous- no, not that one either?

Dammit, someone give me the secret agenda I’m supposed to be reading from! Geez!

Alright, it says here that the next system on the agenda is making large populations of randomly generated creatures, to pave the way for creature interactions.

for(int i = 0; i < 500; i++)
creatureList.Add(new Creature());

Well that was easy. No point doing a whole post about that. Tell you what, I’ll go move ahead and schedule something more interesting for next time.

Man I’m all over the place today! Is it possible to internally generate alchohol? Cause I think I’m doing it. God I suck so much at blogging.
Qu

Administering mandatory kick in the pants in 3… 2… 1…

, , , , , , , , , , ,

2 Comments