In the process of making a project to test a pathfinding Idea that I had, I set out to make a simple terrain generation system. To do this is wanted to split the process into two separate systems. Texture generation, to create textures that can be used to define an environment. And the Terrain Spawner, to interpret a texture and convert that data into an environment.
Recently I have started to make more of an effort to split things down into smaller separate systems. Small and to the point.
The reason I wanted to use texture generation to create the environment data, is twofold. Textures are easy to look at. This makes it easier to see how each step of the generation affects the end result. And a lot of environment generation that I have looked at in the past has used noise maps or textures. So it seems like a good choice. Additionally learning how to work with pixel data in code, sounds like fun.
Pixel colours are selected at random from a range. Each colour has a set range that work as a percentage chance.
I set up a class to create the texture data and render it to a texture used by a material.
SetupTexture() gets run in MonoBehaviour Start().
PixelRange is a public inspector editable array. The class stores a colour and a length. By summing the total of all of the lengths of each PixelRange, and creating a parallel array of the start and end values, I can easily find the correct colour from a random value.
Overall I don't like this as it requires me to loop over the colourRanges array every time I want to find a colour. But it was a quick test that has the flexibility I need.
void SetupTexture()
{
// Get & Set refrences
renderer = GetComponent<Renderer>();
texture = new Texture2D(textureScale, textureScale);
renderer.material.mainTexture = texture;
storedPixels = texture.GetPixels();
lastGetPixels = storedPixels;
// Collect Colour Range Data
float RangeEnd = 0;
colourRanges = new RangeData[pixelRange.Length];
for (int i = 0; i < pixelRange.Length; i++)
{
colourRanges[i] = new RangeData(RangeEnd, RangeEnd + pixelRange[i].range);
RangeEnd += pixelRange[i].range;
}
maxColourRange = RangeEnd;
}
RandomisePixels() is the main function at this point that is setting the colours to the texture.
Quite a simple set up, but I wanted to get visible colours as quick as I could.
I have a static class called GlobalGameData. It only stores a Color array for the time being. This is a cheats way to make the “Texture Data“ available to other systems without having to cross reference system classes. Once I have a little more sophisticated textures. It will be storing a reference to the generated texture.
[ContextMenu("Randomise Pixels")]
void RandomisePixels()
{
Color[] colours = lastGetPixels;
for (int i = 0; i < colours.Length; i++)
{
colours[i] = GetColourFromColourRanges();
}
lastGetPixels = colours;
GlobalGameData.PositionColours = colours;
texture.SetPixels(lastGetPixels);
texture.Apply(false);
}
I “love“ Instantiate objects!
I actually despise spawning objects. I am always in my own head about the performance cost of spawning objects at runtime. This has lead me on many tangents to set up the “Perfect Object Pool System” and I always end up at the same two points.
A) It’s not as bad as I think.
B) Just set up a quick pooling class that calls instantiate internally. That way if object spawning starts to become a performance issue the infrastructure is already there.
public static class ObjectPool
{
public static GameObject SpawnGameObject(GameObject prefab)
{
return GameObject.Instantiate(prefab);
}
public static void Remove(Object target)
{
GameObject.Destroy(target);
}
}
Cubes with cute colours.
Spawning cubes are easy, I have a handy ObjectPool just for this occasion. But where to put them is a little bit of a different issue. This is by far the worst part of this whole situation. Because of some weirdness. Textures on a Unity plane, are flipped both horizontally and vertically.
This resulted in multiple attempts to spawn them in different orders. At one point I even had a nested for loop working backward through the positions to spawn them in a particular order to fix the issue.
In the end, I cheated. I flipped the plane on its x and z, axes.
Than I spawned the cubes like I would any other object, in a fixed grid, offset for neatness. Easy done, I have cubes! Now time to colour them.
void GenerateGroundObjects()
{
storedGameObjects = new GameObject[resolution, resolution];
storedRenderers = new Renderer[resolution, resolution];
for (int y = 0; y < resolution; y++)
{
for (int x = 0; x < resolution; x++)
{
GameObject go = ObjectPool.SpawnGameObject(groundPrefab);
storedGameObjects[x, y] = go;
storedRenderers[x,y] = go.GetComponent<Renderer>();
go.transform.position = new Vector3(x + HalfNegResolution, 0, y + HalfNegResolution);
}
}
}
Another Simple loop to take the colours out of the GlobalGameData and apply them to the stored render references that we have.
void UpdateRenderMats()
{
for (int y = 0; y < storedRenderers.GetLength(1); y++)
{
for (int x = 0; x < storedRenderers.GetLength(0); x++)
{
storedRenderers[x,y].material.color = GlobalGameData.PositionColours[x+(y*resolution)];
}
}
}
From here I am going to extend the functionality of the texture generation. My plan is to make a couple of textures and sum them together to get the result that I am looking for. What is that result? Something interesting with a variability that looks less like white noise.