## The game engine

The last few months I have been dabbling with writing a game engine from scratch, or rather as close to scratch as is viable. After having tried some of the different ready made solutions (Unity and Unreal to name a few), I realized that I needed a deeper understanding of the mathematics behind 3D graphics. And so the game engine project was born. A secondary goal is to create most, if not all, of the content procedurally. The procedural approach is good if you, like me, aren’t very good at modeling. Even if you are a master 3D artist, creating 3D assets takes time. Using a procedural approach allows me to create potentially infinite assets from a few basic blueprints.

In this post I will show one of the problems that I have run into, along with a fix. UV mapping a triangle mesh is not the first problem I’ve encountered, but it is the one with very few hits when trying to search for a solution, so this is also an attempt to alleviate that.

First though, a few definitions. The principal entities we’ll be working with in this post are the *Vertex*, *Triangle* and *Mesh*. Below you’ll see simplified definitions of all three (C#).

struct Vertex { public Vector3 Position; public Vector3 Normal; public Vector2 TexCoord; } struct Triangle { public uint A; public uint B; public uint C; } class Mesh { public Vertex[] Vertices; public Triangle[] Triangles; }

Very standard stuff. The *Vertex* defines a point in 3D space, that point’s normal and a 2D texture coordinate for the point. The *Triangle*, unsurprisingly, defines a triangle via vertex indices. The *Mesh* is defined by the vertices and triangles it contains. These definitions are greatly simplified, my *Mesh* class really isn’t that simple, but in order to keep the amount of irrelevant code to a minimum I have had to cut some corners. Hopefully there is nothing surprising in the above.

It is customary to refer to the texture coordinates as *U* and *V*, hence the term UV mapping. The implementation of 2D and 3D vectors I am using uses *X*, *Y* and *Z* for vector coordinates, so any code samples will be using *X* to refer to the *U* coordinate component, and *Y* for the *V* component.

## The sphere

The first thing we need is a sphere. There are at least two ways to procedurally create a sphere:

- Create a unit sphere using sine and cosine functions see this post for instance
- Create a unit icosahedron and smoothly subdivide it

The problem with generating a sphere with sine and cosine is that the triangles at the poles will get squeezed, since there are the same number of triangles at the poles as there are at the equator. The icosahedron solves this problem, but is harder to map onto a 2D texture.

For my engine I have chosen to go with an icosahedron as the basis for any spheres I need. This means solving a few issues with UV mapping (the process of mapping all the triangles of a 3D mesh to a texture in 2D).

A couple of pictures to hopefully illustrate what I am talking about. There is plenty of code floating around the web for creating an icosahedron and subdividing it, so I won’t get into details of it here. Except to say that I am using the *split edges* method to subdivide, since that yields the smoothest result.

## The texture

The second issue to tackle is actually having a texture. Since my engine is going to be working mostly with procedural assets, I have opted to create a texture for my sphere with a Perlin Noise function, which also is a topic for another post. Anyways, the texture I have created for this test can be seen on the right.

public Bitmap CreateTexture(int width, int height) { Bitmap image = new Bitmap(width, height); double widthFactor = 1.0 / width; double heightFactor = 1.0 / height; Color baseColor = Color.FromArgb(255, 127, 35, 7); SimplexNoise generator = new SimplexNoise(seed: -5000); for (int x = 0; x < width; ++x) { for (int y = 0; y < height; ++y) { double dX = x / (double)width; double dY = y / (double)height; /* noise range is clamped -1 to 1 */ double noise = generator.PerlinNoise(dX, dY, octaves: 3, persistence: 3.0f / 4.0f, lacunarity: 3f); int blueHue = (int)(baseColor.B * noise) + baseColor.B; int redHue = (int)(baseColor.R * noise) + baseColor.R; int greenHue = (int)(baseColor.G * noise) + baseColor.G; Color col = Color.FromArgb(baseColor.A, redHue, greenHue, blueHue); image.SetPixel(x, y, col); } } return image; }

The quality is horrible. The original texture is quite heavy and I have had to convert it to jpeg for bandwidth reasons. Also this texture does not tile and that is on purpose, because I wanted a really obvious clue as to where the texture wraps on the sphere. It really is a waste to create a texture from noise and then use it the old fashioned way. Since one of the advantages of procedural textures is (almost) infinite detail at any scale, saving the output to a texture ruins that. However, I was in need of a texture and so here we are.

## UV mapping, the naive approach

Great, now there is a triangle mesh to work with and a texture to map it to, but how? It turns out that the solution to that problem is easy to find, WikiPedia to the rescue.

Straight forward: loop through all the vertices of the mesh, find the unit vector from the vertex position towards the center of the sphere and assign the calculated *(u,v)* coordinates to the vertex’ texcoord component. Made even easier if you use unit scale meshes, since the inputs to the above functions would then just be the vertex position negated.

As you can see, it turned out not at all pretty, it looks like the sphere has a zipper running from the north to the south pole. Why is that?

## Diagnosing the zipper

When I first encountered this zipper artifact it took me a while to figure out what was going on. Pulling out the pen and paper and sketching what was going on in the UV mapping process helped immensely.

The UV mapping algorithm given on WikiPedia works for a sphere. But we don’t have a sphere, we have a triangle mesh *resembling* a sphere. And more importantly, the UV mapping function has a range that quite nicely coincides with the circumference of our sphere. This means that a vertex lying on the border can belong to two or more triangles, some of which map to the beginning of the texture and some that should map to the end of the texture. Some of the triangles of this type of sphere mesh will even be straddling the boundary between the start and end of the texture. The zipper strip running up the sphere is actually triangles that have gotten UV mapped backwards, due to sharing vertices with other triangles near the start of the texture. This causes the entire length of the texture to get mapped onto a single triangle in reverse, creating this weird zig zag zipper pattern.

Hopefully the picture on the left can illustrate what I mean. The white line is the boundary where the texture wraps around, and the green dots are the vertices that are shared by multiple triangles on either side of the boundary. This sharing of vertices is the problem! Since we can’t assign multiple UV coordinates to a vertex (we can, but that would be a waste), we need to duplicate the problematic ones and fix up the triangles they belong to.

In the picture to the right I have drawn some of the problem triangles onto the texture (not to scale, mind). The two blue triangles represent triangles as they are supposed to be mapped. The yellow triangles are the ones we send to the pixel shader for texture sampling. As you can see on the wire frame there are many more cases, but the two blue triangles on the sketch represent the two general cases. The top blue has two problem vertices and the bottom triangle has one problem vertex. There can’t be three problem vertices, since that would mean the triangle should have been mapped entirely outside the texture, which can’t happen with the algorithm we used for UV mapping.

There are a couple of other things to note about the sketch. Notice how the blue triangles are wound clockwise, , but the yellow triangles are wound counter-clockwise? Also, notice how it does not matter whether it’s one or two vertices that map incorrectly, in both cases the resulting triangle is wound in reverse.

## Fixing the zipper

It turns out that this winding reversal is key to detecting this particular problem. If we can detect when a triangle gets flipped after UV mapping to the texture, we have a good starting point for a fix.

Determining if a triangle has been reversed is pretty easy, we can use the same technique as we use when determining if a triangle is back- or front-facing: calculate the normal. UV mapping takes all the triangles of a mesh and places them face up on a 1×1 plane. This means that the normal should be pointing up and out of the texture. If a triangle has been flipped its normal will be pointing down. Of course, finding the normal requires taking the cross product of any two of the triangle’s legs. The cross product is only defined in 3 dimensional space, but that’s ok. If we set the W component of the UVW coordinates to 0 and take the cross product, we get normals of the type (0,0,W). If the W component is negative it means we have a flipped triangle.

As a reminder, calculating the face normal: . Below is the quick and dirty detection method. It simply runs through all the triangles of the mesh and determines if the corresponding triangles on the texture is flipped. The method returns a list of indices of flipped triangles in the mesh.

private int[] DetectWrappedUVCoordinates(Mesh mesh) { List<int> indices = new List<int>(); for (int i = 0; i < mesh.Triangles.Length; ++i) { uint a = mesh.Triangles[i].A; uint b = mesh.Triangles[i].B; uint c = mesh.Triangles[i].C; Vector3 texA = new Vector3(mesh.Vertices[a].TexCoord, 0); Vector3 texB = new Vector3(mesh.Vertices[b].TexCoord, 0); Vector3 texC = new Vector3(mesh.Vertices[ c].TexCoord, 0); Vector3 texNormal = Vector3.Cross(texB - texA, texC - texA); if (texNormal.Z < 0) indices.Add(i); } return indices.ToArray(); }

Once we have found all the triangles that are flipped, we need to figure out which of the vertices are the problematic ones. We know that all the problem vertices will have texture coordinates at the start of the texture, so checking where on the U axis a given coordinate is will also tell us if the coordinate needs to be fixed. And since we have already determined which triangles need fixing, we do not need to worry about messing up perfectly good triangles.

Moving on. With problem vertices in hand (and the triangles they belong to), we can now go about fixing the zipper artifact. For each problem vertex we need to create a copy of it, and add 1 to it’s U component. Then we need to add the new vertex copy to the mesh’ vertex array and keep a note of the index it was assigned. This index replaces the corresponding index of the old vertex in the triangle.

The code below takes the previously generated list of indices of flipped triangle and checks each of the vertices of those triangles. If a wrap is detected the vertex is copied, fixed and added to the vertex list as new. The code also keeps a map of already fixed vertices, since we would like to avoid duplicating a vertex more than once.

private void FixWrappedUV(int[] wrapped, Mesh sphereMesh) { List<Vertex> vertices = new List<Vertex>(sphereMesh.Vertices); uint verticeIndex = (uint)vertices.Count - 1; Dictionary<uint, uint> visited = new Dictionary<uint, uint>(); foreach (int i in wrapped) { uint a = sphereMesh.Triangles[i].A; uint b = sphereMesh.Triangles[i].B; uint c = sphereMesh.Triangles[i].C; Vertex A = sphereMesh.Vertices[a]; Vertex B = sphereMesh.Vertices[b]; Vertex C = sphereMesh.Vertices[ c]; if (A.TexCoord.X < 0.25f) { uint tempA = a; if (!visited.TryGetValue(a, out tempA)) { A.TexCoord.X += 1; vertices.Add(A); verticeIndex++; visited[a] = verticeIndex; tempA = verticeIndex; } a = tempA; } if (B.TexCoord.X < 0.25f) { uint tempB = b; if (!visited.TryGetValue(b, out tempB)) { B.TexCoord.X += 1; vertices.Add(B); verticeIndex++; visited[b] = verticeIndex; tempB = verticeIndex; } b = tempB; } if (C.TexCoord.X < 0.25f) { uint tempC = c; if (!visited.TryGetValue(c, out tempC)) { C.TexCoord.X += 1; vertices.Add(C); verticeIndex++; visited[ c] = verticeIndex; tempC = verticeIndex; } c = tempC; } sphereMesh.Triangles[i].A = a; sphereMesh.Triangles[i].B = b; sphereMesh.Triangles[i].C = c; } sphereMesh.Vertices = vertices.ToArray(); }

## Swirling at the poles

With the fixed vertices we can now boot up the engine again and see that the zipper is gone, replaced by the expected seam because the texture does not tile. Great. But there is another problem at the north pole, and at the south pole too. There is a kind of swirling pattern going on.

As you might have guessed by now, these swirls are also due to vertex sharing. But why wasn’t that fixed along with the zipper? Because the triangles aren’t flipped, they are just being stretched. My sphere mesh has only one vertex at the poles, but it has six triangles sharing that vertex.

As the left picture shows, triangle 1 is ok because all its vertices have the correct UV coordinates. Triangle 6 is also ok because that was flipped and the previous algorithm fixed that issue. Triangles 2 through 5 are the problem. The good news is that we now know what the problem is, the bad news is that there is no one-size-fits-all solution.

To the right I have cut out the top of the texture and mapped the triangles surrounding the pole onto it. And there is no way to fill the area between triangles 1 and 6 with any set of triangles, if we are only allowed to place one vertex of each triangle at the top edge. Why this restriction then, why not just place a few of them bottom up? Well, because the entire top edge of the texture (and the bottom edge too) maps to one single vertex on the model. If we placed a triangle bottom up in the area between 1 and 6, that triangle would get squeezed down to a single line, and thereby re-introducing the swirling pattern. In the end, no matter how we turn or stretch triangles 2 through 5, we will get seams.

One solution I have been able to find is to take the average of the U components of the other two vertices that make up any given triangle at the poles. The average U component approach minimizes the internal stretching of the triangles, by spacing the triangle tips evenly along the top edge of the texture. In the picture below you can still see the seams, but the swirling is gone.

The following code takes advantage of the fact that, in my sphere mesh, the vertices at the poles will always be vertex *A* of a triangle. The code first finds the north and south pole vertices, makes a note of their respective indices then finds each triangle that uses either the north or south pole. Each time a triangle is found to have a vertex on either pole, the polar vertex is duplicated and the *U* coordinate is update to be the average of the two other vertices’ *U* coordinate. Lastly, the mesh is updated with the new list of vertices.

private void FixSharedPoleVertices(Mesh sphereMesh) { Vertex north = sphereMesh.Vertices.First((v) => v.Position.Y == 1); Vertex south = sphereMesh.Vertices.First((v) => v.Position.Y == -1); uint northIndex = (uint)Array.IndexOf(sphereMesh.Vertices, north); uint southIndex = (uint)Array.IndexOf(sphereMesh.Vertices, south); List<Vertex> vertices = new List<Vertex>(sphereMesh.Vertices); uint verticeIndex = (uint)sphereMesh.Vertices.Length - 1; for (int i = 0; i < sphereMesh.Triangles.Length; ++i) { if (sphereMesh.Triangles[i].A == northIndex) { Vertex A = sphereMesh.Vertices[sphereMesh.Triangles[i].A]; Vertex B = sphereMesh.Vertices[sphereMesh.Triangles[i].B]; Vertex C = sphereMesh.Vertices[sphereMesh.Triangles[i].C]; Vertex newNorth = north; newNorth.TexCoord.X = (B.TexCoord.X + C.TexCoord.X) / 2; verticeIndex++; vertices.Add(newNorth); sphereMesh.Triangles[i].A = verticeIndex; } else if (sphereMesh.Triangles[i].A == southIndex) { Vertex A = sphereMesh.Vertices[sphereMesh.Triangles[i].A]; Vertex B = sphereMesh.Vertices[sphereMesh.Triangles[i].B]; Vertex C = sphereMesh.Vertices[sphereMesh.Triangles[i].C]; Vertex newSouth = south; newSouth.TexCoord.X = (B.TexCoord.X + C.TexCoord.X) / 2; verticeIndex++; vertices.Add(newSouth); sphereMesh.Triangles[i].A = verticeIndex; } } sphereMesh.Vertices = vertices.ToArray(); }

This is about as far as we can get working with the UV mapping in isolation, to get any better results than the one in the picture we need to put some further work into the texture.

The easy solution here would be to smear the texture around the top and bottom. If there are no drastic differences in color between the pixels at the edges, there won’t be on the 3D model either. In the final picture I have added a SmoothStep term to the Perlin Noise function. The term basically adds an amount of pure white to the pixel color based on that pixel’s distance to the polar cap. Around the caps this term completely drowns out the original color. I have set the threshold of the added term to about 75 pixels from the top and bottom of the texture. This makes the sphere look like it has polar ice caps, and if I may say so myself, for ten minutes of work it turned out not too bad.

Updated *CreateTexture* method.

public Bitmap CreateTexture(int width, int height) { Bitmap image = new Bitmap(width, height); double widthFactor = 1.0 / width; double heightFactor = 1.0 / height; Color baseColor = Color.FromArgb(255, 127, 35, 7); SimplexNoise generator = new SimplexNoise(seed: -5000); int polarCapStart = 75; for (int x = 0; x < width; ++x) { for (int y = 0; y < height; ++y) { double dX = x / (double)width; double dY = y / (double)height; double noise = generator.PerlinNoise(dX, dY, octaves: 3, persistence: 3.0f / 4.0f, lacunarity: 3f); float polarCapNorth = 255 * MathF.SmootherStep(height - polarCapStart, height - 1, y); float polarCapSouth = 255 * (1 - MathF.SmootherStep(0, polarCapStart, y)); int blueHue = (int)(baseColor.B * noise) + baseColor.B + (int)polarCapNorth + (int)polarCapSouth; int redHue = (int)(baseColor.R * noise) + baseColor.R + (int)polarCapNorth + (int)polarCapSouth; int greenHue = (int)(baseColor.G * noise) + baseColor.G + (int)polarCapNorth + (int)polarCapSouth; Color col = Color.FromArgb( baseColor.A, MathI.Clamp(0, 255, redHue), MathI.Clamp(0, 255, greenHue), MathI.Clamp(0, 255, blueHue) ); image.SetPixel(x, y, col); } } return image;

This is about all the time I am willing to invest in this UV mapping scheme. For texturing a sphere the old fashioned way, the next natural step would be to look at some of the other projection types that exist, such as cube mapping or HEALpix. However, since my engine is going to be working with procedural assets and I already have a perfectly good Perlin Noise implementation, my next step is going to be implementing that function on the GPU. This will enable me to sample the noise in pixel perfect space (inside the pixel shader) without having to worry about classic artifacts like the zipper seam.

So that is going to be the theme of the next post I think: how to texture a sphere without having a texture.

Ultimate, UV mapping an Icosphere has been a pain and ultimate without any result (so far). I decided to go for a sub-divided cube inflated to create a sphere. While the vertices of the sphere is not equally distributed, the deviation is minor and can be worked around. Icosphere would’ve been an amazing mesh creation method as all vertices are equally distributed but without UV mapping, it is a dead-end