Chapter 5: Tile Geometry

Generating spherical meshes from tile coordinates

Introduction

GIS raster tiles are "images," but to map them onto a sphere, you need a "surface" to receive them. In 3D, you use BufferGeometry to define vertices, normals, UVs, and indices yourself to construct a polygon mesh.

A key GIS-specific consideration is Mercator Y interpolation for UV coordinates. Since tile images created with Mercator projection have a non-linear pixel distribution in the latitude direction, correction is required to map them correctly onto a spherical mesh. Try switching to wireframe mode to see the actual mesh structure.

What You Will Learn

  • How to generate a spherical mesh from tile coordinates (z/x/y) — programmatically creating the surface on which tile images are mapped
  • The roles of vertices, normals, UV coordinates, and indices — the four data arrays that make up a 3D mesh
  • Texture coordinate alignment via Mercator Y interpolation — a GIS-specific process to correct Mercator projection distortion
  • Transformation chain: (u,v) → (lon, mercY) → lat → ECEF → scene coordinates
  • Index winding order (CCW) and back-face culling
  • Colorful mesh display at z=2 (4x4 = 16 tiles) — visualizing tile boundaries and subdivision patterns

Display Toggle

Turning wireframe ON reveals how each tile is subdivided into a 16x16 grid. This is the actual polygon mesh that approximates the spherical surface.

What Does It Mean to "Map" a Tile onto the Sphere?

A tile image is a flat square image (256x256px), while the Earth is a sphere (ellipsoidal surface). To map a tile onto the sphere, you need to generate a mesh (a collection of triangles) on the spherical surface covering the tile's extent, then apply the tile image as a texture.

Tile image (256x256px)   Spherical mesh

+------------------+    /----------\

|                  |  → / / / / / / \

|  OSM Tile       |    / / / / / /   |

|                  |    \ / / / / /  /

+------------------+    \----------/

   Mapped via UVs  → Converted to spherical coords

Why Generate a Mesh for Each Tile?

If you create the entire Earth as a single giant mesh, the resolution will be insufficient when zooming close to the surface, causing the curved surface to appear blocky. Conversely, making the entire mesh high-resolution would result in an enormous vertex count and poor performance.

By generating meshes per tile, you can achieve LOD (Level of Detail) control — fine detail near the camera, coarse detail far away. This is the foundational design for the SSE implementation in the next chapter.

The computeTileGeometry Function

Spherical mesh generation is handled by the computeTileGeometry function in core/tile-geometry.ts. This function has no dependency on Three.js and returns raw data as Float32Array and Uint32Array.

ParameterDefaultDescription
x, y, z-Tile coordinates
segments16Subdivision count (16x16 = 256 cells)
heightFnNoneFunction that returns elevation for each grid point

segments = 16 is a value referenced from the PLATEAU prototype implementation. With 16 subdivisions, you get 289 vertices and 512 triangles, which sufficiently represents spherical curvature while maintaining reasonable GPU load.

The Four Components of a Mesh

A Three.js BufferGeometry consists of four data arrays:

  • vertices (position coordinates) — The x, y, z coordinates for each vertex. The tile's lat/lon extent is divided into a segments x segments grid, and each point is converted to 3D coordinates on the WGS84 ellipsoid surface
  • normals — The "outward" direction vector at each vertex. Used in lighting calculations to determine surface shading. Approximated by normalizing the direction vector from the origin (sufficient because the flattening is only about 0.3%)
  • uvs (texture coordinates) — Which position in the tile image corresponds to this vertex. Specified in the 0-1 range. Requires Mercator Y interpolation (discussed below)
  • indices — The order in which vertices are grouped into sets of three to form triangles. Each grid cell is split into two triangles
Using computeTileGeometry
const data = computeTileGeometry(
  x, y, z,
  16  // segments: 16x16 grid
);

const geometry = new THREE.BufferGeometry();
geometry.setAttribute(
  'position',
  new THREE.BufferAttribute(data.vertices, 3)
);
geometry.setAttribute(
  'normal',
  new THREE.BufferAttribute(data.normals, 3)
);
geometry.setAttribute(
  'uv',
  new THREE.BufferAttribute(data.uvs, 2)
);
geometry.setIndex(
  new THREE.BufferAttribute(data.indices, 1)
);

Vertex Generation: Interpolation in Mercator Y Space

The key to vertex generation is interpolating latitude in Mercator Y space.

Why not linearly interpolate latitude directly? Tile images are created using the Mercator projection. In Mercator projection, the Y-axis stretching increases at higher latitudes. Tile image pixel coordinates are linear with respect to Mercator Y, not with respect to latitude.

If you linearly interpolate latitude directly, the texture will be misaligned, causing roads and coastlines to appear distorted. Interpolating in Mercator Y space ensures that the tile image's UV coordinates correctly correspond to the spherical mesh vertices.

Core of vertex generation
const mercYNorth =
  latToMercatorY(bounds.north);
const mercYSouth =
  latToMercatorY(bounds.south);

for (let j = 0; j <= segments; j++) {
  for (let i = 0; i <= segments; i++) {
    const u = i / segments;
    const v = j / segments;
    const lon = bounds.west
      + (bounds.east - bounds.west) * u;
    // Interpolate in Mercator Y space!
    const mercY = mercYNorth
      + (mercYSouth - mercYNorth) * v;
    const lat = mercatorYToLat(mercY);

    const ecef =
      geodeticToEcef(lat, lon, alt);
    const scene =
      ecefToScenePosition(ecef.x, ecef.y, ecef.z);
    // Store scene coordinates in the vertices array
  }
}

Transformation Chain

Each vertex is computed through the following transformation chain:

(u, v) → (lon, mercY) → lat → (lat, lon, alt) → ECEF → scene coordinates
  • Convert grid point (i, j) to normalized coordinates (u, v)
  • Linearly interpolate longitude from u, and Mercator Y from v
  • Convert Mercator Y back to latitude
  • Convert latitude, longitude, and altitude to ECEF coordinates
  • Transform ECEF coordinates to scene coordinates via axis conversion

Normal Calculation

The surface normal vector on the Earth can be approximated by the direction vector from the origin (Earth's center) to the vertex. Strictly speaking, the ellipsoid normal does not align exactly with the direction from the origin, but since the flattening is small (about 0.3%), this approximation is sufficient.

Normal calculation
// Spherical normal = normalized vertex position
const len = Math.sqrt(
  scene.x*scene.x
  + scene.y*scene.y
  + scene.z*scene.z
);
normals[idx*3]   = scene.x / len;
normals[idx*3+1] = scene.y / len;
normals[idx*3+2] = scene.z / len;

UV Coordinates

The UV coordinate setup appears simple, but note the 1 - v inversion.

UV coordinates
uvs[idx*2]   = u;
uvs[idx*2+1] = 1 - v;
  • j = 0 (north edge) → v = 0 → UV v = 1 (top of texture)
  • j = segments (south edge) → v = 1 → UV v = 0 (bottom of texture)

Three.js textures default to flipY = true, which flips the image vertically on load. As a result, the mapping between OpenGL's UV coordinate system (v=0 at bottom, v=1 at top) and image pixels is swapped. Since the north side is at the top of tile images (y=0 side), 1 - v ensures the north edge correctly corresponds to the top of the UV space.

Index Generation

Each cell (grid square) is split into two triangles.

a --- b

| \  |

|  \ |

c --- d

Triangle 1: a → c → b

Triangle 2: b → c → d

Index generation
for (let j = 0; j < segments; j++) {
  for (let i = 0; i < segments; i++) {
    const a = j*(segments+1) + i;
    const b = a + 1;
    const c = (j+1)*(segments+1) + i;
    const d = c + 1;

    indices[ptr++] = a; // Triangle 1
    indices[ptr++] = c;
    indices[ptr++] = b;

    indices[ptr++] = b; // Triangle 2
    indices[ptr++] = c;
    indices[ptr++] = d;
  }
}

The vertex winding order is counter-clockwise (CCW), matching Three.js's default front-face determination. This ordering ensures the cross product (normal direction) points outward from the sphere, allowing correct rendering with THREE.FrontSide.

Back faces (faces seen from inside the Earth) are not drawn. This optimization is called back-face culling and reduces the number of rendered triangles by half.

Numerical Example (segments = 16)

ItemValue
Vertex count(16+1)² = 289
Cell count16² = 256
Triangle count256 x 2 = 512
Index count512 x 3 = 1,536

The heightFn Parameter

The heightFn parameter is a function that returns the elevation (in meters) for each grid point (i, j). When omitted, elevation defaults to 0 (on the ellipsoid surface). This feature serves as the foundation for the "Terrain" functionality in later chapters.

heightFn usage example
// Place all vertices 100km above the surface
const elevated = computeTileGeometry(
  0, 0, 2, 16, () => 100_000
);

// Variable elevation per grid point (terrain mesh)
const terrain = computeTileGeometry(
  0, 0, 10, 16,
  (i, j) => heightData[j * 17 + i]
);

Three.js Mesh Generation

The raw data returned by core/tile-geometry.ts is converted into a Three.js Mesh by renderer/tile-mesh.ts. Thanks to the separation between the core and renderer layers, this conversion code is very thin.

createTileMesh
export function createTileMesh(
  x: number, y: number, z: number,
  texture: THREE.Texture,
  segments: number = 16
): THREE.Mesh {
  const data =
    computeTileGeometry(x, y, z, segments);
  const geometry = new THREE.BufferGeometry();
  // ... set attributes ...
  const material =
    new THREE.MeshStandardMaterial({
      map: texture,
      side: THREE.FrontSide
    });
  return new THREE.Mesh(geometry, material);
}

MeshStandardMaterial is chosen because it is a PBR (Physically Based Rendering) material that responds to lighting, producing natural shading from the DirectionalLight.

Adjacent Tile Seams

The most important quality metric in tile-based rendering is normal consistency at adjacent tile boundaries. Seamless rendering is guaranteed because boundary vertices are generated from the same longitude and latitude values.

Adjacent tile normal consistency test
it('normals match at adjacent tile boundary vertices',
() => {
  const left =
    computeTileGeometry(0,0,1, segments);
  const right =
    computeTileGeometry(1,0,1, segments);
  // Right edge column of left tile matches left edge column of right tile
  for (let j = 0; j <= segments; j++) {
    const leftIdx =
      j*(segments+1) + segments;
    const rightIdx =
      j*(segments+1) + 0;
    for (let c=0; c<3; c++)
      expect(left.normals[leftIdx*3+c])
        .toBeCloseTo(
          right.normals[rightIdx*3+c], 5
        );
  }
});

If this test fails (normals become discontinuous), a lighting "seam" will be visible at the boundary between adjacent tiles.

Color Coding

Each tile's HSL hue is calculated based on its tile position. hue = ((x + y * n) / n²) * 360 produces a rainbow gradient across the 16 tiles, making tile boundaries and subdivision patterns intuitively understandable.

Summary

In this chapter, we implemented the process of generating spherical meshes for tiles.

  • Mercator Y interpolation: the key to correctly aligning textures with meshes
  • Transformation chain: UV → lon/Mercator Y → lat → ECEF → scene coordinates
  • Normals: approximated by normalizing the direction vector from the origin
  • UV inversion: 1 - v to match Three.js's texture coordinate system
  • Indices: each cell split into 2 triangles, CCW winding order
  • heightFn: foundation for terrain mesh generation

In the next chapter, we will implement SSE-based LOD control and frustum culling to determine "which tiles should be displayed."