Generating spherical meshes from tile coordinates
GIS raster tiles are "images," but to map them onto a sphere, you need a "surface" to receive them. In 3D, you use BufferGeometry to define vertices, normals, UVs, and indices yourself to construct a polygon mesh.
A key GIS-specific consideration is Mercator Y interpolation for UV coordinates. Since tile images created with Mercator projection have a non-linear pixel distribution in the latitude direction, correction is required to map them correctly onto a spherical mesh. Try switching to wireframe mode to see the actual mesh structure.
Turning wireframe ON reveals how each tile is subdivided into a 16x16 grid. This is the actual polygon mesh that approximates the spherical surface.
A tile image is a flat square image (256x256px), while the Earth is a sphere (ellipsoidal surface). To map a tile onto the sphere, you need to generate a mesh (a collection of triangles) on the spherical surface covering the tile's extent, then apply the tile image as a texture.
Tile image (256x256px) Spherical mesh
+------------------+ /----------\
| | → / / / / / / \
| OSM Tile | / / / / / / |
| | \ / / / / / /
+------------------+ \----------/
Mapped via UVs → Converted to spherical coords
If you create the entire Earth as a single giant mesh, the resolution will be insufficient when zooming close to the surface, causing the curved surface to appear blocky. Conversely, making the entire mesh high-resolution would result in an enormous vertex count and poor performance.
By generating meshes per tile, you can achieve LOD (Level of Detail) control — fine detail near the camera, coarse detail far away. This is the foundational design for the SSE implementation in the next chapter.
Spherical mesh generation is handled by the computeTileGeometry function in core/tile-geometry.ts.
This function has no dependency on Three.js and returns raw data as Float32Array and Uint32Array.
| Parameter | Default | Description |
|---|---|---|
| x, y, z | - | Tile coordinates |
| segments | 16 | Subdivision count (16x16 = 256 cells) |
| heightFn | None | Function that returns elevation for each grid point |
segments = 16 is a value referenced from the PLATEAU prototype implementation.
With 16 subdivisions, you get 289 vertices and 512 triangles, which sufficiently represents
spherical curvature while maintaining reasonable GPU load.
A Three.js BufferGeometry consists of four data arrays:
const data = computeTileGeometry(
x, y, z,
16 // segments: 16x16 grid
);
const geometry = new THREE.BufferGeometry();
geometry.setAttribute(
'position',
new THREE.BufferAttribute(data.vertices, 3)
);
geometry.setAttribute(
'normal',
new THREE.BufferAttribute(data.normals, 3)
);
geometry.setAttribute(
'uv',
new THREE.BufferAttribute(data.uvs, 2)
);
geometry.setIndex(
new THREE.BufferAttribute(data.indices, 1)
);The key to vertex generation is interpolating latitude in Mercator Y space.
Why not linearly interpolate latitude directly? Tile images are created using the Mercator projection. In Mercator projection, the Y-axis stretching increases at higher latitudes. Tile image pixel coordinates are linear with respect to Mercator Y, not with respect to latitude.
If you linearly interpolate latitude directly, the texture will be misaligned, causing roads and coastlines to appear distorted. Interpolating in Mercator Y space ensures that the tile image's UV coordinates correctly correspond to the spherical mesh vertices.
const mercYNorth =
latToMercatorY(bounds.north);
const mercYSouth =
latToMercatorY(bounds.south);
for (let j = 0; j <= segments; j++) {
for (let i = 0; i <= segments; i++) {
const u = i / segments;
const v = j / segments;
const lon = bounds.west
+ (bounds.east - bounds.west) * u;
// Interpolate in Mercator Y space!
const mercY = mercYNorth
+ (mercYSouth - mercYNorth) * v;
const lat = mercatorYToLat(mercY);
const ecef =
geodeticToEcef(lat, lon, alt);
const scene =
ecefToScenePosition(ecef.x, ecef.y, ecef.z);
// Store scene coordinates in the vertices array
}
}Each vertex is computed through the following transformation chain:
The surface normal vector on the Earth can be approximated by the direction vector from the origin (Earth's center) to the vertex. Strictly speaking, the ellipsoid normal does not align exactly with the direction from the origin, but since the flattening is small (about 0.3%), this approximation is sufficient.
// Spherical normal = normalized vertex position
const len = Math.sqrt(
scene.x*scene.x
+ scene.y*scene.y
+ scene.z*scene.z
);
normals[idx*3] = scene.x / len;
normals[idx*3+1] = scene.y / len;
normals[idx*3+2] = scene.z / len;The UV coordinate setup appears simple, but note the 1 - v inversion.
uvs[idx*2] = u;
uvs[idx*2+1] = 1 - v;Three.js textures default to flipY = true,
which flips the image vertically on load. As a result, the mapping between OpenGL's UV coordinate system (v=0 at bottom, v=1 at top) and image pixels is swapped.
Since the north side is at the top of tile images (y=0 side), 1 - v ensures the north edge correctly corresponds to the top of the UV space.
Each cell (grid square) is split into two triangles.
a --- b
| \ |
| \ |
c --- d
Triangle 1: a → c → b
Triangle 2: b → c → d
for (let j = 0; j < segments; j++) {
for (let i = 0; i < segments; i++) {
const a = j*(segments+1) + i;
const b = a + 1;
const c = (j+1)*(segments+1) + i;
const d = c + 1;
indices[ptr++] = a; // Triangle 1
indices[ptr++] = c;
indices[ptr++] = b;
indices[ptr++] = b; // Triangle 2
indices[ptr++] = c;
indices[ptr++] = d;
}
}The vertex winding order is counter-clockwise (CCW),
matching Three.js's default front-face determination.
This ordering ensures the cross product (normal direction) points outward from the sphere,
allowing correct rendering with THREE.FrontSide.
Back faces (faces seen from inside the Earth) are not drawn. This optimization is called back-face culling and reduces the number of rendered triangles by half.
| Item | Value |
|---|---|
| Vertex count | (16+1)² = 289 |
| Cell count | 16² = 256 |
| Triangle count | 256 x 2 = 512 |
| Index count | 512 x 3 = 1,536 |
The heightFn parameter is a function that returns the elevation (in meters) for each grid point (i, j).
When omitted, elevation defaults to 0 (on the ellipsoid surface). This feature serves as the foundation for the "Terrain" functionality in later chapters.
// Place all vertices 100km above the surface
const elevated = computeTileGeometry(
0, 0, 2, 16, () => 100_000
);
// Variable elevation per grid point (terrain mesh)
const terrain = computeTileGeometry(
0, 0, 10, 16,
(i, j) => heightData[j * 17 + i]
);The raw data returned by core/tile-geometry.ts is
converted into a Three.js Mesh by renderer/tile-mesh.ts.
Thanks to the separation between the core and renderer layers, this conversion code is very thin.
export function createTileMesh(
x: number, y: number, z: number,
texture: THREE.Texture,
segments: number = 16
): THREE.Mesh {
const data =
computeTileGeometry(x, y, z, segments);
const geometry = new THREE.BufferGeometry();
// ... set attributes ...
const material =
new THREE.MeshStandardMaterial({
map: texture,
side: THREE.FrontSide
});
return new THREE.Mesh(geometry, material);
}MeshStandardMaterial is chosen because it is a PBR (Physically Based Rendering) material
that responds to lighting, producing natural shading from the DirectionalLight.
The most important quality metric in tile-based rendering is normal consistency at adjacent tile boundaries. Seamless rendering is guaranteed because boundary vertices are generated from the same longitude and latitude values.
it('normals match at adjacent tile boundary vertices',
() => {
const left =
computeTileGeometry(0,0,1, segments);
const right =
computeTileGeometry(1,0,1, segments);
// Right edge column of left tile matches left edge column of right tile
for (let j = 0; j <= segments; j++) {
const leftIdx =
j*(segments+1) + segments;
const rightIdx =
j*(segments+1) + 0;
for (let c=0; c<3; c++)
expect(left.normals[leftIdx*3+c])
.toBeCloseTo(
right.normals[rightIdx*3+c], 5
);
}
});If this test fails (normals become discontinuous), a lighting "seam" will be visible at the boundary between adjacent tiles.
Each tile's HSL hue is calculated based on its tile position. hue = ((x + y * n) / n²) * 360 produces a rainbow gradient across the 16 tiles, making tile boundaries and subdivision patterns
intuitively understandable.
In this chapter, we implemented the process of generating spherical meshes for tiles.
In the next chapter, we will implement SSE-based LOD control and frustum culling to determine "which tiles should be displayed."