Ray Tracing in One Weekend: Part 5 Textures
Introduction
I have been reading through this book series on Raytracing which walks the reader through the creation of a Raytracer using C++. In this series of articles, I will be going through this book and implementing the lessons in Rust instead and diving deep into the pieces.
In this article, we will be using textures to color our spheres. We will start with solid simple textures and then move on to textures which use images and finally onto noise based textures.
The following link is the commit in my Github repo that matches the code we will go over.
Texture
In this section, we’ll add textures to our objects to enhance visual detail by applying colors dynamically based on their surface coordinates. Previously, each object had a single, static color value. Now, by implementing a texture trait, we can generate complex and varied color patterns that respond to each point on an object’s surface. This will allow us to create more realistic and detailed objects in our scenes.
To start, let’s define our Texture
trait. This trait will include a method for retrieving color based on a given UV coordinate and a point in 3D space:
pub trait Texture: Send + Sync {
fn get_color(&self, u: f64, v:f64, point: &Point3) -> Color;
}
Here’s what each parameter represents:
u
andv
: These are the UV coordinates, representing the horizontal and vertical axes in texture space. Theu
andv
values range from 0 to 1, which map to specific locations on the texture.point
: The 3D point on the object where the texture is applied. This allows us to create textures that vary based on the object’s geometry.
The Texture
trait is designed with Send
and Sync
to make it thread-safe, so it can be shared safely across threads in a multithreaded renderer.
Basic Textures: Solid Color and Checkerboard
In our renderer, textures allow us to dynamically determine the color of each point on an object’s surface. Let’s start by implementing two foundational textures: a solid color texture and a checkerboard texture.
The SolidColor
texture is the simplest type of texture. It always returns the same color, making it ideal for surfaces with a uniform color. We define it with the following structure and methods:
pub struct SolidColor {
albedo: Color
}
impl SolidColor {
pub fn new(albedo: Color) -> SolidColor {
SolidColor {
albedo
}
}
pub fn from_rgb(red: f64, green: f64, blue: f64) -> SolidColor {
let albedo = Color::new(red, green, blue);
SolidColor {
albedo
}
}
}
impl Texture for SolidColor {
fn get_color(&self, _u: f64, _v:f64, _point: &Point3) -> Color {
self.albedo
}
}
The get_color
method in SolidColor
disregards the UV coordinates and the point
, as it always returns the albedo
color.
The CheckerTexture
is more complex and visually interesting, as it alternates between two colors or textures in a checkerboard pattern. This pattern is applied over a 3D grid, creating a distinct look where each small 3D "cube" alternates between two colors or textures. The texture's color depends on the point’s position in 3D space, making it suitable for adding spatial variation.
pub struct CheckerTexture {
inv_scale: f64,
even: Box<dyn Texture>,
odd: Box<dyn Texture>
}
impl CheckerTexture {
pub fn new(scale: f64, even: Box<dyn Texture>, odd: Box<dyn Texture>) -> CheckerTexture {
CheckerTexture {
inv_scale: 1.0 / scale,
even,
odd
}
}
pub fn from_colors(scale: f64, color1: Color, color2: Color) -> CheckerTexture {
let texture1 = SolidColor::new(color1);
let texture2 = SolidColor::new(color2);
CheckerTexture {
inv_scale: 1.0 / scale,
even: Box::new(texture1),
odd: Box::new(texture2)
}
}
}
impl Texture for CheckerTexture {
fn get_color(&self, u: f64, v:f64, point: &Point3) -> Color {
let x = f64::floor(self.inv_scale * point.x()) as i32;
let y = f64::floor(self.inv_scale * point.y()) as i32;
let z = f64::floor(self.inv_scale * point.z()) as i32;
let is_even = (x + y + z) % 2 == 0;
if is_even {
self.even.get_color(u, v, point)
} else {
self.odd.get_color(u, v, point)
}
}
}
The get_color
method calculates whether a point falls on an "even" or "odd" checker cell by using its 3D position. The method:
- Scales the
point
coordinates byinv_scale
to fit within each cell. - Floors each coordinate to get the grid position and sums them to determine if the cell is even or odd.
- Returns the color from the
even
orodd
texture based on this calculation.
Vec2
To support more flexible and efficient texture handling, we’ll create a simple 2D vector structure called Vec2
. This struct will represent 2D points, making it ideal for UV coordinates and similar 2D data. Vec2
is designed to be lightweight and similar in structure to Vec3
, our 3D vector type.
Additionally, we define an alias, UV
, for Vec2
, which will be used when referring to texture coordinates specifically.
#[derive(Copy, Clone, Default)]
pub struct Vec2 {
e: [f64; 2]
}
impl Vec2 {
pub fn new(x: f64, y: f64) -> Vec2 {
Vec2 {
e:[x,y]
}
}
pub fn x(&self) -> f64 {
self.e[0]
}
pub fn y(&self) -> f64 {
self.e[1]
}
}
pub type UV = Vec2;
Updating Hit Records with UV Coordinates
To support textures more effectively, we need to include UV coordinates in our HitRecord
struct. This will allow us to determine the exact point of texture mapping when a ray intersects an object.
We start by updating the HitRecord
struct to store u
and v
values, representing the UV coordinates at the hit point:
pub struct HitRecord {
pub p: Point3,
pub normal: Vec3,
pub mat: Arc<dyn Material>,
pub t: f64,
pub u: f64,
pub v: f64,
pub front_face: bool
}
Since we’re working with spherical objects, we need a function to calculate the UV coordinates at any given point on the sphere’s surface. We can derive these coordinates using spherical coordinates, with theta
and phi
representing the angular positions in 3D space:
fn get_sphere_uv(p: Point3) -> UV {
let theta = f64::acos(-p.y());
let phi = f64::atan2(-p.z(), p.x()) + f64::consts::PI;
let u = phi / (2.0 * f64::consts::PI);
let v = theta / f64::consts::PI;
UV::new(u, v)
}
In this function:
theta
represents the polar angle, calculated byacos(-p.y())
, wherep.y()
is the y-coordinate of the point on the sphere.phi
is the azimuthal angle, calculated byatan2(-p.z(), p.x()) + PI
. This adjustment ensures thatphi
falls within the range[0, 2π]
.
The resulting u
and v
values are then normalized to fall between 0 and 1.
With the get_sphere_uv
function defined, we can now calculate the UV coordinates whenever a ray intersects a sphere. We update the hit
function for the Sphere
struct to include these UV values in the HitRecord
:
impl Hittable for Sphere {
fn hit(&self, ray: &crate::ray::Ray, t_min: f64, t_max: f64) -> Option<HitRecord> {
...
let uv: UV = Sphere::get_sphere_uv(outward_norm);
let mut rec = HitRecord {
t: root,
p: ray.at(root),
mat: self.mat.clone(),
normal: Default::default(),
front_face: Default::default(),
u: uv.x(),
v: uv.y(),
};
...
Some(rec)
}
}
Integrating Textures into Materials
With textures now available, we can update our materials to utilize them instead of relying on simple colors. Textures will allow us to represent more complex and detailed surfaces, such as patterned or textured materials.
Here’s the updated Lambertian
struct:
pub struct Lambertian {
albedo: Box<dyn Texture>
}
impl Lambertian {
pub fn new(albedo: Box<dyn Texture>) -> Lambertian {
Lambertian {
albedo
}
}
pub fn from_color(albedo_color: Color) -> Lambertian {
let albedo = SolidColor::new(albedo_color);
Lambertian {
albedo: Box::new(albedo)
}
}
}
albedo
: Now holds a Box<dyn Texture>
, which can point to any texture type that implements the Texture
trait. This allows us to use either solid colors or complex patterns as the surface texture.
In the scatter
method of the Material
trait, we’ll use the texture’s get_color
method to obtain the color at the UV coordinates from the hit record. This color, in turn, will serve as the attenuation factor, or reflectance, of the scattered ray.
impl Material for Lambertian {
fn scatter(&self, r_in: &Ray, rec: &HitRecord) -> Option<ScatterRecord> {
let mut scatter_direction = rec.normal + vec3::random_unit_vector();
if scatter_direction.near_zero() {
scatter_direction = rec.normal;
}
Some(ScatterRecord {
attenuation: self.albedo.get_color(rec.u, rec.v, &rec.p),
scattered: Ray::new(rec.p, scatter_direction, r_in.time())
})
}
}
Rendering
We can then build a scene using these texture files for the color values in our materials.
fn checkered_spheres() {
let mut world = HittableList::new();
let checker_texture_one = CheckerTexture::from_colors(0.32, Color::new(0.2, 0.3, 0.1), Color::new(0.9, 0.9, 0.9));
let checker_texture_two = CheckerTexture::from_colors(0.32, Color::new(0.2, 0.3, 0.1), Color::new(0.9, 0.9, 0.9));
world.add(Box::new(Sphere::new(Ray::new(Point3::new(0.0, -10.0, 0.0) , Vec3::new(0.0, 0.0, 0.0), 0.0), Arc::new(Metal::new(Box::new(checker_texture_one), 0.1)), 10.0)));
world.add(Box::new(Sphere::new(Ray::new(Point3::new(0.0, 10.0, 0.0) , Vec3::new(0.0, 0.0, 0.0), 0.0), Arc::new(Lambertian::new(Box::new(checker_texture_two))), 10.0)));
let eye = Point3::new(13.0, 2.0, 3.0);
let lookat = Point3::new(0.0, 0.0, 0.0);
let up = Point3::new(0.0, 1.0, 0.0);
let dist_to_focus = (eye - lookat).length();
let aperture = 0.0;
let camera = Camera::new(IMAGE_WIDTH, IMAGE_HEIGHT, SAMPLES_PER_PIXEL, MAX_DEPTH, eye, lookat, up, 20.0, ASPECT_RATIO, aperture, dist_to_focus);
camera.render(&world);
}
Which should give us a render like this.
Image Texture
In graphics programming, adding image textures allows us to map detailed visuals directly onto 3D objects, enhancing realism. In this section, we’ll create a Rust structure called TextureImage
to load an image file, read pixel data, and use it as a texture on our objects.
The TextureImage
struct is built around the DynamicImage
type, which provides a flexible way to handle image data.
pub struct TextureImage {
image: DynamicImage
}
impl TextureImage {
pub fn new(image_file: &str) -> TextureImage {
let image = ImageReader::open(image_file).expect("should be a valid image path").decode().expect("should decode");
TextureImage {
image
}
}
pub fn width(&self) -> u32 {
self.image.width()
}
pub fn height(&self) -> u32 {
self.image.height()
}
pub fn pixel_data(&self, x: u32, y: u32) -> Color {
let clamped_x = u32::clamp(x, 0, self.width());
let calmped_y = u32::clamp(y, 0, self.height());
let pixel = self.image.get_pixel(clamped_x, calmped_y).to_rgb();
let r = (pixel.0[0] as f64) / 255.0;
let g = (pixel.0[1] as f64) / 255.0;
let b = (pixel.0[2] as f64) / 255.0;
Color::new(r, g, b)
}
}
With the image loaded, we create helper functions to retrieve image dimensions (width
and height
) and pixel color data. The pixel_data
function clamps coordinates within image boundaries to avoid out-of-bounds errors and retrieves RGB color values normalized between 0.0 and 1.0.
impl Texture for TextureImage {
fn get_color(&self, u: f64, v:f64, _point: &crate::vec3::Point3) -> Color {
let u_clamped = f64::clamp(u, 0.0, 1.0);
let v_clamped = 1.0 - f64::clamp(v, 0.0, 1.0);
let x = (u_clamped * self.width() as f64) as u32;
let y = (v_clamped * self.height() as f64) as u32;
self.pixel_data(x, y)
}
}
We integrate TextureImage
into the rendering system by implementing the Texture
trait. The get_color
function takes UV coordinates, clamps them within the [0.0, 1.0] range, and maps them to the image’s pixel grid. This allows us to use any section of the image as a texture on our object.
We can then build a scene to use an earthmap image as a texture like this
fn earth() {
let mut world = HittableList::new();
let earth_texture = TextureImage::new("assets/earthmap.jpg");
let earth_mat = Lambertian::new(Box::new(earth_texture));
world.add(Box::new(Sphere::new(Ray::new(Point3::new(0.0, 0.0, 0.0) , Vec3::new(0.0, 0.0, 0.0), 0.0), Arc::new(earth_mat), 2.0)));
let eye = Point3::new(0.0, 0.0, 12.0);
let lookat = Point3::new(0.0, 0.0, 0.0);
let up = Point3::new(0.0, 1.0, 0.0);
let dist_to_focus = (eye - lookat).length();
let aperture = 0.0;
let camera = Camera::new(IMAGE_WIDTH, IMAGE_HEIGHT, SAMPLES_PER_PIXEL, MAX_DEPTH, eye, lookat, up, 20.0, ASPECT_RATIO, aperture, dist_to_focus);
camera.render(&world);
}
Giving us this render.
Noise Textures
The final type of texture we will create are procedural noise textures which take in parameters and then randomly generate textures using noise functions.
Perlin Noise
Perlin Noise is a gradient noise function commonly used in computer graphics for generating natural-looking textures and terrains. In our implementation, we define a Perlin
struct that encapsulates the necessary data for generating Perlin noise. This struct holds an array of random vectors and permutations for the x, y, and z dimensions, enabling smooth transitions in the generated noise.
pub struct Perlin {
rand_vec: [Vec3; 256],
perm_x: [i32; 256],
perm_y: [i32; 256],
perm_z: [i32; 256]
}
Inside of impl we have a new function which calls some helper functions to build out these fields.
pub fn new() -> Perlin {
let mut rand_vec = [Vec3::random(); 256];
for i in 0..256 {
rand_vec[i] = Vec3::random_range(-1.0, 1.0);
}
let perm_x = Perlin::perlin_generate_perm();
let perm_y = Perlin::perlin_generate_perm();
let perm_z = Perlin::perlin_generate_perm();
Perlin {
rand_vec,
perm_x,
perm_y,
perm_z
}
}
fn perlin_generate_perm() -> [i32; 256] {
let mut p = [0; 256];
for i in 0..256 {
p[i] = i as i32;
}
Perlin::permute(256, &mut p);
p
}
fn permute(n: usize, p: &mut [i32; 256]) {
for i in (0..n-1).rev() {
let target: usize = random_int_range(0, i as i32) as usize;
let tmp = p[i];
p[i] = p[target];
p[target] = tmp;
}
}
These functions create a list of random Vec3s, then creates three arrays of random permutations of the numbers 0–255 which are stored in the x,y and z permutation arrays.
We then expose some functions to actually generate noise using our struct fields.
The noise
function computes the Perlin noise value at a given point in 3D space. The function first scales the input coordinates and calculates fractional parts for each dimension, which are then smoothed using a fade function.
pub fn noise(&self, scale: f64, p: &Point3) -> f64 {
let mut u = f64::abs(p.x() * scale) - f64::floor(f64::abs(p.x() * scale));
let mut v = f64::abs(p.y() * scale) - f64::floor(f64::abs(p.y() * scale));
let mut w = f64::abs(p.z() * scale) - f64::floor(f64::abs(p.z() * scale));
u = u*u*(3.0-2.0*u);
v = v*v*(3.0-2.0*v);
w = w*w*(3.0-2.0*w);
let i = f64::floor(f64::abs(p.x() * scale)) as usize;
let j = f64::floor(f64::abs(p.y() * scale)) as usize;
let k = f64::floor(f64::abs(p.z() * scale)) as usize;
let mut c = [[[Vec3::random(); 2];2];2];
for di in 0..2 {
for dj in 0..2 {
for dk in 0..2 {
c[di][dj][dk] = self.rand_vec[(self.perm_x[(i + di) & 255] ^ self.perm_y[(j + dj) & 255] ^ self.perm_z[(k + dk) & 255]) as usize]
}
}
}
Perlin::perlin_interp(c, u, v, w)
}
The turbulence
function adds a layer of complexity to the noise, simulating the effect of variations in height or surface texture. This function accumulates noise values at progressively smaller scales, weighted to create a more intricate pattern:
pub fn turbulence(&self, p: &Point3, depth: i32) -> f64 {
let mut accum = 0.0;
let mut temp_p = p.clone();
let mut weight = 1.0;
for _ in 0..depth {
accum += weight * self.noise(1.0, &temp_p);
weight *= 0.5;
temp_p = 2.0 * temp_p;
}
f64::abs(accum)
}
Finally, the perlin_interp
function performs trilinear interpolation, blending the noise values based on the calculated weights for each of the surrounding points
fn perlin_interp(c: [[[Vec3; 2]; 2]; 2], u: f64, v: f64, w: f64) -> f64 {
let uu = u*u*(3.0-2.0*u);
let vv = v*v*(3.0-2.0*v);
let ww = w*w*(3.0-2.0*w);
let mut accum = 0.0;
for i in 0..2 {
for j in 0..2 {
for k in 0..2 {
let weight = Vec3::new(u - i as f64, v - j as f64, w - k as f64);
let term_one = uu * (i as f64) + ((1 - (i as i32)) as f64) * (1.0 - uu);
let term_two = vv * (j as f64) + ((1 - (j as i32)) as f64) * (1.0 - vv);
let term_three = ww * (k as f64) + ((1 - (k as i32)) as f64) * (1.0 - ww);
accum += term_one * term_two * term_three * dot(c[i][j][k], weight);
}
}
}
accum
}
Noise Texture
We encapsulate the functionality of our Perlin Noise in a NoiseTexture
struct, which uses the noise to influence the color of our objects. Below is the definition of the NoiseTexture
struct:
pub struct NoiseTexture {
noise: Perlin,
scale: f64
}
impl NoiseTexture {
pub fn new(scale: f64) -> NoiseTexture {
NoiseTexture {
noise: Perlin::new(),
scale
}
}
}
impl Texture for NoiseTexture {
fn get_color(&self, u: f64, v:f64, point: &Point3) -> Color {
Color::new(0.5, 0.5, 0.5) * (1.0 + f64::sin(self.scale * point.z() + 10.0 * self.noise.turbulence(point, 7)))
}
}
In this implementation of the Texture trait:
- We start with a base color of gray (
Color::new(0.5, 0.5, 0.5)
). - The sine function introduces oscillations based on the
z
coordinate of the point, scaled by thescale
factor. - The
turbulence
function from thePerlin
noise implementation adds additional variation, which simulates the irregularities often found in natural textures.
Rendering
We can use this noise texture to create a world like this
fn perlin_spheres() {
let mut world = HittableList::new();
let perlin_texture_one = NoiseTexture::new(4.0);
let perlin_texture_two = NoiseTexture::new(4.0);
world.add(Box::new(Sphere::new(Ray::new(Point3::new(0.0, -1000.0, 0.0) , Vec3::new(0.0, 0.0, 0.0), 0.0), Arc::new(Lambertian::new(Box::new(perlin_texture_one))), 1000.0)));
world.add(Box::new(Sphere::new(Ray::new(Point3::new(0.0, 2.0, 0.0) , Vec3::new(0.0, 0.0, 0.0), 0.0), Arc::new(Lambertian::new(Box::new(perlin_texture_two))), 2.0)));
let eye = Point3::new(13.0, 2.0, 3.0);
let lookat = Point3::new(0.0, 0.0, 0.0);
let up = Point3::new(0.0, 1.0, 0.0);
let dist_to_focus = (eye - lookat).length();
let aperture = 0.0;
let camera = Camera::new(IMAGE_WIDTH, IMAGE_HEIGHT, SAMPLES_PER_PIXEL, MAX_DEPTH, eye, lookat, up, 20.0, ASPECT_RATIO, aperture, dist_to_focus);
camera.render(&world);
}
Which will give us a render like below
Conclusion
In this article, we discovered how we can create Textures and use these textures as the albedo color in our materials. We started off with easy solid and checkered textures, then moved on to image based textures and finally tackled procedural noise based textures.