Hello all,
I have recently been trying to optimise my ray-tracer, and some recent changes made a big (NEGATIVE) difference, which I was not expecting at all.
Let me show some code snippets. I made similar changes in other functions as well, so I am only showing you one (please don't be too harsh... I was about to clean the code when I found this, and decided to publish earlier so I could ask the question)
Original case --> check it out HERE
So, I originally had something like this (I cleaned it a bit):
fn get_global_illumination(
&self,
scene: &Scene,
n_ambient_samples: usize,
n_shadow_samples: usize,
current_depth: usize,
material: &Box<dyn Material + Sync>,
normal: Vector3D, // I am planning to get rid of this
e1: Vector3D, // I am planning to get rid of this
e2: Vector3D, // I am planning to get rid of this
ray: &mut Ray,
intersection_pt: Point3D,
wt: Float, // This also lives in &Ray, so I don't want to use it as an input
rng: &mut RandGen,
aux: &mut RayTracerHelper
) -> Spectrum {
let mut global = Spectrum::black();
let depth = current_depth;
aux.rays[depth] = *ray; // Aux rays is a stack, used for avoiding allocations
for _ in 0..n_ambient_samples {
// Choose a direction.
let (bsdf_value, _is_specular) =
material.sample_bsdf(normal, e1, e2, intersection_pt, ray, rng);
let new_ray_dir = ray.geometry.direction;
// increase depth
let new_depth = current_depth+ 1;
let cos_theta = (normal * new_ray_dir).abs();
let new_value = wt * bsdf_value * cos_theta;
let color = material.colour();
let (li, light_pdf) = self.trace_ray(rng, scene, ray, new_depth, new_value, aux);
let fx = (li * cos_theta) * (color * bsdf_value);
let denominator = bsdf_value * n_ambient_samples as Float + n_shadow_samples as Float * light_pdf;
global += fx / denominator;
// restore ray, because it was modified by trace_ray executions
*ray = aux.rays[depth];
}
// return
global
}
What I thought would be an improvement --> check it out HERE
So, I decided to carry some of the information—e.g., depth and value—in the ray itself, like so:
fn get_global_illumination(
&self,
scene: &Scene,
n_ambient_samples: usize,
n_shadow_samples: usize,
material: &Box<dyn Material + Sync>,
normal: Vector3D, // I am planning to get rid of this
e1: Vector3D, // I am planning to get rid of this
e2: Vector3D, // I am planning to get rid of this
ray: &mut Ray,
rng: &mut RandGen,
aux: &mut RayTracerHelper
) -> Spectrum {
let mut global = Spectrum::black();
let depth = ray.depth; // This was previously an input
aux.rays[depth] = *ray;
let intersection_pt = ray.interaction.point; // this value was previously an input
for _ in 0..n_ambient_samples {
// Choose a direction.
let (bsdf_value, _is_specular) =
material.sample_bsdf(normal, e1, e2, intersection_pt, ray, rng);
let new_ray_dir = ray.geometry.direction;
ray.depth += 1; // instead of `let new_depth = current_depth+ 1;`
let cos_theta = (normal * new_ray_dir).abs();
ray.value *= bsdf_value*cos_theta; // instead of `let new_value = wt * bsdf_value * cos_theta;`
let color = material.colour();
let (li, light_pdf) = self.trace_ray(rng, scene, ray, aux);
let fx = (li * cos_theta) * (color * bsdf_value);
let denominator = bsdf_value * n_ambient_samples as Float + n_shadow_samples as Float * light_pdf;
global += fx / denominator;
// restore ray, which was modified
*ray = aux.rays[depth];
}
// return
global
}
Result:
The result is that both functions produce the same image, but THE FIRST VERSION TAKES 60sec WHILE THE SECOND TAKES 90sec
Any clues as to why this is happening? The only difference is that some values went from being an input to a field in the Ray.