In 
3D computer graphics, 
ray tracing is a technique for modeling 
light transport for use in a wide variety of 
rendering algorithms for generating 
digital images.
On a spectrum of 
computational cost and visual fidelity, ray tracing-based rendering techniques, such as 
ray casting, 
recursive ray tracing, 
distribution ray tracing, 
photon mapping and 
path tracing, are generally slower and higher fidelity than 
scanline rendering methods.
[1] Thus, ray tracing was first deployed in applications where taking a relatively long time to render could be tolerated, such as in still computer-generated images, and film and television 
visual effects (VFX), but was less suited to 
real-time applications such as 
video games, where 
speed is critical in rendering each 
frame.
[2]
Since 2018, however, 
hardware acceleration for real-time ray tracing has become standard on new commercial graphics cards, and graphics APIs have followed suit, allowing developers to use hybrid ray tracing and 
rasterization-based rendering in games and other real-time applications with a lesser hit to frame render times.
Ray tracing is capable of simulating a variety of 
optical effects,
[3] such as 
reflection, 
refraction, 
soft shadows, 
scattering, 
depth of field, 
motion blur, 
caustics, 
ambient occlusion and 
dispersion phenomena (such as 
chromatic aberration). It can also be used to trace the path of 
sound waves in a similar fashion to light waves, making it a viable option for more immersive sound design in video games by rendering realistic 
reverberation and 
echoes.
[4] In fact, any physical 
wave or 
particle phenomenon with approximately linear motion can be simulated with 
ray tracing.
Ray tracing-based rendering techniques that involve sampling light over a domain generate 
image noise artifacts that can be addressed by tracing a very large number of rays or using 
denoising techniques.