Currently, appleseed only renders the surfaces of objects, and treats the space between objects as a void. Hereby, one of the most requested features in appleseed is Volume rendering. This term implies that the rendering engine takes into account how light interacts with media between the objects, and computes how it is absorbed and scattered by air, smoke or fog molecules, or by denser media such as milk or marble.
My goal is to integrate the feature of rendering homogeneous volumes to appleseed engine, and thus making it capable to handle simple volumetric effects, such as light shafts in a foggy environment. During my work I will investigate different approaches of visualizing volumes, select the techniques that are modern, efficient and fit the best to the existing path tracing code of appleseed, and then implement the chosen methods. Additionally, I will introduce how users will interact with the newly added features by extending the user interface of appleseed.studio.