Ray prioritization using stylization and visual saliency

Markus Steinberger, Bernhard Kainz, Stefan Hauswiesner, Rostislav Khlebnikov, Denis Kalkofen, Dieter Schmalstieg

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)

Abstract

This paper presents a new method to control scene sampling in complex ray-based rendering environments. It proposes to constrain image sampling density with a combination of object features, which are known to be well perceived by the human visual system, and image space saliency, which captures effects that are not based on the objects geometry. The presented method uses Non-Photorealistic Rendering techniques for the object space feature evaluation and combines the image space saliency calculations with image warping to infer quality hints from previously generated frames. In order to map different feature types to sampling densities, we also present an evaluation of the object space and image space features impact on the resulting image quality. In addition, we present an efficient, adaptively aligned fractal pattern that is used to reconstruct the image from sparse sampling data. Furthermore, this paper presents an algorithm which uses our method in order to guarantee a desired minimal frame rate. Our scheduling algorithm maximizes the utilization of each given time slice by rendering features in the order of visual importance values until a time constraint is reached. We demonstrate how our method can be used to boost or stabilize the rendering time in complex ray-based image generation consisting of geometric as well as volumetric data.

Original languageEnglish
Pages (from-to)673-684
Number of pages12
JournalComputers and Graphics (Pergamon)
Volume36
Issue number6
DOIs
Publication statusPublished - Oct 2012
Externally publishedYes

Keywords

  • Photorealistic rendering
  • Ray-casting
  • Ray-tracing
  • Visual saliency
  • Volume rendering

Fingerprint

Dive into the research topics of 'Ray prioritization using stylization and visual saliency'. Together they form a unique fingerprint.

Cite this