SIGGRAPH paper summary: Art-Based Rendering Of Fur, Grass, Trees by Michael Kowalski, et al.

A (sort of) particle-based system for an illustrative style of rendering to balance screen-space density of strokes against world-spatial coherence to provide some interframe coherence. Starts with a conventionally-rendered image as reference and places graphtals* according to a “desire” map. As each stroke is placed, it subtracts a blurred version of itself from the desire map, and so on until the desire map is filled, thus preventing overly dense stroke placement. Provides techniques for scaling strokes in screen space as they scale in camera space (does that make any sense?), for deciding how much of the stroke to draw based on factors such as surface facing angle, and for reducing strokes as they are no longer needed to reduce popping. The individual stroke aligns itself to the camera based on user rules such as “always point down” for fur or “always orient clockwise” for truffula tufts. The technique does not fully solve problems of interframe coherence, but gives a starting point.

*Alvy Ray Smith’s concept of “graphtals” (a portmanteau of fractals and graphics): image information that is generated algorithmically/implicitly and only when requested.

Overall this is a idea with potential. The paper points to a lot of interesting prior work, too. Another Brown University project!