The realistic rendering of woven and knitted fabrics has posed significant challenges throughout many years. Previously, fiber-based micro-appearance models have achieved considerable success in attaining high levels of realism. However, rendering such models remains complex due to the intricate internal scatterings of hundreds of fibers within a yarn, requiring vast amounts of memory and time to render.
In this paper, we introduce a new framework to capture aggregated appearance by tracing many light paths through the underlying fiber geometry. We then employ lightweight neural networks to accurately model the aggregated BSDF, which allows for the precise modeling of a diverse array of materials while offering substantial improvements in speed and reductions in memory. Furthermore, we introduce a novel importance sampling scheme to further speed up the rate of convergence. We validate the efficacy and versatility of our framework through comparisons with preceding fiber-based shading models as well as the most recent yarn-based model.
We are able to demonstrate that our materials work for many types of cloth, with different base geometry and different reflectance propoerties. It can be shown that our method outperforms the original fiber model in speed but maintaining the physical accuracy.
Our method although using a neural aggregated appearance produces accurate results to the original reference. Furthermore, we also demonstrate our model is able to handle ply-ply interactions by rendering a 3-ply yarn as shown below and different geometry parameters of the same fiber material.
@inproceedings{soh2024neural,
title={Neural Appearance Model for Cloth Rendering},
booktitle={Computer Graphics Forum},
volume={43},
number={4},
pages={e15156},
year={2024},
organization={Wiley Online Library}
}