Neural Compression of Material Properties using a Geometry-Associated Feature Hierarchy
Abstract
Texture maps are ubiquitous in 3D rendering to encode material properties like albedo color, surface normal vectors, roughness coefficients and many more. Applying them to the surface of a 3D object requires a mapping from the object’s surface
to the flat texture plane (uv-mapping), which may introduce artifacts through unavoidable distortion and seams.
In this work, we propose a novel neural approach to encoding surface material parameters without the need for uv-mapping. We build on recent research in the field of neural function approximation in computer graphics [1] [2], which achieves efficient compression of texture data by training a machine learning model. By parameterizing surface positions in relation to their mesh triangle, we adapt previous approaches to circumvent the uv-mapping step.
The evaluation of our prototype shows that our method is capable of encoding detailed, high-resolution textures at satisfying quality, while encoding multiple material channels in a single representation. We evaluate our method on a selection
of datasets with a broad range of geometry and texture characteristics. We observe that certain characteristics challenge our method more than others. Compression rates range from 66.6% to 8.3% across the examined datasets.
Our outlook discusses, among other points, how limitations regarding subpar performance on meshes with low vertex-density could be overcome in future work. Furthermore, we lay out a possibility how our method’s hierarchical structure could be leveraged to realize low-pass texture filtering.
Degree
Student essay
Collections
View/ Open
Date
2025-02-06Author
Pascal, Walloner
Keywords
texture compression
machine learning
neural networks
input encoding
GPU
uv-mapping
Language
eng