Orientation-aware texture synthesis using rotated texture samples with CNNs
Jong-Hyun Kim*
(* : Inha University)
IEEE Access 2025
Jong-Hyun Kim*
(* : Inha University)
IEEE Access 2025
Abstract : Texture synthesis is a core technique for generating realistic images and enhancing visual immersion, and it has been widely applied in various fields such as computer graphics, gaming, and virtual reality. However, traditional patch-based approaches often suffer from repetitive patterns and boundary artifacts, while CNN-based methods are limited in representing orientation diversity. To address these issues, this study proposes an orientation-aware texture synthesis framework that incorporates rotated texture samples into the CNN synthesis process. The proposed method utilizes exemplars generated by rotating the input texture at multiple angles, and explicitly handles the null regions created during rotation with a mask-based strategy so that only valid areas contribute to CNN training. In addition, a multi-layer Gram matrix loss function of the VGG-19 network is employed to simultaneously preserve both local patterns and global structures. Experimental results across diverse datasets demonstrate that the proposed approach synthesizes richer directional patterns more reliably than conventional patch-based or CNN-only methods, and produces natural and continuous results even for anisotropic textures and complex curved patterns. Furthermore, we analyze the trade-off between synthesis quality and computational cost with respect to the number of rotated samples, and identify an optimal configuration. We also confirm that residual noise artifacts inherent in CNN-based synthesis can be effectively alleviated through post-processing with SUNet. By simultaneously achieving orientation diversity, visual fidelity, and robustness, the proposed framework provides a practical solution that can be applied to real-world applications such as gaming, VR, and digital content production.
[paper]