Real-time Hair segmentation and recoloring on Mobile GPUs
Andrei Tkachenka, Gregory Karpiak, Yury Kartynnik, Artsiom Ablavatski, Valentin Bazarevsky and Siargey Pisarchyk.
We present a novel approach for neural network-based hair segmentation from a single camera input specifically designed for real-time, mobile application. Our relatively small neural network produces a high-quality hair segmentation mask that is well suited for AR effects, \eg virtual hair recoloring. The proposed model achieves real-time inference speed on mobile GPUs (30 - 100+ FPS, depending on the device) with high accuracy. We also propose a very realistic hair recoloring scheme. Our method has been deployed in major AR application and is used by millions of users.
Model summary following M. Mitchell et al., "Model Cards for Model Reporting", FAT* '19: Conference on Fairness, Accountability, and Transparency, January 29–31, 2019, Atlanta, GA, USA.