Author(s): Ning, X (Ning, Xin); He, F (He, Feng); Dong, XL (Dong, Xiaoli); Li, WJ (Li, Weijun); Alenezi, F (Alenezi, Fayadh); Tiwari, P (Tiwari, Prayag)
Source: INFORMATION SCIENCES Volume: 660 Article Number: 120130 DOI: 10.1016/j.ins.2024.120130 Early Access Date: JAN 2024 Published: MAR 2024
Abstract: Face -attribute synthesis is a typical application of neural network technology. However, most current methods suffer from the problem of uncontrollable attribute intensity. In this study, we proposed a novel intensity -controllable generation network (ICGNet) based on covering learning for face attribute synthesis. Specifically, it includes an encoder module based on the principle of homology continuity between homologous samples to map different facial images onto the face feature space, which constructs sufficient and effective representation vectors by extracting the input information from different condition spaces. It then models the relationships between attribute instances and representational vectors in space to ensure accurate synthesis of the target attribute and complete preservation of the irrelevant region. Finally, the progressive changes in the facial attributes by applying different intensity constraints to the representation vectors. ICGNet achieves intensity -controllable face editing compared to other methods by extracting sufficient and effective representation features, exploring and transferring attribute relationships, and maintaining identity information. The source code is available at https:// github .com /kllaodong /-ICGNet. center dot We designed a new encoder module to map face images of different condition spaces into face feature space to obtain sufficient and effective face feature representation. center dot Based on feature extraction, we proposed a novel Intensity -Controllable Generation Network (ICGNet), which can realize face attribute synthesis with continuous intensity control while maintaining identity and semantic information. center dot The quantitative and qualitative results showed that the performance of ICGNet is superior to current advanced models.
Accession Number: WOS:001168971300001
ISSN: 0020-0255
eISSN: 1872-6291