HairStyle Editing via Parametric Controllable Strokes

In this work, we propose a stroke-based hairstyle editing network, dubbed HairstyleNet, allowing users to conveniently change the hairstyles of an image in an interactive fashion. Different from previous works, we simplify the hairstyle editing process where users can manipulate local or entire hair...

Ausführliche Beschreibung

Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on visualization and computer graphics. - 1996. - 30(2024), 7 vom: 10. Juni, Seite 3857-3870
1. Verfasser: Song, Xinhui (VerfasserIn)
Weitere Verfasser: Liu, Chen, Zheng, Youyi, Feng, Zunlei, Li, Lincheng, Zhou, Kun, Yu, Xin
Format: Online-Aufsatz
Sprache:English
Veröffentlicht: 2024
Zugriff auf das übergeordnete Werk:IEEE transactions on visualization and computer graphics
Schlagworte:Journal Article
LEADER 01000caa a22002652 4500
001 NLM355271516
003 DE-627
005 20240628231859.0
007 cr uuu---uuuuu
008 231226s2024 xx |||||o 00| ||eng c
024 7 |a 10.1109/TVCG.2023.3241894  |2 doi 
028 5 2 |a pubmed24n1454.xml 
035 |a (DE-627)NLM355271516 
035 |a (NLM)37022457 
040 |a DE-627  |b ger  |c DE-627  |e rakwb 
041 |a eng 
100 1 |a Song, Xinhui  |e verfasserin  |4 aut 
245 1 0 |a HairStyle Editing via Parametric Controllable Strokes 
264 1 |c 2024 
336 |a Text  |b txt  |2 rdacontent 
337 |a ƒaComputermedien  |b c  |2 rdamedia 
338 |a ƒa Online-Ressource  |b cr  |2 rdacarrier 
500 |a Date Revised 28.06.2024 
500 |a published: Print-Electronic 
500 |a Citation Status PubMed-not-MEDLINE 
520 |a In this work, we propose a stroke-based hairstyle editing network, dubbed HairstyleNet, allowing users to conveniently change the hairstyles of an image in an interactive fashion. Different from previous works, we simplify the hairstyle editing process where users can manipulate local or entire hairstyles by adjusting the parameterized hair regions. Our HairstyleNet consists of two stages: a stroke parameterization stage and a stroke-to-hair generation stage. In the stroke parameterization stage, we first introduce parametric strokes to approximate the hair wisps, where the stroke shape is controlled by a quadratic Bézier curve and a thickness parameter. Since rendering strokes with thickness to an image is not differentiable, we opt to leverage a neural renderer to construct the mapping from stroke parameters to a stroke image. Thus, the stroke parameters can be directly estimated from hair regions in a differentiable way, enabling us to flexibly edit the hairstyles of input images. In the stroke-to-hair generation stage, we design a hairstyle refinement network that first encodes coarsely composed images of hair strokes, face, and background into latent representations and then generates high-fidelity face images with desirable new hairstyles from the latent codes. Extensive experiments demonstrate that our HairstyleNet achieves state-of-the-art performance and allows flexible hairstyle manipulation 
650 4 |a Journal Article 
700 1 |a Liu, Chen  |e verfasserin  |4 aut 
700 1 |a Zheng, Youyi  |e verfasserin  |4 aut 
700 1 |a Feng, Zunlei  |e verfasserin  |4 aut 
700 1 |a Li, Lincheng  |e verfasserin  |4 aut 
700 1 |a Zhou, Kun  |e verfasserin  |4 aut 
700 1 |a Yu, Xin  |e verfasserin  |4 aut 
773 0 8 |i Enthalten in  |t IEEE transactions on visualization and computer graphics  |d 1996  |g 30(2024), 7 vom: 10. Juni, Seite 3857-3870  |w (DE-627)NLM098269445  |x 1941-0506  |7 nnns 
773 1 8 |g volume:30  |g year:2024  |g number:7  |g day:10  |g month:06  |g pages:3857-3870 
856 4 0 |u http://dx.doi.org/10.1109/TVCG.2023.3241894  |3 Volltext 
912 |a GBV_USEFLAG_A 
912 |a SYSFLAG_A 
912 |a GBV_NLM 
912 |a GBV_ILN_350 
951 |a AR 
952 |d 30  |j 2024  |e 7  |b 10  |c 06  |h 3857-3870