Theoretical and Natural Science
- The Open Access Proceedings Series for Conferences
Vol. 18, 08 December 2023
* Author to whom correspondence should be addressed.
Existing image translation methods already enable style transfer on unpaired data. Although these methods have yielded satisfactory results, they still result in changing the background while changing the object. One reason is that when using convolutional neural networks, global information is lost as the number of network layers increases, and the absence of an effective sensory field leads to the failure to generate high-quality results. This paper proposed a Non-Local-Attention-Cycle-Consistent Adversarial Networks for unpaired images style transfer. The no-local-attention can quickly capture long-range dependencies, better extracts global information, ensures effective focus on the foreground while preserving the background, and can be easily embedded into the current network architecture. Experiments are conducted on neural style transfer task with public dataset, this model can obtain the better result than CycleGAN. It allows better attention to structural features rather than just textural features. It can reconstruct some of the content lost by CycleGAN. Recent research has also demonstrated that the optimizer has an impact on the performance of the network. This paper applies the Nadam optimizer and find that this improves training process.
No-local Attention, CycleGAN, Style Transfer
1. Hertzmann A et al. 2001 Image analogies Proceedings of the 28th annual conference on Computer graphics and interactive techniques - SIGGRAPH ’01 New York New York
2. Goodfellow IJ et al. 2014 Generative Adversarial Networks arXiv [statML] doi:1048550/ARXIV14062661 http://arxivorg/abs/14062661
3. Isola P et al. 2016 Image-to-image translation with conditional adversarial networks arXiv [csCV] doi:1048550/ARXIV161107004 http://arxivorg/abs/161107004
4. Mirza M et al.2014 Conditional generative Adversarial Nets arXiv [csLG] doi:1048550/ARXIV14111784 http://arxivorg/abs/14111784
5. Resales A F 2003 Unsupervised image translation In: Proceedings Ninth IEEE International Conference on Computer Vision IEEE p 472–478 vol 1
6. Zhu J-Y et al. 2017 Unpaired image-to-image translation using cycle-consistent adversarial networks arXiv [csCV] doi:1048550/ARXIV170310593 http://arxivorg/abs/170310593
7. Kim T et al. 2017 Learning to discover cross-domain relations with generative adversarial networks arXiv [csCV] doi:1048550/ARXIV170305192 http://arxivorg/abs/170305192
8. Yi Z et al. 2017 DualGAN: Unsupervised dual learning for image-to-image translation arXiv [csCV] doi:1048550/ARXIV170402510 [accessed 2022 Sep 29] http://arxivorg/abs/170402510
9. Luo W et al. 2017 Understanding the effective receptive field in deep convolutional neural networks arXiv [csCV] doi:1048550/ARXIV170104128 http://arxivorg/abs/170104128
10. Wang X et al. 2017 Non-local Neural Networks arXiv [csCV] doi:1048550/ARXIV171107971 http://arxivorg/abs/171107971
11. Dozat T 2015 Incorporating Nesterov momentum into Adam Stanfordedu https://cs229stanfordedu/proj2015/054_reportpdf
12. Index of /~taesung_park/CycleGAN/datasets Berkeleyedu https://peopleeecsberkeleyedu/~taesung_park/CycleGAN/datasets/
13. Christiansen L C 2014 Find your inspiration: Finding your balance of health and fitness Lawton OK: Penguin International Publishing
14. Wikiartorg 2022 visual art encyclopedia https://www.wikiart.org/
The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License. Authors who publish this series agree to the following terms:
1. Authors retain copyright and grant the series right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this series.
2. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the series's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this series.
3. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See Open Access Instruction).