Abstract
Brush stroke models play an important role in robotic Chinese calligraphy as the basis of calligraphy stroke generation and are helpful for training models to write robotic Chinese calligraphy. In this study, in contrast to most current stroke models that only consider graphic generation features, we propose a novel stroke model based on composite curve and morphological dilation according to the physical characteristics and writing posture of the brush. In the proposed composite-curve-dilation brush stroke model (CCD-BSM), an oblique section of a cone and two tangent parabolas form a basic graphic, which is dilated with a fixed coefficient according to the extrusion diffusion characteristics of brush hairs. The CCD-BSM can simulate the graphics formed by various specifications of the brush touching the paper with various postures. Moreover, the parameters in CCD-BSM are measurable and controllable without any parameter estimation method or a large number of training samples. Compared with real stroke graphics written by robots, the results of several experiments prove that the proposed CCD-BSM can simulate brushstroke graphics well and show that it outperformed state-of-the-art stroke models. Compared with existing methods, the results demonstrate the advantages of our proposed model in terms of high similarity and especially show the robustness and efficacy of the method with measurable and controllable parameters.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Guo D, Min H (2022) Survey of calligraphy robot. Cont Decision 37(7):1665–1674. https://doi.org/10.13195/j.kzyjc.2021.0132
Huang L, Hou Z-X, Zhao Y-H, Zhang DJ (2019) Research progress on and prospects for virtual brush modeling in digital calligraphy and painting. Frontiers Inform Technol Electr Eng 20(10):1307–1321. https://doi.org/10.1631/FITEE.1900195
Zhang J, Zhang Y, Zhou C (2014) Simulating the writing process from chinese calligraphy image. J Comput Aided Design Comput Graph 26(6):963–972. https://doi.org/10.3969/j.issn.1003-9775.2014.06.014
Lin H-I, Chen X, Lin T-T (2020) Calligraphy brush trajectory control of by a robotic arm. Appl Sci 10(23):8694. https://doi.org/10.3390/app10238694
Aksan E, Pece F, Hilliges O (2018) Deepwriting: making digital ink editable via deep generative modeling. In: Proceedings of the 2018 CHI conference on human factors in computing systems, pp 1–14. https://doi.org/10.1145/3173574.3173779
Adamik M, Goga J, Pavlovicova J, Babinec A, Sekaj I (2022) Fast robotic pencil drawing based on image evolution by means of genetic algorithm. Robot Auton Syst 148:103912. https://doi.org/10.1016/j.robot.2021.103912
Huang L, Hou Z (2020) A novel virtual 3d brush model based on variable stiffness and haptic feedback. Math Probl Eng, vol 2020. https://doi.org/10.1155/2020/6942947
Otsuki M, Sugihara K, Toda A, Shibata F, Kimura A (2018) A brush device with visual and haptic feedback for virtual painting of 3d virtual objects. Virtual Reality 22(2):167–181. https://doi.org/10.1007/s10055-017-0317-0
Wu R, Zhou C, Chao F, Yang L, Lin C-M, Shang C (2020) Integration of an actor-critic model and generative adversarial networks for a chinese calligraphy robot. Neurocomputing 388:12–23. https://doi.org/10.1016/j.neucom.2020.01.043
Wu R, Fang W, Chao F, Gao X, Zhou C, Yang L, Lin C-M, Shang C (2018) Towards deep reinforcement learning based chinese calligraphy robot. In: IEEE international conference on robotics and biomimetics (ROBIO), pp 507–512. https://doi.org/10.1109/ROBIO.2018.8664813
Chao F, Lv J, Zhou D, Yang L, Lin C-M, Shang C, Zhou C (2018) Generative adversarial nets in robotic chinese calligraphy. In: IEEE international conference on robotics and automation (ICRA), pp 1104–1110. https://doi.org/10.1109/ICRA.2018.8460787
Wu R, Zhou C, Chao F, Yang L, Lin C-M, Shang C (2020) Ganccrobot: generative adversarial nets based chinese calligraphy robot. Inf Sci 516:474–490. https://doi.org/10.1016/j.ins.2019.12.079
Gao X, Zhou C, Chao F, Yang L, Lin C-M, Shang C (2019) A robotic writing framework–learning human aesthetic preferences via human–machine interactions. IEEE Access 7:144043–144053. https://doi.org/10.1109/ACCESS.2019.2944912
Lin G, Guo Z, Chao F, Yang L, Chang X, Lin C-M, Zhou C, Vijayakumar V, Shang C (2021) Automatic stroke generation for style-oriented robotic chinese calligraphy. Futur Gener Comput Syst 119:20–30. https://doi.org/10.1016/j.future.2021.01.029
Wang Y, Min H (2021) Robot calligraphy system based on brush modeling. CAAI Trans Intell Syst 16(4):707–716. https://doi.org/10.11992/tis.202006033
Wong HT, Ip HH (2000) Virtual brush: a model-based synthesis of chinese calligraphy. Comput Graph 24(1):99–113. https://doi.org/10.1016/S0097-8493(99)00141-7
Joshi A (2018) Efficient rendering of linear brush strokes. J Comput Graph Tech, vol 7
Wang S, Chen J, Deng X, Hutchinson S, Dellaert F (2020) Robot calligraphy using pseudospectral optimal control in conjunction with a novel dynamic brush model. In: 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 6696–6703. https://doi.org/10.1109/IROS45743.2020.9341787
Zhang X, Li Y, Zhang Z, Konno K, Hu S (2019) Intelligent chinese calligraphy beautification from handwritten characters for robotic writing. Vis Comput 35(6):1193–1205. https://doi.org/10.1007/s00371-019-01675-w
Huang Z, Heng W, Zhou S (2019) Learning to paint with model-based deep reinforcement learning. In: Proceedings of the IEEE/CVF international conference on computer vision, pp 8709–8718. https://doi.org/10.1109/ICCV.2019.00880
Schaldenbrand P, Oh J (2021) Content masked loss: human-like brush stroke planning in a reinforcement learning painting agent. In: Proceedings of the AAAI conference on artificial intelligence, vol 35, pp 505–512
Bidgoli A, De Guevara ML, Hsiung C, Oh J, Kang E (2020) Artistic style in robotic painting; a machine learning approach to learning brushstroke from human artists. In: 2020 29th IEEE international conference on robot and human communication (RO-MAN), pp 412–418. https://doi.org/10.1109/RO-MAN47096.2020.9223533
Chao F, Huang Y, Lin C-M, Yang L, Hu H, Zhou C (2018) Use of automatic chinese character decomposition and human gestures for chinese calligraphy robots. IEEE Trans Human-Mach Syst 49(1):47–58. https://doi.org/10.1109/THMS.2018.2882485
Chao F, Lin G, Zheng L, Chang X, Lin C-M, Yang L, Shang C (2020) An lstm based generative adversarial architecture for robotic calligraphy learning system. Sustainability 12(21):9092–9102. https://doi.org/10.3390/su12219092
Xie Z, Hiroyuki F, Akinori H, Hiroyuki K (2020) Modeling and manipulating dynamic font-based hairy brush characters using control-theoretic b-spline approach. IFAC-PapersOnLine 53(2):4731–4736. https://doi.org/10.1016/j.ifacol.2020.12.597
Zeng J, Chen Q, Liu Y, Wang M, Yao Y (2021) Strokegan: reducing mode collapse in chinese font generation via stroke encoding. In: Proceedings of AAAI, vol 35, pp 3270–3277
Liang D-T, Liang D, Xing S, Li P, Wu X-C (2020) A robot calligraphy writing method based on style transferring algorithm and similarity evaluation. Intel Serv Robotics 13(1):137–146. https://doi.org/10.1007/s11370-019-00298-3
Kotani A, Tellex S (2019) Teaching robots to draw. In: 2019 International conference on robotics and automation (ICRA), pp 4797–4803. https://doi.org/10.1109/ICRA.2019.8793484
Xu P, Wang L, Guan Z, Zheng X, Chen X, Tang Z, Fang D, Gong X, Wang Z (2018) Evaluating brush movements for chinese calligraphy: a computer vision based approach. In: 27th International joint conference on artificial intelligence, IJCAI 2018, pp 1050–1056. https://doi.org/10.24963/ijcai.2018/146
Gao X, Zhou C, Chao F, Yang L, Lin C-M, Xu T, Shang C, Shen Q (2019) A data-driven robotic chinese calligraphy system using convolutional auto-encoder and differential evolution. Knowl-Based Syst, vol 182. https://doi.org/10.1016/j.knosys.2019.06.010
Li Q, Fei C, Gao X, Yang L, Lin C-M, Shang C, Zhou C (2019) A robotic chinese stroke generation model based on competitive swarm optimizer. In: UK workshop on computational intelligence, pp 92–103. https://doi.org/10.1007/978-3-030-29933-0_8
Gleeson D, Jakobsson S, Salman R, Ekstedt F, Sandgren N, Edelvik F, Carlson JS, Lennartson B (2022) Generating optimized trajectories for robotic spray painting. IEEE Trans Automation Sci Eng. https://doi.org/10.1109/TASE.2022.3156803
Wu R, Chao F, Zhou C, Huang Y, Yang L, Lin C-M, Chang X, Shen Q, Shang C (2021) A developmental evolutionary learning framework for robotic chinese stroke writing. IEEE Trans Cognit Development Syst, https://doi.org/10.1109/TCDS.2021.3098229
Senthilpari C, Ramanamurthy G, Ramesh P, Velrajkumar P, Kodandapani D (2019) Development of smart number writing robotic arm using stochastic gradient decent algorithm. Int J Innovative Technol Exploring Eng 8(10):542–547. https://doi.org/10.35940/ijitee.J8851.0881019
Gülzow JM, Grayver L, Deussen O (2018) Self-improving robotic brushstroke replication. In: Arts, vol 7, p 84. https://doi.org/10.3390/arts7040084
Fu Y, Yu H, Yeh C-K, Lee T-Y, Zhang JJ (2021) Fast accurate and automatic brushstroke extraction. ACM Trans Multimed Comput Commun Appl (TOMM) 17(2):1–24. https://doi.org/10.1145/3429742
Wang Z, Bovik AC, Sheikh HR, Simoncelli EP (2004) Image quality assessment: from error visibility to structural similarity. IEEE Trans Image Process 13(4):600–612. https://doi.org/10.1109/TIP.2003.819861
Xiao Y, Lei W, Lu L, Chang X, Zheng X, Chen X (2021) Cs-gan: cross-structure generative adversarial networks for chinese calligraphy translation. Knowl-Based Syst 229:107334. https://doi.org/10.1016/j.knosys.2021.107334
Ko DH, ul Hassan A, Suk J, Choi J (2021) SKFOnt: skeleton-driven korean font generator with conditional deep adversarial networks. Int J Document Anal Recognit (IJDAR):1–13. https://doi.org/10.1007/s10032-021-00374-4
Ko DH, ul Hassan A, Majeed S, Choi J (2021) SkelGAN: a font image skeletonization method. J Inform Process Syst 17(1):1–13. https://doi.org/10.3745/JIPS.02.0152
Jian M, Dong J, Gong M, Yu H, Nie L, Yin Y, Lam K-M (2019) Learning the traditional art of chinese calligraphy via three-dimensional reconstruction and assessment. IEEE Trans Multimed 22(4):970–979. https://doi.org/10.1109/TMM.2019.2937187
Li C, Yang C, Giannetti C (2019) Segmentation and generalisation for writing skills transfer from humans to robots. Cognit Computat Syst 1(1):20–25. https://doi.org/10.1049/ccs.2018.0005
Acknowledgements
This work is jointly supported by the National Natural Science Foundation of China (Grant No. 62073249) and China Postdoctoral Science Foundation (Grant No. 2020M672426).
Author information
Authors and Affiliations
Corresponding authors
Ethics declarations
Statements and Declarations
The authors have no competing interests to declare that are relevant to the content of this article. All authors certify that they have no affiliations with or involvement in any organization or entity with any financial interest or non-financial interest in the subject matter or materials discussed in this manuscript. The data is available and deposited upon request at https://doi.org/10.5281/zenodo.6470051
Compliance with Ethical Standards
The authors have no potential conflicts of interest. The research in this manuscript doesn’t involve Human Participants and/or Animals.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Guo, D., Ye, L., Yan, G. et al. CCD-BSM:composite-curve-dilation brush stroke model for robotic chinese calligraphy. Appl Intell 53, 14269–14283 (2023). https://doi.org/10.1007/s10489-022-04210-y
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-022-04210-y