Article abstract

Journal of Educational Research and Reviews
Research Article | Published July 2019 | Volume 7, Issue 7, pp. 155-168.
doi: https://doi.org/10.33495/jerr_v7i7.19.128

 

Predicting developmental degrees of music expression in early childhood by machine learning classifiers with 3D motion captured body movement data

 

 

 

Mina Sano

Email Author


 

Osaka-Shoin Women’s University, Higashi-Osaka city, Osaka-fu, Japan, 577-8550.

 

 

 

……..…....….....…………............……………..........…..……….........................……………………...............……………………………….....………………...
Citation: Sano M (2019). Predicting developmental degrees of music expression in early childhood by machine learning classifiers with 3D motion captured body movement data. J. Edu. Res. Rev. 7(7): 155-168.
……..…....….....…………............……………..........…..……….........................……………………...............……………………………….....………………...

 

 

 

 Abstract 

 

Interaction between children’s developmental degree of music and their musical expression continued to intrigue researchers. Currently, one noteworthy element will be to analyze such interaction from quantitative approach and to detect some predictive methodology to replicate such interaction statistically. In this study, the author extracted developmental characteristics of musical expressions in early childhood from viewpoints of elements of body movement, and applied a classification of machine learning based method on those feature quantities acquired from the participant children. Classification models were applied to the feature quantity for 3-year-old, 4-year-old, and 5-year-old in 2 nursery schools in 2016, 2 kindergartens in 2017 and a certified facility in 2018 utilizing 3D motion capture. In order to highlight developmental degree and to extract feature quantity, a three-way non-repeated ANOVA was applied and a statistically significant difference was observed in the movement data analyzed of the moving average of distance such as pelvis and right hand, the moving average of acceleration such as right hand, and the movement smoothness of right foot. The author classified the developmental degree of children’s musical expression by machine learning classifiers using the feature quantities of motion capture data after let classifiers train with categorical variables of developmental degree evaluated by the author with simultaneously recorded video. The author report here that the best classifier is Multilayer Perceptron Neural Network and the second best is Boosted Trees. The sensitivity result showed that the movement of the pelvis was strongly related to the musical development degree.

 

Keywords  Musical expression in early childhood   3D motion capture   feature quantity   machine learning    multilayer perceptron neural network  

 

 

Copyright © 2019 Author(s) retain the copyright of this article. .

This article is published under the terms of the Creative Commons Attribution License 4.0



 References

Burger B (2013). Move the way you feel: Effects of musical features, perceived emotions, and personality on music-induced movement. Department of Music, University of Jyväskylä. http://urn.fi/URN:ISBN :978-951-39-5466-6.

Custodero L (1999). Construction of musical understandings: The cognitive- flow interface, Cognitive processes of children, England in musical activity Conference, p. 25. https://files.eric.ed.gov/fulltext/ ED460061.pdf.

Dahl S, Friberg A (2007). Visual perception of expressiveness in musicians’ body movements. Music Perception, 24:433-454. 10.1525/mp.2007.24.5.433.

Hannon E, Johnson S (2005). Infants use meter to categorize rhythms and melodies: Implications for musical structure learning, Cognit. Psychol. 50:354-377.

Igarashi S, Ueno K, Furukawa K (2001). Analysis of Respiration during Musical Performances by Machine Learning, SIG Technical Reports, MUS, (82(2001-MUS-041)):27-32.

Jacque-Dalcroze E (1921). Rhythm, Music and Education, GP Putnam's Sons.

Kodama Y, Oba S, Ishii S (2015). Recognition of daily-life actions by machine learning methods, IEICE technical report, IBISMI, 114(502):73-78.

Matsumoto A, Mikami H, Kojima A (2014). Support for motor learning by clustering of sports forms: Examine effective image features as a target children back hip circle, ITE technical report, 38, 51(0):9-12. https://doi.org/10.11485/itetr.38.51.0_9.

Memarian H, Balasundram S (2012). Comparison between multi-layer perceptron and radial basis function networks for sediment load estimation in a tropical watershed, J. Water Resour. Protect. 4:870-876, doi:10.4236/jwarp.2012.410102.

Phiilips S, Trainor L (2007). Feeling the beat: Movement influences infant rhythm perception. Science, 308:1430. https://doi.org/ 10.1126/science.1110922.

Rubin J, Merrion (1996). Drama and music methods, Portsmouth: Linner Professional Publications.

Osuna E, Freund R, Girosi F (1997). Support Vector Machines: Training and Applications, A.I. Memo No. 1602, C.B.C.L Paper No. 144:1-43. https://dspace.mit.edu/handle/1721.1/7290.

Sano M (2013). Quantitative analysis about the educational effect of the music expression program, The 9th Asia-Pacific Symposium on Music Educ. Res. 39:1-7.

Sano M (2014). Contents of the music test concerning the recognition of musical elements in early childhood, Research bulletin of Osaka Shoin Women’s University, 4:67-74.

Sano M (2015). Characteristics of musical expressions showing the formation process of beat perception in early childhood: Through the practice of Musical Expression Bringing-up program for 5-year-old children in K nursery school, Japanese J. Music Educ. Pract. 12(2):120-131.

Sano M (2016a). Quantitative analysis concerning the movement elements of musical expression in early childhood utilizing motion capture. Research Bulletin of Osaka Shoin Women’s University 6:133-143.

Sano M (2016b). Quantitative analysis of body movement in musical expression among three nursery schools in the different childcare forms utilizing 3D motion capture, Information and Communication Technologies in the Music Field (ICTMF), Media Musica, Central and Eastern European Online Library, 7(2):7-18.

Sano M (2017). Analysis of elements of movement in musical expression in early childhood. Japan J. Educ. Technol. 41:5-8. https://doi.org/10.15077/jjet.S41003.

Sano M (2018a). Development of a quantitative methodology to analyze the growth of recognition of musical elements in early childhood from a viewpoint of change of body movement. Asia-Pac. J. Res. Early Childhood Educ. (International), 12(1):61-80. http://dx.doi.org/10.17206/apjrece.2018.12.1.61.

Sano M (2018b). Statistical analysis of elements of movement in musical expression in early childhood using 3D motion capture and evaluation of musical development degrees through machine learning. World J. Educ. 8(3):118-130. https://doi.org/10.5430/wje.v8n3p118.

Sato K, Kaiga T, Watabe S (2010). The utilization of motion capture to support proficiency of the dancing. Japan J. Educ. Technol. 34:133-136. https://doi.org/10.15077/jjet.KJ00007086695.

Takada K, Kitasuka T, Aritsugi M (2012). A consideration of an individual identification method by gait using a marker less motion capture device, IPSJSIG Technical Report, 2012-ED-25, 9, 1-7.

Thelen E (1979). Rhythmical stereotypes in normal human infants”. Anim. Behav. 27:699-715. https://doi.org/10.1016/0003-3472(79)90006-X.

Thompson M, Luck G (2012). Exploring relationships between pianists’ body movements, their expressive intentions, and structural elements of the music. Musicae Scientiae, 16:19-40. https://doi.org/10.1177/1029864911423457.

Widmer G (2001). Using AI and machine learning to study expressive music performance: project survey and first report. AI Communications, 14(3):149-162.

Young D (2008). Classification of Common violin Bowing Techniques Using Gesture Data from a Playable Measurement System. Proceedings of the 2008 conference on new interfaces for musical expression (NIME08), Genova, Italy.

Zentner M, Eerola T (2010). Rhythmic encouragement with music in infancy. PNAS, 107(13): 5768-5773. https://doi.org/10.1073/pnas.1000121107.