DIGITAL TECHNOLOGIES IN TEACHING CONDUCTING

Received: 24th November 2022; Revised: 18th January 2023, 20th January 2023; Accepted: 24th January 2023

Authors

  • Adam Rosiński Research worker and Lecturer, Institute of Music, Faculty of Arts, University of Warmia and Mazury in Olsztyn, Poland

DOI:

https://doi.org/10.20319/pijtel.2023.63.5767

Keywords:

Conducting, Teaching, Music, Education, Student

Abstract

The objective of this paper was to invent a digital technology, that could be used in a simple way in online conducting classes, throughout specialized education of students at universities. Transition to remote teaching due to Covid-19 pandemic has highlighted the gap in this field. This article presents a concept of using a specialised software, that is mapping in a 3D space and analysing the movement of human body in real time. The software combines two images provided by the cameras installed in a smartphone and a computer. Common presence of laptops and smartphones in human life makes it possible to omit the costs of buying specialised equipment. The technology was researched with professors and students involved, to verify: software stability, level of detail in the rendered movements of the virtual conductor after mapping the professors, understandability of the commands given to the students, students’ performance in the tasks and the threshold above which the software was recognising moves of an individual as correct. Described software is constantly being developed and adapted to meet the needs of a new user, which may give future opportunities to use it also for other educational or non-educational purposes.

References

Adams, R., Brown, J., Coalson, C., Marks, M., Stein, F., & Whitmore, G. (2014). Kinecting With Music: A Human Interface for A Digital Orchestra. https://cs.carleton.edu/cs_comps/1314/dmusican/final-results/kinecting-music-human.pdf

Baba, T., Hashida, M., & Katayose, H. (2012). A Multi-Timelines Scheduler and A Rehearsal Function for Improving Users’ Sensation of Orchestral Conducting with A Conducting System. Proceedings of the 9th Sound and Music Computing Conference (SMC 2012), Copenhagen, Denmark, July 11–14, 2012

Bello, J. P., Daudet, L., Abdallah, S., Duxbury, C., Davies, M., & Sandler, M. B. (2005). A Tutorial on Onset Detection in Music Signals. IEEE Transactions on Speech and Audio Processing, 13(5), 1035. https://doi.org/10.1109/TSA.2005.851998

Bonada, J. (2000). Automatic Technique in Frequency Domain for Near-Lossless Time-Scale Modification of Audio. Proceedings of the International Computer Music Conference (ICMC 2000), Berlin, Germany, August 27 – September 1, 2000. Michigan Publishing. http://www.mtg.upf.edu/files/publications/icmc00-bonada.pdf

Brice, R. (2001). Music Engineering. (2nd Ed.). Oxford: Newnes. https://doi.org/10.1016/B978-075065040-3/50034-8

Cerra, J. P., & Visconti, M. P. (2008). Systems And Methods for Synchronizing Music. Patent Application Publication, Tufts University, Boston 2008, Patent No. US 2008/0306619 A1.

Chang, A., & Baer, C. M. (2021). Academic Outcomes of Undergraduates Learning at The Age Of COVID-19 Pandemic. Docens Series in Education, 1, 13–31.

Drozdowicz, J. (2022). Teaching Analog Skills in A Digital World. Docens Series in Education, 2, 66–80.

Hadjakos, A., Großhauser, T., & Goebl, W. (2013). Motion Analysis of Music Ensembles with The Kinect. 13th International Conference on New Interfaces for Musical Expression, Daejeon–Seoul, Korea Republic, May 27–30, 2013. IWK. http://Iwk.Mdw.Ac.At/Goebl/Papers/Hadjakos-Etal-2013-Kinect-Head-Motion-NIME.Pdf

Huber, D. M. (2007). The MIDI Manual. A Practical Guide to MIDI In the Project Studio. (3rd Ed.). Burlington: Focal Press. https://doi.org/10.1016/B978-0-240-80798-0.50012-3

Juillerat, N., & Hirsbrunner, B. (2017). Audio Time Stretching with An Adaptive Multiresolution Phase Vocoder. In M.A. Bayoumi (Ed.). 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (Pp. 716–719). Danvers: Institute of Electrical and Electronics Engineers, Inc./IEEE Press. https://doi.org/10.1109/ICASSP.2017.7952249

Karrer, T., Lee, E., & Borchers, J. (2006). Phavorit: A Phase Vocoder for Real-Time Interactive Time-Stretching. Proceedings of the International Computer Music Conference (ICMC 2006), New Orleans, Louisiana, USA, November 6–11, 2006. Michigan Publishing. https://hci.rwth-aachen.de/publications/karrer2006a.pdf

Lepri, G. (2016). Inmusic: An Interactive Multimodal System for Electroacoustic Improvisation. Proceedings of the International Computer Music Conference (ICMC 2016), Utrecht, Netherlands, September 12–16, 2016. http://files.spazioweb.it/aruba35800/file/inmusic-icmc16-final.pdf

Lim, A., Ogata, T., & Okuno, H. G. (2012). Towards Expressive Musical Robots: A Cross-Modal Framework for Emotional Gesture, Voice and Music. EURASIP Journal on Audio, Speech, And Music Processing, 3, 52–57. https://doi.org/10.1186/1687-4722-2012-3

Mandanici, M., & Sapir, S. (2012). Disembodied Voices: A Kinect Virtual Choir Conductor. https://doi.org/10.5281/zenodo.850082

Marrin, T., & Paradiso, J. (1997). The Digital Baton: A Versatile Performance Instrument. Proceedings of the 1997 International Computer Music Conference (ICMC’97), Thessaloniki, Greece, September 25–30, 1997. San Francisco: International Computer Music Association/Michigan Publishing. https://quod.lib.umich.edu/cgi/p/pod/dod-idx/digital-baton-a-versatile-performance-instrument.pdf?c=icmc;idno=bbp2372.1997.083;format=pdf

Marrin, T., & Picard, R. (1998). The Conductor’s Jacket: A Device for Recording Expressive Musical Gestures. In M. Simoni (Ed.), Proceedings of the 1998 International Computer Music Conference (ICMC’98). Ann Arbor: University of Michigan.

Mcguire, S. (2014). Modern MIDI. Sequencing And Performing Using Traditional and Mobile Tools. Burlington: Focal Press.

Nagel, F., & Walther, A. (2009). A Novel Transient Handling Scheme for Time Stretching Algorithms. AES 127th Convention. New York.

Paradiso, J. (1999). The Brain Opera Technology: New Instruments and Gestural Sensors for Musical Interaction and Performance. Journal Of New Music Research, 28(2), 130–149. https://doi.org/10.1076/jnmr.28.2.130.3119

Paradiso, J. A., & Sparacino, F. (1997). Optical Tracking for Music and Dance Performance.

Pejrolo, A., & Derosa, R. (2009). Acoustic And MIDI Orchestration for The Contemporary Composer. A Practical Guide to Writing and Sequencing for The Studio Orchestra. Oxford: Focal Press. https://doi.org/10.4324/9780080551067

Rosa-Pujazón A., Barbancho, I., Tardón, L. J., & Barbancho, A. M. (2013). Conducting A Virtual Ensemble with A Kinect Device. In R. Bresin (Ed.), Proceedings of The Sound and Music Computing Conference 2013 (SMC 2013). Berlin: Logos Verlag.

Rosiński, A. (2013). Wykorzystanie Komputera W Realizacji Nagrań Muzycznych. Bydgoszcz: Wydawnictwo Uniwersytetu Kazimierza Wielkiego w Bydgoszczy.

Russ, M. (2009). Sound Synthesis and Sampling. (3rd Ed.). Oxford/Burlington: Focal Press/Elsevier. https://doi.org/10.1016/B978-0-240-52105-3.00006-2

Sarasúa, Á. (2013). Context-Aware Gesture Recognition in Classical Music Conducting. https://doi.org/10.1145/2502081.2502216

Downloads

Published

2023-01-03

How to Cite

Rosiński, A. (2023). DIGITAL TECHNOLOGIES IN TEACHING CONDUCTING: Received: 24th November 2022; Revised: 18th January 2023, 20th January 2023; Accepted: 24th January 2023. PUPIL: International Journal of Teaching, Education and Learning, 6(3), 57–67. https://doi.org/10.20319/pijtel.2023.63.5767