Music Segmentation and Similarity Estimation Applied to a Gaze-Controlled Musical Interface

Autores

DOI:

https://doi.org/10.33871/23179937.2023.11.1.7068

Palavras-chave:

Music Information Retrieval, Eye tracking, Optimization, Musical Interface

Resumo

Assistive technology, especially gaze-controlled, can promote accessibility, health care, well-being and inclusion for impaired people, including musical activities that can be supported by interfaces controlled using eye tracking. Also, the Internet growth has allowed access to a huge digital music database, which can contribute to a new form of music creation. In this paper, we propose the application of Music Information Retrieval techniques for music segmentation and similarity identification, aiming at the development of a new form of musical creation using an automatic process and the optimization algorithm Harmony Search to combine segments. These techniques for segmentation and similarity of segments were implemented in an assistive musical interface controlled by eye movement to support musical creation and well-being. The experimental results can be found in [https://bit.ly/2Zl7KSC].

Downloads

Não há dados estatísticos.

Biografia do Autor

Higor Camporez, Universidade Federal do Espírito Santo

Higor A. F. Camporez was born in Espírito Santo, Brazil. He received his B.S. degree in computer engineering in 2018 and his M.S. degree in electrical engineering in 2020 both from the Federal University of Espírito Santo (UFES), Espírito Santo, Brazil. He is a PhD student in electrical engineering at UFES since 2020. He is a member of the NESCoM research group that carries on Computer Music related research, especially on Ubiquitous Music. His research interest includes Ubimus, robotic musicians, optimization, artificial intelligence and telecommunication systems.

Yasmin M. de Freitas, Federal University of Espirito Santo

Yasmin M. de Freitas is a master student of the Postgraduate Program in Art and New Media (PPGA - UFES). Graduated in Music Degree at the Federal University of Espírito Santo (UFES) and Member of the Sound Experimentation Group (GEXS) with partnership with the Spirit-Santense Core of Musical Computing (NESCOM).

Jair A. L. Silva, Federal University of Espirito Santo

Jair A. L. Silva received his BS, MS, and PhD degrees in electrical engineering from the Federal University of Espŕito Santo (UFES), Vitória, Brazil, in 2003, 2006, and 2011, respectively. In 2012, he joined the Department of Electrical Engineering of UFES. His research interests include optical fiber communication, orthogonal frequency division multiplexing (OFDM), and passive optical network (PON).

Leandro L. Costalonga, Federal University of Espirito Santo

Leandro Costalonga has a Computer Science Degree with Masters (UFRGS/Brazil) and PhD (University of Plymouth/UK) in Computer Music. Associate professor at the Federal University of Espírito Santo (UFES/Brazil) where teaches on undergraduate programs in Computer Science and Computer Engineering and Graduate Program in Arts. Head of the NESCoM Research Group that carries on Computer Music related research, especially on Ubiquitous Music. Besides Computer Music, other research interest includes Human-Computer Interaction, Programming Languages and Artificial Intelligence.

Helder R. O. Rocha, Federal University of Espirito Santo

Helder R. O. Rocha was born in Santo Antao, Cabo Verde. He received his B.S. degree in electrical engineering in 2002 and his M.S. and D.S. degrees in computing science in 2005 and 2010 from the Federal University of Fluminense (UFF), Rio de Janeiro, Brazil. In 2013, he joined the Department of Computing and Electronic of UFES and in 2018 the Department of Electrical Engineering of UFES. His research interest includes optimization, artificial intelligence in smart grids and telecommunication systems.

Referências

BENWARD, Bruce.; SAKER, Marilyn. Music in Theory and Practice volume 1. New York, NY, USA: McGraw-Hill, 2008.

BROWN, Judith C. Calculation of a constant Q spectral transform. The Journal of the Acoustical Society of America, v. 89, n. 1, p. 425–434, 1991. DOI: https://doi.org/10.1121/1.400476

CAMPOREZ, Higor et al. Features extraction and segmentation for an assistive musical interface. In: THE 10TH WORKSHOP ON UBIQUITOUS MUSIC, 10, 2020. Ubiquitous Music and Everyday Creativity. g-ubimus. Available at: <https://doi.org/10.5281/zenodo.4248230>, 2020

CAMPOREZ, Higor A. F. et al. Interface Computacional para Controle Musical Utilizando os Movimentos dos Olhos. Revista Vórtex, v. 6, n. 2, p. 1–17, 2018a. Available at: <http://vortex.unespar.edu.br/camporez_et_al_v6_n2.pdf>. DOI: https://doi.org/10.33871/23179937.2018.6.2.2616

CAMPOREZ, Higor A. F et al. Olhar Musical: Uma Proposta de Interface para Expressividade Musical Voltada a Indivíduos com Deficiência Motora. In: THE 8TH WORKSHOP ON UBIQUITOUS MUSIC, 8, 2018b, São João del Rei - MG: UFSJ, 2018. p. 76–85.

CANNAM, Chris et al. MIREX 2014: Vamp Plugins from the Centre for Digital Music. 2014

CHOI, Young Mi; SPRIGLE, Stephen H. Approaches for Evaluating the Usability of Assistive Technology Product Prototypes. Assistive Technology, v. 23, n. 1, p. 36–41, 2011. Available at: <https://doi.org/10.1080/10400435.2010.541407>. DOI: https://doi.org/10.1080/10400435.2010.541407

CORREA, A G D et al. GenVirtual: An Augmented Reality Musical Game for Cognitive and Motor Rehabilitation. In: VIRTUAL REHABILITATION, 2007. IEEE: 2007. p. 1–6. DOI: https://doi.org/10.1109/ICVR.2007.4362120

DAVANZO, Nicola et al. Playing Music with the Eyes through an Isomorphic Interface. In: COGAIN ’18, 2018, New York, NY, USA: Association for Computing Machinery, 2018. Available at: <https://doi.org/10.1145/3206343.3206350>. DOI: https://doi.org/10.1145/3206343.3206350

DAVIES, M E P; PLUMBLEY, M D. A spectral difference approach to downbeat extraction in musical audio. In: 14TH EUROPEAN SIGNAL PROCESSING CONFERENCE, 14, 2006. IEEE: 2006. p. 1–4.

DORIGO, M; DI CARO, G. Ant colony optimization: a new meta-heuristic. In: THE 1999 CONGRESS ON EVOLUTIONARY COMPUTATION, 1999. IEEE: 1999. p. 1470- 1477 Vol. 2.

FRID, Emma. Accessible Digital Musical Instruments—A Review of Musical Interfaces in Inclusive Music Practice. Multimodal Technologies and Interaction, v. 3, n. 3, 2019. Available at: <https://www.mdpi.com/2414-4088/3/3/57>. DOI: https://doi.org/10.3390/mti3030057

FUTRELLE, Joe; DOWNIE, J. Stephen. Interdisciplinary Research Issues in Music Information Retrieval: ISMIR 2000-2002. Journal of New Music Research, v. 32, n. 2, p. 121–131, 2003. DOI: https://doi.org/10.1076/jnmr.32.2.121.16740

GEEM, Zong Woo. State-of-the-Art in the Structure of Harmony Search Algorithm. In: GEEM, Zong Woo (Org.). Recent Adv. Harmon. Search Algorithm. Berlin, Heidelberg: Springer Berlin Heidelberg, 2010. p. 1–10. Available at: <https://doi.org/10.1007/978-3-642-04317-8_1>. DOI: https://doi.org/10.1007/978-3-642-04317-8_1

GÓMEZ, Emilia. Tonal description of polyphonic audio for music content processing. INFORMS Journal on Computing, v. 18, n. 3, p. 294–304, 2006. DOI: https://doi.org/10.1287/ijoc.1040.0126

HORNOF, Anthony. The Prospects For Eye-Controlled Musical Performance. In: NIME, 2014, London, United Kingdom. Goldsmiths, University of London, 2014. p. 461–466. Available at: <http://www.nime.org/proceedings/2014/nime2014_562.pdf>.

HORNOF, Anthony J; SATO, Linda. EyeMusic: Making Music with the Eyes. In: NIME 2004, Singapore. Singapore: National University of Singapore, 2004. p. 185–188.

HUCKAUF, Anke; URBINA, Mario H. Gazing with pEYEs: Towards a Universal Input for Various Applications. In: ETRA ’08, 2008, New York. NY, USA: ACM, 2008. p. 51–54. Available at: <http://doi.acm.org/10.1145/1344471.1344483>. DOI: https://doi.org/10.1145/1344471.1344483

KANDPAL, Devansh; KANTAN, Prithvi Ravi; SERAFIN, Stefania. A Gaze-Driven Digital Interface for Musical Expression Based on Real-time Physical Modelling Synthesis. In: 19TH SOUND AND MUSIC COMPUTING CONFERENCE, 2022, France. France: 2022. 456-463.

KELLER, D.; LAZZARINI, Victor; PIMENTA, Marcelo S (Org.). Ubiquitous Music. Springer International Publishing, 2014. DOI: https://doi.org/10.1007/978-3-319-11152-0

KELLER, Damián. Challenges for a second decade of ubimus research: knowledge transfer in ubimus activities. Revista Música Hodie, v. 18, n. 1, p. 148–165, 2018. Available at: <https://revistas.ufg.br/musica/article/view/53578>. DOI: https://doi.org/10.5216/mh.v18i1.53578

KIM, Juno; SCHIEMER, Greg; NARUSHIMA, Terumi. Oculog: playing with eye movements. In NIME, 2007, NY, USA. NY: ACM, 2007. p. 50–55. DOI: https://doi.org/10.1145/1279740.1279747

LARSEN, Jeppe Veirum; OVERHOLT, Dan; MOESLUND, Thomas B. The Prospects of Musical Instruments For People with Physical Disabilities. In: NIME, 2016, Brisbane, Australia. Australia: Queensland Conservatorium Griffith University, 2016. p. 327–331. Available at: <http://www.nime.org/proceedings/2016/nime2016_paper0064.pdf>.

LINDENBAUM, O et al. Musical features extraction for audio-based search. nov. In: IEEE 26-TH CONVENTION OF ELECTRICAL AND ELECTRONICS ENGINEERS IN ISRAEL, 2010, Israel. Israel: IEEE, 2010. p. 87–91. DOI: https://doi.org/10.1109/EEEI.2010.5661916

LOURO, V. S.; IKUTA, C. Y.; NASCIMENTO, M. Música e deficiência: levantamento de adaptações para o fazer musical de pessoas com deficiência. Arquivos Brasileiros de Paralisia Cerebral, v. 1, n. 2, p. 11–17, 2005.

MAJARANTA, Päivi; BULLING, Andreas. Eye Tracking and Eye-Based Human--Computer Interaction. In: FAIRCLOUGH, Stephen H; GILLEADE, Kiel (Org.). Adv. Physiol. Comput. London: Springer London, 2014. p. 39–65. Available at: <https://doi.org/10.1007/978-1-4471-6392-3_3>. DOI: https://doi.org/10.1007/978-1-4471-6392-3_3

MCFEE, Brian et al. librosa: Audio and music signal analysis in python. In: THE 14th PYTHON IN SCIENCE CONF, 2015. 2015. p. 18–25. DOI: https://doi.org/10.25080/Majora-7b98e3ed-003

MIRJALILI, Seyedali; MIRJALILI, Seyed Mohammad; LEWIS, Andrew. Grey Wolf Optimizer. Advances in Engineering Software, v. 69, p. 46–61, 2014. DOI: https://doi.org/10.1016/j.advengsoft.2013.12.007

PAYNE, William; PARADISO, Ann; KANE, Shaun K. Cyclops: Designing an Eye-Controlled Instrument for Accessibility and Flexible Use. In: NIME, 2020, Birmingham, United Kingdom. Birmingham, United Kingdom: Birmingham City University, 2020. 576-580.

PISTON, Walter; DEVOTO, Mark. Harmony. 5a ed. New York: W. W. Norton & Company, 1987.

THOMPSON, William Forde. Music, thought, and Feeling: Underst. Psychol. Music. 2nd ed. New York, NY, US: Oxford University Press, 2015.

VALENCIA, Stephanie et al. Dueto: Accessible, Gaze-Operated Musical Expression. In: ASSETS, 2019, New York. NY, USA: Association for Computing Machinery, 2019. p. 513–515. Available at: <https://doi.org/10.1145/3308561.3354603>. DOI: https://doi.org/10.1145/3308561.3354603

VAMVAKOUSIS, Zacharias; RAMIREZ, Rafael. Temporal control in the EyeHarp gaze-controlled musical interface. In: NIME, 2012, Ann Arbor.Ann Arbor: University of Michigan, 2012. p. 11–16.

VAMVAKOUSIS, Zacharias; RAMIREZ, Rafael. The EyeHarp: A Gaze-Controlled Digital Musical Instrument. Frontiers in Psychology, v. 7, 2016. Available at: <https://www.frontiersin.org/articles/10.3389/fpsyg.2016.00906>. DOI: https://doi.org/10.3389/fpsyg.2016.00906

VENTER, Gerhard. Review of Optimization Techniques. Encyclopedia of Aerospace Engineering: John Wiley & Sons, Ltd, 2010. Available at: <https://onlinelibrary.wiley.com/doi/abs/10.1002/9780470686652.eae495>. DOI: https://doi.org/10.1002/9780470686652.eae495

VICKERS, Stephen; ISTANCE, Howell; SMALLEY, Matthew. EyeGuitar: Making Rhythm Based Music Video Games Accessible Using Only Eye Movements. In: THE 7TH INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTER ENTERTAINMENT TECHNOLOGY, 7, 2010. ACM: 2010. 36-39. DOI: https://doi.org/10.1145/1971630.1971641

WHO, (WORLD HEALTH ORGANIZATION). Relatório mundial sobre a deficiência, 2012. Available at: <http://apps.who.int/iris/bitstream/handle/10665/70670/WHO_NMH_VIP_11.01_por.pdf?sequence=9>. Acesso em: 1 jun. 2018.

YOO, Jae-Chern; HAN, Tae Hee. Fast normalized cross-correlation. Circuits, systems and signal processing, v. 28, n. 6, p. 819–843, 2009. DOI: https://doi.org/10.1007/s00034-009-9130-7

Downloads

Publicado

02.05.2023

Como Citar

Camporez, H., M. de Freitas, Y. ., A. L. Silva, J., L. Costalonga, L., & R. O. Rocha, H. . (2023). Music Segmentation and Similarity Estimation Applied to a Gaze-Controlled Musical Interface. Revista Vórtex, 11(1), 1–25. https://doi.org/10.33871/23179937.2023.11.1.7068

Edição

Seção

Dossier “Ubimus, Gastrossônica e Bem-estar”