Back to Search Start Over

ImprovNet: Generating Controllable Musical Improvisations with Iterative Corruption Refinement

Authors :
Bhandari, Keshav
Chang, Sungkyun
Lu, Tongyu
Enus, Fareza R.
Bradshaw, Louis B.
Herremans, Dorien
Colton, Simon
Publication Year :
2025

Abstract

Deep learning has enabled remarkable advances in style transfer across various domains, offering new possibilities for creative content generation. However, in the realm of symbolic music, generating controllable and expressive performance-level style transfers for complete musical works remains challenging due to limited datasets, especially for genres such as jazz, and the lack of unified models that can handle multiple music generation tasks. This paper presents ImprovNet, a transformer-based architecture that generates expressive and controllable musical improvisations through a self-supervised corruption-refinement training strategy. ImprovNet unifies multiple capabilities within a single model: it can perform cross-genre and intra-genre improvisations, harmonize melodies with genre-specific styles, and execute short prompt continuation and infilling tasks. The model's iterative generation framework allows users to control the degree of style transfer and structural similarity to the original composition. Objective and subjective evaluations demonstrate ImprovNet's effectiveness in generating musically coherent improvisations while maintaining structural relationships with the original pieces. The model outperforms Anticipatory Music Transformer in short continuation and infilling tasks and successfully achieves recognizable genre conversion, with 79\% of participants correctly identifying jazz-style improvisations. Our code and demo page can be found at https://github.com/keshavbhandari/improvnet.<br />Comment: 10 pages, 6 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2502.04522
Document Type :
Working Paper