Hamilton et al., 2024 - Google Patents
The billboard melodic music dataset (BiMMuDa)Hamilton et al., 2024
View PDF- Document ID
- 38431155088492580
- Author
- Hamilton M
- Clemente A
- Hall E
- Pearce M
- Publication year
- Publication venue
- Transactions of the International Society for Music Information Retrieval
External Links
Snippet
We introduce the Billboard Melodic Music Dataset (BiMMuDa), which contains the lead vocal melodies of the top five songs of each year from 1950 to 2022 according to the Billboard year-end singles charts. In this article, the dataset's compilation process and …
- 238000013518 transcription 0 abstract description 77
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0033—Recording/reproducing or transmission of music for electrophonic musical instruments
- G10H1/0041—Recording/reproducing or transmission of music for electrophonic musical instruments in coded form
- G10H1/0058—Transmission between separate instruments or between individual components of a musical system
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H1/00—Details of electrophonic musical instruments
- G10H1/36—Accompaniment arrangements
- G10H1/38—Chord
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/121—Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
- G10H2240/131—Library retrieval, i.e. searching a database or selecting a specific musical piece, segment, pattern, rule or parameter set
- G10H2240/141—Library retrieval matching, i.e. any of the steps of matching an inputted segment or phrase with musical database contents, e.g. query by humming, singing or playing; the steps may include, e.g. musical analysis of the input, musical feature extraction, query formulation, or details of the retrieval process
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/031—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0008—Associated control or indicating means
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H1/00—Details of electrophonic musical instruments
- G10H1/36—Accompaniment arrangements
- G10H1/40—Rhythm
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/101—Music Composition or musical creation; Tools or processes therefor
- G10H2210/151—Music Composition or musical creation; Tools or processes therefor using templates, i.e. incomplete musical sections, as a basis for composing
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/395—Special musical scales, i.e. other than the 12- interval equally tempered scale; Special input devices therefor
- G10H2210/525—Diatonic scales, e.g. aeolian, ionian or major, dorian, locrian, lydian, mixolydian, phrygian, i.e. seven note, octave-repeating musical scales comprising five whole steps and two half steps for each octave, in which the two half steps are separated from each other by either two or three whole steps
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H7/00—Instruments in which the tones are synthesised from a data store, e.g. computer organs
- G10H7/08—Instruments in which the tones are synthesised from a data store, e.g. computer organs by calculating functions or polynomial approximations to evaluate amplitudes at successive sample points of a tone waveform
- G10H7/10—Instruments in which the tones are synthesised from a data store, e.g. computer organs by calculating functions or polynomial approximations to evaluate amplitudes at successive sample points of a tone waveform using coefficients or parameters stored in a memory, e.g. Fourier coefficients
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/171—Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
- G10H2240/281—Protocol or standard connector for transmission of analog or digital data to or from an electrophonic musical instrument
- G10H2240/295—Packet switched network, e.g. token ring
- G10H2240/305—Internet or TCP/IP protocol use for any electrophonic musical instrument data or musical parameter transmission purposes
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/075—Musical metadata derived from musical analysis or for use in electrophonic musical instruments
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS
- G10H2250/00—Aspects of algorithms or signal processing methods without intrinsic musical character, yet specifically adapted for or used in electrophonic musical processing
- G10H2250/131—Mathematical functions for musical analysis, processing, synthesis or composition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Ji et al. | A comprehensive survey on deep music generation: Multi-level representations, algorithms, evaluations, and future directions | |
Cancino-Chacón et al. | Computational models of expressive music performance: A comprehensive and critical review | |
Sturm et al. | Music transcription modelling and composition using deep learning | |
Juslin | Cue utilization in communication of emotion in music performance: Relating performance to perception. | |
Pfleiderer | Inside the Jazzomat: New Perspectives for Jazz Research | |
Condit-Schultz | MCFlow: A digital corpus of rap transcriptions | |
Lerch et al. | An interdisciplinary review of music performance analysis | |
Keller et al. | Cognitive and affective judgements of syncopated musical themes | |
Hung et al. | Musical composition style transfer via disentangled timbre representations | |
Klapuri | Introduction to music transcription | |
De Haas et al. | A geometrical distance measure for determining the similarity of musical harmony | |
Chai | Automated analysis of musical structure | |
Dai et al. | Personalised popular music generation using imitation and structure | |
Dai et al. | What is missing in deep music generation? a study of repetition and structure in popular music | |
Cope | Recombinant music: using the computer to explore musical style | |
Chew | Notating disfluencies and temporal deviations in music and arrhythmia | |
Heydarian | Automatic recognition of Persian musical modes in audio musical signals | |
Nikzat et al. | KDC: AN OPEN CORPUS FOR COMPUTATIONAL RESEARCH OF DASTG ŻAHI MUSIC | |
Smith et al. | Listening as a creative act: Meaningful differences in structural annotations of improvised performances | |
Kroher et al. | Computational ethnomusicology: a study of flamenco and Arab-Andalusian vocal music | |
Juchniewicz et al. | The influences of progression type and distortion on the perception of terminal power chords | |
Hamilton et al. | The billboard melodic music dataset (BiMMuDa) | |
Arthur et al. | The Coordinated Corpus of Popular Musics (CoCoPops): A Meta-Corpus of Melodic and Harmonic Transcriptions. | |
Olthof et al. | The role of absolute pitch memory in the oral transmission of folksongs | |
Köküer et al. | Curating and annotating a collection of traditional Irish flute recordings to facilitate stylistic analysis |