WO2007066818A1 - 音楽編集装置及び音楽編集方法 - Google Patents
音楽編集装置及び音楽編集方法 Download PDFInfo
- Publication number
- WO2007066818A1 WO2007066818A1 PCT/JP2006/324889 JP2006324889W WO2007066818A1 WO 2007066818 A1 WO2007066818 A1 WO 2007066818A1 JP 2006324889 W JP2006324889 W JP 2006324889W WO 2007066818 A1 WO2007066818 A1 WO 2007066818A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- music
- beat
- section
- song
- remix
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 40
- 230000001360 synchronised effect Effects 0.000 claims abstract description 74
- 238000002156 mixing Methods 0.000 claims abstract description 12
- 238000012937 correction Methods 0.000 claims abstract description 11
- 238000012545 processing Methods 0.000 claims description 77
- 230000005236 sound signal Effects 0.000 claims description 19
- 230000008569 process Effects 0.000 claims description 13
- 238000004891 communication Methods 0.000 claims description 9
- 230000010354 integration Effects 0.000 claims description 5
- 239000011324 bead Substances 0.000 claims description 2
- 239000000203 mixture Substances 0.000 abstract description 19
- 238000010586 diagram Methods 0.000 description 39
- 230000033764 rhythmic process Effects 0.000 description 30
- 230000000694 effects Effects 0.000 description 24
- 239000000463 material Substances 0.000 description 21
- HAORKNGNJCEJBX-UHFFFAOYSA-N cyprodinil Chemical compound N=1C(C)=CC(C2CC2)=NC=1NC1=CC=CC=C1 HAORKNGNJCEJBX-UHFFFAOYSA-N 0.000 description 14
- 241001342895 Chorus Species 0.000 description 13
- 230000006870 function Effects 0.000 description 11
- 239000011295 pitch Substances 0.000 description 10
- 230000008859 change Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 7
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 7
- 230000007704 transition Effects 0.000 description 5
- 238000003825 pressing Methods 0.000 description 4
- 235000021251 pulses Nutrition 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000010355 oscillation Effects 0.000 description 2
- 241000277269 Oncorhynchus masou Species 0.000 description 1
- 235000010627 Phaseolus vulgaris Nutrition 0.000 description 1
- 244000046052 Phaseolus vulgaris Species 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 235000012054 meals Nutrition 0.000 description 1
- 239000003415 peat Substances 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000011069 regeneration method Methods 0.000 description 1
- 230000001020 rhythmical effect Effects 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000010183 spectrum analysis Methods 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/36—Accompaniment arrangements
- G10H1/40—Rhythm
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0008—Associated control or indicating means
- G10H1/0025—Automatic or semi-automatic music composition, e.g. producing random music, applying rules from music theory or modifying a musical piece
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L19/00—Speech or audio signals analysis-synthesis techniques for redundancy reduction, e.g. in vocoders; Coding or decoding of speech or audio signals, using source filter models or psychoacoustic analysis
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B20/00—Signal processing not specific to the method of recording or reproducing; Circuits therefor
- G11B20/10—Digital recording or reproducing
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/031—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
- G10H2210/076—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for extraction of timing, tempo; Beat detection
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/101—Music Composition or musical creation; Tools or processes therefor
- G10H2210/105—Composing aid, e.g. for supporting creation, edition or modification of a piece of music
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/101—Music Composition or musical creation; Tools or processes therefor
- G10H2210/125—Medley, i.e. linking parts of different musical pieces in one single piece, e.g. sound collage, DJ mix
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/101—Music Composition or musical creation; Tools or processes therefor
- G10H2210/131—Morphing, i.e. transformation of a musical piece into a new different one, e.g. remix
- G10H2210/136—Morphing interpolation, i.e. interpolating in pitch, harmony or time, tempo or rhythm, between two different musical pieces, e.g. to produce a new musical work
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/375—Tempo or beat alterations; Music timing control
- G10H2210/391—Automatic tempo adjustment, correction or control
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/011—Files or data streams containing coded musical information, e.g. for transmission
- G10H2240/016—File editing, i.e. modifying musical data files or streams as such
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/011—Files or data streams containing coded musical information, e.g. for transmission
- G10H2240/046—File format, i.e. specific or non-standard musical file format used in or adapted for electrophonic musical instruments, e.g. in wavetables
- G10H2240/061—MP3, i.e. MPEG-1 or MPEG-2 Audio Layer III, lossy audio compression
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/011—Files or data streams containing coded musical information, e.g. for transmission
- G10H2240/046—File format, i.e. specific or non-standard musical file format used in or adapted for electrophonic musical instruments, e.g. in wavetables
- G10H2240/066—MPEG audio-visual compression file formats, e.g. MPEG-4 for coding of audio-visual objects
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/091—Info, i.e. juxtaposition of unrelated auxiliary information or commercial messages with or between music files
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/121—Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
- G10H2240/131—Library retrieval, i.e. searching a database or selecting a specific musical piece, segment, pattern, rule or parameter set
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/171—Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
- G10H2240/281—Protocol or standard connector for transmission of analog or digital data to or from an electrophonic musical instrument
- G10H2240/295—Packet switched network, e.g. token ring
- G10H2240/305—Internet or TCP/IP protocol use for any electrophonic musical instrument data or musical parameter transmission purposes
Definitions
- a book for example, a music device that generates new (content) by collecting music content obtained by subdividing the music into music parts, measures, etc. Music collection method.
- Memo With the audio, it became routine for the user to carry a large number of songs. In an audio system where you can listen to a large amount of songs continuously, you can not only listen to the original CD album, but also listen to all songs across albums and play only the songs you like. The style of reconstructing as a list and listening is widespread. It can also be said that the freedom to listen to the songs that The Za listened to in the order in which they listened to has spread without observing the numbers on the album.
- Figure 32 shows the conventional live performance using song A and song B. Each is played at the original tempo of the song. Of course, there is no sound in.
- the 2 0 0 3 4 4 0 4 6 report discloses how to combine multiple materials according to their power via the keyboard to interactively edit and play back songs to enjoy. Chips that consist of hooks that apply to them, such as sound pattern and sound pattern, are assigned. When the user presses, the chip corresponding to this is executed. By pressing a number at the same time, the chip is put into force. Also, press in order and combine the chips with to configure the music.
- DJ the turntable on the record board of the job category
- the timing tempo of a song is controlled, and multiple songs are played in sequence as if they were a series of songs. You can do it.
- This method is generally called a lock.
- the listener is given a new feeling and excitement as a way of making music from simple music to advanced music. Therefore, in recent years, this DJ clock technique has It has become popular as a law.
- J can start playing back in sync with the beat measure of the next song, matching the beat measure currently being played.
- the machine cannot recognize bit measures, so it cannot do it automatically.
- D J can start playback seamlessly while adjusting the tempo of the next song according to the tempo of the currently playing song. Since it cannot recognize the tempo, it cannot do it automatically.
- D J can start playback in a seamless manner while playing the next piece of music. It can't do that automatically because it can't recognize the tempo.
- D J can start playback in a seamless manner while keeping track of the melody of the next song in accordance with the melody of the song currently being played. It can't do that automatically because it can't recognize melodies.
- the purpose is to provide a music device that realizes automatic DJ mixing by loading and automatically.
- a mix logic is generated based on the mix pattern file, which is the data generated at least for the beat position on the time axis. Generates a mixing control unit and a star bit, sends this stabit to the above-mentioned mixing control unit, and responds to the above-mentioned mixing control unit slash pattern data. Synchronized playback that plays multiple songs in response to instructions or mix processing, mixes the songs played in the above-mentioned period playback, and The phase of the bit sync signal generated from the meta data and the phase of the star bit are compared, and a branch circuit for integrating the above force and a correction for correcting the tempo based on the force of the above branch circuit are further provided. .
- the music collection method according to Ming is based on a data pattern, a at least time-based beat position data generated corresponding to the music data, and a dynamics pattern file.
- the mixing process is performed and a stub is generated, and this star beat is sent to the above-mentioned mixing process section, and at the same time, the above-mentioned mixing process step is performed.
- It is equipped with a synchronized playback process that gives instructions according to the audio data and plays multiple songs in response to the mix, and the songs played in the above-mentioned period playback are mixed.
- the bit sync and star-bit generated from the above-mentioned metadata supplied from the It further comprises a step of comparing the phase with the signal, a step of integrating the force in the above step, and a step of correcting the tempo based on the force of the step.
- DJ play of music can be performed in real time and dynamically, so that it is possible to provide a new way of making music with the music production apparatus.
- it will become easier to synchronize music bits and other media, creating new entertainment.
- Fig. 2 is a block to which the device and method according to the invention are applied
- Fig. 2 is a block showing the detailed structure of the synchronous playback and audio mixing unit
- Fig. 3 is Music device program
- Fig. 4 is a chart showing the order of music programs executed by the music device
- 5, Fig. 6 shows data on the time axis
- Fig. 6 shows time
- Fig. 7 shows an example of the axis data
- Fig. 7 shows another specific example of the time axis data
- Fig. 8A, 8B, 8C shows the method of the data
- Fig. 9 shows the mix.
- Fig. 0 shows an example of the pattern figure
- Fig. 0 is for explaining the connection by the cross using the effect
- Fig. 2 explains the connection by the cross-face
- Fig. 3 explains the connection by the cut-in.
- Fig. 4 is for explaining the connection using the effect
- Fig. 5 is for explaining the simultaneous regeneration
- Fig. 16 is for explaining the use of hook
- Fig. 7 is For the purpose of explaining the partial live
- Fig. 8 shows the synchronous playback that enables editing and playback so that SE is sandwiched between song A, song B, and the class shown in Fig. 0.
- Fig. 9 is for explaining the function of the synchronous playback limit processing part, which is the main part of the explanation.
- Fig. 20 shows the set of bit playback
- Fig. 20 shows the set of bit playback
- Fig. 20 shows the set of bit playback
- Figure 22 Shows the detailed bit-playback timing chart.
- Figure 22 shows the tempo of a certain music.
- ⁇ 2 3 A, 2 3 B, 23 C show the problems associated with the tempo movement.
- Fig. 24 shows the time-line data of Fig. 25, and
- Fig. 25 shows a system for playing in sync with a constantly changing music bit.
- Fig. 27 shows the network communication.
- Fig. 28 is a block of a music device having a network
- Fig. 29 is a block of a music device having a network
- Fig. 30 is a block of a music device having a sensor.
- a musical instrument block with a sensor, 3A 3B, is a flowchart showing the order of the musical instrument with a sensor
- 3 2 is a diagram for explaining a conventional life.
- FIG. 1 is a block diagram of music to which the parts and methods according to the present invention are applied.
- This is a song provided through a body such as a disk (D, a compact disc (), a mini disc MD, a digital bass disc (or a network such as the internet.
- a body such as a disk (D, a compact disc (), a mini disc MD, a digital bass disc (or a network such as the internet.
- D disk
- mini disc MD a mini disc
- MD digital bass disc
- the data DJ described later, is used to play an automatic DJ live that, for example, plays a song A song B beat that originally has a different tempo and plays it back and forth.
- C P central processing unit
- memory 4 a memory 4
- synchronous playback 8 a synchronous playback 8
- R 3 RAM 4 and user 1 5 design interface 6 are connected to C P 2 through 3.
- C 2 decides how to connect the music to the real time, and gives the necessary materials to the synchronized playback 8 with the required timing. Also, the tempo bit sync is indicated for the sync playback 8 according to the operation of the z.
- music data 6 mix pattern 7. 5 It is also possible to externally connect a storage device for storage, which is a memory that holds multiple data, or a flash media device such as a portable storage device.
- the music data stored in Music 5 may be compressed or compressed.
- the data 6 stores the data on the time axis, such as the flash media disc, which is added to the song.
- the metadata is added to the music.
- This is the supplementary data on the time axis, which includes not only the tempo but also the bit information, the information simply referred to as the head of a measure), the prelude intro) te (chorus), and so on.
- Mix pattern 7 does not need to be particularly limited as long as it is a memory like song 5. It is a memory that gives instructions on how to determine the mix pattern and holds the mix pattern. The details of the Mic Spatte will be described later, but not only is written, but it is also written, such as how to combine, or where to use where in song A and song B to combine. It is a certain file.
- Periodic playback 8 is a block for automatically playing music and plays the music material indicated by the mix control function of C P 2 in synchronization with the reference bit.
- Period playback 9 audio switching 0, digital analog (DA and audio 2).
- Period replay 9 replays a number of issues with multiple numbers in synchronism with the number of issues generated within itself.
- Kithing 0 is a multiple of
- the synthesized signal reproduced in is synthesized and output. Converts the digital signal reproduced by DA audio mixing 0 to an analog signal. 2 is the analog from DA Amplify and output to a speakerphone.
- ROM 3 contains a music program in the order according to the method of collection of Ming. It also stores the data of Defol. R M serves as a query when C P 2 executes the above program. It also stores the data when the above program is executed.
- the 1 F 5 is, for example, Kibod, Us, Touchpad, etc. that accepts works by The. 16 is a display that includes the current situation, the music situation, or a touch panel that allows the user to make a work, for example ,.
- Kibod, Us, Touchpad etc.
- The. 16 is a display that includes the current situation, the music situation, or a touch panel that allows the user to make a work, for example ,.
- Fig. 2 is a block diagram showing the detailed structure of the synchronized playback 9 audio mixing 0.
- the term regeneration 9 consists of a stabit 90 and three systems.
- Starbeat 90 produces a clock equivalent to a bit. Specifically, the tempo at the time of mixing and the bit number synchronized with that tempo are output.
- the star bit 90 generates and outputs the measure number and other usual bit numbers in accordance with the specified values (44 children, 34 children, etc.).
- the stadium 90 is used to stabit the song nobit position based on the sync signal (clock or bit) generated by the stabit 90. Synchronous playback is performed according to the bit position.
- Each track has a decor (r) 9 9
- Decoder 9 9 9 C outputs the compressed voice of MP3, etc. as the code and data.
- the timeline 9 2 9 9 2 is a section that converts the playback level while keeping the pitch constant. Match materials with different tempos to the tempo of the reference bit based on the data from 6 for metadata. Depending on the ratio with the original tempo stadium of the song, the process of changing the playback to real time is performed. This allows the original tempo of the song to match the tempo of the master beat. Of course, as I said, there is no need to change it.
- pitch shift t h f r function may be included in these voices. Change the shift while keeping the shift and the degree of playback constant. This is used to comfortably combine materials with different degrees, but it is not always a necessary function but an additional function.
- the mixing 0 has three systems of processing units 0, 0 0 0 0 and a volume adjusting unit corresponding to the above 3 tracks. After being mixed by the mix 0 2 of these three systems, it is amplified by the audio part 0 3 and output to the external speaker and headphone. For each issue from It has a structure that allows for both face processing and volume management.
- Figure 3 is a block diagram of music.
- the function of C P 2 as a door is shown in the figure as the mixing section 20.
- the text processing section 20 is further divided into a data processing section 2 and a data pattern 2 2 2 containing patterns.
- the query processing unit 20 processes the data stored in the metadata 6 by the data processing unit 2. As described above, data is added to the song along the time axis, and not only the tempo but also the beat information, measure information, and intro / melody information are retained.
- the meta data processing unit 2 reads the data on the time axis corresponding to the music, reads it with the mix pattern embedding 22 and investigates the music information according to the instructions in the mix pattern report. For example, by understanding which bit is where you are, and where the beat of the song you are trying to match is, you can identify multiple songs and their effects. At what point and how to reproduce.
- the mix processing unit 20 stores the mix pattern 7 or reads the mix pattern file 7a with the mix pattern included 22.
- the mix pattern file 7a is a file that specifies what is to be done, whether to cut out, cut, or what SE should apply.
- the mix pattern was created by an automatic algorithm, although it could be the data that was artificially directed by the hand of a third party, or that it could be used as a part of this song. Anything can be used, like a machine-determined cook.
- the star bit is generated by the star bit 90, and this star bit is sent to the mixing control unit 20.
- the mixing control unit 20 controls the mixing.
- the player may use the media data such as the media data 6 to play multiple songs according to the instruction or the command.
- Figure 4 shows the order of music programs that music executes by C P 2. This program is an example of how to collect things.
- the processing unit 20 of the CP 2 reads and acquires the mix pattern file 7a from the mix pattern 7 with the mix pattern included 2 2 (step S. Next, easy, for example, synchronized playback of eyes). (Step S 2) If there is the next song, go to Step S 3 and determine the tempo of the synchronized playback 9 stat 90 0. This is 40 0. It may be fixed at or specified by the user. Next, obtain the connection pattern (also written in the pattern) (step S 5
- step S6 the data of the song is acquired (step S6. If A, then this is the data file of song A. Next, if the feature is needed, It is judged from the mix pattern (step S 7), YES if necessary, and it is enabled to apply the appropriate FUKU to the fictional processing unit 0 0 (step S 7). 8)
- step S9 it is judged from the check pattern function whether the volume fade process is necessary. For example, select whether you want to raise or lower the song when editing song A and song B and stacking them. . If necessary (, step S 0 for setting the hood parameter. I wrote it on the premise that it can be raised or lowered dynamically, but set the lammeter there.
- step S the original tempo of the song is set for the star bit 90 of synchronized playback 9.
- the original tempo of the song is attached to the data.
- the empty voice in the synchronized playback 9 is acquired.
- the example of 3 C is described above.
- step S4 it is determined in step S4) whether or not the point for preparing the next song has been reached (step S5). For example, if S E is a cross, the cross will end immediately before the bar, or if it is a cut, it will start immediately, so you can make preparations from the front. Of course, it's all at the same time. If you haven't reached the point where the song is prepared, go back to Step S4 and wait until it is reached. If the point for the next song is reached (,, return to step S 2
- Figure 5 is a diagram showing data 30 on the time axis.
- the meta data 30 is supplementary data added to the music on the time axis, and not only the tempo but also the bit report, the bar report, and the melody report such as intro. Is described. 3 represents the eyes. Represents the eye.
- 4 represents 4 of the knot.
- 2 represents the second eye.
- 2 represents the beat at the beginning of the knot Become 3 2 shows what is in that position. Indicates the position of a normal, media, tesa, etc. clause. 0 indicates that it is the head of the bar. ⁇ It has a sex of 2. If a sample, for example a song is sampled in kH, there are 4 samples in between, but the position is described in sample positions.
- the data 30 such as shown in Fig. 5 is described by a text expression or expression.
- Figure 6 shows an example of time axis data. The data of the axis between 40 bits 4 is displayed at the same time.
- FIG. 6 4 of the bit numbers is the first bit of the section, and 42 is the normal bit. If it is a bit 4 of a clause and the other beats 4 2 4 children, it has 4) timings followed by 3) positions corresponding to the sample positions of the music.
- FIG. 7 Figure is also a diagram showing an example of the time axis metadata. It is possible to attach a report to 50 indicating not only the bit 5 5 but also the song intro 5 and the melody such as A melody 5 2 5 3, B melody 5 4 and theesabi). It is possible. You can use this information to find out the position of the target song and the position of the fixed melody.
- 8A is an example in which data 7 of music data 7 2 is logically separate and physically present in the same media like MP 3.
- 8 B is M
- data 7 4 coexists with music 7 4 like PEG 4.
- 8 C is an example in which data 7 5 corresponding to music data 7 6 is retrieved via a network, for example. Logically and theoretically, the music data data is. This is applied when music is connected to a network such as the Internet with a network receiver, like Music 800, which will be described later.
- Fig. 9 is a diagram showing an example of the body of the Xspatanfire.
- the surface data is for the song, but for the mix pattern, it is possible for Z to make it freely and to have no dependency on the song. It is the Isle that connects them.
- (file) 6 may be the same as ⁇ , ⁇ E. It can be a counter file isle. 6 2 Indicates where to play for each song. Rust for A, intro for B, and 8 to 20 for C.
- Tethering pattern 4 4 is a cross for A, but when it is connected to B Show the connection and the child with a cross-face.
- B to C are cut lines
- C to D are crossfees
- D and E are connected together.
- the joint effect 6 5 specifies that the reverb, mouthpiece, and distortion are applied as engineering hooks at the time of joining.
- Tether SE 6 6 specifies the effect 9 When the pattern shown in Fig. 9 is specified, and when the pattern is reproduced, the following occurs. Is played. Intros come to the crowd near the rust. At the same time, from the beginning of the 20th section, at which the birth of the 8th section begins, the section will start from the section of 20. It is sometimes locked. When it is released, E is also played back at the same time. It is like that.
- both music A and music B can be played in synchronization with the reference bit, as shown in Fig. 0.
- song B is being played back at the same tempo.
- the wave number can be determined by the user.
- song A and song B each consist of bars.
- the measures are equal in size.
- the measure fit ing.
- the effect (SE is also easy, there is a bar. Since the tempo is correct, the bar is found. Both the wave number and the phase are present. With this kind of connection, it is possible to listen to the sound in a harmonious and musical sense.
- the figure shows an example of playing music A, music B, and music C continuously. Simultaneously adjust the tempo of all the songs to the tempo determined by the system or system regardless of the original tempo of the song, and play back the (down) of those songs as well. It enables you to go to lessons and lessons.
- the data on the time axis which is the music, is used.
- the original tempo of the song at the time of birth The playback rate is changed from the ratio of the current stubby tempo, and playback is started with the section of each song aligned.
- the tempo positions of multiple songs are handled accurately, and the playback positions are controlled by the real time for synchronous playback. It is premised that the method described in this section is used to connect the music described here to the seamless mode.
- Fig. 1 2 shows an example in which song A and song B are overlapped with a bit and are connected by the k-n theory.
- Current A
- the figure shows the effect for each of song A, song B, and song C. It is an example. Of course, it is possible to synchronize with the star bito tempo of this case. By sandwiching or overlapping SEs, it is possible to connect naturally with musicians who have large and different musical tones.
- Fig. 5 In music, you may live together. As shown in Fig. 5, not only are the songs that are connected to each other played, but a beat period is set and playback is continued at the same time. Since the tempo beats are synchronized, it may even seem as if it was a song like that from the beginning.In addition, as shown in Fig. May be applied.
- the quality is changed by applying the engineering effect (reverberation effect) to the part A or the whole. It includes not only those that emphasize the bass and treble, but also those that change the timbre itself, such as reverb, delay, distortion, and quiz.
- the engineering effect reverberation effect
- the fuku is played back to song A, and song B is played back at the same time with the cut file being applied, or by applying the various connection methods described above at the same time to create a more natural and attractive connection method. It can be realized.
- the music does not have to be a musical piece to apply the above-mentioned connection method, and may be the portion of the song as shown in Fig. 7 or the portion of the song. If you connect the parts of a song, it will be possible to make a connection that mimics.
- the backspan pattern As mentioned above, according to the music, it is possible to describe the backspan pattern and to perform the backsock of the song based on the backspan pattern.
- the user may decide the tempo at the tempo specified by the user, or You can put the whole together.
- an acceleration sensor or the like may be used to measure the jog tempo of the user, and the star bito tempo may be determined to match the tempo.
- the user can make the following new original stains in the same way as in the past.
- These stains can never be obtained by passively listening to music as before, and they are actively involved in music and create a musical style that suits oneself. Or something that satisfies one's own. In other words, you can connect different parts of the song in a cool way and listen as a non-stop music. In addition, you can satisfy yourself by experimenting with how to connect your dreams and creating a product of your choice.
- Figure 8 shows the composition of synchronized playback 8 that enables editing and playback of song A and song B shown in Fig. 0 so that S E is sandwiched between the tracks.
- the 3rd generation system As shown in Fig. 2, the 3rd generation system) and the audio mixing 0 are provided as the 9th period reproduction.
- Tracks are in charge of songs A, B, and SE, respectively.
- the number of tracks changes depending on the song.
- the tratter is equipped with the functions of Dekoda 99, Time Stretch 9 2 9 2 9 2 and Pitch Shift 9 4 9 9 4 respectively.
- the decoder outputs the compressed voice of MP3, etc. as the code and data.
- the SE is short in length and small in size, so there is no need to shrink it. Therefore, the SE track decoder is omitted and PCM ugd is used.
- the time stretch part converts the playback level while keeping the pitch constant. This is used to match materials with different tempos to match the tempo of the reference beat.
- the pitch shift part is changed while keeping the pitch constant. It is used to comfortably harmonize materials with different degrees, but it is not always a necessary function but an additional function.
- Figure 9 is a diagram for explaining the function of the synchronized playback 9-ix processing section 20 that is the main part of this invention.
- the management section 20 selects the music to be played and indicates the playback in the periodic playback 9. In this case, in some cases, the music body is instructed to be live, while in other cases only the part of the music piece is reproduced. In the example of Fig. 0, the selection of song A, song B S E, and its timing are instructed.
- Periodic playback 9 plays the music material indicated by the disk management section 20 in synchronization with the reference beat. Periodic playback 9 plays the music number in synchronization with the task number generated by me.
- a clock is a clock that is equivalent to a beat. The rise of this rising edge is given to the task control section 20 as an interruption. As the tree enters, it is possible to know that it is the turning point counting from the head by counting it. The lock management unit 20 interrupts this clock and interrupts it. Since it has been updated, it is possible to know the timing when it is time to input song A or song B, so an instruction is given to the synchronized playback 9 according to that timing.
- the synchronized playback 9 is not just a voice signal, but is also generated at the same time, and that clock is a clock equivalent to a bit and the clock is a clock.
- the feature is that it is given to 20 and interrupted.
- the reference bit consists of the number indicating the beginning of the bar and the number indicating the beat. It depends on (4 4).
- the bit is supplied as a signal that is included in the fax management section 20.
- the counting section 20 counts the position of the clock by counting this clock. You can know things. It can also be used as a timing signal to input music material.
- Figure 20 shows the set of bite regeneration. Let's look at an example of synchronizing with music, especially the bit of a bar. , This is not limited to the beginning of a bar, but of course, it is also possible to perform a synchronized playback for finer beats such as the normal 6 bits.
- the mixing management section 20 obtains the timing of the measure bit as an interrupt signal as a reference beat.
- the mix management section 20 selects the next material to be played next, and inserts the music material 9 in synchronization at the timing before the bit of the next bar to be played.
- the synchronized playback 9 starts to play the material at the timing of the next bar.
- Synchronous playback 9 takes the form of performing commands according to commands with very fine timing.
- a clock generation unit that generates a clock in 9 minutes of synchronized playback, and that plays the music material at a timing that is accurately synchronized with that clock Mimics a system that controls the whole. It controls the rough timing. Gives instructions to synchronous playback 9 with rough timing so that playback 9 operates correctly.
- Figure 2 shows the detailed management section 20 that shows the detailed regeneration timing chart, indicating which materials are put into 3 tra. Also, the mixing section 20 receives the reference bit as an interrupt, so it is possible to know the mining for inserting the material into each tray.
- the input songs SC and SC 2 are held within the (a weighting (t ng) range of the synchronized playback 9. Then, for the timing of the bit of the next bar, (b Current item) (urr) is loaded and playback is started.
- SC is the (C) () clock
- Synchronous playback 9 is executed in synchronization with all the reference beats from these control units 20.
- a beat mode in the above example, a beat of a bar is taken as an example, but it is also possible to synchronize with a finer beat such as a beat or a beat.
- sync playback 9 starts playback as indicated by the mix control section 20.
- the mixing control section 20 of these synchronous modes performs the synchronous playback control section 9.
- FIG. 2 Figure 2 shows the tempo of a song.
- the beat indicates the vertical BPM. Ming contains problems to be solved.
- the tempo of music at that time varies considerably depending on the power of the performer and the performance chart.
- the average B PM is 96, but in reality, it fluctuates in a range of 0 B P or more throughout one song.
- the fluctuations and patterns of varieties vary depending on the song, and cannot be quantified or qualitative.
- (2 3 A) is a spectrum (2 3 B) and a beat is taken to extract a beat (2 3 C). This corresponds to a 4th note.
- the position of the 4th note is called a time stamp.
- the timeline metadata (simply the data) is created by recording the position of the bit. In this case, record the raw sounds and the bits included in the data. This is also included in the data. In other words, what includes the time bit is called data.
- Figure 24 shows the timeline data.
- the timeline data describes the beat position that corresponds to the song's song.
- a musical time stamp corresponding to the section bite bit position is recorded in a unit of time code, for example, a sample from the head. This size is a mere time stamp, which is thousands of minutes to tens of thousands of minutes of the original PCM data, which is very small. Physically, when there are 44 children, there is a time stamp of 3 3 2 3 8 from the eye to the 39th bit in the song. Looking at this time stamp audio signal, the synchronous playback 9 is generating this clock.
- Figure 25 shows the synchronization signal 20 for solving the problem.
- the period signal 20 stab 90, biorhythm 20 0, and song rhythm 20) generate the bar sync and beat sync that are the standard bits, and the sync playback is synchronized.
- 9 Apply.
- the music rhythm 206 generates a beat signal corresponding to the number of beats of the song in sync with the live song from the timeline data of the currently playing song. To do.
- the reference beats are compared in the Vis- ing device and the branch circuit 20 4 generated from this timeline data, and output as a signal.
- 26A, 26B, 26C, and 26 show actual P products in order to explain the synchronization method using timeline data.
- the phase of 2 6 B does. (2 6 C, it is possible to obtain a positive output pulse for the delayed phase and a negative pulse train for the advanced phase.
- This pulse train is integrated by the integrator circuit and the DC value tempo positive human power (2 6 D)
- the tempo is input to the positive input time stretch block, and positive values are used for tempo conversion, and negative values are used for tempo conversion.
- the bit will be controlled so as to be phased with the reference bit.
- the original beat beats , Beats can be played real-time and dynamically so that the rhythms have a constant tempo.
- FIG. 27 shows the composition of other sounds 80 to which the embodiment of Ming is applied.
- Figure 2 is a professional figure of Music 80. This 80 is equipped with a network transmitter 8 and can be connected to the internet 82.
- Having a network subscriber 8 allows you to create and / or mix your mix pattern by exchanging and sharing the mix pattern on a network such as the internet 82. You can enjoy downloading.
- a content pattern provided by the content service side may be used.
- the method of connecting songs is published on the Internet, etc., and new music that is shared with other people or created by multiple people to create and compose a new music
- the communication function as the axis becomes effective.
- Figures 29 and 30 further show the door composition and block diagram of other sounds 0.
- This 0 is a configuration that acquires the sensor sensor value via the A / D converter, and is functionally equipped with a sensor.
- the sensor 3 is, for example,
- the technique of detecting the walking tempo by using the acceleration sensor and changing the tempo of the music according to the tempo is applied.
- the audio and audio As described in (2 0 0 5 3 6 3 0 9 4), the invention of selecting music in accordance with the tempo of walking jogging may be applied.
- a sensor is indispensable, and by applying the sensor and the algorithm of these inventions to the system, songs are selected according to the user's needs, and they are further tracked. It is possible to make a stop.
- 3 A 3 B is music with sensor mode
- the sensor mode changes the processing depending on the selection of the user.
- the sensor mode the sensor detects a pattern such as walking or jogging, and the bit is changed according to the pattern.
- step S that determines whether the sensor mode is the button mode.
- the sensor mode is a technique in which Sakihoza selects songs by jogging or walking, and the order of the songs or the song selection itself is determined by the sensor. There is no guarantee that it will be decided. It can be said that this suggests a dynamic change.
- the VS3 sensor tip is an image that dynamically creates a pattern based on the value from the sensor. When the pattern mode is selected instead, the process is the same as in the above-mentioned four figures.
- step S3 and step S32 The case where the sensor mode is selected in step S3 and step S32 will be described. It is called automatically selecting songs according to the jog and automatically connecting the songs accordingly. This is the case.
- step S33 the song and tempo are determined from the power of the sensor (step S33).
- step S34 the stubby tempo is set in step S35.
- the tempo is set according to the tempo of walking. Since it is not always the case that the connection of the songs has been completely sewn, an automatic determination step S 36. For example, some jogging modes are easy to connect with a cross, and since I am looking at the data on the next page and doing in, I sometimes overlap them.
- the reason for the step S 3 7 is the same as S 6 to S 5 in Fig. 4, so we will explain it.
- the music tempo is selected according to the tempo of the jogi, and by connecting them with a nonstop, the tempo of jogging is adjusted. You can enjoy playing comfortably without going out. In addition to jogging, depending on the sensor, other rhythm exercises (dancing, etc.) can be matched or miked.
- the mike it is possible to set the mike only in 30 minutes when commuting, and finish the mick when reaching the destination. Also, at times near the arrival time, the tempo can be gradually increased, and certain SEs can be overlapped to create an atmosphere of being worn soon.
- the way to connect music is disclosed on the Internet, etc., and it can be shared with other people or created by multiple people to create a mock. It is possible to have a communication centered on new music that has been played together. You can also use the sensor to listen to music that suits your situation and conditions.
- the real time has a means to track to the beat of the rhythm of music.
- timeline metadata By having a time stamp (), it is accurate and has a means to track to the beat of the rhythm of music in the real time.
- it is characterized by having a means for generating a sync beat sync signal from the timeline data beat time stamp () of the reproduced music. It is also characterized by having a means to track to the beat of the rhythm of the music at the timeline metadata real time, even for music that has a tempo change or rhythm in the middle of the song. . It is also characterized by having a number of tracks for playing songs of different rhythms continuously or simultaneously.
- each track has means for playing back beats without shifting from each other, even when playing songs of different rhythms with different tempos continuously or simultaneously. Also, even when playing a number of songs with different rhythms in succession or simultaneously, as a stage for synchronous playback for beats without shifting from each other, each track has a P path. Characterize things.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Computational Linguistics (AREA)
- Health & Medical Sciences (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Human Computer Interaction (AREA)
- Electrophonic Musical Instruments (AREA)
- Management Or Editing Of Information On Record Carriers (AREA)
- Signal Processing For Digital Recording And Reproducing (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP06834644A EP1959428A4 (en) | 2005-12-09 | 2006-12-07 | MUSICAL EDITING DEVICE AND METHOD |
JP2007549214A JP5243042B2 (ja) | 2005-12-09 | 2006-12-07 | 音楽編集装置及び音楽編集方法 |
US12/095,745 US7855334B2 (en) | 2005-12-09 | 2006-12-07 | Music edit device and music edit method |
KR1020087013805A KR101287984B1 (ko) | 2005-12-09 | 2006-12-07 | 음악 편집 장치 및 음악 편집 방법 |
CN2006800461745A CN101326569B (zh) | 2005-12-09 | 2006-12-07 | 音乐编辑设备和音乐编辑方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005356830 | 2005-12-09 | ||
JP2005-356830 | 2005-12-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007066818A1 true WO2007066818A1 (ja) | 2007-06-14 |
Family
ID=38122952
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/324889 WO2007066818A1 (ja) | 2005-12-09 | 2006-12-07 | 音楽編集装置及び音楽編集方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US7855334B2 (ja) |
EP (1) | EP1959428A4 (ja) |
JP (1) | JP5243042B2 (ja) |
KR (1) | KR101287984B1 (ja) |
CN (1) | CN101326569B (ja) |
WO (1) | WO2007066818A1 (ja) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010008821A (ja) * | 2008-06-27 | 2010-01-14 | Yamaha Corp | 曲編集支援装置およびプログラム |
JP2010008750A (ja) * | 2008-06-27 | 2010-01-14 | Yamaha Corp | 曲編集支援装置およびプログラム |
JP2011008136A (ja) * | 2009-06-29 | 2011-01-13 | Kddi Corp | 同期再生装置、同期再生方法および同期再生プログラム |
JP2011102818A (ja) * | 2009-11-10 | 2011-05-26 | Yamaha Corp | 曲編集支援装置およびプログラム |
WO2011151972A1 (ja) * | 2010-06-04 | 2011-12-08 | パナソニック株式会社 | 楽音再生装置及び楽音再生方法 |
WO2012007990A1 (ja) * | 2010-07-14 | 2012-01-19 | パイオニア株式会社 | 再生システム、再生方法およびそのプログラム |
JP2012022322A (ja) * | 2011-07-20 | 2012-02-02 | Pioneer Electronic Corp | 再生システム、操作部、再生方法およびそのプログラム |
WO2012077555A1 (ja) | 2010-12-07 | 2012-06-14 | 株式会社Jvcケンウッド | 曲順決定装置、曲順決定方法、および曲順決定プログラム |
US9245508B2 (en) | 2012-05-30 | 2016-01-26 | JVC Kenwood Corporation | Music piece order determination device, music piece order determination method, and music piece order determination program |
JP2020058653A (ja) * | 2018-10-11 | 2020-04-16 | 株式会社コナミアミューズメント | ゲームシステム及びゲームプログラム |
WO2020075533A1 (ja) * | 2018-10-11 | 2020-04-16 | 株式会社コナミアミューズメント | ゲームシステム、ゲームプログラム及びゲームシステムの制御方法 |
JP2021101366A (ja) * | 2017-03-30 | 2021-07-08 | グレースノート インコーポレイテッド | 音声を伴うビデオ提示の生成 |
WO2021179206A1 (zh) * | 2020-03-11 | 2021-09-16 | 努音有限公司 | 自动混音装置 |
Families Citing this family (47)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10164686B4 (de) * | 2001-01-13 | 2007-05-31 | Native Instruments Software Synthesis Gmbh | Automatische Erkennung und Anpassung von Tempo und Phase von Musikstücken und darauf aufbauender interaktiver Musik-Abspieler |
EP1959428A4 (en) * | 2005-12-09 | 2011-08-31 | Sony Corp | MUSICAL EDITING DEVICE AND METHOD |
WO2008107994A1 (ja) * | 2007-03-08 | 2008-09-12 | Pioneer Corporation | 情報再生装置及び方法、並びにコンピュータプログラム |
JP4311466B2 (ja) * | 2007-03-28 | 2009-08-12 | ヤマハ株式会社 | 演奏装置およびその制御方法を実現するプログラム |
US7956274B2 (en) * | 2007-03-28 | 2011-06-07 | Yamaha Corporation | Performance apparatus and storage medium therefor |
US8269093B2 (en) * | 2007-08-21 | 2012-09-18 | Apple Inc. | Method for creating a beat-synchronized media mix |
JP5336522B2 (ja) * | 2008-03-10 | 2013-11-06 | フラウンホッファー−ゲゼルシャフト ツァ フェルダールング デァ アンゲヴァンテン フォアシュンク エー.ファオ | 瞬間的事象を有する音声信号の操作装置および操作方法 |
US9014831B2 (en) * | 2008-04-15 | 2015-04-21 | Cassanova Group, Llc | Server side audio file beat mixing |
US7915512B2 (en) * | 2008-10-15 | 2011-03-29 | Agere Systems, Inc. | Method and apparatus for adjusting the cadence of music on a personal audio device |
US9024166B2 (en) * | 2010-09-09 | 2015-05-05 | Harmonix Music Systems, Inc. | Preventing subtractive track separation |
US9070352B1 (en) * | 2011-10-25 | 2015-06-30 | Mixwolf LLC | System and method for mixing song data using measure groupings |
US9111519B1 (en) * | 2011-10-26 | 2015-08-18 | Mixwolf LLC | System and method for generating cuepoints for mixing song data |
KR20130115653A (ko) * | 2012-04-12 | 2013-10-22 | 주식회사 제이디사운드 | 곡에 적합한 자동 디제잉 방법 및 장치 |
US9696884B2 (en) * | 2012-04-25 | 2017-07-04 | Nokia Technologies Oy | Method and apparatus for generating personalized media streams |
WO2014003072A1 (ja) * | 2012-06-26 | 2014-01-03 | ヤマハ株式会社 | オーディオ波形データを使用する自動演奏技術 |
JP2014052469A (ja) * | 2012-09-06 | 2014-03-20 | Sony Corp | 音声処理装置、音声処理方法、及び、プログラム |
US9319445B2 (en) | 2012-10-22 | 2016-04-19 | Spotify Ab | Systems and methods for pre-fetching media content |
CN102968995B (zh) * | 2012-11-16 | 2018-10-02 | 新奥特(北京)视频技术有限公司 | 一种音频信号的混音方法及装置 |
DK2808870T3 (en) | 2013-05-30 | 2016-04-18 | Spotify Ab | Crowdsourcing remix rules for streamed music |
US9411882B2 (en) | 2013-07-22 | 2016-08-09 | Dolby Laboratories Licensing Corporation | Interactive audio content generation, delivery, playback and sharing |
US9798974B2 (en) * | 2013-09-19 | 2017-10-24 | Microsoft Technology Licensing, Llc | Recommending audio sample combinations |
US9372925B2 (en) | 2013-09-19 | 2016-06-21 | Microsoft Technology Licensing, Llc | Combining audio samples by automatically adjusting sample characteristics |
US20160071524A1 (en) * | 2014-09-09 | 2016-03-10 | Nokia Corporation | Audio Modification for Multimedia Reversal |
WO2016112519A1 (zh) * | 2015-01-15 | 2016-07-21 | 华为技术有限公司 | 一种分割音频内容的方法及装置 |
CN104778957B (zh) * | 2015-03-20 | 2018-03-02 | 广东欧珀移动通信有限公司 | 一种歌曲音频处理的方法及装置 |
CN104778219B (zh) * | 2015-03-20 | 2018-05-29 | 广东欧珀移动通信有限公司 | 一种预设效果歌曲拼接的方法及装置 |
CN104780438A (zh) * | 2015-03-20 | 2015-07-15 | 广东欧珀移动通信有限公司 | 一种视频与歌曲音频拼接的方法及装置 |
CN104778220B (zh) * | 2015-03-20 | 2019-04-05 | Oppo广东移动通信有限公司 | 一种清唱歌曲拼接的方法及装置 |
CN104778221A (zh) * | 2015-03-20 | 2015-07-15 | 广东欧珀移动通信有限公司 | 一种歌曲串烧拼接的方法及装置 |
CN104778216B (zh) * | 2015-03-20 | 2017-05-17 | 广东欧珀移动通信有限公司 | 一种预设风格歌曲处理的方法及装置 |
CN104766601A (zh) * | 2015-03-28 | 2015-07-08 | 王评 | 啦啦操音乐自动混合器 |
US9606766B2 (en) | 2015-04-28 | 2017-03-28 | International Business Machines Corporation | Creating an audio file sample based upon user preferences |
US9842577B2 (en) | 2015-05-19 | 2017-12-12 | Harmonix Music Systems, Inc. | Improvised guitar simulation |
GB2539875B (en) | 2015-06-22 | 2017-09-20 | Time Machine Capital Ltd | Music Context System, Audio Track Structure and method of Real-Time Synchronization of Musical Content |
ITUB20153140A1 (it) * | 2015-08-17 | 2017-02-17 | Marco Franciosa | Metodo e dispositivo per la gestione delle transizioni tra brani musicali |
US20170060520A1 (en) * | 2015-09-01 | 2017-03-02 | AudioCommon, Inc. | Systems and methods for dynamically editable social media |
US9773486B2 (en) | 2015-09-28 | 2017-09-26 | Harmonix Music Systems, Inc. | Vocal improvisation |
US9799314B2 (en) | 2015-09-28 | 2017-10-24 | Harmonix Music Systems, Inc. | Dynamic improvisational fill feature |
US9502017B1 (en) * | 2016-04-14 | 2016-11-22 | Adobe Systems Incorporated | Automatic audio remixing with repetition avoidance |
JP6414164B2 (ja) * | 2016-09-05 | 2018-10-31 | カシオ計算機株式会社 | 自動演奏装置、自動演奏方法、プログラムおよび電子楽器 |
GB2557970B (en) | 2016-12-20 | 2020-12-09 | Mashtraxx Ltd | Content tracking system and method |
US9880805B1 (en) | 2016-12-22 | 2018-01-30 | Brian Howard Guralnick | Workout music playback machine |
CN108806655B (zh) * | 2017-04-26 | 2022-01-07 | 微软技术许可有限责任公司 | 歌曲的自动生成 |
US10453434B1 (en) * | 2017-05-16 | 2019-10-22 | John William Byrd | System for synthesizing sounds from prototypes |
JP6683322B2 (ja) * | 2018-10-11 | 2020-04-15 | 株式会社コナミアミューズメント | ゲームシステム、ゲームプログラム、及び合成楽曲の作成方法 |
CN110517657B (zh) * | 2019-08-15 | 2024-01-16 | 上海若安文化传播有限公司 | 音乐文件的节拍配置/播放方法、系统、介质及设备 |
WO2022049732A1 (ja) * | 2020-09-04 | 2022-03-10 | ローランド株式会社 | 情報処理装置及び情報処理方法 |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58163998A (ja) * | 1982-03-25 | 1983-09-28 | ヤマハ株式会社 | 自動演奏装置 |
JPH06342282A (ja) * | 1993-04-08 | 1994-12-13 | Sony Corp | 音楽出力装置 |
JPH07295589A (ja) * | 1994-04-22 | 1995-11-10 | Yamaha Corp | 波形処理装置 |
JP2000056780A (ja) * | 1998-08-05 | 2000-02-25 | Yamaha Corp | カラオケ装置 |
JP2001109470A (ja) * | 1999-10-13 | 2001-04-20 | Yamaha Corp | 自動演奏装置及び方法 |
JP2003044046A (ja) | 2001-07-30 | 2003-02-14 | Sony Corp | 情報処理装置及び情報処理方法、並びに記憶媒体 |
JP2003050588A (ja) * | 2001-08-06 | 2003-02-21 | Pioneer Electronic Corp | コンテンツ提供システムの管理サーバ装置、および端末装置 |
JP2003108132A (ja) * | 2001-09-28 | 2003-04-11 | Pioneer Electronic Corp | オーディオ情報再生装置及びオーディオ情報再生システム |
JP2004198759A (ja) * | 2002-12-19 | 2004-07-15 | Sony Computer Entertainment Inc | 楽音再生装置及び楽音再生プログラム |
JP2005156641A (ja) | 2003-11-20 | 2005-06-16 | Sony Corp | 再生態様制御装置及び再生態様制御方法 |
JP2007164939A (ja) | 2005-12-16 | 2007-06-28 | Sony Corp | オーディオ信号の再生機および再生方法 |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4669024A (en) * | 1985-10-23 | 1987-05-26 | Westinghouse Electric Corp. | Multiphase frequency selective phase locked loop with multiphase sinusoidal and digital outputs |
JPS63103490A (ja) * | 1986-10-20 | 1988-05-09 | Matsushita Electric Ind Co Ltd | 音楽信号再生装置 |
US4862485A (en) * | 1987-10-14 | 1989-08-29 | National Semiconductor Corporation | Quotient phase-shift processor for digital phase-locked-loops |
US5015939A (en) * | 1990-08-10 | 1991-05-14 | Synektron Corporation | Control circuit for switched reluctance motor |
US5148113A (en) * | 1990-11-29 | 1992-09-15 | Northern Telecom Ltd. | Clock phase alignment |
FR2734967B1 (fr) * | 1995-05-31 | 1997-07-04 | Cit Alcatel | Procede de verrouillage de phase et boucle appliquant ce procede |
JP2956569B2 (ja) * | 1996-02-26 | 1999-10-04 | ヤマハ株式会社 | カラオケ装置 |
JP3861381B2 (ja) * | 1997-06-13 | 2006-12-20 | ヤマハ株式会社 | カラオケ装置 |
JP4186298B2 (ja) | 1999-03-17 | 2008-11-26 | ソニー株式会社 | リズムの同期方法及び音響装置 |
JP3066528U (ja) | 1999-08-11 | 2000-02-25 | 橘医療器株式会社 | 口腔内視鏡 |
JP4293712B2 (ja) * | 1999-10-18 | 2009-07-08 | ローランド株式会社 | オーディオ波形再生装置 |
JP3789326B2 (ja) | 2000-07-31 | 2006-06-21 | 松下電器産業株式会社 | テンポ抽出装置、テンポ抽出方法、テンポ抽出プログラム及び記録媒体 |
DE10123366C1 (de) | 2001-05-14 | 2002-08-08 | Fraunhofer Ges Forschung | Vorrichtung zum Analysieren eines Audiosignals hinsichtlich von Rhythmusinformationen |
JP3674950B2 (ja) | 2002-03-07 | 2005-07-27 | ヤマハ株式会社 | 音楽データのテンポ推定方法および装置 |
US20030205124A1 (en) * | 2002-05-01 | 2003-11-06 | Foote Jonathan T. | Method and system for retrieving and sequencing music by rhythmic similarity |
JP4243682B2 (ja) | 2002-10-24 | 2009-03-25 | 独立行政法人産業技術総合研究所 | 音楽音響データ中のサビ区間を検出する方法及び装置並びに該方法を実行するためのプログラム |
US7208672B2 (en) * | 2003-02-19 | 2007-04-24 | Noam Camiel | System and method for structuring and mixing audio tracks |
US7521623B2 (en) * | 2004-11-24 | 2009-04-21 | Apple Inc. | Music synchronization arrangement |
US7189913B2 (en) * | 2003-04-04 | 2007-03-13 | Apple Computer, Inc. | Method and apparatus for time compression and expansion of audio data with dynamic tempo change during playback |
US20040254660A1 (en) * | 2003-05-28 | 2004-12-16 | Alan Seefeldt | Method and device to process digital media streams |
JP3888353B2 (ja) * | 2004-01-07 | 2007-02-28 | ソニー株式会社 | データ編集装置及びデータ編集方法 |
US7026536B2 (en) * | 2004-03-25 | 2006-04-11 | Microsoft Corporation | Beat analysis of musical signals |
CA2560736A1 (en) * | 2004-04-09 | 2005-10-27 | Micronas Semiconductors, Inc. | Apparatus for and method of controlling a digital demodulator coupled to an equalizer |
US7592534B2 (en) * | 2004-04-19 | 2009-09-22 | Sony Computer Entertainment Inc. | Music composition reproduction device and composite device including the same |
US7068110B2 (en) * | 2004-06-28 | 2006-06-27 | Silicon Laboratories Inc. | Phase error cancellation |
US7518053B1 (en) * | 2005-09-01 | 2009-04-14 | Texas Instruments Incorporated | Beat matching for portable audio |
US20070074618A1 (en) * | 2005-10-04 | 2007-04-05 | Linda Vergo | System and method for selecting music to guide a user through an activity |
EP1959428A4 (en) * | 2005-12-09 | 2011-08-31 | Sony Corp | MUSICAL EDITING DEVICE AND METHOD |
US20080097633A1 (en) * | 2006-09-29 | 2008-04-24 | Texas Instruments Incorporated | Beat matching systems |
-
2006
- 2006-12-07 EP EP06834644A patent/EP1959428A4/en not_active Withdrawn
- 2006-12-07 WO PCT/JP2006/324889 patent/WO2007066818A1/ja active Application Filing
- 2006-12-07 KR KR1020087013805A patent/KR101287984B1/ko not_active IP Right Cessation
- 2006-12-07 JP JP2007549214A patent/JP5243042B2/ja not_active Expired - Fee Related
- 2006-12-07 CN CN2006800461745A patent/CN101326569B/zh not_active Expired - Fee Related
- 2006-12-07 US US12/095,745 patent/US7855334B2/en not_active Expired - Fee Related
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS58163998A (ja) * | 1982-03-25 | 1983-09-28 | ヤマハ株式会社 | 自動演奏装置 |
JPH06342282A (ja) * | 1993-04-08 | 1994-12-13 | Sony Corp | 音楽出力装置 |
JPH07295589A (ja) * | 1994-04-22 | 1995-11-10 | Yamaha Corp | 波形処理装置 |
JP2000056780A (ja) * | 1998-08-05 | 2000-02-25 | Yamaha Corp | カラオケ装置 |
JP2001109470A (ja) * | 1999-10-13 | 2001-04-20 | Yamaha Corp | 自動演奏装置及び方法 |
JP2003044046A (ja) | 2001-07-30 | 2003-02-14 | Sony Corp | 情報処理装置及び情報処理方法、並びに記憶媒体 |
JP2003050588A (ja) * | 2001-08-06 | 2003-02-21 | Pioneer Electronic Corp | コンテンツ提供システムの管理サーバ装置、および端末装置 |
JP2003108132A (ja) * | 2001-09-28 | 2003-04-11 | Pioneer Electronic Corp | オーディオ情報再生装置及びオーディオ情報再生システム |
JP2004198759A (ja) * | 2002-12-19 | 2004-07-15 | Sony Computer Entertainment Inc | 楽音再生装置及び楽音再生プログラム |
JP2005156641A (ja) | 2003-11-20 | 2005-06-16 | Sony Corp | 再生態様制御装置及び再生態様制御方法 |
JP2007164939A (ja) | 2005-12-16 | 2007-06-28 | Sony Corp | オーディオ信号の再生機および再生方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1959428A4 |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010008821A (ja) * | 2008-06-27 | 2010-01-14 | Yamaha Corp | 曲編集支援装置およびプログラム |
JP2010008750A (ja) * | 2008-06-27 | 2010-01-14 | Yamaha Corp | 曲編集支援装置およびプログラム |
JP2011008136A (ja) * | 2009-06-29 | 2011-01-13 | Kddi Corp | 同期再生装置、同期再生方法および同期再生プログラム |
JP2011102818A (ja) * | 2009-11-10 | 2011-05-26 | Yamaha Corp | 曲編集支援装置およびプログラム |
WO2011151972A1 (ja) * | 2010-06-04 | 2011-12-08 | パナソニック株式会社 | 楽音再生装置及び楽音再生方法 |
WO2012007990A1 (ja) * | 2010-07-14 | 2012-01-19 | パイオニア株式会社 | 再生システム、再生方法およびそのプログラム |
US9640216B2 (en) | 2010-07-14 | 2017-05-02 | Pioneer Dj Corporation | Reproduction system for maintaining synchronization between a first audio content and a plurality of audio contents during special reproduction of the first audio content, and method and program thereof |
JP4927232B2 (ja) * | 2010-07-14 | 2012-05-09 | パイオニア株式会社 | 再生システム、操作部、再生方法およびそのプログラム |
US8766078B2 (en) | 2010-12-07 | 2014-07-01 | JVC Kenwood Corporation | Music piece order determination device, music piece order determination method, and music piece order determination program |
EP2650875A1 (en) * | 2010-12-07 | 2013-10-16 | JVC Kenwood Corporation | Track order determination device, track order determination method, and track order determination program |
WO2012077555A1 (ja) | 2010-12-07 | 2012-06-14 | 株式会社Jvcケンウッド | 曲順決定装置、曲順決定方法、および曲順決定プログラム |
EP2650875A4 (en) * | 2010-12-07 | 2014-07-02 | Jvc Kenwood Corp | PITCH ORDER DETERMINATION DEVICE, TRACK ORDER DETERMINATION METHOD, AND TRACK ORDER DETERMINATION PROGRAM |
JP2012022322A (ja) * | 2011-07-20 | 2012-02-02 | Pioneer Electronic Corp | 再生システム、操作部、再生方法およびそのプログラム |
US9245508B2 (en) | 2012-05-30 | 2016-01-26 | JVC Kenwood Corporation | Music piece order determination device, music piece order determination method, and music piece order determination program |
JP2021101366A (ja) * | 2017-03-30 | 2021-07-08 | グレースノート インコーポレイテッド | 音声を伴うビデオ提示の生成 |
JP7271590B2 (ja) | 2017-03-30 | 2023-05-11 | グレースノート インコーポレイテッド | 音声を伴うビデオ提示の生成 |
US11915722B2 (en) | 2017-03-30 | 2024-02-27 | Gracenote, Inc. | Generating a video presentation to accompany audio |
JP2020058653A (ja) * | 2018-10-11 | 2020-04-16 | 株式会社コナミアミューズメント | ゲームシステム及びゲームプログラム |
WO2020075533A1 (ja) * | 2018-10-11 | 2020-04-16 | 株式会社コナミアミューズメント | ゲームシステム、ゲームプログラム及びゲームシステムの制御方法 |
WO2021179206A1 (zh) * | 2020-03-11 | 2021-09-16 | 努音有限公司 | 自动混音装置 |
Also Published As
Publication number | Publication date |
---|---|
EP1959428A4 (en) | 2011-08-31 |
KR20080074976A (ko) | 2008-08-13 |
JP5243042B2 (ja) | 2013-07-24 |
KR101287984B1 (ko) | 2013-07-19 |
US20090272253A1 (en) | 2009-11-05 |
CN101326569B (zh) | 2012-07-18 |
CN101326569A (zh) | 2008-12-17 |
US7855334B2 (en) | 2010-12-21 |
JPWO2007066818A1 (ja) | 2009-05-21 |
EP1959428A1 (en) | 2008-08-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5243042B2 (ja) | 音楽編集装置及び音楽編集方法 | |
WO2007066813A1 (ja) | 音楽編集装置、音楽編集情報の作成方法、並びに音楽編集情報が記録された記録媒体 | |
US7626112B2 (en) | Music editing apparatus and method and program | |
WO2007066819A1 (ja) | 音楽編集装置及び音楽編集方法 | |
US11178457B2 (en) | Interactive music creation and playback method and system | |
US20040177746A1 (en) | Automatic generation of musical scratching effects | |
JP2004212473A (ja) | カラオケ装置及びカラオケ再生方法 | |
JP6926354B1 (ja) | オーディオデータの分解、ミキシング、再生のためのaiベースのdjシステムおよび方法 | |
JP2001215979A (ja) | カラオケ装置 | |
KR101136974B1 (ko) | 재생장치 및 재생방법 | |
Huber | The Midi manual: A practical guide to Midi within Modern Music production | |
KR101029483B1 (ko) | 멀티채널 오디오 파일을 이용한 음악 ucc 제작방법 및 그 장치 | |
JP4489650B2 (ja) | 歌詞文字に基づいて切り貼り編集を行うカラオケ録音編集装置 | |
JP2008216681A (ja) | 録音した自分の歌声と模範歌唱とを厳しく比較できるカラオケ装置 | |
JP2021157007A (ja) | フォトムービー生成システム、フォトムービー生成装置、ユーザ端末、フォトムービー生成方法、及びプログラム | |
NZ791398B2 (en) | Method and device for decomposing, recombining and playing audio data | |
JP2008276101A (ja) | 楽曲再生システム及び楽曲再生装置 | |
JP2005300739A (ja) | 演奏データ編集装置 | |
TW200426779A (en) | MIDI playing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200680046174.5 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
ENP | Entry into the national phase |
Ref document number: 2007549214 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 4598/DELNP/2008 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12095745 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006834644 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020087013805 Country of ref document: KR |
|
NENP | Non-entry into the national phase |
Ref country code: DE |