WO2006011342A1 - 楽音生成装置および楽音生成システム - Google Patents
楽音生成装置および楽音生成システム Download PDFInfo
- Publication number
- WO2006011342A1 WO2006011342A1 PCT/JP2005/012539 JP2005012539W WO2006011342A1 WO 2006011342 A1 WO2006011342 A1 WO 2006011342A1 JP 2005012539 W JP2005012539 W JP 2005012539W WO 2006011342 A1 WO2006011342 A1 WO 2006011342A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- coordinate
- musical
- sound
- tone
- Prior art date
Links
- 239000013598 vector Substances 0.000 claims abstract description 44
- 238000004364 calculation method Methods 0.000 claims abstract description 17
- 238000004891 communication Methods 0.000 claims description 6
- 230000033001 locomotion Effects 0.000 abstract description 11
- 230000005540 biological transmission Effects 0.000 abstract description 2
- 238000003860 storage Methods 0.000 abstract description 2
- 239000000872 buffer Substances 0.000 description 18
- 238000000034 method Methods 0.000 description 17
- 239000011295 pitch Substances 0.000 description 15
- 230000005236 sound signal Effects 0.000 description 10
- 230000033764 rhythmic process Effects 0.000 description 8
- 238000012546 transfer Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 5
- 239000003086 colorant Substances 0.000 description 4
- 238000012790 confirmation Methods 0.000 description 4
- 238000013500 data storage Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000002996 emotional effect Effects 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000003892 spreading Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/18—Selecting circuits
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/161—User input interfaces for electrophonic musical instruments with 2D or x/y surface coordinates sensing
Definitions
- the present invention relates to a musical sound generation device and a musical sound generation system that generate musical sounds based on input coordinate data.
- a pen-type input device that inputs coordinate information of a drawn picture
- a display device that displays coordinate information input from the pen-type input device
- a pen-type input device A music performance system including a sound source device that outputs an audio signal corresponding to input coordinate information and a main control device that controls the display device and the sound source device based on the coordinate information input from the pen-type input device. Proposed.
- the sound of the instrument used is replaced with the color on the input screen, and the user can freely select from the color variations and place it on the display screen to listen to the sound. In addition to the enjoyment, it is said that the enjoyment can be enjoyed (see Patent Document 1;).
- the audio signal corresponding to the position where the pen is placed for drawing is an audio signal assigned to the coordinate position. Then, an audio signal corresponding to the coordinate position is generated and recorded in advance when drawing a picture, and the audio signal corresponding to the coordinate position is reproduced by tracing the next drawn picture again. .
- the audio signal generated by the drawn picture is determined by where the pen is placed on the screen when tracing the drawn picture, and the audio signal corresponding to the coordinate position is reproduced. Therefore, in practice, it is not possible to generate free audio based on freely drawn pictures. It cannot be said that it is necessary to perform a pen operation defined at the position of the screen. Also, in order to reproduce music, the pen must be moved at the exact same screen position.
- an image display step for displaying input images in the order of input in a drawing area in which a coordinate system is set, and a tone generation step for generating a tone corresponding to the coordinates in the coordinate system of the image portion being displayed A musical sound generating method having the following has been proposed.
- the coordinate system is set to a first coordinate axis that determines the pitch of the musical sound and a second coordinate axis that determines the left-right volume balance of the musical sound.
- this musical sound generation method it is said that the reproduced drawing and musical sound can be matched with the drawing and musical sound at the time of input (see Patent Document 2;).
- a phrase is generated by adding a tempo element by clicking the mouse.
- the generated sound is a single sound having a pitch and volume assigned to coordinate positions (coordinate points).
- the generated phrase is also determined by a mouse operation at a specific coordinate point on the plane coordinate.
- the musical sound generation method (Patent Document 2) is similar to the music performance system (Patent Document 1), in other words, the degree of freedom of voice generated when generating a voice based on a freely drawn drawing. Is small.
- Patent Document 2 Japanese Patent Laid-Open No. 2003-271164
- Patent Document 3 Japanese Patent Laid-Open No. 6-175652 Disclosure of the invention
- the object to be controlled based on the solid is the increase / decrease of the value of the timbre parameter, etc., and the setting change of the timbre parameter itself Is performed by a parameter input device such as a mode setting switch which is an input means different from the tablet.
- a parameter input device such as a mode setting switch which is an input means different from the tablet.
- the present invention has been made in view of the above problems, and provides a musical sound generation device and a musical sound generation system with a high degree of freedom of musical sounds generated based on a drawing freely drawn by coordinate input means.
- the purpose is to do.
- a musical sound generating device includes coordinate input means for inputting coordinate data and a vector of two coordinate data input before and after a predetermined time interval.
- Vector calculation means for calculating the musical instrument data
- musical tone data generation means for generating musical sound data based on the calculated outer scale
- musical instrument data generation means for generating musical instrument data based on the coordinate data
- generated musical sound And a musical tone output means for controlling the sound source based on the data and the musical instrument data to output musical instruments.
- the musical sound generating device is characterized in that the musical instrument data generating means generates musical sound data based on the coordinate data and a music theory database.
- the musical sound data is selected from one or two or more selected from the pitch of sound, the strength of the sound, the length of the sound, the right / left balance of the sound, and the middle of the vibration of the sound. It is characterized by being.
- the musical sound generating device is further characterized by further comprising image display means for displaying an image corresponding to the coordinate data input by the coordinate input means.
- the musical sound generating device is further characterized by further comprising display color data generating means for generating display color data based on the coordinate data.
- the musical sound generation device generates musical instrument data based on the coordinate data.
- a musical instrument data generating means configured to associate the musical instrument data with the display color data, control the sound source based on the generated musical instrument data, and output a musical tone of the musical instrument.
- the musical sound generating device records a data group consisting of coordinate data groups input separately and generated musical sound data groups, display color data groups, and instrument data groups, and the data It further comprises recording / reproducing means for simultaneously reproducing one or both of the musical sound and the image based on the group.
- a plurality of the tone generation devices are connected by a communication network, and one or both of a tone and an image are connected to each of the plurality of tone generation devices. Are generated simultaneously.
- the musical sound generating apparatus calculates the change amount of the coordinate data as a vector, generates musical sound data corresponding to the calculated vector, and generates musical instrument data based on the coordinate data. Since the musical tone of the musical instrument is output by controlling the sound source based on the generated musical tone data and musical instrument data, the musical tone can be freely obtained without being limited by the size and position of the input coordinate plane.
- FIG. 1 is a diagram showing a schematic configuration of a musical sound generating device according to the present invention.
- FIG. 2 is a diagram for explaining the relationship between coordinate data and vectors in the musical sound generating device of the present invention.
- FIG. 3 is a diagram showing a color circle used for explaining a display color determination method in the musical sound generating apparatus of the present invention.
- FIG. 4 is a diagram showing a hue circle used for explaining a musical instrument determining method in the musical sound generating device of the present invention.
- FIG. 5 is a diagram showing a main flow of a musical sound generation processing procedure in the musical sound generation device of the present invention.
- FIG. 6 shows a flow of color selection processing in the musical sound generation process in the musical sound generation device of the present invention.
- FIG. 7 is a diagram showing a system configuration of an example of a tone generation system of the present invention.
- the musical sound generating apparatus 10 of the present invention includes a coordinate input device (coordinate input means) 12, a main control device 14, an acoustic device (musical sound output means) 16, and a display device (image display means) 18.
- the coordinate input device 12 is for inputting coordinate data of lines or pictures drawn continuously or discontinuously, and an appropriate method such as a touch panel display or a mouse can be used.
- the main control device 14 is, for example, a personal computer, which processes the coordinate data signal from the coordinate input device 12, sends a musical sound signal to the acoustic device 16, and sends an image signal to the display device 18.
- the detailed configuration of the main controller 14 will be described later.
- the acoustic device (musical sound output means) 16 is, for example, a speaker system, and generates a musical sound by a musical sound signal.
- the display device 18 is a liquid crystal display, for example, and displays an image using an image signal.
- the acoustic device 16 and the display device 18 may be integrated with the main control device 14. Further, the display device 18 may be omitted as necessary.
- the main controller 14 will be further described.
- the main control unit 14 includes a motion calculation unit (vector calculation unit) 20, a musical sound data generation unit (musical sound data generation unit) 22, and an instrument data generation unit / display color data generation unit (instrument data generation unit / display color data generation). Means) 24, a data transfer / save unit 26, and a sound source, for example, a MIDI sound source 28 and a timer 30 are provided.
- the motion calculation unit 20 calculates the coordinate data input by the coordinate input device 12 as a vector having a size and an orientation by connecting two coordinate positions input before and after a predetermined time interval. Is.
- the motion calculation unit 20 includes a coordinate buffer unit 32 and a vector calculation unit 34.
- the coordinate buffer unit 32 temporarily stores the input coordinate data.
- the first coordinate buffer unit and the coordinate data of the first coordinate buffer unit that capture the input coordinate data as they are are sequentially received at predetermined time intervals. Includes the second coordinate buffer and third coordinate buffer.
- the vector calculation unit 34 determines whether the coordinate data of the first coordinate buffer unit to the third coordinate buffer unit are This is for calculating the outer radius, and includes a scalar amount calculation unit and an angle change amount calculation unit.
- the musical sound data generation unit 22 generates musical sound data based on the vector calculated by the vector calculation unit 34. In this case, the musical sound data generation unit 22 generates MIDI data.
- the musical sound data generation unit 22 includes a musical sound data determination unit 36 that generates MIDI data, and further includes a music theory database 38 in this case. Details of the music theory database 38 will be described later.
- the musical sound data determining unit 36 includes a sound intensity parameter determining unit that determines a sound intensity parameter based on a scalar amount, and a sound pitch parameter determining unit that determines a sound pitch parameter based on an angle change amount. including. Note that, conversely, the sound pitch parameter may be determined based on the scalar amount, and the sound intensity parameter may be determined based on the angle change amount.
- the musical sound data determination unit 36 is configured such that, for example, when the amount of change in the vector obtained at a predetermined time interval is equal to or smaller than the threshold value, the musical sound data at the previous time is continuously generated as it is. The length of the sound (tempo) can be obtained.
- the musical sound data is selected from these five, which may be the left / right balance of the sound or the modulation (sound modulation) of the sound only by the pitch, intensity, and length of the sound 1 Or it can be 2 or more.
- the instrument data generation unit / display color data generation unit 24 has a function of generating instrument data and a function of generating display color data corresponding to the coordinate data.
- the color 'instrument correspondence database 42 generates display color data and instrument data based on the coordinate data.
- the display color data displayed on the display device 18 and the musical instrument data that is the material of the musical sound generated by the acoustic device 16 are allocated in the form of a hue ring and a musical instrument divided area corresponding to the hue ring, for example, according to the coordinate position.
- Display color data and musical instrument data can be obtained by changing the coordinate position after displaying the color wheel on the input screen.
- the color 'instrument correspondence check' determining unit 40 checks the input coordinate data against the color 'musical instrument correspondence database 42 and simultaneously determines display color data and instrument data.
- the data transfer / save unit 26 is a data transfer unit that temporarily stores each data including coordinate data sent from the musical tone data generation unit 22 and the musical instrument data generation unit / display color data generation unit 24. 44 and a data storage unit 46 for storing the data as necessary.
- the MIDI sound source 28 includes musical tones for a plurality of types of musical instruments, and is controlled by musical tone data and musical instrument data signals from the data transfer unit 42 to generate musical tone signals for the selected musical instrument.
- a musical sound is generated by the acoustic device 16 by the musical sound signal.
- the image drawn on the input device 12 is displayed on the display device 18 by a signal of coordinate data including display color data of the data transfer unit 42.
- the acoustic device 16 and the display device 18 can be operated at the same time, or only one of them can be operated.
- Coordinate data that changes continuously or discontinuously is taken into the coordinate buffer unit 32 of the motion calculation unit 20 at predetermined time intervals.
- the pen is moved from left to right in FIG. 2 to obtain coordinate data l (xl, yl, tl) and coordinate data 1 at a certain time, and then a predetermined time has elapsed.
- the coordinate data 2 (x2, y2, t2) and the coordinate data 2 are acquired, the coordinate data 3 (x3, y3, t3) when a predetermined time elapses are sequentially obtained.
- (xi, yj) indicates the coordinate value
- tk indicates the time.
- the time tl, the time t2, and the time t3 are predetermined equal time intervals.
- the latest coordinate data 3 is taken into the first buffer section. Prior to that, the coordinate data 2 is shifted from the first buffer unit to the second buffer unit, and the coordinate data 1 is also shifted to the first buffer unit.
- the angle change amount calculation unit of the vector calculation unit 34 obtains a solid a from the coordinate data 1 and the coordinate data 2, that is, by connecting the two coordinate positions of the coordinate data 1 and the coordinate data 2, and similarly.
- the vector b is obtained from the coordinate data 2 and the coordinate data 3.
- the position of the coordinate data (xi, yj) is freely changed by the movement of the pen, and therefore the vector b can be different from the vector a. That is, for example, as shown in FIG.
- the pen is slowly moved in a certain direction between time tl and time t2, and from time t2 to time t3 If you change the direction of the pen and move it quickly, the vector b will have a larger scalar value and the direction of the vector will change with respect to the vector a. At this time, the amount of change in the direction of the two vectors obtained before and after the predetermined time interval is shown as an angle change amount ⁇ in FIG.
- the pitch of the sound is determined according to the angle change amount ⁇ .
- the angle change amount ⁇ can take a value of ⁇ 180 degrees according to the movement of the pen.
- the note pitch (hereinafter referred to as “note”) of MIDI data is used as the pitch of the sound.
- the note is, for example, a number from 0 to 127 in which a full note (piano white key) and a semitone (piano black key) are arranged.
- the music theory database 38 in addition to the data that can specify the pitch of all sounds according to the angle change amount 0 shown in Table 1, for example, the scale on the chord according to the angle change amount 0 shown in Table 2 ( This includes data for C code) and the ethnic scales shown in Table 3 (here, offshore scales).
- the scalar quantity calculation unit of the vector calculation unit 34 calculates the scalar quantity of each of the vector a and vector b forces. Then, the musical sound data determining unit 36 of the musical sound data generating unit 22 generates sound intensity (sound intensity data, sound intensity parameter) according to the scalar amounts of the vectors a and b. In other words, the intensity of the sound can be changed by changing the amount of vector scalar.
- L can take a value within the range of 0-1.
- the sound intensity is the volume of MIDI data (hereinafter referred to as volume), and volume is a number from 0 to 127.
- the sound intensity is generated according to the scalar quantity.
- the sound intensity (tempo) is set by continuing the intensity of the sound generated according to the scalar quantity at the previous time. Can be generated.
- a hue circle is set in which hue h is assigned in an angular range of 360 degrees with the center point of the coordinate plane as the center.
- the hue circle is assigned a saturation s so that the closer it is to the center point of the coordinate plane, the lighter the color is, and the farther the center point of the coordinate plane is, the brighter the color.
- color setting means such as a color selection button is operated to display a hue circle on the coordinate plane
- the color of the display color can be changed by changing the angle of the hue circle by moving the state force to the position of the current coordinate P (X, y) of the plane coordinates and moving the state force to the position of another coordinate.
- the saturation of the display color can be changed by changing the distance from the center of the hue circle.
- the display color can be changed by dragging the right button.
- a desired brightness can be obtained by setting the brightness to change according to the length of time in which the pen is fixed at the same coordinates without being powered.
- the hue circle in FIG. 3 is divided into, for example, 12 parts, and musical instruments are assigned to the respective colors A to L.
- the instrument number (Program Nunber) of the tone map shown in Table 4 of the MIDI sound source 28 may be assigned mechanically as shown in Table 5 or the desired instrument number as shown in Table 6 May be assigned.
- the kind of musical instrument can be determined together with the determination of the display color. Note that when the image is not displayed! /, The instrument can be determined only by operating on the coordinate plane.
- the data for selecting the display color and the data for selecting the musical instrument described above are included in the color 'instrument correspondence database 42', and the color 'instrument correspondence comparison' determination unit 40 The display color and the instrument are determined by collating with the input coordinate data.
- mode confirmation is performed (S3 in FIG. 5), and the operator performs color selection as desired (S22 in FIG. 5).
- the process of 0 color selection will be described later.
- drawing is performed based on the default color conditions.
- the drawing start (drag start) decision is made (S4 in Fig. 5), the drawing is then initialized, and the initial two continuous coordinates (Pbuf3, Pbuf2) are the first. 3. Shifted to the second buffer (S5 in Fig. 5). If drawing has not started, return to step S3 for mode confirmation. Thus, it is determined whether or not the drawing is performed at a timing that matches the rhythm (sound length, tempo) (S6 in FIG. 5).
- the current coordinate P being drawn (hereinafter sometimes referred to as the current coordinate), that is, the current coordinate data is acquired (S7 in FIG. 5), and the drawing is performed. Subsequently, the current coordinate P is compared with the previous coordinate (Pbuf 2) (S8 in FIG. 5).
- step S6 If the difference between the value of the current coordinate P and the previous coordinate (Pbuf 2) that is a predetermined time in the past is less than the threshold value, the process returns to step S6 where it is determined again whether the drawing is performed at a timing that matches the rhythm. .
- the difference between the value of the current coordinate P and the previous coordinate (Pbuf 2) is greater than or equal to the threshold value, the current coordinate P is substituted into the first buffer (Pbuf 1, S9 in FIG. 5).
- note off is transmitted to the MIDI sound source 28. That is, for example, when a musical instrument is selected that continues without sound disappearing naturally, such as a wind instrument, the previous sound (the current sound) is stopped to play the next sound (in Fig. 5, S10). ).
- the angle change amount ⁇ of two vectors and the scalar amount L of each vector are calculated from the coordinate data (Pbuf1 to Pbuf3) of the first to third buffers (Sl l in Fig. 5).
- the musical sound generation device 10 assumes that a plurality of operators draw alternately, create musical sounds, and then reproduce these multiple drawings and musical sounds at the same time.
- Each generated data is saved in the list, and the time when the sound was heard is added to the data (S13 in Fig. 5).
- each buffer is shifted backward (S14 in FIG. 5). Furthermore, it is judged whether the generated data exceeds the specified amount (S15 in Fig. 5). When the generated data does not exceed the specified amount, it is determined whether or not the operator has finished drawing (S16 in Fig. 5), and when the operator finishes drawing, that is, the pen is released and dragged. When finished, return to step S3 for mode confirmation. On the other hand, when the operator continues drawing, the timing Returning to the confirmation step S6, new coordinates are acquired. If there is only one operator, step S15 is omitted and the process proceeds to step S16.
- step S15 where it is determined whether the generated data exceeds the specified amount. If the generated data exceeds the specified amount, it is further determined whether the force has reached the specified number of people (S17 in Fig. 5). , processing is terminated when it reaches a prescribed number of people (in Figure 5, S18), whereas, operation by another operator is performed when not reached the predetermined number of people (in Figure 5, omitted.) 0 [0056]
- MIDI data and screen display data are generated (S12 in FIG. 5)
- screen display is performed in real time based on these data (S19 in FIG. 5), or MIDI data is displayed. The data is sent to the MIDI sound source (S20 in Fig. 5) and played (S21 in Fig. 5).
- screen display and pronunciation can be performed based on the stored data.
- a plurality of operators a plurality of drawings are performed on the same screen, and a simultaneous performance (session) is performed.
- there are a plurality of operators only one of a plurality of drawing and simultaneous performances may be performed.
- the color selection process will be described.
- the current coordinate P is acquired (S24 in FIG. 6). Therefore, the positional relationship between the center point O of the effective range of the hue circle and the current coordinate P is calculated (S25 in FIG. 6), and it is further determined whether the pen is raised (in FIG. 6, S26).
- the hue h force is determined based on the center angle of the hue ring, and the saturation s is determined based on the distance from the center point of the hue ring (S27 in FIG. 6), and the color selection ends ( In FIG. 6, S28) returns to the main routine for drawing.
- the new coordinate is obtained as the current coordinate ((S24 in FIG. 6) 0.
- the new coordinate is the same as the previous coordinate P.
- the brightness is judged whether the brightness is the maximum value, when the brightness is not the maximum value, the brightness is increased (S31 in FIG. 6), and when the brightness is the maximum value, the brightness is decreased to the minimum value ( Figure 6), S32), each returns to step S26 where it is determined whether the pen has been raised.
- a seat A device capable of inputting coordinate data by a plurality of persons as the mark input device 12 may be used, and a plurality of coordinate data may be processed simultaneously by the main control device 14.
- musical sound data can be generated based on a three-dimensional vector by using a three-dimensional input device such as a three-dimensional mouse as the coordinate input apparatus 12.
- the coordinate input device 12 may input the position of an object captured by a camera as coordinate data.
- the line is drawn thickly by providing a selection switch or the like to express the blur of the line according to the magnitude of the vector scalar quantity, in other words, the speed of the pen movement. It is also possible to change the height.
- a plurality of the musical tone generators 10 are connected by a communication network, and the musical tone generators 10 respectively generate musical sounds and images at the same time, or record and playback as necessary.
- the data may be exchanged in real time, or may be recorded with another musical sound generator 10 that receives and records data of one or more musical sound generators 10 with a time lag, for example. You can also superimpose your own data on this data. Also, instead of generating musical sounds and images at the same time, either one of musical sounds or images may be generated at the same time.
- one example of a musical sound generation system is one in which, for example, two musical sound generation devices 10 are directly connected via a communication network (not shown, see FIG. 8).
- reference numeral 30 a indicates a rhythm control / synchronization unit including the timer 30.
- a data group consisting of a coordinate data group input by each musical tone generator 10 and a musical tone data group, a display color data group and a musical instrument data group generated according to the coordinate data is a data storage unit of each musical tone generator 10.
- These data groups are exchanged in real time, for example, and are generated in real time, and music and images are simultaneously generated based on the data groups controlled and synchronized by the rhythm control / synchronization unit 30a. In this case as well, instead of generating music and images at the same time, either music or images are used. You may produce
- FIG. 1 Another example of the musical sound generation system is one in which, for example, three musical sound generation devices 10a are connected via a communication network 50 via a server unit 48, as shown in FIG.
- the data storage unit 46 and the rhythm control / synchronization unit 30b are provided in the server unit 48, and the data groups of the three musical tone generation devices 10 are exchanged in real time, for example, in the same manner as the musical tone generation system of FIG.
- the rhythm control / synchronization unit 30b generates music and images simultaneously based on the data group controlled and synchronized. In this case, either the musical sound or the image may be generated simultaneously instead of generating the musical sound and the image at the same time.
- a session can be performed by a plurality of persons in remote locations.
- the musical sound generating apparatus 10 of the present invention since drawing and playing can be performed simultaneously, it is possible to provide personal enjoyment as well as a new expression for an artist. Further, the musical sound generating device 10 of the present invention is not limited to the use only for playing music. For example, by making the movement of a pen writing a character such as a sign into a voice, it can be used as a new sign authentication and visual information transmission means for visually impaired people. Alternatively, it can be applied as a tool for rehabilitation and prevention of blurring of the elderly because it can easily produce sounds. Similarly, it can be applied to children's emotional education and learning of colors and sounds.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Electrophonic Musical Instruments (AREA)
- Auxiliary Devices For Music (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006528958A JP3978506B2 (ja) | 2004-07-29 | 2005-07-07 | 楽音生成方法 |
US11/631,398 US7504572B2 (en) | 2004-07-29 | 2005-07-07 | Sound generating method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-220998 | 2004-07-29 | ||
JP2004220998 | 2004-07-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006011342A1 true WO2006011342A1 (ja) | 2006-02-02 |
Family
ID=35786093
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/012539 WO2006011342A1 (ja) | 2004-07-29 | 2005-07-07 | 楽音生成装置および楽音生成システム |
Country Status (3)
Country | Link |
---|---|
US (1) | US7504572B2 (ja) |
JP (1) | JP3978506B2 (ja) |
WO (1) | WO2006011342A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010096885A (ja) * | 2008-10-15 | 2010-04-30 | Hideyuki Kotani | 選曲装置 |
CN109920397A (zh) * | 2019-01-31 | 2019-06-21 | 李奕君 | 一种物理学中音频函数制作系统及制作方法 |
US11756516B2 (en) | 2020-12-09 | 2023-09-12 | Matthew DeWall | Anatomical random rhythm generator |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102006008298B4 (de) * | 2006-02-22 | 2010-01-14 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Vorrichtung und Verfahren zum Erzeugen eines Notensignals |
EP2136356A1 (en) * | 2008-06-16 | 2009-12-23 | Yamaha Corporation | Electronic music apparatus and tone control method |
TWI467467B (zh) * | 2012-10-29 | 2015-01-01 | Pixart Imaging Inc | 畫面物件移動控制方法及裝置 |
US11532293B2 (en) * | 2020-02-06 | 2022-12-20 | James K. Beasley | System and method for generating harmonious color sets from musical interval data |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03206493A (ja) * | 1990-01-09 | 1991-09-09 | Yamaha Corp | 電子楽器 |
JPH06175652A (ja) * | 1992-12-10 | 1994-06-24 | Yamaha Corp | 電子楽器のパラメータ入力装置および演奏操作装置 |
WO1996022580A1 (fr) * | 1995-01-17 | 1996-07-25 | Sega Enterprises, Ltd. | Processeur d'images et dispositif electronique |
JP2002182647A (ja) * | 2001-12-07 | 2002-06-26 | Yamaha Corp | 電子楽器 |
JP2003271164A (ja) * | 2002-03-19 | 2003-09-25 | Yamaha Music Foundation | 楽音発生方法、楽音発生プログラム、記憶媒体、及び楽音発生装置 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2890564B2 (ja) * | 1989-12-14 | 1999-05-17 | ヤマハ株式会社 | 電子楽器 |
EP0434086B1 (en) * | 1989-12-22 | 1995-03-29 | Yamaha Corporation | Musical tone control apparatus |
JP3528284B2 (ja) * | 1994-11-18 | 2004-05-17 | ヤマハ株式会社 | 3次元サウンドシステム |
JP3224492B2 (ja) | 1995-06-08 | 2001-10-29 | シャープ株式会社 | 音楽演奏システム |
US5920024A (en) * | 1996-01-02 | 1999-07-06 | Moore; Steven Jerome | Apparatus and method for coupling sound to motion |
KR20010020900A (ko) * | 1999-08-18 | 2001-03-15 | 김길호 | 화성법과 색음 상호변환을 이용하여 색채를 조화하는 방법및 장치 |
-
2005
- 2005-07-07 JP JP2006528958A patent/JP3978506B2/ja active Active
- 2005-07-07 US US11/631,398 patent/US7504572B2/en not_active Expired - Fee Related
- 2005-07-07 WO PCT/JP2005/012539 patent/WO2006011342A1/ja active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03206493A (ja) * | 1990-01-09 | 1991-09-09 | Yamaha Corp | 電子楽器 |
JPH06175652A (ja) * | 1992-12-10 | 1994-06-24 | Yamaha Corp | 電子楽器のパラメータ入力装置および演奏操作装置 |
WO1996022580A1 (fr) * | 1995-01-17 | 1996-07-25 | Sega Enterprises, Ltd. | Processeur d'images et dispositif electronique |
JP2002182647A (ja) * | 2001-12-07 | 2002-06-26 | Yamaha Corp | 電子楽器 |
JP2003271164A (ja) * | 2002-03-19 | 2003-09-25 | Yamaha Music Foundation | 楽音発生方法、楽音発生プログラム、記憶媒体、及び楽音発生装置 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010096885A (ja) * | 2008-10-15 | 2010-04-30 | Hideyuki Kotani | 選曲装置 |
CN109920397A (zh) * | 2019-01-31 | 2019-06-21 | 李奕君 | 一种物理学中音频函数制作系统及制作方法 |
CN109920397B (zh) * | 2019-01-31 | 2021-06-01 | 李奕君 | 一种物理学中音频函数制作系统及制作方法 |
US11756516B2 (en) | 2020-12-09 | 2023-09-12 | Matthew DeWall | Anatomical random rhythm generator |
Also Published As
Publication number | Publication date |
---|---|
JPWO2006011342A1 (ja) | 2008-05-01 |
US7504572B2 (en) | 2009-03-17 |
JP3978506B2 (ja) | 2007-09-19 |
US20080168893A1 (en) | 2008-07-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Fels | Designing for intimacy: Creating new interfaces for musical expression | |
Hunt et al. | Mapping performer parameters to synthesis engines | |
Freeman | Extreme sight-reading, mediated expression, and audience participation: Real-time music notation in live performance | |
US8242344B2 (en) | Method and apparatus for composing and performing music | |
Hunt et al. | Multiple media interfaces for music therapy | |
Jensenius | An action–sound approach to teaching interactive music | |
Pressing | Some perspectives on performed sound and music in virtual environments | |
Ward et al. | Music technology and alternate controllers for clients with complex needs | |
JP3978506B2 (ja) | 楽音生成方法 | |
Fels et al. | Evolving Tooka: from Experiment to Instrument. | |
Siegel | Dancing the music: Interactive dance and music | |
Zlatintsi et al. | A web-based real-time kinect application for gestural interaction with virtual musical instruments | |
JP2003521005A (ja) | 単一またはいくつかのリンクされたワークステーションを用いる、音楽を表示するための装置 | |
Ilsar | The AirSticks: a new instrument for live electronic percussion within an ensemble | |
Mainsbridge | Body as instrument: an exploration of gestural interface design | |
Overholt | Advancements in violin-related human-computer interaction | |
Bergstrom | Soma: live performance where congruent musical, visual, and proprioceptive stimuli fuse to form a combined aesthetic narrative | |
Armitage et al. | Augmented opera performance | |
Meneses | Iterative design in DMIs and AMIs: expanding and embedding a high-level gesture vocabulary for T-Stick and GuitarAMI | |
WO2024190759A1 (ja) | 情報処理方法、情報処理システムおよびプログラム | |
Ingebritsen | Auditory kinesthesia: A framework to facilitate the design and performance of interactive music systems | |
JPH08335076A (ja) | 音楽演奏システム | |
von Arnim et al. | The Feedback Mop Cello: An Instrument for Interacting with Acoustic Feedback Loops | |
Rubin | in tensions, for dancer, cello, and motion-sensitive live electronics | |
Ishov | PrismaSonus: Bridging Acoustic and Digital Worlds in Flute Practice and Performer-Composer Collaboration |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
DPEN | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2006528958 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11631398 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |