[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN102651212B - Playing device and electronic musical instrument - Google Patents

Playing device and electronic musical instrument Download PDF

Info

Publication number
CN102651212B
CN102651212B CN201210046642.6A CN201210046642A CN102651212B CN 102651212 B CN102651212 B CN 102651212B CN 201210046642 A CN201210046642 A CN 201210046642A CN 102651212 B CN102651212 B CN 102651212B
Authority
CN
China
Prior art keywords
mentioned
subregion
pitch
music performance
performance apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201210046642.6A
Other languages
Chinese (zh)
Other versions
CN102651212A (en
Inventor
坂崎尚之
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Casio Computer Co Ltd
Original Assignee
Casio Computer Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2011226534A external-priority patent/JP6007476B2/en
Application filed by Casio Computer Co Ltd filed Critical Casio Computer Co Ltd
Publication of CN102651212A publication Critical patent/CN102651212A/en
Application granted granted Critical
Publication of CN102651212B publication Critical patent/CN102651212B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Electrophonic Musical Instruments (AREA)

Abstract

The invention relates to a playing device and an electronic musical instrument. A playing device body (11) is disposed in a main area, and when the playing device body is placed in the main area, sound generation is set a fixed time. A CPU (21) uses a tone which is stored in a tone table corresponding to the main area, and a pitch which is stored in a pitch table corresponding to a sub-area to generate a musical note starting event. The musical note starting event is transmitted from the playing device body (11) to an electronic musical instrument (10), and a sound source part (31) of the electronic musical instrument is generated and output musical sound based on the tone and pitch of the musical note starting event.

Description

Music performance apparatus and electronic musical instrument
The application based on No. 2011-41139th, the Japanese Patent application submitted on February 28th, 2011, and requires the right of priority of this application, comprises its full content by reference at this.
Technical field
The present invention relates to and to be kept by hand by player and brandish and produce music performance apparatus and the electronic musical instrument of musical sound.
Background technology
Proposed there is the electronic musical instrument of following structure: in rod member, arrange sensor, player brandishes with hand retaining member, and sensor detects the motion of this component, and electronic musical instrument sends musical sound in the past.Especially, in this electronic musical instrument, rod member possesses the shape as the drum rod hitting rod or Japanese drum of drum (drum), knocks the action of drum or Japanese drum accordingly, send percussion instrument sound with player.
Such as, in Japanese Patent No. 2663503 publication, propose there is the music performance apparatus of following structure: acceleration transducer is set in rod member, carrys out the output (acceleration transducer value) of acceleration sensor if through the stipulated time after reaching the threshold value of regulation, then send musical sound.
In music performance apparatus disclosed in Japanese Patent No. 2663503 publication, there are the following problems: the acceleration transducer value just based on rod member controls sending of musical sound, is difficult to the change realizing musical sound such desired by player.
In addition, in Japanese Unexamined Patent Publication 2007-256736 publication, propose just like lower device: multiple tone color can be sent, use geomagnetic sensor, according to rod member towards direction, send certain tone color in multiple tone color.In device disclosed in Japanese Unexamined Patent Publication 2007-256736 publication, there are the following problems: because make tone color change according to the direction of component, if so the kind of the tone color that should pronounce increases, the direction (angular range) then distributing to this tone color diminishes, and is thus difficult to the musical sound of the tone color desired by producing.
Summary of the invention
The object of the present invention is to provide a kind of musical sound inscape that can make to comprise tone color and pitch according to the music performance apparatus changed like that desired by player and electronic musical instrument.
Object of the present invention is realized by following music performance apparatus, and this music performance apparatus has: retaining member, and player can keep with hand; Tone color storage unit, stores the information for determining main areas and establishes the tone color of corresponding musical sound with this main areas, and this main areas delimited by the face that at least side is vertical with ground surface in space; Pitch storage unit, stores the information for determining the subregion being arranged in above-mentioned main areas and establishes the pitch of corresponding musical sound with this subregion; Positional information acquisition unit, obtains the positional information of above-mentioned retaining member; Sensing element, when to be in the position of the above-mentioned retaining member acquired by above-mentioned positional information unit in above-mentioned main areas and to be in above-mentioned subregion, read from above-mentioned tone color storage unit and establish corresponding tone color with the main areas residing for the position of above-mentioned retaining member, further, read from above-mentioned pitch storage unit and establish corresponding pitch with the subregion residing for the position of above-mentioned retaining member; And pronunciation indicating member, the musical sound with tone color read-out by this sensing element and pitch is sent to the instruction of musical sound generation unit.
Accompanying drawing explanation
Fig. 1 is the block diagram of the structure of the electronic musical instrument represented involved by the first embodiment of the present invention.
Fig. 2 is the block diagram of the structure of the music performance apparatus main body represented involved by present embodiment.
Fig. 3 is the process flow diagram of the example representing the process performed in the music performance apparatus main body involved by present embodiment.
Fig. 4 represents that the current location involved by present embodiment obtains the process flow diagram of the example of process.
Fig. 5 is the example flow chart of the main areas setting process represented involved by present embodiment.
Fig. 6 is the example flow chart of the tone color setting process represented involved by present embodiment.
Fig. 7 is the figure that outline represents the decision of the main areas involved by present embodiment.
Fig. 8 is the figure of the example of the form representing main areas in the RAM involved by present embodiment and tone color.
Fig. 9 is the process flow diagram of the example of the subregion setting process represented involved by present embodiment.
Figure 10 is the process flow diagram of the example of the pitch setting process represented involved by present embodiment.
Figure 11 is the figure that outline represents the decision of the subregion involved by present embodiment.
Figure 12 is the figure of the example of the form representing subregion in the RAM involved by present embodiment and pitch.
Figure 13 is the process flow diagram of the example of the pronunciation timing check processing represented involved by present embodiment.
Figure 14 is the process flow diagram of the example of the note open event generating process represented involved by present embodiment.
Figure 15 is the figure that outline represents the example of the pitch of subregion set in the subregion setting process and pitch setting process of the music performance apparatus main body involved by present embodiment and correspondence.
Figure 16 is the process flow diagram of the example representing the process performed in the musical instrument portion involved by present embodiment.
Figure 17 is the figure of the example of the subregion setting process represented involved by the second embodiment.
Figure 18 is the figure that outline represents the example of the pitch of subregion set in the subregion setting process and pitch setting process of the music performance apparatus main body involved by present embodiment and correspondence.
Figure 19 is the process flow diagram of the example of the main areas setting process represented involved by the 3rd embodiment.
Figure 20 is the process flow diagram of the example of the setting process representing subregion involved by the 4th embodiment and pitch.
Figure 21 is the figure that outline represents subregion set in the subregion of music performance apparatus main body involved by present embodiment and the setting process of pitch.
Figure 22 is the process flow diagram of the example of the second tone color setting process represented involved by the 5th embodiment.
Figure 23 is the example of main areas during outline represents involved by combination embodiment and subregion.
Figure 24 is the process flow diagram of the example of the main areas setting process represented involved by other embodiments.
Figure 25 is the process flow diagram of the example of the subregion setting process represented involved by other embodiments.
Figure 26 is the figure of the example representing main areas involved by other embodiments and subregion.
Figure 27 is the process flow diagram representing the pronunciation timing check processing that other modes relate to.
Figure 28 is the figure of the structure representing the subregion that other modes relate to.
Figure 29 is the process flow diagram of the note open event generating process that other modes relate to.
Figure 30 represents the variation of the shape of the subregion involved by the 4th embodiment.
Embodiment
Hereinafter, with reference to the accompanying drawings of embodiments of the present invention.Fig. 1 is the block diagram of the structure of the electronic musical instrument represented involved by the first embodiment of the present invention.As shown in Figure 1, the electronic musical instrument 10 involved by present embodiment has and the bar-shaped music performance apparatus main body 11 that along its length extend of brandishing hand-held for player.And electronic musical instrument 10 possesses the musical instrument portion 19 for generation of musical sound, musical instrument portion 19 has CPU12, interface (I/F) 13, ROM14, RAM15, display part 16, input part 17 and sound system 18.As described later, music performance apparatus main body 11 has acceleration transducer 23 and geomagnetic sensor 22 near side, the i.e. front contrary with the root side that player keeps.
The I/F13 in musical instrument portion 19 accepts the data (such as note open event (note-on event)) from music performance apparatus main body 11, is saved in RAM15, and to the acceptance of CPU12 notification data.In the present embodiment, such as, the root side end of music performance apparatus main body 11 is provided with infrared communications set 24, I/F13 is also provided with infrared communications set 33.Therefore, the infrared ray that the infrared communications set 24 receiving music performance apparatus main body 11 by the infrared communications set 33 of I/F13 sends, musical instrument portion 19 can receive the data from music performance apparatus main body 11.
CPU12 performs the control in the musical instrument portion 19 of control, the particularly electronic musical instrument of electronic musical instrument 10 entirety, the detection of operation of the bond switching (not shown) of formation input part 17, the various process based on the note open event generation musical sound received via I/F13 etc.
ROM14 preserves the control in the musical instrument portion 19 of control, the particularly electronic musical instrument of electronic musical instrument 10 entirety, the detection of operation of the bond switching (not shown) of formation input part 17, the various handling procedures based on the note open event generation musical sound received via I/F13 etc.And ROM14 includes Wave data, particularly bass drum (bass drum) to various tone color, step on the Wave data district that the idiophonic Wave datas such as small cymbals (high-hat), snare drum (snare drum), big cymbals (cymbal) preserve.Certainly, be not limited to idiophonic Wave data, in ROM22, also can preserve the Wave data of other idiophonic tone colors such as the stringed musical instruments such as the keyboard instrument such as wind instrument, piano, guitar such as flute, saxophone, trumpet, xylophone (Marimba), vibraphone (Vibraphone), timpani (timpani).
RAM15 stores the program read from ROM14, the data produced processing procedure and parameter.The pronunciation situation (pronunciation mark) etc. of the sensor values in the data produced in processing procedure, comprise the mode of operation of the switch of input part 17, receiving via I/F13, musical sound.
Display part 16 such as has liquid crystal indicator (not shown), can show selected tone color, the tone color of main areas described later and musical sound is established the form of corresponding main areas and tone color and the pitch of subregion and musical sound established the content etc. of the form of corresponding subregion and pitch.And input part 17 has switch (not shown), can indicate the appointment etc. of tone color.
Sound system 18 possesses sound source part 31, voicefrequency circuit 32 and loudspeaker 35.Sound source part 31, according to the instruction from CPU12, reads Wave data from the Wave data district of ROM15, generates and export tone data.Voicefrequency circuit 32 converts the tone data exported from sound source part 31 to simulating signal, the simulating signal after conversion is amplified and exports to loudspeaker 35.Thus, musical sound is exported from loudspeaker 35.
Fig. 2 is the block diagram of the structure of the music performance apparatus main body represented involved by present embodiment.As shown in Figure 2, music performance apparatus main body 11 has geomagnetic sensor 22 and acceleration transducer 23 in side, the i.e. front contrary with the root side that player keeps.The position of geomagnetic sensor 22 is not limited to front, also can be configured in root side.But the many front positions by music performance apparatus main body 11 of player are considered as benchmark (that is, with soon front end while) and brandish music performance apparatus main body 11.Therefore, consider the positional information of the front end obtaining music performance apparatus main body 11, geomagnetic sensor 22 is preferably placed at front.Acceleration transducer 22, particularly in order to make changing greatly of acceleration manifest, is preferably configured in the front of music performance apparatus main body 11.
Geomagnetic sensor 22 is the 3 axle geomagnetic sensors having magneto-resistance effect element or Hall element, can detect each magnetic field components in x, y, z direction.Therefore, in the present embodiment, based on the sensor values of 3 axle geomagnetic sensors, the positional information (coordinate figure) of music performance apparatus main body 11 can be obtained.And acceleration transducer 23 is sensors of such as electrostatic capacitive or piezoresistance component type, can export the data value of the acceleration that expression produces.Acceleration transducer 23 involved by present embodiment such as can obtain the major axis of music performance apparatus main body 11, the 3 axial accekerations (composition) with 2 axles of long axis normal.According to obtain from acceleration transducer 3 each compositions axial, the amount of movement of music performance apparatus main body 11 can be calculated.And, according to the composition of the long axis direction of music performance apparatus main body 11, the pronunciation timing of musical sound can be determined.
And music performance apparatus main body 11 has CPU21, infrared communications set 24, ROM25, RAM26, interface (I/F) 27 and input part 28.CPU21 performs following process: obtaining of the sensor values in music performance apparatus main body 11, obtaining of the positional information that the sensor values of base area Magnetic Sensor 22 and the sensor values of acceleration transducer 23 carry out, the region of musical sound and the setting of main areas can be sent, the region of pitch and the setting of subregion of the melody that send is specified in main areas, based on the sensor values (acceleration transducer value) of acceleration transducer 22 to the detection of the pronunciation of musical sound timing, the generation of note open event, and the process of control etc. via the transmission of I/F27 and infrared communications set 24 pairs of note open event.
Following program is preserved: obtaining of the sensor values in music performance apparatus main body 11 in ROM25, obtaining of the positional information of the sensor values that the sensor values of base area Magnetic Sensor 22 and acceleration transducer 23 carry out, the region of musical sound and the setting of main areas can be sent, the region of pitch and the setting of subregion of the melody that send is specified in main areas, based on the detection of acceleration transducer value to the pronunciation timing of musical sound, the generation of note open event, and the handling procedure of control etc. via the transmission of I/F27 and infrared communications set 24 pairs of note open event.In RAM26, preserve sensor values etc., to obtain or the value that generates in processes.I/F27 exports data according to the instruction from CPU21 to infrared communications set 24.And input part 28 has switch (not shown).
Fig. 3 is the process flow diagram of the example representing the process performed in the music performance apparatus main body involved by present embodiment.As shown in Figure 3, the CPU21 of music performance apparatus main body 11 performs the initialization process (step 301) comprising the data of RAM26 or the removing of mark etc.In initialization process, timer interruption (time interrupt) is removed.If relieve timer interruption, then in music performance apparatus main body 11, read in the sensor values of geomagnetic sensor 22 and the sensor values of acceleration transducer 23 by CPU21 with predetermined time interval, and it is saved in RAM26 respectively.And, in initialization process, based on the initial value of geomagnetic sensor 22 and the initial value of acceleration transducer 23, obtain the initial position of music performance apparatus main body 11, and it is also stored in RAM26.Current location acquired by obtaining in process (step 304) in the current location of following explanation becomes the relative position relative to above-mentioned initial position.After initialization process, repeatedly perform step 302 ~ 310.
CPU21 is obtained the sensor values (acceleration transducer value) of the acceleration transducer 23 obtained by interrupt processing and is stored in RAM26 (step 302).And CPU21 obtains the sensor values (geomagnetic sensor value) (step 303) of the geomagnetic sensor 22 obtained by interrupt processing.
Then, CPU21 performs current location and obtains process (step 304).Fig. 4 represents that the current location involved by present embodiment obtains the process flow diagram of the example of process.As shown in Figure 4, geomagnetic sensor value that CPU21 preserves based on RAM26, that obtain in the step 303 of previous execution and the geomagnetic sensor value obtained in this place of execution step 303, calculate the moving direction (step 401) of music performance apparatus main body 11.As mentioned above, the geomagnetic sensor 22 involved by present embodiment is 3 axle geomagnetic sensors, so the tri-vector that can form based on the difference of being divided by each one-tenth of x composition, y composition, z composition obtains direction.
And, acceleration transducer value that CPU21 preserves based on RAM26, that obtain in the step 302 of previous execution and the acceleration transducer value obtained in this step 302 performed, calculate the amount of movement (step 402) of music performance apparatus main body 11.This can obtain by using the difference (time interval) obtaining the moment of acceleration transducer value and respective acceleration transducer value to carry out 2 integrations.Then, previous positional information, the moving direction obtained respectively in step 401,402 and amount of movement that CPU21 preserves based on RAM26, calculate the coordinate (step 403) of current location.CPU21 judges whether the coordinate calculated there occurs change (step 404) relative to previous position coordinates.When being judged as "Yes" in step 404, CPU21 preserves the coordinate of the current location calculated in RAM26, as new positional information (step 405).
After current location obtains process (step 304), CPU21 performs main areas setting process (step 305).Be configured in the present embodiment, player uses music performance apparatus main body 11, the summit of appointed area, and the region that the vertical line that the plane after the two dimensional surface of being delimited by summit is projected on ground surface and the summit from this plane extend delimited is main areas.And, in the present embodiment, delimit out the main areas that music performance apparatus main body 10 can be used to produce musical sound.And, as described later, establish with the pitch of the actual musical sound that should send the subregion associated by setting in main areas, produce actual musical sound.Below, the situation that use 4 summits set main areas is described.Fig. 5 is the process flow diagram of the example of the main areas setting process represented involved by present embodiment.
As shown in Figure 5, CPU21 judges whether the main areas configuration switch in input part 18 is switched on (step 501).When being judged as "Yes" in step 501, CPU21 obtains the positional information that RAM26 preserves, and it can be used as the coordinate on summit (apex coordinate) to be stored in RAM26 (step 502).Then, CPU21 will represent in RAM26 that the Parameter N of number of vertices is from adding 1 (step 503).In addition, in the present embodiment, above-mentioned parameter N is initialized to " 0 " in initialization process (step 301 of Fig. 3).Then, CPU21 judges that whether Parameter N is than " 4 " large (step 504).When being judged as "No" in step 504, terminate main areas setting process.
Be judged as in step 504 that the "Yes" meaning refers to that 4 apex coordinates have been stored in RAM26.Therefore, when being judged as "Yes" in step 504, CPU21 obtains the information (step 505) of the two dimensional surface (quadrilateral) of being delimited by 4 apex coordinates.Then, CPU21 is based on the information of the quadrilateral acquired by expression, obtain and this quadrilateral is projected to ground surface and the position on the summit of the quadrilateral obtained, and in the form of the main areas information of main areas is stored in RAM26 and tone color (step 506).Afterwards, the Parameter N in RAM26 is initialized as " 0 " by CPU21, and, main areas is set mark and be set as " 1 " (step 507).
In the present embodiment, specify summit by player, the main areas based on the plane of being delimited by summit can be set.In the above-described embodiment, be that the plane (quadrilateral) of 4 is set as main areas by number of vertices, but by changing number of vertices, the main areas of the arbitrary polygons such as triangle can be set.
Fig. 7 is the figure that outline represents the decision of the main areas involved by present embodiment.Music performance apparatus main body when mark 71 ~ 74 represents that player has connected main configuration switch respectively.The front position of the music performance apparatus main body in mark 71 ~ 74 is respectively:
P1 (mark 71): (x 1, y 1, z 1)
P2 (mark 72): (x 2, y 2, z 2)
P3 (mark 73): (x 3, y 3, z 3)
P4 (mark 74): (x 4, y 4, z 4),
These 4 coordinates to be connected with straight line and the plane obtained represents with mark 700.
And, plane 700 is being projected to ground surface (z coordinate=z 0) and in the plane 701 obtained, the coordinate y on summit is:
(x 1,y 1,z 0)
(x 2,y 2,z 0)
(x 3,y 3,z 0)
(x 4,y 4,z 0)
In the present embodiment, by 4 coordinate (x 1, y 1, z 0), (x 2, y 2, z 0), (x 3, y 3, z 0), (x 4, y 4, z 0) plane of delimiting and be set as main areas from the space 710 that the vertical line 75 ~ 78 that these 4 coordinates extend delimited.As described later, music performance apparatus main body 11, when being positioned at above-mentioned space 710, can produce musical sound by brandishing music performance apparatus main body 11.In addition, about setting and the shape in region, also other modes can be had.To describe later this.
If main areas setting process (step 305) terminates, then CPU21 performs tone color setting process (step 306).Fig. 6 is the process flow diagram of the example of the tone color setting process represented involved by present embodiment.As shown in Figure 6, CPU21 judges whether main areas setting mark is " 1 " (step 601).When being judged as "No" in step 601, terminate tone color setting process.
When being judged as "Yes" in step 601, CPU21 judges that tone color confirms whether switch is switched on (step 602).When being judged as "Yes" in step 602, CPU21 generates the note open event (step 603) of the timbre information comprised based on the parameter TN representing tone color.This parameter TN is such as the tone color number for determining tone color uniquely.In this note open event, represent volume level (volume level) as long as or the information of pitch predetermined.Then, CPU21 exports the note open event (step 604) generated to I/F26.I/F27 makes infrared communications set 24 note open event be sent with infrared signal.Infrared signal from infrared communications set 24 is received by the infrared communications set 33 in musical instrument portion 19.Thus, in musical instrument portion 19, send the musical sound of regulation pitch.To describe later for the pronunciation in musical instrument portion 19.
After step 604, CPU21 judges to determine whether switch is switched on (step 605).When being judged as "No" in step 605, CPU21 will represent that the parameter TN of tone color is from adding 1 (step 606), and returns step 602.When being judged as "Yes" in step 605, the timbre information shown in parameter TN is associated with the information of main areas by CPU21, and in the form of the main areas be stored in RAM26 and tone color (step 607).Then, main areas is set Flag Reset as " 0 " (step 608) by CPU21.
Fig. 8 is the figure of the example of the form representing main areas in the RAM involved by present embodiment and tone color.As shown in Figure 8, the record (for example, referring to mark 801) of the form 800 of the main areas involved by present embodiment and tone color has main areas ID, the coordinate of vertex position (1 ~ summit, summit 4) and these projects of tone color.Main areas ID is used for determining record uniquely, is the number selected by CPU21 when the record of the form 800 generating main areas and tone color.In the present embodiment, the tone color of the musical instrument that pitch can be specified variable is configured to.In the example of fig. 8, the idiophonic tone color that the such pitch of vibraphone, xylophone, timpani is variable is set with.The tone color of the musical instrument (keyboard instrument, stringed musical instrument, wind instrument etc.) that can set beyond above-mentioned percussion instrument can certainly be configured to.
And the coordinate as vertex position preserves the two-dimensional coordinate (x, y) in x direction, y direction.This is because as described above, the main areas involved by present embodiment is the three dimensions that the plane based on such as 4 summits on ground surface and the vertical line 75 ~ 78 that extends from these 4 summits delimited, and z coordinate is arbitrary.
If tone color setting process (step 306) terminates, then CPU21 performs subregion setting process (step 307).Fig. 9 is the process flow diagram of the example of the subregion setting process represented involved by present embodiment.As shown in Figure 9, CPU21 obtains the positional information (step 901) of the music performance apparatus main body 11 that RAM26 preserves, and judges whether music performance apparatus main body 11 is positioned at certain main areas (step 902).When being judged as "No" in step 902, terminate subregion setting process.
When being judged as "Yes" in step 902, CPU21 judges in the input part 28 of music performance apparatus main body 11, whether center configuration switch is switched on (step 903).When being judged as "No" in step 903, terminate subregion setting process.When being judged as "Yes" in step 903, CPU21 judges whether center configuration switch is new connection (step 904).When being judged as "Yes" in step 904, CPU21 is using the positional information (coordinate (x of the positional information of music performance apparatus main body 11 as center C c, y c, z c)) be stored in RAM26 (step 905).This position becomes the reference position of following set subregion.
When being judged as "No" in step 904, when that is switch is in connection always, or after execution of step 905, CPU21 judges whether center configuration switch is disconnected (step 906).When being judged as "No" in step 906, terminate subregion setting process.When being judged as "Yes" in step 906, the positional information (coordinate (x of the position P of the music performance apparatus main body 11 when the positional information of music performance apparatus main body 11 disconnects as center configuration switch by CPU21 p, y p, z p)), be stored in RAM26 (step 907).And in step 907, CPU21 calculates the distance d between position C and position P p.CPU21 will be position C with center and through the radius d of position P pscope (disk: circular plane) determine as subregion (step 908), and will be used for determining that the information (coordinate of center C, the coordinate of position P (also referred to as " through position ") and radius d) of subregion is stored in the form (step 909) of subregion in RAM26 and pitch.In step 909 in the information of form being stored in subregion and pitch, also comprise the main areas ID for determining the main areas that subregion is subordinate to and the subregion ID for determining subregion.Afterwards, the subregion in RAM26 is set traffic sign placement as " 1 " (step 910) by CPU21.
As described above, in the first embodiment, player is wanting the position being set to center C, the center configuration switch of music performance apparatus main body 11 is connected, when maintaining this state, move to the position suitable with radius, make center configuration switch disconnect in this position, thus, the position P process that can disconnect by position C centered by the position connected by center configuration switch and from center configuration switch, the plane of the circle of radius d (d: the distance between center C and position P), be set as subregion.
Figure 11 is the figure that outline represents the decision of the subregion involved by present embodiment.Mark 100 represents the music performance apparatus main body during configuration switch connection of center, music performance apparatus main body when mark 101 expression center configuration switchs disconnect.Conveniently, in fig. 11, being set to player makes music performance apparatus main body move horizontally and observe the state of this music performance apparatus main body from upside.
By player, the center configuration switch of music performance apparatus main body is connected, using the coordinate (x of the front position of music performance apparatus main body 100 as center C c, y c, z c) be stored in RAM26, if when the state that the center of maintenance configuration switch is connected, make music performance apparatus main body 11 move and at desired disconnection center, position configuration switch, then obtain the coordinate (x of position P p, y p, z p), as the front position of music performance apparatus main body 101, and, calculate the distance d between center C and position P p.Thus, by centered by the C of center, from the radius d of position P process pthe plane 1000 of circle be set to subregion.As described later, the front end (geomagnetic sensor 22) of music performance apparatus main body 11 is positioned at this subregion or from subregion process, produces musical sound thus.
In addition, in the example of Figure 11, player makes music performance apparatus main body 11 move horizontally, therefore, circular plane relative to the parallel fetch bit of ground surface, but is not limited to this, the plane of the circle set by above-mentioned player also can relative to ground surface keep arbitrarily angled come fetch bit.And, about the setting in region, also can additive method be considered.
If subregion setting process (step 307) terminates, then CPU21 performs pitch setting process (step 308).Figure 10 is the process flow diagram of the example of the pitch setting process represented involved by present embodiment.In addition, input part 28 confirms switch to specify pitch to have pitch and determines switch.Also this switch can be used as by switching other above-mentioned switches.And the parameter (such as, based on the pitch information of MIDI) of the expression pitch used in the pitch setting process of Figure 10 is set to initial value (such as, double bass) in initialization process (step 301).As shown in Figure 10, CPU21 judges whether subregion setting mark is " 1 " (step 1001).When being judged as "No" in step 1001, terminate pitch setting process.
When being judged as "Yes" in step 1001, CPU21 judges that pitch confirms whether switch is switched on (step 1002).When being judged as "Yes" in step 1002, CPU21 generates the note open event (step 1003) comprising pitch information, and this pitch information is the information based on the Parameter N N representing pitch.In this note open event, as long as represent that the information of volume level or tone color is predetermined.Then, CPU21 exports the note open event (step 1004) generated to I/F26.I/F27 makes infrared communications set 24 note open event be sent with infrared signal.Infrared signal from infrared communications set 24 is received by the infrared communications set 33 in musical instrument portion 19.Thus, in musical instrument portion 19, send the musical sound of regulation pitch.
After step 1004, CPU21 judges to confirm whether switch is switched on (step 1005).When being judged as "No" in step 1005, CPU21 will represent that the Parameter N N of pitch is from adding 1 (step 1006), returns step 1002.When being judged as "Yes" in step 1005, the pitch information shown in Parameter N N is associated with the information of subregion by CPU21, and the form (step 1007) of the subregion be stored in RAM26 and pitch.Then, subregion is set Flag Reset as " 0 " (step 1008) by CPU21.
In the pitch setting process shown in Figure 10, when pitch confirms that switch is switched on, send the musical sound than a previous high pitch.Desired pitch, by making confirmation switch connection when sending the musical sound of desired pitch, can be set up corresponding with subregion by player.Figure 12 is the figure of the example of the form representing subregion in the RAM involved by present embodiment and pitch.As shown in Figure 12, the record (for example, referring to mark 1201) of the form 1200 of the subregion involved by present embodiment and pitch has main areas ID, subregion ID, the coordinate of center C, coordinate, radius d and these projects of pitch through position P.
Main areas ID is the main areas ID for determining the main areas that subregion is positioned at.Subregion ID is the number selected by CPU21 when the record of the form 1200 generating subregion and pitch.
If pitch setting process (step 308) terminates, then CPU21 performs pronunciation timing check processing (step 309).Figure 13 is the process flow diagram of the example of the pronunciation timing check processing represented involved by present embodiment.CPU21 obtains the positional information (step 1301) that RAM26 preserves, and with reference to the form of the main areas in RAM26 and tone color, judges whether music performance apparatus main body 11 is positioned at main areas (step 1302).When being judged as "No" in step 1302, the maximal value of the acceleration transducer preserved by RAM26 is reset to " 0 " (step 1303).
When being judged as "Yes" in step 1302, CPU21 judges that whether the acceleration transducer value obtained in step 302 is than setting α large (step 1304).Setting α is the arbitrary value larger than 0, as long as can detect the situation of being brandished music performance apparatus main body 11 by player.When being judged as "No" in step 1304, proceed to step 1307.When being judged as "Yes" in step 1304, CPU21 judges the maximal value large (step 1305) whether acceleration transducer value preserves than RAM26.When being judged as "No" in step 1305, proceed to step 1307.
When being judged as "Yes" in step 1305, the acceleration transducer value acquired by CPU21 preserves, as the maximal value (step 1306) in RAM26.Then, CPU21 judge music performance apparatus main body 11 position whether tangent with subregion or whether from subregion through (step 1307).In step 1307, CPU21 determines the record group with the main areas ID being judged as the main areas in its region in step 1302 in the form of subregion and pitch, respective with reference to this record, the coordinate of center C, through coordinate and the radius of position P, obtain the information for determining the plane to the circle that subregion specifies, the position of this music performance apparatus main body 11 obtained from geomagnetic sensor 22 etc. judging that RAM26 preserves whether with the plane contact of subregion, or the track of the music performance apparatus main body 11 obtained according to the coordinate of the coordinate of previous process and this process whether with the crossing on the same level of subregion.When being judged as "No" in step 1307, terminate pronunciation timing check processing.
When being judged as "Yes" in step 1307, CPU21 be judged as that RAM26 preserves, whether establish corresponding pronunciation situation with this subregion be " in noise reduction " (step 1308).When being judged as "Yes" in step 1308, CPU21 performs note open event process (step 1309).In the present embodiment, pronunciation situation and every sub regions are set up corresponding and are stored in RAM26, in the sound source part 31 in musical instrument portion 19, represent and this subregion whether establish the tone color that associates be (pronouncing in situation=pronunciation) or whether by noise reduction (pronouncing in situation=noise reduction) in pronouncing.
Figure 14 is the process flow diagram of the example of the note open event generating process represented involved by present embodiment.As shown in Figure 14, the maximal value of acceleration transducer value that CPU21 preserves based on RAM26 determines volume level (velocity of sound) (step 1401).
If the maximal value of degree of will speed up sensor is set to Amax, the maximal value of volume level (velocity of sound) is set to Vmax, then volume level Vel such as can obtain as follows.
Vel=a·Amax
(wherein, if aAmax > Vmax, then Vel=Vmax, in addition, a is the positive coefficient of regulation)
Then, the main areas in CPU21 reference RAM26 and the form of tone color, determine the record for the main areas residing for music performance apparatus main body 11, the tone color in determined record determined the tone color (step 1402) for the musical sound that should send.And, subregion in CPU21 reference RAM23 and the form of pitch, in the main areas residing for music performance apparatus main body 11, determine the record of the subregion of or process tangent for music performance apparatus main body, the pitch in determined record is determined the pitch (step 1403) for the musical sound that should send.CPU21 generates the note open event (step 1404) comprising determined volume level (velocity of sound), tone color and pitch.
Generated note open event is exported (step 1405) to I/F27 by CPU21.I/F27 makes infrared communications set 24 note open event be sent with infrared signal.Infrared signal from infrared communications set 24 is received by the infrared communications set 33 in musical instrument portion 19.Afterwards, the pronunciation situation in RAM26 changes to " in pronunciation " (step 1406) by CPU21.
If pronunciation timing check processing (step 309) terminates, then CPU21 execution parameter communication process (step 310).About parameter communication process (step 310), be described together with the parameter communication process in musical instrument portion 19 described later.
Figure 15 is the figure that outline represents the example of the pitch of subregion and the correspondence set in the subregion setting process of the music performance apparatus main body involved by present embodiment and pitch setting process.In this example embodiment, in subregion setting process, the main areas (reference marker 1500) of being delimited by projecting plane 1510 and the vertical line that extends from its summit, be set with subregion (reference marker 150 ~ 153).The subregion ID of subregion 150 ~ 153 is " 0 " ~ " 3 " respectively, subregion 150 ~ 153 is distributed respectively to the pitch of C3 ~ F3.These information are kept in the subregion of RAM26 and the form of pitch.
Such as, wave music performance apparatus main body (mark 1501) under player, if the front end of this music performance apparatus main body (reference marker 1502) is through subregion 150, then send the musical sound of C3 pitch.Equally, player's Back stroke music performance apparatus main body (reference marker 1503), if the front end of this music performance apparatus main body (reference marker 1504) is through subregion 151, then sends the musical sound of D3 pitch.In addition, tone color establishes corresponding with main areas 1500.
Figure 16 is the process flow diagram of the example representing the process performed in the musical instrument portion involved by present embodiment.The CPU12 in musical instrument portion 19 perform comprise RAM15 data dump, display part 16 picture on piclear, sound source part 31 the initialization process (step 1601) of removing etc.Then, CPU12 performs switch process (step 1602).In switch process, such as, CPU12 according to the switching manipulation of input part 17, set for the musical sound that should send, the parameter etc. of effect sound.The parameter (such as, the degree of depth etc. of reverberation) of set effect sound is stored in RAM15.And, in switch process, also can be edited by switching manipulation and to be sent from music performance apparatus main body 11 by parameter communication process described later and to be stored in the RAM15 in musical instrument portion 19, main areas and the form of tone color and the form of subregion and pitch.In this editor, also can revise vertex position for specifying main areas and change tone color, or the position of subregion or size can be revised and change pitch.
Then, CPU12 judges whether I/F13 newly have received note open event (step 1603).When being judged as "Yes" in step 1603, CPU12 performs pronunciation process (step 1604).In pronunciation process, CPU12 exports received note open event to sound source part 31.Sound source part 31 reads the Wave data of ROM according to the tone color that note open event comprises.And its reading speed is dependent on the pitch that note open event comprises.And sound source part 31 is multiplied by the coefficient of the volume data (velocity of sound) comprised based on note open event to the Wave data read, and generate the tone data of regulation volume level.The tone data generated exports to voicefrequency circuit 32, finally, produces the musical sound of regulation from loudspeaker 35.
Afterwards, CPU12 execution parameter communication process (step 1605).In parameter communication process (step 1605), according to the instruction of CPU12, send the data of the form of main areas and the tone color such as will edited by switch process (step 1602) and the form of subregion and pitch to music performance apparatus main body 11.In music performance apparatus main body 11, if infrared communications set 24 receives data, then CPU21 accepts data via I/F27, and is stored in RAM26 (step 310 of Fig. 3).
In the step 310 of Fig. 3, the CPU21 also execution parameter communication process of music performance apparatus main body 11.In the parameter communication process of music performance apparatus main body 11, send to musical instrument portion 19 generate based on the main areas of setting in step 305 ~ 308 and tone color and subregion and pitch and be stored in the data of in RAM26, main areas and the form of tone color and the form of subregion and pitch.
If the parameter communication process (step 1605) in musical instrument portion 19 terminates, then CPU12 performs other process, such as, performs (steps 1606) such as the renewals of image shown on the picture of display part 16.
According to the first embodiment, the position of music performance apparatus main body 11 is positioned at main areas and its position is positioned at subregion time as pronunciation timing, CPU21 utilize preserve in the form of main areas and tone color with main areas establish preserve in the form of corresponding tone color and subregion and pitch establish corresponding pitch with subregion, generate note open event.Thus, player, by operating music performance apparatus main body 11 towards the subregion of regulation in the main areas set for the tone color specified, can send desired tone color and the musical sound of pitch.
And, in the present embodiment, CPU21 based on the positional information on the summit of more than 3, using link the plane on summit carry out projecting to ground surface and the plane that obtains as bottom surface, the space that the plane becoming this bottom surface and the vertical line that extends from its summit delimited is determined as main areas.Like this, player, by specifying summit, can set the main areas of the plane based on connect Vertex.In addition, in the above-described embodiment, set as main areas the plane (quadrilateral) that number of vertices is 4, but by changing number of vertices, the main areas of any bottom shapes such as triangle can be set.
And then in the present embodiment, the positional information of CPU21 based on specified center, the positional information of other positions different from this center, be defined as subregion by centered by center, from the circular flat of other position processes.Thus, by designated centers position and other positions (through position), the two dimensional surface of circle can be set as subregion.
And, in the present embodiment, music performance apparatus main body 11 has geomagnetic sensor 22 and acceleration transducer 23, CPU21 is based on the sensor values of geomagnetic sensor 22, detect the moving direction of music performance apparatus main body 11, and based on the sensor values of acceleration transducer 23, calculate the amount of movement of music performance apparatus main body 11.According to moving direction and amount of movement, obtain the current location of music performance apparatus main body 11.Therefore, it is possible to do not use large-scale device and do not carry out complex calculations, obtain the position of music performance apparatus main body 11.
Next, the second embodiment of the present invention is described.In the first embodiment, as main areas, set the plane obtained to ground surface projection by the polygon connecting multiple summit and the determined region of vertical line extended from summit.And, as subregion, set the plane domain of the circle determined by center and radius.But the shape in region is not limited to these.
Such as, in this second embodiment, as subregion, setting connects the polygonal plane domain on multiple summit.Figure 17 is the figure of the example of the subregion setting process represented involved by the second embodiment.In this second embodiment, the process (such as, main areas setting process, tone color setting process, pitch setting process, pronunciation timing check processing etc.) beyond subregion setting process is identical with the process shown in the first embodiment.
As shown in Figure 17, CPU21 obtains the positional information (step 1701) of the music performance apparatus main body 11 that RAM26 preserves, and judges whether music performance apparatus main body 11 is positioned at certain main areas (step 1702).When being judged as "No" in step 1702, terminate subregion setting process.
When being judged as "Yes" in step 1702, CPU21 judges whether the subregion configuration switch in input part 18 is switched on (step 1703).When being judged as "Yes" in step 1703, CPU21 obtains the positional information that RAM26 preserves, and it can be used as the coordinate on summit (apex coordinate) to be stored in RAM26 (step 1704).Then, CPU21 making in RAM26, representing that the parameter M of number of vertices is from adding 1 (step 1705).In addition, identical with the first embodiment, above-mentioned parameter M is initialized to " 0 " in initialization process (step 301 of Fig. 3).Then, CPU21 judges that whether parameter M is than " 4 " large (step 1706).When being judged as "No" in step 1706, terminate subregion setting process.
In step 1706, be judged as that the "Yes" meaning refers to that 4 apex coordinates have been stored in RAM26.Therefore, when being judged as "Yes" in step 1706, CPU21 obtains the information (step 1707) of the two dimensional surface (quadrilateral) of being delimited by 4 apex coordinates.Then, the information of acquired two dimensional surface is stored in the form of subregion in RAM26 and pitch (step 1708) by CPU21.Then, the parameter M in RAM26 is initialized as " 0 " by CPU21, and subregion is set mark and be set to " 1 " (step 1709).
Like this, according to the second embodiment, the subregion as the two dimensional surface of being delimited by multiple summit (being 4 summits in the present embodiment) can be set.In each sub regions, identical with the first embodiment, can by the pitch desired by pitch setting process setting.Figure 18 is the figure that outline represents the example of the pitch of subregion and the correspondence set in the subregion setting process and pitch setting process of the music performance apparatus main body involved by present embodiment.In this example embodiment, in subregion setting process, the quadrilateral of being delimited by 4 summits is set as subregion.In figure 18, the main areas (reference marker 1800) of being delimited by projection 1810 and the vertical line that extends from its summit, 6 sub regions 180 ~ 185 of quadrilateral are exemplified respectively.The region ID of subregion 180 ~ 185 is respectively " 0 " ~ " 5 ".And, each sub regions 180 ~ 185 is assigned to the pitch of C3 ~ A3.These information are stored in the subregion of RAM26 and the form of pitch.Such as, if wave music performance apparatus main body (mark 1801) under player, the front end of this music performance apparatus main body (mark 1802) through subregion 182, then sends the musical sound of E3 pitch.
And, as main areas, also can be same with the first embodiment, obtain the circular two-dimensional plane based on center C and radius d, set this plane region that the projecting plane that obtains and the vertical line that extends from this projecting plane delimited to ground surface projection.
And then, also can be, not in accordance with obtaining center and radius to obtain the mode of the information of plane, but the information of player by making music performance apparatus main body 11 move the plane obtaining circle or ellipse along desired region in space.Below, illustrate that the track based on music performance apparatus main body 11 sets the situation of main areas by the 3rd embodiment.In the third embodiment, also set circular (or oval) main areas as the column of bottom surface.That is, in the third embodiment, player moves along desired region in space by making music performance apparatus main body 11, delimit the plane of circle or ellipse, the plane of delimiting becomes the bottom surface of the cylinder (or cylindroid) of regulation main areas to the projecting plane of ground surface.Figure 19 is the process flow diagram of the example of the main areas setting process represented involved by the 3rd embodiment.In the third embodiment, in order to set main areas, the switch portion 28 of music performance apparatus main body 11 has setting and starts switch and setting end switch.In addition, in the third embodiment, the process (such as, tone color setting process, subregion setting process, pitch setting process, pronunciation timing check processing etc.) beyond main areas setting process is identical with the process shown in the first embodiment.
As shown in Figure 19, CPU21 judges that setting starts switch and whether is switched on (step 1901).When being judged as "Yes" in step 1901, CPU21 obtains the positional information that RAM26 preserves, and it can be used as the coordinate of start position (starting point coordinate) to be stored in RAM26 (step 1902).And mark in setting is set to " 1 " (step 1903) by CPU21.
When being judged as "No" in step 1901, CPU21 judges in setting, whether mark is " 1 " (step 1904).When being judged as "Yes" in step 1904, obtain the positional information that RAM26 preserves, and it can be used as the coordinate (through position coordinates) through position to be stored in RAM26 (step 1905).In addition, step 1905 is performed repeatedly, until player connects the end switch of music performance apparatus main body 11.Therefore, in step 1905, be stored in RAM26 with being associated by through position coordinates with the execution number of times of step 1905.
Then, CPU21 judges whether end switch is switched on (step 1906).When being judged as "Yes" in step 1906, CPU21 obtains the positional information that RAM26 preserves, and it can be used as the coordinate in final position (terminal point coordinate) to be stored in RAM26 (step 1907).Then, CPU21 judge terminal point coordinate whether distance starting point coordinate be positioned at the scope (step 1908) of regulation.When being judged as "No" in step 1908, end region setting process.When being judged as "No" in step 1904,1906, end region setting process too.
When being judged as "Yes" in step 1908, based on starting point coordinate, through position coordinates, final position coordinate, obtain for determine through these coordinates ellipse or circle information (step 1909).CPU21 also can make the closed curve connecting adjacent coordinate, and obtains the circle approximate with this closed curve or ellipse.Such as, for approximate, the method that least square method etc. is known can be applied.And, CPU21 calculates the ellipse corresponding with the projecting plane that the ellipse determined in step 1909 or circle are projected to ground surface and obtained or the information of circle, and using based on the ellipse of this projection or the information of circle as the information of main areas, be saved in the form of main areas in RAM26 and tone color (step 1910).Then, mark in the setting in RAM26 is reset to " 0 " by CPU21, and main areas is set mark and be set to " 1 " (step 1911).
According to the 3rd embodiment, the plane that can set shape desired by player sets towards the projecting plane on earth's surface and from projecting plane the main areas of the column that vertically extending wall delimited.Particularly, in the third embodiment, the main areas making the track of music performance apparatus main body 11 movement such as the side of outline player can be set.In addition, subregion also can be set by the method described in the 3rd embodiment certainly.
And the setting of main areas is not limited to set based on the track of above-mentioned music performance apparatus main body 11.Such as, also can be identical with the setting of the subregion in the first embodiment, obtain the circular two-dimensional plane based on center C and radius d, set this plane region that the projecting plane that obtains and the vertical line that extends from this projecting plane delimited to ground surface projection.To this, describe later with reference to Figure 24.
Next, the 4th embodiment of the present invention is described.In the 4th embodiment, if player sets a region in main areas, then by this region segmentation for regulation number, split regional is set as subregion, and automatically distributes pitch to this each sub regions.
In the 4th embodiment, replace the subregion setting process (step 307) in Fig. 3 and pitch setting process (step 308), perform the setting process of subregion and pitch.Figure 20 is the process flow diagram of the example of the setting process representing subregion involved by the 4th embodiment and pitch.As shown in Figure 20, CPU21 obtains the positional information (step 2001) of the music performance apparatus main body 11 that RAM26 preserves, and judges whether music performance apparatus main body 11 is positioned at certain main areas (step 2002).When being judged as "No" in step 2002, terminate the setting process of subregion and pitch.
When being judged as "Yes" in step 2002, CPU21 judges whether the subregion configuration switch in input part 18 is switched on (step 2003).When being judged as "Yes" in step 2003, CPU21 obtains the positional information that RAM26 preserves, and it can be used as the coordinate on summit (apex coordinate) to be stored in RAM26 (step 2004).Then, CPU21 making in RAM26, representing that the parameter M of number of vertices is from adding 1 (step 2005).In addition, identical with the first embodiment, above-mentioned parameter M is initialized to " 0 " in initialization process (step 301 of Fig. 3).Then, CPU21 judges that whether parameter M is than " 4 " large (step 2006).When being judged as "No" in step 2006, terminate subregion setting process.
In step 2006, be judged as that the "Yes" meaning refers to that 4 apex coordinates have been stored in RAM26.Therefore, when being judged as "Yes" in step 2006, CPU21 obtains the information (step 2007) of the two dimensional surface (quadrilateral) of being delimited by 4 apex coordinates.Then, CPU21 will obtain two dimensional surface P decile, and obtain P the part planar positional information separately apex coordinate of (information) at 4.The positional information of each several part plane that this obtains by CPU21, as the information of subregion, is stored in the form (step 2008) of subregion in RAM26 and pitch.
Then, parameter p is initialized as " 0 " (step 2009) by CPU21, to pitch Note (p) of p sub regions (subregion ID=p) distribution provisions, and preserve pitch Note (p) (step 2010) with being associated with subregion ID in the form of subregion and pitch.CPU21 makes parameter p from adding 1 (step 2011), and judges whether parameter p is more than P (step 2012).When being judged as "No" in step 2012, turn back to step 2010.When being judged as "Yes" in step 2012, CPU21 makes the parameter M in RAM26 be initialized as " 0 " (step 2013).
Figure 21 is the figure that outline represents the subregion set in the subregion of music performance apparatus main body involved by present embodiment and the setting process of pitch.As shown in figure 21, by projecting plane 2110 with from the main areas (reference marker 2100) that the vertical line that summit extends delimited, two dimensional surface 2120 is set with by player.By the setting process of subregion and pitch, two dimensional surface 2120 is divided into P part planar (such as reference marker 2130,2131), and this P part planar is assigned respectively to pitch Note (p) of subregion ID and correspondence.The positional information (apex coordinate) of subregion and pitch are stored in the form of subregion and pitch with subregion ID respectively with being associated.
According to the 4th embodiment, if in main areas, player sets two dimensional surface by operation music performance apparatus main body 11, then this two dimensional surface divided and generate regulation number subregion, the positional information of the subregion of this regulation number is stored in the form of subregion and pitch, and the pitch should distributing to all subregion is also stored in the form of subregion and pitch.Therefore, the simple operations undertaken by player, terminates the setting of desired subregion.
According to the 4th embodiment, can produce showing and corresponding from different pitches hit the musical instrument of body (such as Figure 30 A) as xylophone or glockenspiel etc. in space in analog.
But, also can change the shape on the two dimensional surface of subregion 2120, more easily play to make player.Such as also can as shown in figure 30b, make polygon-shaped in the mode of surrounding player.
At above-mentioned first embodiment in the 4th embodiment, set the tone color of the musical sound that should send with being associated with main areas, set the pitch of the musical sound that should send with being associated with each sub regions.In the 5th embodiment, set first tone color suitable with tone color classification (category) with being associated with main areas, set second tone color suitable with the subcategory than the first tone color refinement with subregion with being associated.
In the 5th embodiment, performing except the second tone color setting process this point except replacing the pitch setting process of the step 308 of Fig. 3, performing the process almost identical with the first embodiment.Particularly, as the 5th embodiment, the first tone color is percussion instrument, and the second tone color is comprised in such as to step on small cymbals, snare drum, bass drum, gong, big cymbals etc. in the such classification of percussion instrument.And, in the tone color setting process of Fig. 6, confirm the connection of switch in response to tone color, send certain (such as snare drum) musical sound of the second tone color comprised in idiophonic classification.
Figure 22 is the process flow diagram of the example of the second tone color setting process represented involved by the 5th embodiment.As shown in figure 22, CPU21 judges whether subregion setting mark is " 1 " (step 2201).When being judged as "No" in step 2201, terminate the second tone color setting process.
When being judged as "Yes" in step 2201, CPU21 judges that tone color confirms whether switch is switched on (step 2202).When being judged as "Yes" in step 2202, CPU21 generates the note open event (step 2203) comprising timbre information, and this timbre information is the information based on the parameter DTN representing percussion instrument tone color.This parameter DTN is such as the tone color number for determining percussion instrument tone color uniquely.In this note open event, preset the information representing volume level or pitch.Then, generated note open event is exported (step 2204) to I/F26 by CPU21.I/F27 makes infrared communications set 24 note open event be sent with infrared signal.Infrared signal from infrared communications set 24 is received by the infrared communications set 33 in musical instrument portion 19.Thus, in musical instrument portion 19, send the idiophonic musical sound of regulation.
After step 2204, CPU21 judges to determine whether switch is switched on (step 2205).When being judged as "No" in step 2205, CPU21 makes the parameter DTN of expression tone color from adding 1 (step 2206), turns back to step 2202.When being judged as "Yes" in step 2205, the timbre information shown in parameter DTN and the information of subregion are stored in the form of subregion in RAM26 and tone color (step 2207) by CPU21 with being associated.Then, subregion is set Flag Reset as " 0 " (step 2208) by CPU21.
In the 5th embodiment, in RAM26, replace the form of subregion and pitch, and the form of subregion and tone color is set.In the form of the subregion shown in Figure 12 and pitch, replace the project of pitch and the project that arranges about idiophonic second tone color.
According to the 5th embodiment, as main areas, setting can set the region of the second tone color, in main areas, and setting subregion, and various second tone color can be distributed to these subregions.
First embodiment of the present invention ~ the 5th embodiment is illustrated, but the present invention is not limited to above-mentioned embodiment.Such as, the first embodiment and the 5th embodiment can be combined.In the embodiment involved by this combination, when being assigned with the tone color of the transformable musical instrument of the pitch such as vibraphone, xylophone by the setting process of tone color shown in Fig. 6 to main areas, and then subregion setting process (with reference to Fig. 9), performs pitch setting process (with reference to Figure 10).On the other hand, when setting idiophonic tone color by tone color setting process to main areas, and then subregion setting process (with reference to Fig. 9), performs the second tone color setting process (with reference to Figure 22).In this embodiment, after subregion setting process, CPU21 judges that the tone color set by tone color setting process (Fig. 6) is idiophonic tone color or the tone color beyond it.
Figure 23 is the example of main areas during outline represents involved by combination embodiment and subregion.In fig 23, left side represents the main areas and subregion that set according to the 4th embodiment, and right side represents according to the main areas set by the 5th embodiment and subregion.In left side, in main areas 2300, include the region 2310 of player's setting, divided further have multiple region, and divided part planar becomes subregion (reference marker 2311,2112) respectively.And, each sub regions is assigned to the pitch of regulation.
On the other hand, on right side, in main areas 2320, be set with multiple subregion (reference marker 2321,2322), and various second tone colors as idiophonic subcategory are set with to each sub regions.
When player is positioned at main areas 2300, by making music performance apparatus main body 11 be positioned on subregion, send the musical sound of the pitch being assigned to subregion with the tone color (such as vibraphone) being assigned to main areas 2300.On the other hand, when player is positioned at main areas 2320, by making music performance apparatus main body 11 be positioned on subregion, send musical sound with the second tone color being assigned to subregion.Like this, by moving in different main areas, can the performance of the different musical instrument of simulated implementation.
And the setting for main areas and subregion can also apply additive method.Figure 24 is the process flow diagram of the example of the main areas setting process represented involved by other embodiments.The example of Figure 24 is almost identical with the setting of the subregion shown in Fig. 9, by setting center C and through position P, obtains centered by the C of center, from through the two dimensional surface of position P through such circle, sets main areas accordingly.
CPU21 judges whether center configuration switch is switched on (step 2401) in the input part 28 of music performance apparatus main body 11.When being judged as "No" in step 2401, terminate main areas setting process.When being judged as "Yes" in step 2401, CPU21 judges whether center configuration switch is new connection (step 2402).When being judged as "Yes" in step 2402, CPU21 is using the positional information (coordinate (x of the positional information of music performance apparatus main body 11 as center C c, y c, z c)), be stored in RAM26 (step 2403).This position becomes the main areas of following setting and the reference position of subregion.
When being judged as "No" in step 2402, that is when switch is in connection always, or after performing step 2403, CPU21 judges whether center configuration switch is disconnected (step 2404).When being judged as "No" in step 2404, terminate main areas setting process.When being judged as "Yes" in step 2404, the positional information of music performance apparatus main body 11 is set to the positional information (coordinate (x of the position P of the music performance apparatus main body 11 when center configuration switch disconnects by CPU21 p, y p, z p)), be stored in RAM26 (step 2405).And in step 2405, CPU21 calculates the distance d between position C and position P p.CPU21 obtain center is set to position C from the radius of position P process be d pthe information (step 2406) of circular flat.Then, the information of the main areas based on acquired information is stored in the form (step 2407) of main areas in RAM26 and tone color by CPU21.Then, the main areas in RAM26 is set traffic sign placement as " 1 " (step 2408) by CPU21.
Next, the subregion setting process involved by other embodiments is described.Figure 25 is the process flow diagram of the example of the subregion setting process represented involved by other embodiments.As shown in fig. 25, CPU21 obtains the positional information (step 2501) of the music performance apparatus main body 11 that RAM26 preserves, and judges whether music performance apparatus main body 11 is positioned at certain main areas (step 2502).When being judged as "No" in step 2502, terminate subregion setting process.
When being judged as "Yes" in step 2502, CPU21 judges whether configuration switch is switched on (step 2503) in the input part 28 of music performance apparatus main body 11.When being judged as "Yes" in step 2503, CPU21 obtains the coordinate (step 2404) of the intersection point between following line and the outline of main areas, and this line is the line of the position process of the music performance apparatus main body 11 obtained in the center C and step 2501 preserved from the form of the main areas RAM26 and tone color.The coordinate of this intersection point is stored in RAM26.CPU21 judges whether preserve the information (step 2405) being positioned at other intersection points of the angle direction of regulation relative to the line of the position process from above-mentioned center and music performance apparatus main body 11 in RAM26.When being judged as "Yes" in step 2405, by including other intersection points that the intersection point that obtains in step 2404 and RAM26 preserve and being defined as subregion (step 2406) relative to other fan-shaped plans of 2 that center is positioned at predetermined radius, the coordinate information comprising above-mentioned intersection point and at 2 is saved in the form of subregion and pitch (step 2407).In addition, other 2 points above-mentioned, preferably use following point, the point that this point is on the line that is formed by connecting at other intersection points preserved the intersection point obtained in step 2404 and RAM26 and center, distance stop position is predetermined radius.Then, the subregion that RAM26 preserves by CPU21 sets traffic sign placement as " 1 " (step 2408).
Figure 26 is the figure of the example representing main areas involved by other embodiments and subregion.As shown in figure 26, in main areas 2600, setting specifies that fan-shaped subregion (for example, referring to mark 2601,2602, and distributes pitch to this subregion respectively.And, when being assigned with idiophonic tone color to main areas, second tone color (snare drum, step on the concrete tone color such as small cymbals, gong) of the subcategory for idiophonic tone color also can be set to subregion.
According to other embodiments, CPU21 based on the center on the ground surface that the center of specifying and other positions different from this center are projected to ground surface and obtain and other positions on ground surface, by centered by the center on ground surface, being bottom surface from the circle of other position processes ground surface, cylinder determines to be main areas.Thereby, it is possible to by position (through the position) setting of center and regulation based on the main areas of the two dimensional surface of desired circle.
And, according to these other embodiments, CPU21 is in the main areas determined based on the center on the ground surface that specified center and other positions different from this center are projected to ground surface and obtained and other positions on ground surface, by including from 2 intersection points between the line and the outline of main areas of the position of retaining member and above-mentioned center process and other 2 the sector regions being positioned at predetermined radius relative to center, be defined as subregion.Thereby, it is possible to set fan-shaped subregion in main areas.
And, in above-mentioned first ~ four embodiment, the CPU21 of music performance apparatus main body 11 detects geomagnetic sensor value when player brandishes music performance apparatus main body 11 and acceleration transducer value, obtain the positional information of music performance apparatus main body 11 based on these sensor valuess, and judge whether music performance apparatus main body 11 is positioned at main areas.Under the state that music performance apparatus main body 11 is positioned at main areas, music performance apparatus main body 11 is brandished, if CPU21 judges that this music performance apparatus main body 11 is positioned on subregion or have passed through subregion, then generate to include and to establish corresponding tone color with main areas and establish the note open event of corresponding pitch with subregion, and this note open event is sent to musical instrument portion 19 via I/F27 and infrared communications set 24.On the other hand, in musical instrument portion 19, if receive note open event, then received note open event exports to sound source part 31 and produces musical sound by CPU12.Said structure is suitable for musical instrument portion 19 when being special machines of the non-generation musical sound of the personal computer or game machine etc. being provided with MIDI plate etc.
But, the process in the process in music performance apparatus main body 11 and musical instrument portion 19 share situation about being not limited in above-mentioned embodiment.Such as, also can be configured to, the information of main areas and the form of tone color and the form of subregion and pitch sends to musical instrument portion 19 by music performance apparatus main body 11, and in music performance apparatus main body, obtains positional information based on sensor values, and sends to musical instrument portion 19.In this case, pronunciation timing check processing (Figure 13), note open event generating process (Figure 14) perform in musical instrument portion 19.Said structure is suitable for musical instrument portion 19 and generates the special machine of musical sound and the situation of electronic musical instrument.
And, in the present embodiment, between music performance apparatus main body 11 and musical instrument portion 19, use infrared communications set 24,33 to communicate to data with infrared signal, but be not limited to this.Such as, also can be configured to, percussion instrument main body 11 and musical instrument portion 19 can carry out data communication by other radio communications, also can carry out data communication in a wired fashion by cable.
And then, in the above-described embodiment, the moving direction of music performance apparatus main body 11 is detected by geomagnetic sensor 23, and the amount of movement of music performance apparatus main body 11 is detected by acceleration transducer 22, the position of music performance apparatus main body 11 is obtained based on these moving directions and amount of movement, but be not limited to such method, other position detecting devices such as sensor values of 3 axle acceleration sensors or the sensor values of angular-rate sensor can certainly be used, obtain the position of music performance apparatus main body 11.
On the other hand, in the above-described embodiment, be set to the pronunciation timing of musical sound when music performance apparatus main body 11 being positioned at main areas and being positioned at above-mentioned subregion, but also can change to other modes.
Such as, being distinguished whether to be applied with music performance apparatus main body 11 by acceleration transducer 22 and brandish action, being applied with being characterized as the pronunciation timing that the timing of brandishing action is set to musical sound.
Figure 27 is the process flow diagram of the pronunciation timing check processing that other modes are applied.The process of each step 2701 ~ 2709 in this Figure 27, with each of step 1301 ~ 1309 of the pronunciation of Figure 13 timing detection procedure figure process distinguish identical.Unique difference is, in fig. 13, when being judged as "No" in step 1304 and 1305, proceeds to step 1307, on the other hand, when being judged as "No" in step 2704 and 2705, terminates the process of this flow process.
Therefore, in figure 27, in the situation (being "Yes" in step 2705) that the acceleration detected by acceleration transducer 22 is the situation (in step 2704 for "Yes") of more than α and this acceleration detected has exceeded the maximal value preset, only when be judged as to music performance apparatus main body 11 be applied with brandish action, just proceed to the note open event process of step 2709 via step 2707,2708, thus the pronunciation of musical sound can be carried out.
By should in such a way, can only carry out close with the idiophonic articulation of reality brandish action time just carry out the pronunciation of musical sound, so make more can feel the sense of reality in performance.
And be configured in the above-described embodiment, belong to which position in a sub regions regardless of music performance apparatus main body 11, the pitch of the musical sound sent is all constant.But, also can expect the position affiliated in a sub regions according to music performance apparatus main body 11, change the mode of the pitch of the musical sound sent.
Such as, as shown in figure 28, the region impaled by the circle played centered by the position that operator 11 specifies is set to subregion 2801.The pitch of regulation is set up corresponding with this subregion 2801.And, this subregion 2801 be split into community 2801A, the 2801B of concentric circles, 2801C,, change pitch value a little and this respectively distinguished to mix and states pitch.Such as join higher pitch to the district more close with center is overstepping the bounds of propriety.
Figure 29 is the process flow diagram of the note open event generating process involved by the manner.In the figure, give identical number to the part identical with Figure 14 and omit the description.
In the step 1403 of Figure 29, after determining pitch, proceed to step 2901 at the form with reference to subregion and pitch.In this step 2901, whether have passed through certain position in subregion based on music performance apparatus main body 11, change the value of determined pitch.
Then, the pitch that this value has been changed is set to determined pitch, proceeds to step 1404.
By being set to such formation, as the idiophonic performance of reality, according to hit position, the pitch of the musical sound that should send can be changed knifeedge.
In aforesaid way, according to the position belonging to music performance apparatus main body 11 is in a sub regions, change the pitch of sent musical sound, but certainly also can be configured to change other the musical sound parameter such as tone color.
Above, be described in detail embodiments of the present invention, but scope of the present invention is not limited to above-mentioned embodiment, the invention described in claims and equivalent scope thereof are within the scope of the present invention.

Claims (15)

1. a music performance apparatus, wherein, has:
Retaining member, player can keep with hand;
Positional information acquisition unit, obtains the positional information of above-mentioned retaining member;
Main areas setup unit, during main areas configuration switch is connected, by the positional information of above-mentioned retaining member obtained by above-mentioned positional information acquisition unit, as for determining that the information of main areas sets, this main areas delimited by the face that at least side is vertical with ground surface in space;
Tone color storage unit, stores the information for determining above-mentioned main areas and establishes the tone color of corresponding musical sound with this main areas;
Pitch storage unit, stores the information for determining the subregion being arranged in above-mentioned main areas and establishes the pitch of corresponding musical sound with this subregion;
Sensing element, when to be in the position of the above-mentioned retaining member acquired by above-mentioned positional information unit in above-mentioned main areas and to be in above-mentioned subregion, read from above-mentioned tone color storage unit and establish corresponding tone color with the main areas residing for the position of above-mentioned retaining member, further, read from above-mentioned pitch storage unit and establish corresponding pitch with the subregion residing for the position of above-mentioned retaining member; And
Pronunciation indicating member, sends the musical sound with tone color read-out by this sensing element and pitch to the instruction of musical sound generation unit.
2. music performance apparatus as claimed in claim 1, wherein,
Above-mentioned music performance apparatus also has:
Tone color setup unit, based on the positional information on more than specified 3 summits, the plane plane connecting above-mentioned summit obtained to ground surface projection is as bottom surface, the space delimited by the plane becoming this bottom surface and the vertical line that extends from its summit is determined to be main areas, the information and tone color that are used for determining above-mentioned main areas is set up and is stored in above-mentioned tone color storage unit accordingly.
3. music performance apparatus as claimed in claim 1, wherein,
Above-mentioned music performance apparatus also has:
Tone color setup unit, based on the center on the ground surface that specified center and other positions of being different from this center are obtained to ground surface projection and other positions on ground surface, be centered by the center on above-mentioned ground surface by bottom surface and determine to be main areas from the cylinder of the circle of other position processes above-mentioned ground surface, the information and tone color that are used for determining above-mentioned main areas are set up and is stored in above-mentioned tone color storage unit accordingly.
4. music performance apparatus as claimed in claim 1, wherein,
Above-mentioned music performance apparatus also has:
Tone color setup unit, by obtaining the track that positional information determines above-mentioned retaining member by predetermined time interval, be the track of this retaining member to be projected bottom surface on earth's surface and the cylindrical region of closed curve on the ground surface that obtains determines as main areas, the information and tone color that are used for determining above-mentioned main areas set up and is stored in above-mentioned tone color storage unit accordingly.
5. music performance apparatus as claimed in claim 1, wherein,
Above-mentioned music performance apparatus also has:
Pitch setup unit, based on the positional information on more than specified 3 summits, determines the plane connecting above-mentioned summit as subregion, the information and pitch that are used for determining above-mentioned subregion is set up and is stored in above-mentioned pitch storage unit accordingly.
6. music performance apparatus as claimed in claim 1, wherein,
Above-mentioned music performance apparatus also has:
Pitch setup unit, based on specified center positional information and be different from the positional information of other positions of this center, centered by above-mentioned center, will determine to be subregion from the plane of the circle of other position processes above-mentioned, the information and pitch that are used for determining above-mentioned subregion are set up and is stored in above-mentioned pitch storage unit accordingly.
7. music performance apparatus as claimed in claim 1, wherein,
Above-mentioned music performance apparatus also has:
Pitch setup unit, by obtaining the track that positional information determines above-mentioned retaining member by predetermined time interval, the track of this retaining member is determined as subregion, the information and pitch that are used for determining above-mentioned subregion is set up and is stored in above-mentioned pitch storage unit accordingly.
8. music performance apparatus as claimed in claim 5, wherein,
The region segmentation set in above-mentioned main areas is regulation number by above-mentioned pitch setup unit, the various piece region be partitioned into is determined for subregion, and, to the pitch of this subregion difference distribution provisions, the information being used for determining subregion is set up respectively with the pitch be assigned with and is stored in above-mentioned pitch storage unit accordingly.
9. music performance apparatus as claimed in claim 1, wherein,
Above-mentioned music performance apparatus also has:
Pitch setup unit, in the main areas determined based on the center on the ground surface that specified center and other positions of being different from this center are obtained to ground surface projection and other positions on ground surface, to include from the position of above-mentioned retaining member and the line of above-mentioned center process 2 intersection points crossing with the outline of above-mentioned main areas, and determine as subregion relative to other sector regions of 2 that center is positioned at predetermined radius, the information and pitch that are used for determining above-mentioned subregion are set up and is stored in above-mentioned pitch storage unit accordingly.
10. music performance apparatus as claimed in claim 1, wherein,
Above-mentioned pronunciation indicating member, as pronunciation timing when to be in the position of the above-mentioned retaining member acquired by above-mentioned positional information unit in above-mentioned main areas and to be in above-mentioned subregion, sends musical sound to above-mentioned musical sound generation unit instruction.
11. music performance apparatus as claimed in claim 1, wherein,
Above-mentioned music performance apparatus also has:
Whether discrimination unit, distinguish to be applied with above-mentioned retaining member and brandish action;
Above-mentioned pronunciation indicating member above-mentioned discrimination unit is characterized as to above-mentioned retaining member be applied with brandish action time as pronunciation timing, musical sound is sent to the instruction of above-mentioned musical sound generation unit.
12. music performance apparatus as claimed in claim 1, wherein,
Above-mentioned pronunciation indicating member also has the pitch variable cell that the position residing in above-mentioned subregion according to above-mentioned retaining member changes the pitch read from above-mentioned pitch storage unit, above-mentioned musical sound generation unit instruction is sent to the musical sound with the tone color read-out by above-mentioned sensing element and the pitch after the change of above-mentioned pitch variable cell.
13. music performance apparatus as claimed in claim 1, is characterized in that,
Above-mentioned positional information acquisition unit has geomagnetic sensor and acceleration transducer;
Based on the sensor values of above-mentioned geomagnetic sensor, detect the moving direction of above-mentioned retaining member, and, based on the sensor values of above-mentioned acceleration transducer, calculate the amount of movement of above-mentioned retaining member.
14. 1 kinds of music performance apparatus, have:
Retaining member, player can keep with hand;
Positional information acquisition unit, obtains the positional information of above-mentioned retaining member;
Main areas setup unit, during main areas configuration switch is connected, by the positional information of above-mentioned retaining member obtained by above-mentioned positional information acquisition unit, as for determining that the information of main areas sets, this main areas delimited by the face that at least side is vertical with ground surface in space;
Classification storage unit, stores the information for determining above-mentioned main areas and establishes the classification of tone color of corresponding musical sound with this main areas;
Tone color storage unit, stores information for determining the subregion being arranged in above-mentioned main areas and establishes corresponding with this subregion and belong to the tone color of the subcategory after being segmented by above-mentioned classification;
Sensing element, when to be in the position of the above-mentioned retaining member acquired by above-mentioned positional information unit in above-mentioned main areas and to be in above-mentioned subregion, read from above-mentioned tone color storage unit and establish corresponding tone color with the subregion residing for the position of above-mentioned retaining member; And
Pronunciation indicating member, sends the musical sound of the tone color had read-out by this sensing element to the instruction of musical sound generation unit.
15. 1 kinds of electronic musical instruments, possess:
Music performance apparatus according to claim 1; And
Possesses the musical instrument portion of above-mentioned musical sound generation unit;
Above-mentioned music performance apparatus and above-mentioned instrumental parts do not possess communication unit.
CN201210046642.6A 2011-02-28 2012-02-27 Playing device and electronic musical instrument Active CN102651212B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP041139/2011 2011-02-28
JP2011041139 2011-02-28
JP226534/2011 2011-10-14
JP2011226534A JP6007476B2 (en) 2011-02-28 2011-10-14 Performance device and electronic musical instrument

Publications (2)

Publication Number Publication Date
CN102651212A CN102651212A (en) 2012-08-29
CN102651212B true CN102651212B (en) 2015-05-27

Family

ID=46693208

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210046642.6A Active CN102651212B (en) 2011-02-28 2012-02-27 Playing device and electronic musical instrument

Country Status (1)

Country Link
CN (1) CN102651212B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104021780A (en) * 2014-06-16 2014-09-03 马人欢 Electronic percussion instrument
JP6070735B2 (en) * 2015-02-04 2017-02-01 ヤマハ株式会社 Keyboard instrument

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101390154A (en) * 2006-02-22 2009-03-18 弗劳恩霍夫应用研究促进协会 Device and method for producing a note signal, and device and method for emitting an output signal displaying a tone

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE60132963T2 (en) * 2000-09-05 2009-04-16 Yamaha Corporation, Hamamatsu Apparatus and method for generating sound in response to movement of a portable terminal
JP3867630B2 (en) * 2002-07-19 2007-01-10 ヤマハ株式会社 Music playback system, music editing system, music editing device, music editing terminal, music playback terminal, and music editing device control method
JP3933057B2 (en) * 2003-02-20 2007-06-20 ヤマハ株式会社 Virtual percussion instrument playing system
JP3804630B2 (en) * 2003-03-20 2006-08-02 ヤマハ株式会社 Music generator
KR101189214B1 (en) * 2006-02-14 2012-10-09 삼성전자주식회사 Apparatus and method for generating musical tone according to motion
JP2007256736A (en) * 2006-03-24 2007-10-04 Yamaha Corp Electric musical instrument

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101390154A (en) * 2006-02-22 2009-03-18 弗劳恩霍夫应用研究促进协会 Device and method for producing a note signal, and device and method for emitting an output signal displaying a tone

Also Published As

Publication number Publication date
CN102651212A (en) 2012-08-29

Similar Documents

Publication Publication Date Title
CN102568455B (en) Performance apparatus and electronic musical instrument
CN102314866B (en) Performance apparatus and electronic musical instrument
CN102568453B (en) Performance apparatus and electronic musical instrument
CN102290044B (en) Performance apparatus and electronic musical instrument
US9773480B2 (en) Electronic music controller using inertial navigation-2
JP6007476B2 (en) Performance device and electronic musical instrument
CN102159290B (en) Touch sensing gaming peripheral for musical game
JP5598490B2 (en) Performance device, method and program
CN102347021B (en) Performance apparatus and electronic musical instrument
CN103310771B (en) Proficiency decision maker and method
JP5812663B2 (en) Music performance program, music performance device, music performance system, and music performance method
CN103366721A (en) Playing apparatus and method
CN103310769A (en) Musical performance device,and method for controlling musical performance device
CN103310770A (en) Performance apparatus and a method of controlling the performance apparatus
CN109559720A (en) Electronic musical instrument and control method
CN103364840A (en) Orientation detection device and orientation detection method
JP2002091290A (en) Device and method for displaying playing
CN102651212B (en) Playing device and electronic musical instrument
JP3799190B2 (en) Music conductor game device
JP5147351B2 (en) Music performance program, music performance device, music performance system, and music performance method
JP2006220938A (en) Sound controller
JP2007233157A (en) Performance operation device and performance device using the same
JP2012013725A (en) Musical performance system and electronic musical instrument
JP2018196775A (en) Game program
JP2020194153A (en) Terminal device, performance device, program, performance system, performance main body device, musical score, and musical score display device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant