WO2014007437A1 - 온라인 게임에서의 유저 제스처 입력 처리 방법 - Google Patents
온라인 게임에서의 유저 제스처 입력 처리 방법 Download PDFInfo
- Publication number
- WO2014007437A1 WO2014007437A1 PCT/KR2012/009450 KR2012009450W WO2014007437A1 WO 2014007437 A1 WO2014007437 A1 WO 2014007437A1 KR 2012009450 W KR2012009450 W KR 2012009450W WO 2014007437 A1 WO2014007437 A1 WO 2014007437A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- drag
- input
- user
- online game
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 230000006870 function Effects 0.000 claims description 86
- 238000003672 processing method Methods 0.000 claims description 13
- 230000008685 targeting Effects 0.000 claims description 6
- 230000035807 sensation Effects 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 16
- 230000008859 change Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 8
- 238000004590 computer program Methods 0.000 description 4
- 239000008280 blood Substances 0.000 description 2
- 210000004369 blood Anatomy 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 241000169170 Boreogadus saida Species 0.000 description 1
- 206010040047 Sepsis Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/428—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the present invention relates to a method for processing user gesture input in an online game.
- a virtual keypad screen, a virtual direction key, and a virtual operation key are output to a touch screen without a separate keypad, but such a virtual user interface has a small size. Since it occupies a part of the display screen of the mobile terminal, it interfered with the visibility of the user and acted as an element that hindered the play screen of the online game.
- the gesture system has been used for many applications or functions as the smart phone or tablet PC has been spread and evolved, but the application is limited due to technical limitations in online games.
- MMOPRG MMORPG
- it is difficult to apply a gesture system because it must detect a gesture input while continuously checking a character's state or a monster's state.
- the movement of the character's 3D space is implemented as a touch-drag input, there is a problem that it is difficult to separate the touch-drag for spatial movement and the touch-drag for character application.
- an online game when playing an online game using a mobile terminal, an online game capable of minimizing a touch required to perform an efficient and quick operation, improving a feeling of operation, and providing a comfortable field of view.
- an online game capable of minimizing a touch required to perform an efficient and quick operation, improving a feeling of operation, and providing a comfortable field of view.
- an online game when playing an online game using a mobile terminal, an online game capable of minimizing a touch required to perform an efficient and quick operation, improving a feeling of operation, and providing a comfortable field of view.
- a recording medium recording a method for processing a user gesture input in the.
- Another embodiment of the present invention when playing an online game using a mobile terminal, it is possible to minimize the touch required to perform an efficient and fast operation, to improve the feeling of operation and to provide a comfortable view online It is to provide a mobile terminal for processing user gesture input in the game.
- a method of processing a user gesture input in an online game performed through a mobile terminal capable of touch-drag input determines whether a target character of the online game is targeted. Making; And performing a function corresponding to a gesture including at least one drag direction when the target character is targeted, and moving the user character according to the touch-drag input when the target character is not targeted. Characterized in that.
- the user gesture input processing method in the online game may include: determining an effective range of the gesture input when the target character is targeted; And in the case of a gesture input within the effective range, performing a function corresponding to the input gesture, wherein the valid range is within a play area of the target character and the user character in a display area of the mobile terminal. It is characterized by.
- the user gesture input processing method in the online game may be determined as a gesture input within the valid range when the starting point of the drag constituting the gesture is within the valid range.
- the user gesture input processing method in the online game may be determined as a gesture input within the valid range regardless of the end point of the drag when the starting point of the drag constituting the gesture is within the valid range.
- the user gesture input processing method in the online game may further include outputting a trail of the gesture at a position of a display area of the mobile terminal corresponding to a starting point of drag forming the gesture when the target character is targeted. It is characterized by including.
- the gesture may include a unidirectional drag, and when the unidirectional drag is within at least one of the valid ranges set in the X and Y axes, the gesture is determined as a gesture input within the valid range.
- the gesture is composed of multidirectional drags, and for each of the first direction drag to the Nth direction drag (N is an integer of 2 or more), the gesture is within an effective range of at least one of the effective ranges set on the X and Y axes, respectively. When the N-th drags are valid, the N-1 drags are determined to be gesture input within an effective range.
- the user gesture input processing method in the online game may determine whether a gesture composed of multidirectional drag including the first direction drag to the Nth direction drag is a gesture input within an effective range, regardless of the start order of the directions. It is done.
- the user gesture input processing method in the online game may perform a function corresponding to a gesture similar to the input gesture when it is not a gesture input within the valid range.
- the function may include at least one skill that can be used by the user character.
- the user gesture input processing method in the online game may include a gesture guide including at least one or more functions that can be used by the user character and gestures including the at least one drag direction mapped to the functions. It is characterized by displaying in the display area of.
- the online game may be an RPG game including an MMORPG / MORPG, an Aeon (Aeon of Strife) game, a Real Time Strategy (RTS) game, a First / Third Person Shooters (FPS) game, or a sports game. .
- RPG Real Time Strategy
- FPS First / Third Person Shooters
- a method of processing a user gesture input in an online game performed through a mobile terminal capable of touch-drag input targets a target character in the online game. Determining whether it has been received; When the target character is targeted, determining a gesture input based on at least one of a drag direction and a drag number; And when a valid gesture is input as a result of the determination of the gesture input, performing a function corresponding to the input gesture.
- the user gesture input processing method in the online game may further include moving a user character according to the touch-drag input when the target character is not targeted.
- the mobile terminal for processing the user gesture input in the online game is capable of touch-drag input, it is determined whether the target character of the online game is targeted A targeting determination unit; And performing a function corresponding to a gesture including at least one drag direction when the target character is targeted, and controlling to move the user character according to the touch-drag input when the target character is not targeted. It characterized in that it comprises a control unit.
- a recording medium recording a program for implementing a method for processing a user gesture input in an online game through a mobile terminal capable of touch-drag input according to another embodiment of the present invention
- First program code for determining whether a target character of the online game is targeted
- Second program code for performing a function corresponding to a gesture including at least one drag direction when the target character is targeted
- third program code for moving a user character according to the touch-drag input when the target character is not targeted.
- the method for processing a user gesture input in an online game performed through a mobile terminal capable of touch-drag input performs an efficient and quick operation when playing an online game using a mobile terminal.
- the touch required to do so can be minimized, the operation feeling can be improved, and a comfortable view can be provided.
- FIG. 1 is a schematic diagram illustrating a mobile terminal 110 and 120 for processing a user gesture input in an online game system 100 according to an embodiment of the present invention.
- FIG. 2 is a flowchart illustrating a method of processing a user gesture input in an online game according to another embodiment of the present invention.
- FIG. 3 is a flowchart illustrating a method of processing a user gesture input in an online game according to another embodiment of the present invention.
- FIG. 4 is an exemplary diagram for describing a user gesture input and expression according to an embodiment of the present invention.
- FIG. 5 is another exemplary diagram for explaining a user gesture input and expression according to an exemplary embodiment.
- FIG. 6 is another exemplary diagram for describing a user gesture input and expression according to an embodiment of the present invention.
- FIG. 7 is another exemplary view for explaining a user gesture input and expression according to an embodiment of the present invention.
- FIG. 8 is an exemplary diagram for describing a user gesture determination according to an exemplary embodiment.
- FIG. 9 is an example diagram for describing a gesture including a unidirectional drag as an example of a user gesture determination according to an exemplary embodiment.
- FIG. 10 is a diagram for explaining a gesture including multidirectional drag as another example of determining a user gesture according to an exemplary embodiment.
- 11 to 17 are exemplary diagrams for describing various types of user gestures and functions according to an embodiment of the present invention.
- “communication”, “communication network” and “network” may be used as the same meaning.
- the three terms refer to wired and wireless local and wide area data transmission and reception networks capable of transmitting and receiving files between a user terminal, a terminal of other users, and a download server.
- a “game server” refers to a server computer to which users access to use game contents.
- a plurality of game programs may be operated in one game server.
- the game server may be connected to the server to perform the middleware or payment processing for the database, the description thereof will be omitted in the present invention.
- the online game means game content that can be used by users by accessing the above-mentioned game server. In particular, it refers to a game in which a large number of users can simultaneously access and enjoy the game, and increase the level through actions such as acquiring experience value while nurturing characters through the game. In addition, in order to facilitate the progress of the game in the game, it means a game that can purchase a variety of items.
- various community systems can be used. For example, a guild or clan of an online game may be formed. The above concept means that users who use an online game gather to form a group and organize a group.
- Each organization may increase the guild or clan's reputation, depending on the number of users or the level of their characters, and thus may utilize various benefits in the game. For example, if the guild or clan's reputation increases, the character's display on the game screen may change (e.g. the effect of changing the color of the displayed character's name), or the use of items and villages in the game. You can enjoy the benefits of
- a community system that can be used in online games is party play.
- the party play is a group in game play through requests, invitations, and acceptances among users, and the formed party members may use their own chat system or use a specific display for identifying the party members on the game screen.
- users who perform party play may distribute items to each other or share the result content obtained as a result of game play.
- the sharing method may also be set by each having a result content, or distributing at least a part of the result content to other characters.
- the result content refers to all content that can be obtained by the characters of the users as a result of the play during the game play.
- the experience content and cyber money that can be obtained at the end of a game may belong to the result content, and in the case of a sports game, the experience content and cyber money that can be obtained at the end of a game, etc.
- the result content may be the experience gained from completing a specific quest or killing a monster, and reward cyber money.
- the resultant content is basically belonged to the user's character.
- at least a part of the obtained content may be distributed to characters of other users belonging to the party, the guild, the clan, and the like.
- the item means all data that can help the progress of the game and generally can be understood as an item in the game.
- an experience value obtained when a character on behalf of a user defeats a monster, an item that allows for more gain, an item that can change the appearance of the character, and the like correspond to the item in the present invention. can do.
- FIG. 1 is a schematic diagram illustrating a mobile terminal 110 and 120 for processing a user gesture input in an online game system 100 according to an embodiment of the present invention.
- a tablet PC or a notebook 110 and a mobile phone or a smart phone 120 are illustrated, and an online game is connected to a game server 140 through a network 130.
- the network 130 includes a wired or wireless communication network.
- the game server 140 provides an online game to the mobile terminal.
- the online game may be a role playing game including MMORPG or MORPG.
- MMORPGs and MORPGs can be categorized, or as a general rule, if you create a "room” and only battle and hunt between users connected to the room, MORPG; Can be distinguished.
- the online game may be an AOS (Aeon of Strife) game, a Real Time Strategy (RTS) game, a First / Third Person Shooters (FPS / TPS) game, or a sports game.
- AOS Application of Strife
- RTS Real Time Strategy
- FPS First / Third Person Shooters
- sports game a sports game.
- Programs required for game implementation may be installed and executed in the mobile terminal.
- the mobile terminal is implemented with a touch-drag function, and the game is possible through the touch-drag function when performing an online game.
- the mobile terminal processes a user gesture input in an online game.
- the mobile terminal determines whether the target character of the online game is targeted, and when the target character is targeted, performs a function corresponding to a gesture including at least one drag direction. On the other hand, if the target character is not targeted to control the user character to move according to the touch-drag input.
- the target character includes a monster or a character of another user in the RPG game
- the first person and third person shooting game includes the enemy, and the opponent character in the sports game.
- the gesture may include various predefined drag directions, and may include up, down, left, and right directions, eight directions including diagonal directions, circular gestures, and repeated gestures.
- the function may be a skill that can be used by a user character.
- a specific skill When a specific skill is mapped to a specific gesture, a specific skill may be used for the target character.
- FIG. 2 is a flowchart illustrating a method of processing a user gesture input in an online game according to another embodiment of the present invention.
- the mobile terminal accesses a game server to start an online game.
- step 204 it is determined whether the target character of the online game is targeted.
- the target character may be a monster that is a target of combat and hunting in the RPG game as an opponent of the user character in the online game.
- the targeting method may perform targeting by touching a corresponding target character.
- a gesture including drag is input, and when a gesture is input, a function corresponding to the gesture is performed.
- the function may be, for example, a weapon, an attack method, and the like that the user character may use in battle and hunting. These functions may vary depending on the level of the user character, and of course, the same can be applied to not only RPG games but also other genres of games.
- step 204 if the target character is not targeted, the user character is moved according to the touch-drag input. For example, in space or 3D space, the user character is moved to a position corresponding to the touched coordinates, or the user character is moved in the dragged direction.
- the method for processing a user gesture input in an online game can efficiently and quickly perform a function by minimizing a required touch when performing a function of a user character in an online game.
- a function only by the gesture input including the touch-drag to enhance the user's operation feeling of manipulating the user character, by not displaying a separate virtual keypad or operation keys on the display area of the mobile terminal, a wide field of view Can be provided to the user.
- FIG. 3 is a flowchart illustrating a method of processing a user gesture input in an online game according to another embodiment of the present invention.
- step 300 the target character is targeted.
- step 302 if there is a drag input from the user, in step 304 it is determined whether the starting point of the drag is valid.
- the starting point of the drag refers to a point where the user first touches the drag.
- the dragging point is determined by whether the first touched point is within a part of the display area of the mobile terminal, for example, the play area of the user character and the target character. Determine if the starting point is valid. If the starting point of the drag is outside the valid range, the drag input is ignored.
- step 306 if the starting point of the drag is valid, in step 306 it is determined whether the input drag is a valid gesture.
- the gesture includes a combination of unidirectional drag or multidirectional drag. The determination of the gesture will be described later with reference to FIGS. 8 to 10.
- step 306 if the input drag is a valid gesture, in step 308 it is determined whether there are a plurality of valid gestures. If the valid gesture is singular, then at step 314, the function corresponding to the gesture is performed. If there are multiple valid gestures, in step 310 the number and order of gestures are determined. In operation 312, a function according to the number and order of gestures is performed.
- step 302 if it is not a drag input, in step 316, in case of a touch input, in step 318, the user character is moved to a position corresponding to the touched coordinate.
- FIG. 4 is an exemplary diagram for describing a user gesture input and expression according to an embodiment of the present invention.
- an area 400 corresponding to an effective range of a gesture input is shown on a display screen of a mobile terminal.
- the effective range area 400 is within the play area of the user character 410 and the target character 420.
- a gesture including at least one drag direction is input.
- the starting point 441 of the downward gesture 440 is within the effective range area 400, but the end point is outside the effective range area 400, and the starting point 451 of the upward gesture 450 is an effective range. Outside the area 400, the endpoint 452 is within the effective range area 400.
- the gesture input of the user in order to ensure the validity of the gesture input of the user, when the start position of the gesture is within the effective range area, it is determined as a valid gesture, but the gesture input is determined regardless of the end position. That is, in FIG. 4, the gesture 440 is determined to be a valid gesture, but the gesture 450 is determined to be an invalid gesture.
- a gesture input on the menu 460 in which the user can use the function by touch is determined as an invalid gesture. That is, the down gesture 440 performs the corresponding function, and the up gesture 450 does not perform the corresponding function.
- FIG. 5 is another exemplary diagram for explaining a user gesture input and expression according to an exemplary embodiment.
- a trail 510 of a gesture is output based on a touch point or a starting point 500 of the drag.
- the targeting is released, the trail according to the touch is not output. The trail will automatically disappear after a period of time after output.
- FIG. 6 is another exemplary diagram for describing a user gesture input and expression according to an embodiment of the present invention.
- the gesture guide 600 is output at the bottom right of the screen.
- the gesture guide includes respective gestures 620 corresponding to functions 610 that the user character can use.
- a gesture shape of the corresponding function may be output on the function icon.
- the gesture guide 600 may appear or disappear at a certain area of the screen, for example, at the bottom right, by the user's selection.
- FIG. 7 is another exemplary view for explaining a user gesture input and expression according to an embodiment of the present invention.
- a gesture shape 710 of a function performed at the bottom of the screen is output, and at the top of the screen.
- the icon 720 of the function is output.
- the output gesture guide 700 disappears. For example, the gesture guide 700 may disappear until the next gesture is input.
- FIG. 8 is an exemplary diagram for describing a user gesture determination according to an exemplary embodiment.
- patterns of drag in eight directions are shown.
- Four directions (first pattern, third pattern, fifth pattern, and seventh pattern) of up, down, left and right, and four directions (second pattern, fourth pattern, sixth pattern, and eighth pattern) of each diagonal line Include.
- first pattern, third pattern, fifth pattern, and seventh pattern of up, down, left and right
- second pattern, fourth pattern, sixth pattern, and eighth pattern of each diagonal line
- Analysis of the drag pattern is calculated by accumulating the amount of change in the coordinates moved by the drag.
- the amount of change in the left and right (X-axis) and the top and bottom (Y-axis) is stored, and the stored change amount is analyzed and determined as one pattern as shown in FIG. 8.
- the functions are classified and performed according to the number of patterns, and when the accumulated change amount exceeds the set minimum value, the upper, lower, and left and right values are stored as one bit. For example, a pattern value of eight directions can be used in total. A more specific pattern determination method will be described with reference to FIGS. 9 and 10.
- FIG. 9 is an example diagram for describing a gesture including a unidirectional drag as an example of a user gesture determination according to an exemplary embodiment.
- FIG. 9A an invalid range 900 and an effective range 910 are shown. It is determined whether the accumulated change amount of the coordinates moved by the drag on the X and Y axes is in the invalid range 900 or the effective range 910.
- the drag 901 illustrated in FIG. 9A is in the invalid range 900 in both the X-axis and Y-axis directions, it is determined that the drag 901 is not a valid gesture, and a function corresponding to the gesture is not performed.
- the drag 902 illustrated in FIG. 9B is in the effective range 910 in both the X-axis and Y-axis directions, it is determined that the drag 902 is a valid gesture, and performs a function corresponding to the gesture.
- the drag 903 shown in FIG. 9C is in the effective range 910 in the X-axis direction, but in the invalid range 900 in the Y-axis direction. In this case, it is judged that it is a valid gesture when it is in an effective range in either axial direction.
- the drag 903 illustrated in FIG. 9C determines that the gesture corresponds to the pattern 3 illustrated in FIG. 8, and performs a function corresponding to the corresponding gesture.
- FIG. 10 is a diagram for explaining a gesture including multidirectional drag as another example of determining a user gesture according to an exemplary embodiment.
- FIG. 10 a method of determining a gesture composed of multidirectional drags, unlike a gesture composed of unidirectional drags illustrated in FIG. 9, will be described.
- a circular gesture is shown in FIG. 10A and a repeat gesture is shown in FIG. 10B.
- the A direction drag to the D direction drag are all determined to be valid gestures on the X and Y axes.
- Direction drag is pattern 2
- B drag is pattern 4
- C drag is pattern 6
- D drag is pattern 8, respectively.
- a final gesture is determined.
- the A direction drag is a valid gesture on the X axis and an invalid gesture on the Y axis.
- Dragging in the B direction is determined to be a valid gesture on both the X-axis and the Y-axis, and is determined by the pattern 6.
- the drag in the C direction is a valid gesture on the X axis, an invalid gesture on the Y axis, and is determined by pattern 3. Therefore, the gesture shown in FIG. 10B determines that the patterns 3, 6, and 3 are repeated gestures.
- the pattern determination method has been described with a specific pattern shape, but the present disclosure is not limited thereto, and the same determination method may be applied to various patterns.
- the function corresponding to the closest pattern may be performed. Therefore, the user's intention, that is, the intention to use the function can be reflected as much as possible.
- 11 to 17 are exemplary diagrams for describing various types of user gestures and functions according to an embodiment of the present invention.
- the target character 1210 is targeted 1220 and the user inputs the left drag 1250
- a function corresponding to the input gesture is provided.
- the user character 1200 is an assassin
- the evasion throwing function and the warrior perform a function corresponding to the breaking axis.
- a gesture 1230 and a function icon 1240 corresponding thereto are output.
- the target character 1410 is targeted 1420 and the user inputs the right upward drag 1450, it is determined that the gesture input is in a valid range, and a function corresponding to the input gesture is provided.
- a function corresponding to the input gesture is provided.
- the user character 1400 is an assassin
- the user character 1400 performs a function corresponding to Seung Ryong Charm.
- the gesture 1430 and the corresponding function icon 1440 are output.
- the gesture input has a valid range, and a function corresponding to the input gesture, for example, For example, if the user character 1500 is an assassin, he / she performs a function corresponding to a blood blast charmer and a warrior armor.
- a shape 1530 of the gesture and a function icon 1540 corresponding thereto are output.
- the target character 1610 is targeted 1620 and the user inputs a circular drag 1650
- a function corresponding to the input gesture is illustrated. For example, if the user character 1600 is an assassin, he performs a function corresponding to a blood blow and a jin injection.
- the gesture 1630 and the function icon 1640 corresponding thereto are output.
- the method for processing a user gesture input in an online game performed through a mobile terminal capable of touch-drag input includes an application basically installed in the mobile terminal (which is basically mounted in the terminal). It may be executed by a platform, operating system, etc.), the application directly installed on the terminal through an application providing server, such as an application store server, an application or a web server associated with the service (ie, Program).
- an application ie, a program
- the method for processing a user gesture input according to an embodiment of the present invention described above is implemented as an application (ie, a program) that is basically installed in a terminal or directly installed by a user, and can be read by a computer such as a terminal. Can be recorded on the medium.
- Such a program is recorded on a recording medium readable by a computer and executed by a computer so that the above functions can be executed.
- the above-described program is encoded in a computer language such as C, C ++, JAVA, or machine language that can be read by a computer processor (CPU).
- Code may be included.
- Such code may include a function code associated with a function or the like that defines the above-described functions, and may include execution procedure-related control code necessary for a processor of the computer to execute the above-described functions according to a predetermined procedure.
- the code may further include memory reference-related code for additional information or media required for a processor of the computer to execute the above-described functions at which location (address address) of the computer's internal or external memory. .
- the code indicates that the processor of the computer is a communication module of the computer (eg, a wired and / or wireless communication module).
- the communication code may further include communication related codes such as how to communicate with any other computer or server in the remote, and what information or media should be transmitted and received during communication.
- codes and code segments associated therewith may be used in consideration of a system environment of a computer that reads a recording medium and executes the program. It may be easily inferred or changed by.
- Examples of recording media that can be read by a computer recording a program as described above include, for example, a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, an optical media storage device, and the like.
- a computer-readable recording medium having recorded a program as described above may be distributed to computer systems connected through a network so that computer-readable codes may be stored and executed in a distributed manner.
- at least one of the plurality of distributed computers may execute some of the functions presented above, and transmit the result to at least one of the other distributed computers, and transmit the result.
- the receiving computer may also execute some of the functions presented above, and provide the results to other distributed computers as well.
- a computer-readable recording medium recording an application which is a program for executing a method of processing a user gesture input
- an application store server an application
- an application or a corresponding service.
- It may be a storage medium (eg, a hard disk, etc.) included in an application provider server such as a web server or the application providing server itself.
- the computer which can read the recording medium which recorded the application which is a program for executing the method of processing the user gesture input according to each embodiment of the present invention is not only a general PC such as a desktop or a notebook computer, but also a smart phone and a tablet PC. It may include a mobile terminal such as PDA (Personal Digital Assistants) and a mobile communication terminal, as well as to be interpreted as any computing device (Computing).
- a general PC such as a desktop or a notebook computer, but also a smart phone and a tablet PC. It may include a mobile terminal such as PDA (Personal Digital Assistants) and a mobile communication terminal, as well as to be interpreted as any computing device (Computing).
- PDA Personal Digital Assistants
- a computer capable of reading a recording medium recording an application which is a program for executing a method of processing a user gesture input may be a smart phone, a tablet PC, a personal digital assistant (PDA), a mobile communication terminal, or the like.
- the application may be downloaded from the application providing server to a general PC and installed on the mobile terminal through a synchronization program.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Business, Economics & Management (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Tourism & Hospitality (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Human Resources & Organizations (AREA)
- Marketing (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Economics (AREA)
- General Business, Economics & Management (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (17)
- 터치-드래그 입력이 가능한 모바일 단말을 통해 수행되는 온라인 게임에서의 유저 제스처 입력을 처리하는 방법으로서,상기 온라인 게임의 대상 캐릭터가 타겟팅되었는지 판단하는 단계; 및상기 대상 캐릭터가 타겟팅된 경우, 적어도 하나 이상의 드래그 방향을 포함하는 제스처에 상응하는 기능을 수행하고, 상기 대상 캐릭터가 타겟팅되지 않은 경우, 상기 터치-드래그 입력에 따라 유저 캐릭터를 이동시키는 단계를 포함하는 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 제 1 항에 있어서,상기 대상 캐릭터가 타겟팅된 경우, 상기 제스처 입력의 유효 범위를 판단하는 단계; 및상기 유효 범위 내의 제스처 입력인 경우에, 상기 입력된 제스처에 상응하는 기능을 수행하는 단계를 포함하고,상기 유효 범위는,상기 모바일 단말의 디스플레이 영역 중 상기 대상 캐릭터 및 상기 유저 캐릭터의 플레이 영역 내인 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 제 2 항에 있어서,상기 제스처를 구성하는 드래그의 시작 포인트가 상기 유효 범위 내인 경우에, 상기 유효 범위 내의 제스처 입력으로 판단하는 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 제 2 항에 있어서,상기 제스처를 구성하는 드래그의 시작 포인트가 상기 유효 범위 내인 경우, 상기 드래그의 종료 포인트에 관계없이 상기 유효 범위 내의 제스처 입력으로 판단하는 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 제 1 항에 있어서,상기 대상 캐릭터가 타겟팅된 경우, 상기 제스처를 구성하는 드래그의 시작 포인트에 상응하는 상기 모바일 단말의 디스플레이 영역의 위치에서 상기 제스처의 트레일을 출력하는 단계를 더 포함하는 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 제 1 항에 있어서,상기 제스처에 상응하는 기능을 수행된 경우, 상기 수행된 기능에 상응하는 아이콘과 상기 제스처를 나타내는 모양을 상기 모바일 단말의 디스플레이 영역에 디스플레이하는 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 제 1 항에 있어서,상기 제스처는,단방향의 드래그로 구성되며,X축과 Y축 각각에 설정된 유효 범위 중 적어도 하나의 유효 범위 내에 있는 경우, 유효 범위 내의 제스처 입력으로 판단하는 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 제 1 항에 있어서,상기 제스처는,다방향의 드래그로 구성되며,제1 방향 드래그 내지 제N 방향 드래그(N은 2 이상의 정수) 각각에 대해, X축과 Y축 각각에 설정된 유효 범위 중 적어도 하나의 유효 범위 내에 있고, 제N-1개의 드래그가 유효한 경우, 유효 범위 내의 제스처 입력이라고 판단하는 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 제 8 항에 있어서,상기 제1 방향 드래그 내지 제N 방향 드래그를 포함하는 다방향 드래그로 구성된 제스처는 방향의 시작 순서에 관계없이, 유효 범위 내의 제스처 입력인지를 판단하는 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 제 7 항 내지 제 9 항 중 어느 한 항에 있어서,상기 유효 범위 내의 제스처 입력이 아닌 경우, 상기 입력된 제스처와 유사한 제스처에 상응하는 기능을 수행하는 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 제 1 항에 있어서,상기 기능은,상기 유저 캐릭터가 사용할 수 있는 적어도 하나 이상의 스킬을 포함하는 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 제 1 항에 있어서,상기 유저 캐릭터가 사용할 수 있는 적어도 하나 이상의 기능들과, 상기 기능들에 매핑된 상기 적어도 하나 이상의 드래그 방향을 포함하는 제스처들을 포함하는 제스처 가이드를 상기 모바일 단말의 디스플레이 영역에 디스플레이하는 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 제 1 항에 있어서,상기 온라인 게임은,MMORPG/MORPG를 포함하는 RPG 게임, AOS(Aeon of Strife) 게임, RTS(Real Time Strategy) 게임, FPS/TPS(First/Third Person Shooters) 게임, 또는 스포츠 게임인 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 터치-드래그 입력이 가능한 모바일 단말을 통해 수행되는 온라인 게임에서의 유저 제스처 입력을 처리하는 방법으로서,상기 온라인 게임의 대상 캐릭터가 타겟팅되었는지 판단하는 단계;상기 대상 캐릭터가 타겟팅된 경우, 드래그 방향 및 드래그 개수 중 적어도 하나를 기초로 제스처 입력을 판단하는 단계; 및상기 제스처 입력의 판단 결과, 유효한 제스처가 입력된 경우, 입력된 제스처에 상응하는 기능을 수행하는 단계를 포함하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 제 14 항에 있어서,상기 대상 캐릭터가 타겟팅되지 않은 경우, 상기 터치-드래그 입력에 따라 유저 캐릭터를 이동시키는 단계를 더 포함하는 것을 특징으로 하는 온라인 게임에서의 유저 제스처 입력 처리 방법.
- 온라인 게임에서의 유저 제스처 입력을 처리하는 모바일 단말로서,상기 모바일 단말은,터치-드래그 입력이 가능하며,상기 온라인 게임에서의 대상 캐릭터가 타겟팅되었는지 판단하는 타겟팅 판단부; 및상기 대상 캐릭터가 타겟팅된 경우, 적어도 하나 이상의 드래그 방향을 포함하는 제스처에 상응하는 기능을 수행하고, 상기 대상 캐릭터가 타겟팅되지 않은 경우, 상기 터치-드래그 입력에 따라 유저 캐릭터를 이동시키도록 제어하는 제어부를 포함하는 것을 특징으로 하는 모바일 단말.
- 온라인 게임에서의 유저 제스처 입력을 처리하는 방법을 터치-드래그 입력이 가능한 모바일 단말을 통해 구현하기 위한 프로그램을 기록한 기록매체로서,상기 온라인 게임의 대상 캐릭터가 타겟팅되었는지 판단하는 제1 프로그램 코드;상기 대상 캐릭터가 타겟팅된 경우, 적어도 하나 이상의 드래그 방향을 포함하는 제스처에 상응하는 기능을 수행하는 제2 프로그램 코드; 및상기 대상 캐릭터가 타겟팅되지 않은 경우, 상기 터치-드래그 입력에 따라 유저 캐릭터를 이동시키는 제3 프로그램 코드를 포함하는 것을 특징으로 하는 기록매체.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/412,805 US20150157932A1 (en) | 2012-07-06 | 2012-11-09 | Method of processing user gesture inputs in online game |
CN201280075683.6A CN104603823A (zh) | 2012-07-06 | 2012-11-09 | 处理在线游戏中的用户手势输入的方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR10-2012-0074111 | 2012-07-06 | ||
KR1020120074111A KR101398086B1 (ko) | 2012-07-06 | 2012-07-06 | 온라인 게임에서의 유저 제스처 입력 처리 방법 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014007437A1 true WO2014007437A1 (ko) | 2014-01-09 |
Family
ID=49882171
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2012/009450 WO2014007437A1 (ko) | 2012-07-06 | 2012-11-09 | 온라인 게임에서의 유저 제스처 입력 처리 방법 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20150157932A1 (ko) |
KR (1) | KR101398086B1 (ko) |
CN (1) | CN104603823A (ko) |
WO (1) | WO2014007437A1 (ko) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018066862A1 (ko) * | 2016-10-06 | 2018-04-12 | 주식회사 핀콘 | 좌표 공략 게임 시스템 및 방법 |
Families Citing this family (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104346032B (zh) * | 2013-08-09 | 2019-07-26 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
KR101628780B1 (ko) | 2014-07-01 | 2016-06-09 | (주)위메이드엔터테인먼트 | 터치스크린 기반 사용자 조작 입력 처리 장치, 방법 및 애플리케이션 |
US9904463B2 (en) * | 2014-09-23 | 2018-02-27 | Sulake Corporation Oy | Method and apparatus for controlling user character for playing game within virtual environment |
KR101639037B1 (ko) * | 2015-01-28 | 2016-07-12 | 주식회사 두바퀴소프트 | 실시간 선택 스킬을 적용한 게임화면 표시방법 및 어플리케이션 |
US10286314B2 (en) * | 2015-05-14 | 2019-05-14 | Activision Publishing, Inc. | System and method for providing continuous gameplay in a multiplayer video game through an unbounded gameplay session |
KR101739840B1 (ko) * | 2015-06-10 | 2017-05-25 | (주)엔도어즈 | 게임 서비스 제공 장치 및 그 제어 방법 |
CN105148517B (zh) * | 2015-09-29 | 2017-08-15 | 腾讯科技(深圳)有限公司 | 一种信息处理方法、终端及计算机存储介质 |
CN105498202A (zh) * | 2015-11-26 | 2016-04-20 | 珠海网易达电子科技发展有限公司 | 全触屏划屏射击操作方式 |
CN105549884A (zh) * | 2015-12-11 | 2016-05-04 | 杭州勺子网络科技有限公司 | 一种触屏的手势输入识别方法 |
KR20170104819A (ko) * | 2016-03-08 | 2017-09-18 | 삼성전자주식회사 | 제스처를 가이드하는 전자 장치 및 그의 제스처 가이드 방법 |
CN106237615A (zh) * | 2016-07-22 | 2016-12-21 | 广州云火信息科技有限公司 | 多单位元素技能操作方式 |
WO2018084169A1 (ja) * | 2016-11-01 | 2018-05-11 | 株式会社コロプラ | ゲーム方法およびゲームプログラム |
WO2018216079A1 (ja) | 2017-05-22 | 2018-11-29 | 任天堂株式会社 | ゲームプログラム、情報処理装置、情報処理システム、および、ゲーム処理方法 |
JP6921193B2 (ja) * | 2017-05-22 | 2021-08-18 | 任天堂株式会社 | ゲームプログラム、情報処理装置、情報処理システム、および、ゲーム処理方法 |
JP6921192B2 (ja) | 2017-05-22 | 2021-08-18 | 任天堂株式会社 | ゲームプログラム、情報処理装置、情報処理システム、および、ゲーム処理方法 |
US10413814B2 (en) * | 2017-06-09 | 2019-09-17 | Supercell Oy | Apparatus and method for controlling user interface of computing apparatus |
CN107450812A (zh) * | 2017-06-26 | 2017-12-08 | 网易(杭州)网络有限公司 | 虚拟对象控制方法及装置、存储介质、电子设备 |
CN107469344A (zh) * | 2017-08-04 | 2017-12-15 | 上海风格信息技术股份有限公司 | 一种实现多人在线游戏移动终端点触操控的方法 |
CN109491579B (zh) * | 2017-09-12 | 2021-08-17 | 腾讯科技(深圳)有限公司 | 对虚拟对象进行操控的方法和装置 |
KR102072093B1 (ko) * | 2018-08-31 | 2020-03-02 | 주식회사 게임빈 | 웹 기반 서버-클라이언트 시스템에서 이미지 기반 게임 클라이언트의 보상 요청의 유효성을 검증하는 방법 |
KR102072092B1 (ko) * | 2017-12-29 | 2020-01-31 | 주식회사 게임빈 | 웹 기반 서버-클라이언트 시스템에서 코드 기반 게임 클라이언트의 보상 요청 신호에 대한 유효성 검증 방법 |
CN109126129B (zh) * | 2018-08-31 | 2022-03-08 | 腾讯科技(深圳)有限公司 | 在虚拟环境中对虚拟物品进行拾取的方法、装置及终端 |
US11045719B2 (en) * | 2018-09-12 | 2021-06-29 | King.Com Ltd. | Method and computer device for controlling a touch screen |
CN109513209B (zh) | 2018-11-22 | 2020-04-17 | 网易(杭州)网络有限公司 | 虚拟对象处理方法及装置、电子设备以及存储介质 |
CN110393916B (zh) | 2019-07-26 | 2023-03-14 | 腾讯科技(深圳)有限公司 | 视角转动的方法、装置、设备及存储介质 |
CN111282266B (zh) * | 2020-02-14 | 2021-08-03 | 腾讯科技(深圳)有限公司 | 三维虚拟环境中的技能瞄准方法、装置、终端及存储介质 |
CN111481932B (zh) * | 2020-04-15 | 2022-05-17 | 腾讯科技(深圳)有限公司 | 虚拟对象的控制方法、装置、设备和存储介质 |
CN111530075B (zh) * | 2020-04-20 | 2022-04-05 | 腾讯科技(深圳)有限公司 | 虚拟环境的画面显示方法、装置、设备及介质 |
JP7270008B2 (ja) * | 2020-09-08 | 2023-05-09 | カムツス コーポレーション | ゲーム提供方法、コンピュータプログラム、コンピュータ読取可能な記録媒体、およびコンピュータ装置 |
CA3160509A1 (en) * | 2021-05-14 | 2022-11-14 | Tencent Technology (Shenzhen) Company Limited | Virtual object control method, apparatus, device, and computer-readable storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006175059A (ja) * | 2004-12-22 | 2006-07-06 | Konami Co Ltd | ゲーム装置及びゲームプログラム |
JP2007130367A (ja) * | 2005-11-14 | 2007-05-31 | Nintendo Co Ltd | ゲーム装置およびゲームプログラム |
KR20100098972A (ko) * | 2009-03-02 | 2010-09-10 | 주식회사 엔씨소프트 | 온라인게임에서의 캐릭터 액션 입력장치 및 그 방법 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4260770B2 (ja) * | 2005-05-09 | 2009-04-30 | 任天堂株式会社 | ゲームプログラムおよびゲーム装置 |
US20060277466A1 (en) * | 2005-05-13 | 2006-12-07 | Anderson Thomas G | Bimodal user interaction with a simulated object |
US20110273380A1 (en) * | 2010-05-07 | 2011-11-10 | Research In Motion Limited | Portable electronic device and method of controlling same |
-
2012
- 2012-07-06 KR KR1020120074111A patent/KR101398086B1/ko active IP Right Grant
- 2012-11-09 US US14/412,805 patent/US20150157932A1/en not_active Abandoned
- 2012-11-09 CN CN201280075683.6A patent/CN104603823A/zh active Pending
- 2012-11-09 WO PCT/KR2012/009450 patent/WO2014007437A1/ko active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006175059A (ja) * | 2004-12-22 | 2006-07-06 | Konami Co Ltd | ゲーム装置及びゲームプログラム |
JP2007130367A (ja) * | 2005-11-14 | 2007-05-31 | Nintendo Co Ltd | ゲーム装置およびゲームプログラム |
KR20100098972A (ko) * | 2009-03-02 | 2010-09-10 | 주식회사 엔씨소프트 | 온라인게임에서의 캐릭터 액션 입력장치 및 그 방법 |
Non-Patent Citations (2)
Title |
---|
"''[Galaxy S2 Start] Gamevil, 2012 Professional baseball/2012'' My pitcher//Android Professional baseball''", NAVER BLOG, 27 October 2011 (2011-10-27), Retrieved from the Internet <URL:http://blog.naver.com/thescales?Redirect=Log&logNo=50124716308> * |
"Applications Game Review] Battle Heart(Battle heart)", TISTORY BLOG, 26 August 2011 (2011-08-26), Retrieved from the Internet <URL:http://mysticvision.tistory.com/33.> * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018066862A1 (ko) * | 2016-10-06 | 2018-04-12 | 주식회사 핀콘 | 좌표 공략 게임 시스템 및 방법 |
Also Published As
Publication number | Publication date |
---|---|
KR20140006642A (ko) | 2014-01-16 |
CN104603823A (zh) | 2015-05-06 |
US20150157932A1 (en) | 2015-06-11 |
KR101398086B1 (ko) | 2014-05-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2014007437A1 (ko) | 온라인 게임에서의 유저 제스처 입력 처리 방법 | |
US9492753B2 (en) | Game control device, program, recording medium, game control method, game control system | |
CN113101652A (zh) | 信息展示方法、装置、计算机设备及存储介质 | |
CN113786620B (zh) | 游戏信息推荐方法、装置、计算机设备及存储介质 | |
WO2022037529A1 (zh) | 虚拟对象的控制方法、装置、终端及存储介质 | |
KR20140135276A (ko) | 게임 스크린에 대한 사용자의 제스쳐 입력을 처리하는 장치 및 방법 | |
CN114159789A (zh) | 游戏交互方法、装置、计算机设备及存储介质 | |
KR101407483B1 (ko) | 터치 스크린을 구비한 모바일 단말기를 이용하여 온라인 게임을 수행하는 방법 및 시스템. | |
KR101404635B1 (ko) | 온라인 게임에서의 드래그 입력 처리 방법 | |
KR102609293B1 (ko) | 게임 동작 결정 장치 및 방법 | |
JP5409861B1 (ja) | ゲームシステム及びゲーム制御方法 | |
KR102260409B1 (ko) | 게임 인터페이스 방법 및 장치 | |
KR101417947B1 (ko) | 온라인 게임에서의 유저 제스처 입력 처리 방법 | |
CN104684622A (zh) | 可显示评论的游戏系统以及评论显示控制方法 | |
KR102584901B1 (ko) | 이벤트 정보 송신 장치 및 방법, 이벤트 정보 출력 장치 및 방법 | |
WO2013085196A1 (ko) | 온라인 게임에서의 소셜 네트워크 서비스 제공 방법 및 이를 수행하는 서버 | |
CN115040867A (zh) | 一种游戏卡牌控制方法、装置、计算机设备及存储介质 | |
CN114225412A (zh) | 信息处理方法、装置、计算机设备及存储介质 | |
CN115999153A (zh) | 虚拟角色的控制方法、装置、存储介质及终端设备 | |
CN114053714A (zh) | 虚拟对象的控制方法、装置、计算机设备及存储介质 | |
WO2013085189A1 (ko) | 아이템 사용 서비스 제공 방법 및 서버 | |
WO2013085105A1 (ko) | 온라인 게임에서의 친구간의 전적 제공 방법 및 서버 | |
KR102712772B1 (ko) | 콘텐츠 내 오브젝트 선택 장치 및 선택 방법 | |
WO2023193605A1 (zh) | 虚拟道具的处理方法、装置、终端、介质及程序产品 | |
KR102551096B1 (ko) | 클라우드 게임 서비스 제공 장치 및 클라우드 게임 서비스 제공 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12880626 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14412805 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 29/05/2015) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12880626 Country of ref document: EP Kind code of ref document: A1 |