WO2021017783A1 - 视角转动的方法、装置、设备及存储介质 - Google Patents
视角转动的方法、装置、设备及存储介质 Download PDFInfo
- Publication number
- WO2021017783A1 WO2021017783A1 PCT/CN2020/100873 CN2020100873W WO2021017783A1 WO 2021017783 A1 WO2021017783 A1 WO 2021017783A1 CN 2020100873 W CN2020100873 W CN 2020100873W WO 2021017783 A1 WO2021017783 A1 WO 2021017783A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- viewing angle
- function
- angle rotation
- control
- function control
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 74
- 230000001960 triggered effect Effects 0.000 claims abstract description 51
- 230000004044 response Effects 0.000 claims abstract description 11
- 230000006870 function Effects 0.000 claims description 379
- 238000012545 processing Methods 0.000 claims description 23
- 238000004590 computer program Methods 0.000 claims description 2
- 230000008569 process Effects 0.000 abstract description 17
- 230000003993 interaction Effects 0.000 abstract description 9
- 230000003213 activating effect Effects 0.000 abstract 2
- 239000000523 sample Substances 0.000 description 17
- 238000010586 diagram Methods 0.000 description 10
- 230000002093 peripheral effect Effects 0.000 description 10
- 230000001133 acceleration Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 208000015041 syndromic microphthalmia 10 Diseases 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 3
- 238000010304 firing Methods 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 239000000919 ceramic Substances 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000008034 disappearance Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000012528 membrane Substances 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- PICXIOQBANWBIZ-UHFFFAOYSA-N zinc;1-oxidopyridine-2-thione Chemical class [Zn+2].[O-]N1C=CC=CC1=S.[O-]N1C=CC=CC1=S PICXIOQBANWBIZ-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
- A63F13/5252—Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6661—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
- A63F2300/6669—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera using a plurality of virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character change rooms
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6661—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
- A63F2300/6676—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dedicated player input
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8076—Shooting
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8082—Virtual reality
Definitions
- the embodiments of the present application relate to the field of human-computer interaction, and particularly relate to viewing angle rotation technology.
- the user aims at the shooting target and observes the environment by controlling the rotation of the virtual character's perspective.
- the user interface of the above-mentioned application program is provided with a viewing angle rotation control, and the user controls the viewing angle rotation of the virtual character through moving operations such as up, down, left, and right triggered on the viewing angle rotation control; and in the process of controlling the viewing angle rotation, the screen simultaneously It can only respond to the viewing angle rotation operation of one contact.
- the screen can only respond to the viewing angle rotation operation of one contact at the same time, once the only contact disappears, switching the viewing angle again needs to trigger a contact again, which reduces the interaction efficiency during the operation.
- the embodiments of the present application provide a method, device, device, and storage medium for viewing angle rotation, which can improve the interaction efficiency during the viewing angle rotation operation.
- the technical solution is as follows:
- a method of viewing angle rotation which is executed by a terminal, and the method includes:
- the first-view screen is the screen when the virtual environment is observed in the virtual environment from the first viewpoint of the virtual character, and the first-view screen is superimposed with the first functional control and the second functional control,
- the first function control is used to support the first function and the viewing angle rotation function
- the second function control is used to support the second function and the viewing angle rotation function
- the first function and perspective rotation function of the first functional control are turned on, and the first perspective screen is switched to the second perspective screen;
- the second perspective screen is the second perspective direction of the virtual character in the virtual environment The screen when observing the virtual environment;
- the viewing angle rotation function of the first functional control is turned off, the second function and viewing angle rotation function of the second functional control are turned on, and the second viewing angle screen is switched to the third viewing angle screen; the third viewing angle screen is in the virtual In the environment, the virtual character's third angle of view is used to observe the screen in the virtual environment.
- a viewing angle rotation device which includes:
- the display module is used to display the first-view picture of the application program.
- the first-view picture is the picture when the virtual environment is observed in the virtual environment from the first viewpoint of the virtual character, and the first function control is superimposed on the first-view picture
- a second function control the first function control is used to support the first function and the viewing angle rotation function
- the second function control is used to support the second function and the viewing angle rotation function
- a receiving module configured to receive a first viewing angle rotation operation triggered based on the first function control
- the processing module is used for turning on the first function and the viewing angle rotation function of the first functional control according to the first viewing angle rotation operation, and switching the first viewing angle picture to the second viewing angle picture;
- the second viewing angle picture adopts a virtual character in a virtual environment The picture when observing the virtual environment in the second angle of view;
- a receiving module configured to receive a second viewing angle rotation operation triggered based on the second functional control when the first functional control is in the on state
- the processing module is used for turning off the viewing angle rotation function of the first functional control according to the second viewing angle rotation operation, turning on the second function and viewing angle rotation function of the second functional control, and switching the second viewing angle picture to the third viewing angle picture;
- the perspective screen is the screen when the virtual character's third perspective is used to observe the virtual environment in the virtual environment.
- a terminal includes:
- the processor connected to the memory;
- the processor is configured to load and execute executable instructions to implement the method of viewing angle rotation as described in the previous aspect and any of its optional embodiments.
- a computer-readable storage medium stores at least one instruction, at least one program, code set, or instruction set, the above-mentioned at least one instruction and the above-mentioned at least one program ,
- the foregoing code set or instruction set is loaded and executed by the processor to implement the method of viewing angle rotation as described in the previous aspect and any of its optional embodiments.
- a computer program product including instructions, which when run on a computer, cause the computer to execute the method of viewing angle rotation described in the previous aspect and any of its optional embodiments.
- the terminal displays the first-view screen of the application program.
- the first-view screen is superimposed with the first functional control and the second functional control.
- the first functional control is used to support the first function and the viewing angle rotation function
- the second functional control is used to support The second function and the viewing angle rotation function;
- the terminal receives the first viewing angle rotation operation triggered based on the first function control; according to the first viewing angle rotation operation, the first function and the viewing angle rotation function of the first functional control are turned on, and the first viewing angle screen is switched It is a second perspective picture; when the first functional control is in the on state, the second perspective rotation operation triggered by the second functional control is received; according to the second perspective rotation operation, the perspective rotation function of the first functional control is turned off, and the second The second function of the function control and the viewing angle rotation function switch the second viewing angle picture to the third viewing angle picture.
- the above method can also respond to the viewing angle rotation operation triggered by the second function control while the viewing angle rotation function of the first functional control is triggered, that is, the screen can respond to the viewing angle rotation operation of at least two contacts at the same time, which improves the operation process.
- the terminal when the first functional control is in the on state, based on the viewing angle rotation function triggered by the second functional control, the terminal first responds to the viewing angle rotation operation triggered based on the second functional control, ensuring that multiple viewing angle rotation functions are provided.
- the functional controls are all turned on, the order and accuracy of the terminal's response to the viewing angle rotation operation.
- Fig. 1 is a schematic diagram of a camera model provided by an exemplary embodiment of the present application
- Fig. 2 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
- FIG. 3 is a structural block diagram of a terminal provided by another exemplary embodiment of the present application.
- FIG. 4 is a flowchart of a method of viewing angle rotation provided by an exemplary embodiment of the present application
- Fig. 5 is a schematic diagram of an interface for viewing angle rotation provided by an exemplary embodiment of the present application.
- FIG. 6 is a flowchart of a method of viewing angle rotation provided by another exemplary embodiment of the present application.
- FIG. 7 is a flowchart of a method of viewing angle rotation provided by another exemplary embodiment of the present application.
- FIG. 8 is a flowchart of a method of viewing angle rotation provided by another exemplary embodiment of the present application.
- Fig. 9 is a flowchart of a method for setting a viewing angle rotation operation provided by an exemplary embodiment of the present application.
- FIG. 10 is a schematic diagram of an interface of a method for setting a viewing angle rotation operation provided by an exemplary embodiment of the present application
- FIG. 11 is a schematic interface diagram of a method for setting a viewing angle rotation operation provided by another exemplary embodiment of the present application.
- FIG. 12 is a flowchart of a method of viewing angle rotation provided by another exemplary embodiment of the present application.
- FIG. 13 is a flowchart of a method of viewing angle rotation provided by another exemplary embodiment of the present application.
- FIG. 14 is a flowchart of a method of viewing angle rotation provided by another exemplary embodiment of the present application.
- FIG. 15 is a block diagram of a viewing angle rotation device provided by an exemplary embodiment of the present application.
- Fig. 16 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
- Virtual environment the virtual environment displayed (or provided) when the application is running on the terminal.
- the virtual environment may be a simulation environment of the real world, a semi-simulation and semi-fictional environment, or a purely fictitious environment.
- the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment.
- the following embodiments take the virtual environment as a three-dimensional virtual environment as an example, but are not limited thereto.
- Virtual role refers to the movable object in the virtual environment.
- the movable object may be at least one of a virtual character, a virtual animal, and an animation character.
- the virtual environment is a three-dimensional virtual environment
- the virtual character is a three-dimensional model created based on animation skeleton technology.
- Each virtual character has its own shape and volume in the three-dimensional virtual environment, and occupies a part of the space in the three-dimensional virtual environment.
- Viewing direction the viewing direction when the virtual character is observed in the virtual environment from the first-person, third-person, or other perspectives.
- other perspectives can be overhead perspectives or any possible other perspectives
- the first-person perspective is the observation perspective of the first-person virtual character in the virtual environment, and the observed virtual screen does not include the virtual character itself
- the third-person perspective is The viewing angle of the third-person virtual character in the virtual environment.
- the observed virtual picture includes the virtual character itself.
- the viewing angle direction refers to the direction observed through the camera model when the virtual character observes in the virtual environment.
- the camera model automatically follows the virtual character in the virtual environment, that is, when the position of the virtual character in the virtual environment changes, the camera model follows the position of the virtual character in the virtual environment and changes simultaneously, and the camera The model is always within the preset distance range of the virtual character in the virtual environment.
- the relative position of the camera model and the virtual character does not change.
- Camera model is a three-dimensional model located around the virtual character in a three-dimensional virtual environment.
- the camera model is located near the head of the virtual character or the head of the virtual character;
- the third-person perspective is adopted, the The camera model can be located behind the virtual character and bound with the virtual character, or can be located at any position with a preset distance from the virtual character.
- the camera model can observe the virtual character in the three-dimensional virtual environment from different angles.
- the third-person perspective is the over-the-shoulder perspective of the first person
- the camera model is located behind the virtual character (such as the head and shoulders of the virtual character).
- the perspective includes other perspectives, such as a top-view perspective; when a top-down perspective is used, the camera model can be located above the virtual character's head, and the top-view perspective is viewed from the air Angle of view to observe the virtual environment.
- the camera model is not actually displayed in the three-dimensional virtual environment, that is, the camera model is not displayed in the three-dimensional virtual environment displayed on the user interface.
- a virtual character corresponds to a camera model.
- the camera model can be rotated with the virtual character as the center of rotation, such as: Any point of is the center of rotation to rotate the camera model.
- the camera model not only rotates in angle, but also shifts in displacement.
- the distance between the camera model and the center of rotation remains unchanged. That is, the camera model is rotated on the surface of the sphere with the center of rotation as the center of the sphere, where any point of the virtual character can be the head, torso, or any point around the virtual character.
- the center of the angle of view of the camera model points in a direction where the point on the spherical surface where the camera model is located points to the center of the sphere.
- the camera model can also observe the virtual character at a preset angle in different directions.
- a point is determined in the virtual character 11 as the rotation center 12, and the camera model rotates around the rotation center 12.
- the camera model is configured with an initial position, which is the virtual character The position above the back (such as the position behind the brain).
- the initial position is position 13, and when the camera model rotates to position 14 or position 15, the viewing angle direction of the camera model changes with the rotation of the camera model.
- the terminal in this application can be a laptop portable computer, a mobile phone, a tablet computer, an e-book reader, an electronic game console, a moving picture expert compression standard audio layer 4 (Moving Picture Experts Group Audio Layer IV, MP4) player, etc. .
- a moving picture expert compression standard audio layer 4 Moving Picture Experts Group Audio Layer IV, MP4
- the aforementioned terminal includes a pressure touch screen 120, a memory 140, and a processor 160. Please refer to the structural block diagram of the terminal shown in FIG. 2.
- the touch screen 120 may be a capacitive screen or a resistive screen.
- the touch screen 120 is used to implement interaction between the terminal and the user.
- the terminal obtains the viewing angle rotation operation triggered by the user through the touch screen 120.
- the memory 140 may include one or more computer-readable storage media.
- the foregoing computer storage medium includes at least one of random access memory (Random Access Memory, RAM), read only memory (Read Only Memory, ROM), and flash memory (Flash).
- An operating system 142 and application programs 144 are installed in the memory 140.
- the operating system 142 is basic software that provides the application program 144 with secure access to computer hardware.
- the operating system 142 may be an Android system (Android) or an Apple system (IOS).
- the application program 144 is an application program supporting a virtual environment, and the virtual environment includes a virtual character.
- the application program 144 is an application program supporting a three-dimensional virtual environment.
- the application program 144 may be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, a MOBA game, and a multiplayer gun battle survival game.
- the application 144 may be a stand-alone version of the application, such as a stand-alone version of a 3D game program; it may also be an online version of the application.
- the processor 160 may include one or more processing cores, such as a 4-core processor or an 8-core processor.
- the processor 160 is configured to execute a viewing angle rotation command according to the viewing angle rotation operation of the virtual character received on the touch screen 120.
- the foregoing terminal may further include a gyroscope 180.
- the above-mentioned gyroscope 180 is used to obtain the viewing angle rotation operation of the virtual character triggered by the user.
- Fig. 4 is a flowchart of a method of viewing angle rotation provided by an exemplary embodiment of the present application. The method is applied to the terminal shown in Fig. 2 or Fig. 3 as an example for illustration. The method includes:
- Step 201 Display a first-view picture of the application program.
- the terminal displays the first-view picture of the application program.
- the application program may be at least one of a virtual reality application, a three-dimensional map application, a military simulation program, a TPS game, an FPS game, and a MOBA game.
- the first-perspective picture is a picture when the virtual character's first-perspective direction is adopted in the virtual environment to observe the virtual environment.
- the first perspective direction may be a direction of observing the virtual environment using at least one of a first person perspective, a third person perspective, or other perspectives.
- the other viewing angles can be a top view or any other possible viewing angles.
- the virtual environment screen corresponding to the first-person perspective does not include the virtual character itself; the third-person perspective and the virtual environment screen corresponding to the overhead perspective include the virtual character itself. For example, when you observe the virtual environment through the camera model, you can see The three-dimensional model of the virtual character and the virtual firearms held by the virtual character.
- a first function control and a second function control are superimposed on the first view angle screen, the first function control is used to support the first function and the viewing angle rotation function, and the second function control is used to support the second function and the viewing angle rotation function.
- the first function refers to other functions except the viewing angle rotation function; the second function refers to other functions except the viewing angle rotation function.
- the other functions may be a lens opening function, a probe function, or a shooting function, etc.
- the first function control includes at least one of a lens opening control, a probe control and a shooting control.
- the second function control includes at least one of a lens opening control, a probe control and a shooting control.
- the first functional control is different from the second functional control.
- the first functional control is a mirror opening control
- the second functional control is a probe control.
- the open-scope control is used to turn on or off the collimator, and the collimator is used to assist in aiming the target during shooting.
- the collimator may include a multiplier, a red dot collimator, and a holographic collimator.
- the probe control is used to control the virtual character's head to shoot out when there is an obstruction, thereby reducing the exposed area of itself.
- the shooting control is used to control fire, for example, to control a virtual rifle to fire at the target.
- Step 202 Receive a first viewing angle rotation operation triggered based on the first function control.
- the terminal receives a first view angle rotation operation triggered based on the first function control; optionally, the first view angle rotation operation includes any one of a click operation and a long press operation.
- Step 203 Turn on the first function and the viewing angle rotation function of the first functional control according to the first viewing angle rotation operation, and switch the first viewing angle picture to the second viewing angle picture.
- the terminal rotates a corresponding angle based on the first viewing angle direction according to the first viewing angle rotation operation to rotate the first viewing angle picture to the second viewing angle picture.
- the second-perspective picture is a picture when the virtual character's second-perspective direction is adopted in the virtual environment to observe the virtual environment.
- the terminal generates a first sequence label of the first functional control according to the first viewing angle rotation operation; the above first sequence label is used to determine whether to turn on or off the viewing angle rotation operation of the first functional control.
- the terminal when the first functional control is a mirror opening control, the terminal turns on the collimator according to the first viewing angle rotation operation, and turns on the viewing angle rotation operation to switch the first viewing angle picture to the second viewing angle picture.
- the terminal turns on the probe function according to the first viewing angle rotation operation, and turns on the viewing angle rotation operation to switch the first viewing angle picture to the second viewing angle picture.
- the terminal fires according to the first angle of view rotation operation, and starts the angle of view rotation operation to switch the first angle of view screen to the second angle of view screen.
- the firing control control fire includes two modes: first, press to fire; second, release to fire; therefore, the above terminal fires according to the first angle of view rotation operation, which can be fired when the firing control is pressed , It can also fire when the shooting control is pressed and released.
- Step 204 Receive a second viewing angle rotation operation triggered based on the second functional control when the first functional control is in the on state.
- the terminal receives a second viewing angle rotation operation triggered based on the second function control.
- the second viewing angle rotation operation includes any one of a click operation and a long press operation.
- Step 205 Turn off the viewing angle rotation function of the first functional control according to the second viewing angle rotation operation, turn on the second function and viewing angle rotation function of the second functional control, and switch the second viewing angle image to the third viewing angle image.
- the terminal rotates the corresponding angle based on the second viewing angle direction according to the second viewing angle rotation operation to rotate the second viewing angle picture to the third viewing angle picture.
- the third-perspective picture is a picture when the virtual character's third-perspective direction is adopted in the virtual environment to observe the virtual environment.
- the terminal when the second functional control is a mirror opening control, the terminal turns on the collimator according to the second viewing angle rotation operation, and turns on the viewing angle rotation operation to switch the second viewing angle picture to the third viewing angle picture.
- the terminal turns on the probe function according to the second viewing angle rotation operation, and turns on the viewing angle rotation operation to switch the second viewing angle image to the third viewing angle image.
- the terminal fires according to the second angle of view rotation operation, and starts the angle of view rotation operation to switch the second angle of view screen to the third angle of view screen.
- the first function of the first function control is in the on state, for example, the collimator is in the on state, or the probe function is in the on state. It should be noted that if the first function control is a shooting control, if it is a non-continuous shooting state, the shooting control is still in the open state after a firing, but no bullets will be fired.
- the aforementioned second sequence label is used to determine whether to enable or disable the viewing angle rotation function of the second function control.
- the first functional control includes the first functional control in an on state, and the first functional control in the on state corresponds to a first sequence label; the schematic steps for generating the second sequence label are as follows:
- the second sequence label is x+1; for example, if the first sequence label includes 1, 2, and the largest sequence label is 2, the second sequence label is determined to be 3.
- the terminal determines whether the second sequence label is greater than the first sequence label; when the second sequence label is greater than the first sequence label, the terminal turns off the viewing angle rotation function of the first functional control, and turns on the second function and viewing angle rotation function of the second functional control .
- the terminal displays the first-view screen of the application program.
- the first-view screen is superimposed with the first functional control and the second functional control, and the first functional control is used to support The first function and the viewing angle rotation function, the second function control is used to support the second function and the viewing angle rotation function;
- the terminal receives the first viewing angle rotation operation triggered based on the first function control; according to the first viewing angle rotation operation, the first function control is turned on The first function and the viewing angle rotation function of the switch, the first viewing angle picture is switched to the second viewing angle picture; when the first function control is in the open state, receiving the second viewing angle rotation operation triggered by the second function control; rotating according to the second viewing angle Operate, turn off the viewing angle rotation function of the first functional control, turn on the second function and viewing angle rotation function of the second functional control, and switch the second viewing angle picture to the third viewing angle picture.
- the above method can also respond to the viewing angle rotation operation triggered by the second function control while the viewing angle rotation function of the first functional control is triggered, that is, the screen can respond to the viewing angle rotation operation of at least two contacts at the same time, which improves the operation process.
- the screen can respond to the viewing angle rotation operation of at least two contacts at the same time, which improves the operation process.
- the efficiency of the interaction when the first functional control is in the on state, the viewing angle rotation function is triggered based on the second functional control, and the terminal first responds to the viewing angle rotation operation triggered based on the second functional control, which ensures that the viewing angle rotation function is triggered in multiple When the functional controls are all turned on, the order and accuracy of the terminal's response to the viewing angle rotation operation.
- the setting of multiple functional controls with viewing angle rotation function ensures that players can freely complete the viewing angle rotation operation in different states, providing more flexibility for combat Sex and operating space.
- the switching of the viewing angle rotation function of the first functional control and the second functional control is shown.
- the user interface 21 under the first viewing angle includes the probe control 22, the lens opening control 23 and the shooting control 24; the terminal receiving is based on the probe control 22
- the first angle of view rotation operation the first angle of view is rotated to the second angle of view, such as the user interface 25 in the second angle of view, relative to the first angle of view, the second angle of view has moved the distance L1 to the right;
- the terminal While being triggered, the terminal receives the second-view rotation operation based on the shooting button 24, and rotates the second-view picture to the third-view picture, such as the user interface 26 in the third viewpoint.
- the right has moved the distance L2.
- the terminal may determine a functional control from the first functional control to perform the viewing angle rotation operation;
- step 206 to step 208 are added after step 205, as shown in FIG. 6, and the steps are as follows:
- Step 206 Determine whether to end the second viewing angle rotation operation on the second function control.
- the terminal determines whether to end the second viewing angle rotation operation on the second function control.
- step 207 is executed; when the terminal does not end the second view angle rotation operation on the second function control, step 208 is executed.
- the terminal uses a drag operation to rotate the view angle of the virtual character, that is, the second view angle rotation operation also includes a drag operation.
- the second viewing angle rotation operation includes a click operation and a drag operation.
- step 207 is executed; otherwise, step 208 is executed.
- the second viewing angle rotation operation includes a long-press operation and a drag operation.
- step 207 is executed; otherwise, step 208 is executed.
- the second viewing angle rotation operation on the second function control ends, that is, the viewing angle rotation function on the second function control is turned off.
- Step 207 Determine the i-th first function control in the open state from the first function controls in the open state, and turn on the viewing angle rotation function of the i-th first function control in the open state.
- the terminal determines the i-th first functional control in the on state from the first functional control in the on state, and turns on the first functional control in the on state.
- the viewing angle rotation function of i first function controls in the open state, i is a positive integer.
- the viewing angle rotation function of the i-th first functional control in the on state is turned on.
- the n first function controls that are in the open state respectively correspond to n first sequence numbers
- the largest sequence label is determined from the n first sequence numbers
- the i-th corresponding to the largest sequence label is in the open state
- the first functional control of is filtered out, and the viewing angle rotation function of the i-th first functional control is turned on, and n is a positive integer.
- Step 208 still perform the second viewing angle rotation operation on the second function control.
- the control automatically takes over the viewing angle rotation, which greatly guarantees the orderliness and accuracy of the terminal's response to the viewing angle rotation operation when there are multiple functional controls with the viewing angle rotation function.
- the disappearance of the rotation contact of one viewing angle can receive the response of the rotation of the viewing angle by the rotation contact of another viewing angle, and it can also avoid the situation that the picture is stuck when the viewing angle rotation contact is triggered again.
- the embodiment shown in FIG. 6 described above can be divided into two parts: the pressing process and the releasing process according to the operating state of the second function control.
- the pressing of the second function control Schematic description of the process:
- Step 31 start.
- Step 32 The terminal receives a pressing operation on the second functional control with the viewing angle rotation function (triggering of the second viewing angle rotation operation).
- Step 33 The terminal activates the viewing angle rotation function of the second functional control according to the pressing operation, and marks the second sequence number for the second functional control.
- the terminal marks the second functional control with a second sequence number.
- the last triggered function control with viewing angle rotation function is marked with a sequence number of x, and the second sequence number is x+1, that is, before the second function control is triggered, x is the largest sequence label ; X is a positive integer.
- Step 34 The terminal judges whether a first functional control with a viewing angle rotation function is in a pressed state.
- the terminal determines that the first function control with the viewing angle rotation function is in the pressed state, and step 35 is executed; otherwise, step 36 is executed.
- Step 35 The terminal turns off the viewing angle rotation function of the first function control.
- the end here refers to the end of searching for the first functional control that is in the pressed state, but the viewing angle rotation function of the second functional control is still in the on state.
- Step 41 start.
- Step 42 The terminal receives the release operation on the second function control with the viewing angle rotation function (cancellation of the second viewing angle rotation operation).
- Step 43 The terminal judges whether the viewing angle rotation function of the second function control is in an on state.
- step 47 is executed; otherwise, step 44 is executed.
- Step 44 The terminal judges whether a first function control with a viewing angle rotation function is in a pressed state.
- the terminal determines that the first function control with the viewing angle rotation function is in the pressed state, and step 45 is executed; otherwise, step 47 is executed.
- Step 45 The terminal searches for the first functional control corresponding to the largest sequence label.
- the terminal determines the first function control corresponding to the largest sequence label from the n first function controls in the pressed state.
- Step 46 The terminal enables the viewing angle rotation function of the first function control corresponding to the largest sequence label.
- the user can customize the trigger mode of the viewing angle rotation operation.
- the trigger mode of the viewing angle rotation operation can be defined as a click operation, or a long press operation, or a touch operation.
- the customization of the trigger mode of the viewing angle rotation operation is described, as shown in Figure 9. The steps are as follows:
- Step 301 Display the setting interface of the application.
- the setting interface of the application program is displayed on the terminal, and the setting interface includes at least two mode setting controls, and the mode setting controls are used to set the trigger mode of the viewing angle rotation operation.
- the mode setting control includes at least two of a click mode setting control, a long-press mode setting control, and a mixed mode setting control.
- the viewing angle rotation operation corresponding to the click mode setting control is a click operation
- the viewing angle rotation operation corresponding to the long press mode setting control is a long press operation
- the viewing angle rotation operation corresponding to the mixed mode setting control is a touch operation
- the duration of the touch operation is used for Determine whether the second function of the second function control is turned on or off.
- the viewing angle rotation operation includes any one of a first viewing angle rotation operation and a second viewing angle rotation operation.
- Step 302 Receive a selection operation triggered on the setting interface.
- the selection operation is used to select at least two mode setting controls corresponding to the target trigger mode.
- the selection operation may include at least one of a single-click operation, a double-click operation, a long press operation, and a sliding operation.
- Step 303 According to the selection operation, the trigger mode of the viewing angle rotation operation is determined as the target trigger mode.
- the terminal determines the trigger mode of the viewing angle rotation operation as the target trigger mode; optionally, the target trigger mode includes at least two of a click operation, a long press operation, and a touch operation.
- the setting interface 51 of the application includes three setting buttons for the mirror opening mode: click mode setting control 52, long press mode setting control 53 and mixed mode setting control 54.
- click mode setting control 52 click mode setting control 52
- long press mode setting control 53 long press mode setting control 53
- mixed mode setting control 54 The user can choose any of the three lens opening modes.
- the setting interface 55 of the application program includes setting buttons for three probe modes: click mode setting control 56, long-press mode setting control 57, and mixed mode setting control 58. The user can select any of the three probe modes.
- the user can customize the trigger mode of the viewing angle rotation operation, adapt to the user's own shooting habits and operating characteristics, meet the independent operation needs of users at different levels, and enrich users The choice provides more personalized combat experience.
- the first function control also includes a first function
- the second function control also includes a second function
- the first viewing angle rotation operation also controls the opening and closing of the first function
- the second viewing angle rotation operation also Control the opening and closing of the second function.
- the target trigger mode includes a click operation; the terminal starts the second function of the second function control according to the click operation; when the click operation ends, keeps the second function of the second function control in an on state.
- the terminal closes the second function of the second function control according to a click operation on the second function control again.
- the mirror opening control controls the mirror opening and closing process as follows:
- Step 61 start.
- Step 62 The terminal receives the click operation on the mirror opening control.
- Step 63 The terminal judges whether the mirror opening function is in an on state.
- step 64 When the terminal determines that the mirror opening function is in the on state, step 64 is executed; otherwise, step 65 is executed.
- Step 64 the terminal closes the mirror.
- Step 65 the terminal opens the mirror.
- the target trigger mode includes a long-press operation; the terminal activates the second function of the second function control according to the long-press operation; when the long-press operation ends, keep the second function of the second function control in an on state.
- the terminal closes the second function of the second function control according to a click operation on the second function control.
- the mirror opening control controls the mirror opening and closing process as follows:
- Step 71 start.
- Step 72 The terminal receives the long press operation on the mirror opening control.
- Step 73 the terminal opens the mirror.
- Step 74 The terminal judges whether to end the long press operation on the mirror opening control.
- step 75 is executed; otherwise, step 76 is executed.
- Step 75 the terminal turns off the mirror.
- Step 76 The terminal keeps the mirror on state.
- the target trigger mode includes a touch operation; the terminal activates the second function of the second function control according to the touch operation; when the touch operation ends, the duration of the touch operation is acquired; when the duration is greater than the time threshold, the first function is maintained.
- the second function of the second function control is in the on state.
- the time threshold is used to determine to keep the second function of the second function control in the on state after the touch operation is finished.
- the second function of the second function control is turned off.
- the terminal closes the second function of the second function control according to a click operation on the second function control.
- the mirror opening control controls the mirror opening and closing process as follows:
- Step 81 start.
- Step 82 The terminal receives the touch operation on the mirror opening control.
- Step 83 The terminal judges whether the mirror opening function is in an on state.
- step 84 is executed; otherwise, step 85 is executed.
- Step 84 the terminal turns off the mirror.
- Step 85 the terminal opens the mirror.
- Step 86 The terminal judges whether the operation duration at the end of the touch operation is greater than the time threshold.
- step 87 When the terminal determines that the operation duration at the end of the touch operation is greater than the time threshold, step 87 is executed; otherwise, step 88 is executed.
- the time threshold may be 0.2 seconds (s); when the terminal determines that the operation duration at the end of the touch operation is greater than 0.2s, step 87 is executed; otherwise, step 88 is executed.
- Step 87 The terminal determines that the touch operation is a long-press operation and keeps the mirror on state.
- Step 88 The terminal determines that the touch operation is a click operation and turns off the mirror.
- the response logic of the viewing angle rotation when the first functional control and the second functional control are triggered is customized by the user, which is illustrative.
- the terminal passes the first functional control according to the custom logic. Control the perspective rotation of the virtual character; or, control the perspective rotation of the virtual character through the second function control according to the custom logic; where the custom logic is the user-defined first function control and the second function control when triggered Response logic of rotation operation.
- the custom logic is that when the first functional control and the second functional control are triggered at the same time, the terminal enables the viewing angle rotation function of the first functional control and turns off the viewing angle rotation control of the second functional control. Therefore, when the first function control and the second function control are triggered at the same time, the terminal controls the viewing angle rotation of the virtual character through the first function control.
- the terminal also controls the viewing angle rotation of the virtual character through a gyroscope.
- the terminal receives its own rotation operation, and controls the rotation of the virtual character's viewing angle through the gyroscope.
- the method of viewing angle rotation provided in this embodiment provides users with the function of customizing the response logic of viewing angle rotation, enabling users to customize control operation logic with viewing angle rotation function that is more in line with their own operating habits, and improves The operating experience in user engagement is improved.
- the viewing angle rotation of the virtual character can also be controlled by the gyroscope, so that while the user rotates the viewing angle of the virtual character, it can also control other operations on the virtual character, which improves the interaction efficiency in the battle.
- FIG. 15 is a viewing angle rotation device provided by an exemplary embodiment provided by the present application.
- the device may form part or all of the terminal through software, hardware, or a combination of the two, and the device includes:
- the display module 401 is used to display a first-view picture of the application program.
- the first-view picture is a picture when the virtual environment is observed in the virtual environment from the first viewpoint of the virtual character, and the first function control is superimposed on the first-view picture
- a second function control the first function control is used to support the first function and the viewing angle rotation function
- the second function control is used to support the second function and the viewing angle rotation function
- the receiving module 402 is configured to receive the first viewing angle rotation operation triggered based on the first function control
- the processing module 403 is configured to turn on the first function and the viewing angle rotation function of the first functional control according to the first viewing angle rotation operation, and switch the first viewing angle picture to the second viewing angle picture; the second viewing angle picture adopts a virtual environment in a virtual environment. The screen when the character observes the virtual environment from the second perspective direction;
- the receiving module 402 is configured to receive a second viewing angle rotation operation triggered based on the second functional control when the first functional control is in the on state;
- the processing module 403 is configured to turn off the viewing angle rotation function of the first functional control according to the second viewing angle rotation operation, turn on the second function and viewing angle rotation function of the second functional control, and switch the second viewing angle picture to the third viewing angle picture;
- the three-perspective picture is the picture when the virtual character's third-perspective direction is adopted in the virtual environment to observe the virtual environment.
- the processing module 403 includes:
- a generating sub-module 4032 is used to generate the second order label of the second function control according to the second viewing angle rotation operation
- the processing sub-module 4034 is used for turning off the viewing angle rotation function of the first functional control and turning on the second function and viewing angle rotation function of the second functional control when the second sequence label is greater than the first sequence label; where the first sequence label is The order label of the first function control.
- the first function control includes a first function control in an open state, and the first function control in an open state corresponds to a first order label;
- a sub-module 4032 is generated for obtaining the largest sequence label from the first sequence label according to the second viewing angle rotation operation; the sequence label obtained by adding one to the largest sequence label is determined as the second sequence label.
- the first functional control includes the first functional control in an on state
- the processing sub-module 4034 is configured to determine the i-th first function control in the open state from the first function control in the open state when the second view angle rotation operation based on the second function control is ended, and open the i-th first function control
- the viewing angle rotation function of the first function control in the open state, i is a positive integer.
- the first function control in the on state corresponds to a first order label
- the processing sub-module 4034 is configured to enable the viewing angle rotation function of the i-th first functional control in the open state when the first sequence label of the i-th first functional control in the open state is the maximum sequence label.
- the display module 401 is configured to display the setting interface of the application program, and the setting interface includes at least two mode setting controls, and the mode setting controls are used to set the trigger mode of the viewing angle rotation operation;
- the receiving module 402 is configured to receive a selection operation triggered on the setting interface, and the selection operation is used to select a mode setting control corresponding to a target trigger mode among at least two mode setting controls;
- the determining module 404 is configured to determine the trigger mode of the viewing angle rotation operation as the target trigger mode according to the selection operation;
- the viewing angle rotation operation includes any one of a first viewing angle rotation operation and a second viewing angle rotation operation.
- the viewing angle rotation operation includes a second viewing angle rotation operation
- the target trigger mode includes a click operation
- the processing module 403 is configured to enable the second function of the second function control according to the click operation; when the click operation is ended, keep the second function of the second function control in the on state.
- the viewing angle rotation operation includes a second viewing angle rotation operation
- the target trigger mode includes a long press operation
- the processing module 403 is configured to enable the second function of the second function control according to the long-press operation; when the long-press operation is ended, keep the second function of the second function control in the on state.
- the viewing angle rotation operation includes a second viewing angle rotation operation
- the target trigger mode includes a touch operation
- the processing module 403 is configured to enable the second function of the second functional control according to the touch operation; when the touch operation is ended, obtain the duration of the touch operation; when the duration is greater than the time threshold, keep the second function of the second functional control at Open state.
- the processing module 403 is configured to turn off the second function of the second function control when the duration is less than or equal to the time threshold.
- the processing module 403 is configured to control the viewing angle rotation of the virtual character through the first function control according to the custom logic; or, control the viewing angle rotation of the virtual character through the second function control according to the custom logic;
- the logic is the user-defined response logic to the viewing angle rotation operation when the first functional control and the second functional control are triggered.
- the first viewing angle screen of the application is displayed on the terminal.
- the first viewing angle screen is superimposed with the first functional control and the second functional control.
- the first functional control is used to support The first function and the viewing angle rotation function
- the second function control is used to support the second function and the viewing angle rotation function
- the terminal receives the first viewing angle rotation operation triggered based on the first function control; according to the first viewing angle rotation operation, the first function control is turned on The first function and the viewing angle rotation function of the switch, the first viewing angle picture is switched to the second viewing angle picture; when the first function control is in the open state, receiving the second viewing angle rotation operation triggered by the second function control; rotating according to the second viewing angle Operate, turn off the viewing angle rotation function of the first functional control, turn on the second function and viewing angle rotation function of the second functional control, and switch the second viewing angle picture to the third viewing angle picture.
- the above device can also respond to the viewing angle rotation operation triggered by the second function control while the viewing angle rotation function of the first functional control is triggered, that is, the screen can respond to the viewing angle rotation operation of at least two contacts at the same time, which improves the operation process.
- the device when the first functional control is in the on state, the device triggers the viewing angle rotation function based on the second functional control, and the terminal first responds to the viewing angle rotation operation triggered by the second functional control, which ensures that multiple functions with viewing angle rotation function When the controls are turned on, the order and accuracy of the terminal's response to the viewing angle rotation operation.
- FIG. 16 shows a structural block diagram of a terminal 500 provided by an exemplary embodiment of the present invention.
- the terminal 500 may be: a smartphone, a tablet computer, a moving Picture Experts Group Audio Layer III (MP3) player, a moving Picture Experts compressed standard audio layer 4 (Moving Picture Experts Group Audio Layer IV) , MP4) player, laptop or desktop computer.
- MP3 Moving Picture Experts Group Audio Layer III
- MP4 Moving Picture Experts compressed standard audio layer 4
- the terminal 500 may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
- the terminal 500 includes a processor 501 and a memory 502.
- the processor 501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
- the processor 501 may adopt at least one hardware form among digital signal processing (Digital Signal Processing, DSP), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA), and Programmable Logic Array (Programmable Logic Array, PLA). achieve.
- the processor 501 may also include a main processor and a coprocessor.
- the main processor is a processor used to process data in the awake state, and is also called a central processing unit (CPU);
- the coprocessor is A low-power processor used to process data in the standby state.
- the processor 501 may be integrated with a graphics processor (Graphics Processing Unit, GPU), and the GPU is used to render and draw content that needs to be displayed on the display screen.
- the processor 501 may also include an artificial intelligence (Artificial Intelligence, AI) processor, and the AI processor is used to process calculation operations related to machine learning.
- AI Artificial Intelligence
- the memory 502 may include one or more computer-readable storage media, which may be non-transitory.
- the memory 502 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
- the non-transitory computer-readable storage medium in the memory 502 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 501 to implement the viewing angle rotation provided by the method embodiment of the present application. Methods.
- the terminal 500 may optionally further include: a peripheral device interface 503 and at least one peripheral device.
- the processor 501, the memory 502, and the peripheral device interface 503 may be connected by a bus or a signal line.
- Each peripheral device can be connected to the peripheral device interface 503 through a bus, a signal line or a circuit board.
- the peripheral device includes: at least one of a radio frequency circuit 504, a touch screen 505, a camera 506, an audio circuit 507, a positioning component 508, and a power supply 509.
- the peripheral device interface 503 can be used to connect at least one peripheral device related to Input/Output (I/O) to the processor 501 and the memory 502.
- the processor 501, the memory 502, and the peripheral device interface 503 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 501, the memory 502, and the peripheral device interface 503 or The two can be implemented on separate chips or circuit boards, which are not limited in this embodiment.
- the radio frequency circuit 504 is used to receive and transmit radio frequency (RF) signals, also called electromagnetic signals.
- the radio frequency circuit 504 communicates with a communication network and other communication devices through electromagnetic signals.
- the radio frequency circuit 504 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
- the radio frequency circuit 504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a user identity module card, and so on.
- the radio frequency circuit 504 can communicate with other terminals through at least one wireless communication protocol.
- the wireless communication protocol includes but is not limited to: World Wide Web, Metropolitan Area Network, Intranet, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area network and/or Wireless Fidelity (WiFi) networks.
- the radio frequency circuit 504 may also include a circuit related to Near Field Communication (NFC), which is not limited in this application.
- NFC Near Field Communication
- the display screen 505 is used to display a user interface (UI).
- the UI can include graphics, text, icons, videos, and any combination thereof.
- the display screen 505 also has the ability to collect touch signals on or above the surface of the display screen 505.
- the touch signal can be input to the processor 501 as a control signal for processing.
- the display screen 505 may also be used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
- the display screen 505 there may be one display screen 505, which is provided with the front panel of the terminal 500; in other embodiments, there may be at least two display screens 505, which are respectively arranged on different surfaces of the terminal 500 or in a folding design; In still other embodiments, the display screen 505 may be a flexible display screen, which is arranged on the curved surface or the folding surface of the terminal 500. Furthermore, the display screen 505 can also be set as a non-rectangular irregular pattern, that is, a special-shaped screen.
- the display screen 505 may be made of materials such as a liquid crystal display (Liquid Crystal Display, LCD), an organic light emitting diode (Organic Light-Emitting Diode, OLED).
- the camera assembly 506 is used to capture images or videos.
- the camera assembly 506 includes a front camera and a rear camera.
- the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
- the camera assembly 506 may also include a flash.
- the flash can be a single-color flash or a dual-color flash. Dual color temperature flash refers to a combination of warm light flash and cold light flash, which can be used for light compensation under different color temperatures.
- the audio circuit 507 may include a microphone and a speaker.
- the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals and input them to the processor 501 for processing, or input to the radio frequency circuit 504 to implement voice communication. For the purpose of stereo collection or noise reduction, there may be multiple microphones, which are respectively set in different parts of the terminal 500.
- the microphone can also be an array microphone or an omnidirectional acquisition microphone.
- the speaker is used to convert the electrical signal from the processor 501 or the radio frequency circuit 504 into sound waves.
- the speaker can be a traditional membrane speaker or a piezoelectric ceramic speaker.
- the speaker When the speaker is a piezoelectric ceramic speaker, it can not only convert the electrical signal into human audible sound waves, but also convert the electrical signal into human inaudible sound waves for purposes such as distance measurement.
- the audio circuit 507 may also include a headphone jack.
- the positioning component 508 is used to locate the current geographic location of the terminal 500 to implement navigation or location-based service (LBS).
- LBS location-based service
- the positioning component 508 may be a positioning component based on the Global Positioning System (GPS) of the United States, the Beidou system of China, or the Galileo system of Russia.
- GPS Global Positioning System
- the power supply 509 is used to supply power to various components in the terminal 500.
- the power source 509 may be alternating current, direct current, disposable batteries, or rechargeable batteries.
- the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery.
- a wired rechargeable battery is a battery charged through a wired line
- a wireless rechargeable battery is a battery charged through a wireless coil.
- the rechargeable battery can also be used to support fast charging technology.
- the terminal 500 further includes one or more sensors 510.
- the one or more sensors 510 include, but are not limited to: an acceleration sensor 511, a gyroscope sensor 512, a pressure sensor 513, a fingerprint sensor 514, an optical sensor 515, and a proximity sensor 516.
- the acceleration sensor 511 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal 500.
- the acceleration sensor 511 can be used to detect the components of gravitational acceleration on three coordinate axes.
- the processor 501 may control the touch screen 505 to display the user interface in a horizontal view or a vertical view according to the gravity acceleration signal collected by the acceleration sensor 511.
- the acceleration sensor 511 can also be used for game or user motion data collection.
- the gyroscope sensor 512 can detect the body direction and rotation angle of the terminal 500, and the gyroscope sensor 512 can cooperate with the acceleration sensor 511 to collect the user's 3D actions on the terminal 500.
- the processor 501 can implement the following functions according to the data collected by the gyroscope sensor 512: motion sensing (for example, changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
- the pressure sensor 513 may be arranged on the side frame of the terminal 500 and/or the lower layer of the touch screen 505.
- the processor 501 performs left and right hand recognition or quick operation according to the holding signal collected by the pressure sensor 513.
- the processor 501 controls the operability controls on the UI interface according to the user's pressure operation on the touch display screen 505.
- the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
- the fingerprint sensor 514 is used to collect the user's fingerprint.
- the processor 501 can identify the user's identity according to the fingerprint collected by the fingerprint sensor 514, or the fingerprint sensor 514 can identify the user's identity according to the collected fingerprint. When it is recognized that the user's identity is a trusted identity, the processor 501 authorizes the user to perform related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings.
- the fingerprint sensor 514 may be provided on the front, back or side of the terminal 500. When a physical button or a manufacturer logo is provided on the terminal 500, the fingerprint sensor 514 can be integrated with the physical button or the manufacturer logo.
- the optical sensor 515 is used to collect the ambient light intensity.
- the processor 501 may control the display brightness of the touch screen 505 according to the ambient light intensity collected by the optical sensor 515. Specifically, when the ambient light intensity is high, the display brightness of the touch screen 505 is increased; when the ambient light intensity is low, the display brightness of the touch screen 505 is decreased.
- the processor 501 may also dynamically adjust the shooting parameters of the camera assembly 506 according to the ambient light intensity collected by the optical sensor 515.
- the proximity sensor 516 also called a distance sensor, is usually provided on the front panel of the terminal 500.
- the proximity sensor 516 is used to collect the distance between the user and the front of the terminal 500.
- the processor 501 controls the touch screen 505 to switch from the on-screen state to the off-screen state; when the proximity sensor 516 detects When the distance between the user and the front of the terminal 500 gradually increases, the processor 501 controls the touch display screen 505 to switch from the rest screen state to the bright screen state.
- FIG. 16 does not constitute a limitation on the terminal 500, and may include more or fewer components than shown, or combine some components, or adopt different component arrangements.
- the program can be stored in a computer-readable storage medium.
- the medium may be a computer-readable storage medium included in the memory in the foregoing embodiment; or may be a computer-readable storage medium that exists alone and is not assembled into the terminal.
- the computer-readable storage medium stores at least one instruction, at least one program, code set or instruction set, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor In order to realize the method of viewing angle rotation as described in any one of FIGS.
- the computer-readable storage medium may include: read only memory (Read Only Memory, ROM), random access memory (Random Access Memory, RAM), solid state drive (Solid State Drives, SSD), or optical disk.
- the random access memory may include resistance random access memory (Resistance Random Access Memory, ReRAM) and dynamic random access memory (Dynamic Random Access Memory, DRAM).
- ReRAM resistance random access memory
- DRAM Dynamic Random Access Memory
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
Claims (15)
- 一种视角转动的方法,由终端执行,所述方法包括:显示应用程序的第一视角画面,所述第一视角画面是在虚拟环境中采用虚拟角色的第一视角方向观察所述虚拟环境时的画面,所述第一视角画面上叠加有第一功能控件和第二功能控件,所述第一功能控件用于支持第一功能和视角转动功能,所述第二功能控件用于支持第二功能和所述视角转动功能;接收基于所述第一功能控件触发的第一视角转动操作;根据所述第一视角转动操作,开启所述第一功能控件的所述第一功能和所述视角转动功能,将所述第一视角画面切换为第二视角画面;所述第二视角画面是在所述虚拟环境中采用所述虚拟角色的第二视角方向观察所述虚拟环境时的画面;在所述第一功能控件处于开启状态下,接收基于所述第二功能控件触发的第二视角转动操作;根据所述第二视角转动操作,关闭所述第一功能控件的所述视角转动功能,开启所述第二功能控件的所述第二功能和所述视角转动功能,将所述第二视角画面切换为第三视角画面;所述第三视角画面是在所述虚拟环境中采用所述虚拟角色的第三视角方向观察所述虚拟环境时的画面。
- 根据权利要求1所述的方法,所述根据所述第二视角转动操作,关闭所述第一功能控件的所述视角转动功能,开启所述第二功能控件的所述第二功能和所述视角转动功能,包括:根据所述第二视角转动操作,生成所述第二功能控件的第二顺序标号;当所述第二顺序标号大于第一顺序标号时,关闭所述第一功能控件的所述视角转动功能,开启所述第二功能控件的所述第二功能和所述视角转动功能;其中,所述第一顺序标号是所述第一功能控件的顺序标号。
- 根据权利要求2所述的方法,所述第一功能控件包括处于开启状态的第一功能控件,所述处于开启状态的第一功能控件对应有所述第一顺序标号;所述根据所述第二视角转动操作,生成所述第二功能控件的第二顺序标号,包括:根据所述第二视角转动操作从所述第一顺序标号中获取最大顺序标号;将所述最大顺序标号加一后得到的顺序标号确定为所述第二顺序标号。
- 根据权利要求2所述的方法,所述第一功能控件包括处于开启状态的第一功能控件;在所述关闭所述第一功能控件的所述视角转动功能,开启所述第二功能控件的所述第二功能和所述视角转动功能之后,所述方法还包括:当结束基于所述第二功能控件的所述第二视角转动操作时,从所述处于 开启状态的第一功能控件中确定出第i个所述处于开启状态的第一功能控件,开启第i个所述处于开启状态的第一功能控件的所述视角转动功能,所述i为正整数。
- 根据权利要求4所述的方法,所述处于开启状态的第一功能控件对应有所述第一顺序标号;所述从所述处于开启状态的第一功能控件中确定出第i个所述处于开启状态的第一功能控件,开启第i个所述处于开启状态的第一功能控件的所述视角转动功能,包括:当第i个所述处于开启状态的第一功能控件的所述第一顺序标号为最大顺序标号时,开启第i个所述处于开启状态的第一功能控件的所述视角转动功能。
- 根据权利要求1至5任一所述的方法,所述方法还包括:显示所述应用程序的设置界面,所述设置界面上包括至少两个模式设置控件,所述模式设置控件用于设置视角转动操作的触发方式;接收在所述设置界面上触发的选择操作,所述选择操作用于选择至少两个所述模式设置控件中目标触发方式对应的模式设置控件;根据所述选择操作,将所述视角转动操作的触发方式确定为所述目标触发方式;其中,所述视角转动操作包括所述第一视角转动操作和所述第二视角转动操作中的任意一种。
- 根据权利要求6所述的方法,所述视角转动操作包括第二视角转动操作;所述目标触发方式包括点击操作;所述根据所述第二视角转动操作,开启所述第二功能控件的所述第二功能,包括:根据所述点击操作开启所述第二功能控件的所述第二功能;所述方法还包括:当结束所述点击操作时,保持所述第二功能控件的所述第二功能处于开启状态。
- 根据权利要求6所述的方法,所述视角转动操作包括第二视角转动操作;所述目标触发方式包括长按操作;所述根据所述第二视角转动操作,开启所述第二功能控件的所述第二功能,包括:根据所述长按操作开启所述第二功能控件的所述第二功能;所述方法还包括:当结束所述长按操作时,保持所述第二功能控件的所述第二功能处于开启状态。
- 根据权利要求6所述的方法,所述视角转动操作包括第二视角转动操作;所述目标触发方式包括触摸操作;所述根据所述第二视角转动操作,开启所述第二功能控件的所述第二功能,包括:根据所述触摸操作开启所述第二功能控件的所述第二功能;所述方法还包括:当结束所述触摸操作时,获取所述触摸操作的持续时长;当所述持续时长大于时间阈值时,保持所述第二功能控件的所述第二功能处于开启状态。
- 根据权利要求9所述的方法,所述方法还包括:当所述持续时长小于或者等于所述时间阈值时,关闭所述第二功能控件的所述第二功能。
- 根据权利要求1所述的方法,所述方法还包括:根据自定义逻辑通过所述第一功能控件控制所述虚拟角色的视角转动;或,根据所述自定义逻辑通过所述第二功能控件控制所述虚拟角色的视角转动;其中,所述自定义逻辑是由用户自定义的所述第一功能控件与所述第二功能控件被触发时对视角转动操作的响应逻辑。
- 一种视角转动的装置,所述装置包括:显示模块,用于显示应用程序的第一视角画面,所述第一视角画面是在虚拟环境中采用虚拟角色的第一视角方向观察所述虚拟环境时的画面,所述第一视角画面上叠加有第一功能控件和第二功能控件,所述第一功能控件用于支持第一功能和视角转动功能,所述第二功能控件用于支持第二功能和所述视角转动功能;接收模块,用于接收基于所述第一功能控件触发的第一视角转动操作;处理模块,用于根据所述第一视角转动操作,开启所述第一功能控件的所述第一功能和所述视角转动功能,将所述第一视角画面切换为第二视角画面;所述第二视角画面是在所述虚拟环境中采用所述虚拟角色的第二视角方向观察所述虚拟环境时的画面;所述接收模块,用于在所述第一功能控件处于开启状态下,接收基于所述第二功能控件触发的第二视角转动操作;所述处理模块,用于根据所述第二视角转动操作,关闭所述第一功能控件的所述视角转动功能,开启所述第二功能控件的所述第二功能和所述视角转动功能,将所述第二视角画面切换为第三视角画面;所述第三视角画面是在所述虚拟环境中采用所述虚拟角色的第三视角方向观察所述虚拟环境时的画面。
- 一种终端,所述终端包括:存储器;与存储器相连的处理器;其中,处理器被配置为加载并执行可执行指令以实现如权利要求1至11任一所述的视角转动的方法。
- 一种计算机可读存储介质,所述可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行以实现如权利要求1至11任一所述的视角转动的方法。
- 一种计算机程序产品,包括指令,当其在计算机上运行时,使得计算机执行如权利要求1至11中任一项所述的视角转动的方法。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021565096A JP7309913B2 (ja) | 2019-07-26 | 2020-07-08 | 視点回転の方法、装置、端末およびコンピュータプラグラム |
SG11202110279UA SG11202110279UA (en) | 2019-07-26 | 2020-07-08 | Viewing angle rotation method, device, apparatus, and storage medium |
EP20846755.5A EP3925677A4 (en) | 2019-07-26 | 2020-07-08 | VIEWING ANGLE ROTATION METHOD, DEVICE, APPARATUS AND STORAGE MEDIA |
KR1020217034151A KR102663747B1 (ko) | 2019-07-26 | 2020-07-08 | 시야각 회전 방법, 디바이스, 장치, 및 저장 매체 |
US17/337,279 US11878240B2 (en) | 2019-07-26 | 2021-06-02 | Method, apparatus, device, and storage medium for perspective rotation |
JP2023110919A JP2023139033A (ja) | 2019-07-26 | 2023-07-05 | 視点回転の方法、装置、端末およびコンピュータプログラム |
US18/540,504 US20240123342A1 (en) | 2019-07-26 | 2023-12-14 | Method, apparatus, device, and storage medium for perspective rotation |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910683976.6A CN110393916B (zh) | 2019-07-26 | 2019-07-26 | 视角转动的方法、装置、设备及存储介质 |
CN201910683976.6 | 2019-07-26 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/337,279 Continuation US11878240B2 (en) | 2019-07-26 | 2021-06-02 | Method, apparatus, device, and storage medium for perspective rotation |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021017783A1 true WO2021017783A1 (zh) | 2021-02-04 |
Family
ID=68326262
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/100873 WO2021017783A1 (zh) | 2019-07-26 | 2020-07-08 | 视角转动的方法、装置、设备及存储介质 |
Country Status (7)
Country | Link |
---|---|
US (2) | US11878240B2 (zh) |
EP (1) | EP3925677A4 (zh) |
JP (2) | JP7309913B2 (zh) |
KR (1) | KR102663747B1 (zh) |
CN (1) | CN110393916B (zh) |
SG (1) | SG11202110279UA (zh) |
WO (1) | WO2021017783A1 (zh) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110393916B (zh) | 2019-07-26 | 2023-03-14 | 腾讯科技(深圳)有限公司 | 视角转动的方法、装置、设备及存储介质 |
CN111111168B (zh) * | 2019-12-16 | 2021-03-26 | 腾讯科技(深圳)有限公司 | 虚拟道具的控制方法和装置、存储介质及电子装置 |
US11562615B2 (en) * | 2020-04-10 | 2023-01-24 | Igt | Symbol substitution system |
CN111589132A (zh) * | 2020-04-26 | 2020-08-28 | 腾讯科技(深圳)有限公司 | 虚拟道具展示方法、计算机设备及存储介质 |
CN113589992B (zh) * | 2021-08-17 | 2023-09-12 | 网易(杭州)网络有限公司 | 游戏界面交互方法、游戏界面交互装置、介质及终端设备 |
USD1034646S1 (en) * | 2021-10-08 | 2024-07-09 | Samsung Electronics Co., Ltd. | Display screen or portion thereof with graphical user interface |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150157932A1 (en) * | 2012-07-06 | 2015-06-11 | WEMADE ENTERTAINMENT CO., LTD a corporation | Method of processing user gesture inputs in online game |
CN107694087A (zh) * | 2017-10-23 | 2018-02-16 | 网易(杭州)网络有限公司 | 信息处理方法及终端设备 |
CN108499105A (zh) * | 2018-04-16 | 2018-09-07 | 腾讯科技(深圳)有限公司 | 在虚拟环境中进行视角调整的方法、装置及存储介质 |
CN108815851A (zh) * | 2018-06-05 | 2018-11-16 | 腾讯科技(深圳)有限公司 | 在虚拟环境中射击时的界面显示方法、设备及存储介质 |
CN110393916A (zh) * | 2019-07-26 | 2019-11-01 | 腾讯科技(深圳)有限公司 | 视角转动的方法、装置、设备及存储介质 |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7800592B2 (en) | 2005-03-04 | 2010-09-21 | Apple Inc. | Hand held electronic device with multiple touch sensing devices |
US9327191B2 (en) * | 2006-05-08 | 2016-05-03 | Nintendo Co., Ltd. | Method and apparatus for enhanced virtual camera control within 3D video games or other computer graphics presentations providing intelligent automatic 3D-assist for third person viewpoints |
US7768514B2 (en) * | 2006-12-19 | 2010-08-03 | International Business Machines Corporation | Simultaneous view and point navigation |
US9007379B1 (en) * | 2009-05-29 | 2015-04-14 | Two Pic Mc Llc | Methods and apparatus for interactive user control of virtual cameras |
JP5300777B2 (ja) * | 2010-03-31 | 2013-09-25 | 株式会社バンダイナムコゲームス | プログラム及び画像生成システム |
US20140002580A1 (en) | 2012-06-29 | 2014-01-02 | Monkeymedia, Inc. | Portable proprioceptive peripatetic polylinear video player |
US9227141B2 (en) * | 2013-12-31 | 2016-01-05 | Microsoft Technology Licensing, Llc | Touch screen game controller |
JP2016073663A (ja) | 2015-11-25 | 2016-05-12 | グリー株式会社 | プログラム、及び表示システム |
KR20160126848A (ko) * | 2015-12-22 | 2016-11-02 | 주식회사 인챈트인터렉티브 | 사용자의 제스처 입력을 처리하는 방법 |
CN105760076B (zh) * | 2016-02-03 | 2018-09-04 | 网易(杭州)网络有限公司 | 游戏控制方法及装置 |
US10354446B2 (en) * | 2016-04-13 | 2019-07-16 | Google Llc | Methods and apparatus to navigate within virtual-reality environments |
DE102016211453A1 (de) * | 2016-06-27 | 2017-12-28 | Conti Temic Microelectronic Gmbh | Verfahren und Fahrzeugsteuersystem zum Erzeugen von Abbildungen eines Umfeldmodells und entsprechendes Fahrzeug |
US10004991B2 (en) | 2016-06-28 | 2018-06-26 | Hothead Games Inc. | Systems and methods for customized camera views in virtualized environments |
JP6373920B2 (ja) | 2016-09-14 | 2018-08-15 | 株式会社バンダイナムコエンターテインメント | シミュレーションシステム及びプログラム |
JP6539253B2 (ja) * | 2016-12-06 | 2019-07-03 | キヤノン株式会社 | 情報処理装置、その制御方法、およびプログラム |
US10460492B2 (en) * | 2017-09-13 | 2019-10-29 | Canon Kabushiki Kaisha | Method, system and apparatus for navigating a virtual camera using a navigation device |
CN116450020B (zh) | 2017-09-26 | 2024-08-09 | 网易(杭州)网络有限公司 | 虚拟射击主体控制方法、装置、电子设备及存储介质 |
JP2018060539A (ja) * | 2017-10-02 | 2018-04-12 | 望月 玲於奈 | ユーザーインターフェースプログラム |
CN107890664A (zh) | 2017-10-23 | 2018-04-10 | 网易(杭州)网络有限公司 | 信息处理方法及装置、存储介质、电子设备 |
KR102309397B1 (ko) * | 2017-12-20 | 2021-10-06 | 레이아 인코포레이티드 | 크로스-렌더 멀티뷰 카메라, 시스템 및 방법 |
CN108376424A (zh) * | 2018-02-09 | 2018-08-07 | 腾讯科技(深圳)有限公司 | 用于对三维虚拟环境进行视角切换的方法、装置、设备及存储介质 |
CN108525294B (zh) | 2018-04-04 | 2021-07-27 | 网易(杭州)网络有限公司 | 射击游戏的控制方法和装置 |
CN108553891A (zh) | 2018-04-27 | 2018-09-21 | 腾讯科技(深圳)有限公司 | 对象瞄准方法和装置、存储介质及电子装置 |
CN108771863B (zh) | 2018-06-11 | 2022-04-15 | 网易(杭州)网络有限公司 | 射击游戏的控制方法和装置 |
WO2020133143A1 (en) * | 2018-12-28 | 2020-07-02 | Zhejiang Dahua Technology Co., Ltd. | Systems and methods for image display |
CN109821237B (zh) * | 2019-01-24 | 2022-04-22 | 腾讯科技(深圳)有限公司 | 视角转动的方法、装置、设备及存储介质 |
JP2022051972A (ja) * | 2019-02-06 | 2022-04-04 | ソニーグループ株式会社 | 情報処理装置および方法、並びにプログラム |
US11216149B2 (en) * | 2019-03-15 | 2022-01-04 | Samsung Electronics Co., Ltd. | 360° video viewer control using smart device |
CN110038297A (zh) | 2019-04-12 | 2019-07-23 | 网易(杭州)网络有限公司 | 移动终端的游戏操作方法及装置、存储介质及电子设备 |
JP6829298B1 (ja) * | 2019-10-18 | 2021-02-10 | 株式会社スクウェア・エニックス | プログラム、コンピュータ装置、及び制御方法 |
-
2019
- 2019-07-26 CN CN201910683976.6A patent/CN110393916B/zh active Active
-
2020
- 2020-07-08 JP JP2021565096A patent/JP7309913B2/ja active Active
- 2020-07-08 EP EP20846755.5A patent/EP3925677A4/en active Pending
- 2020-07-08 KR KR1020217034151A patent/KR102663747B1/ko active IP Right Grant
- 2020-07-08 SG SG11202110279UA patent/SG11202110279UA/en unknown
- 2020-07-08 WO PCT/CN2020/100873 patent/WO2021017783A1/zh active Application Filing
-
2021
- 2021-06-02 US US17/337,279 patent/US11878240B2/en active Active
-
2023
- 2023-07-05 JP JP2023110919A patent/JP2023139033A/ja active Pending
- 2023-12-14 US US18/540,504 patent/US20240123342A1/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20150157932A1 (en) * | 2012-07-06 | 2015-06-11 | WEMADE ENTERTAINMENT CO., LTD a corporation | Method of processing user gesture inputs in online game |
CN107694087A (zh) * | 2017-10-23 | 2018-02-16 | 网易(杭州)网络有限公司 | 信息处理方法及终端设备 |
CN108499105A (zh) * | 2018-04-16 | 2018-09-07 | 腾讯科技(深圳)有限公司 | 在虚拟环境中进行视角调整的方法、装置及存储介质 |
CN108815851A (zh) * | 2018-06-05 | 2018-11-16 | 腾讯科技(深圳)有限公司 | 在虚拟环境中射击时的界面显示方法、设备及存储介质 |
CN110393916A (zh) * | 2019-07-26 | 2019-11-01 | 腾讯科技(深圳)有限公司 | 视角转动的方法、装置、设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
EP3925677A4 (en) | 2022-05-04 |
KR102663747B1 (ko) | 2024-05-09 |
US11878240B2 (en) | 2024-01-23 |
JP7309913B2 (ja) | 2023-07-18 |
EP3925677A1 (en) | 2021-12-22 |
CN110393916A (zh) | 2019-11-01 |
CN110393916B (zh) | 2023-03-14 |
JP2023139033A (ja) | 2023-10-03 |
US20210291053A1 (en) | 2021-09-23 |
SG11202110279UA (en) | 2021-10-28 |
US20240123342A1 (en) | 2024-04-18 |
JP2022531599A (ja) | 2022-07-07 |
KR20210142705A (ko) | 2021-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7231737B2 (ja) | 動作制御方法、装置、電子機器およびプログラム | |
CN109350964B (zh) | 控制虚拟角色的方法、装置、设备及存储介质 | |
CN108619721B (zh) | 虚拟场景中的距离信息显示方法、装置及计算机设备 | |
CN109529319B (zh) | 界面控件的显示方法、设备及存储介质 | |
WO2021017783A1 (zh) | 视角转动的方法、装置、设备及存储介质 | |
WO2019214402A1 (zh) | 虚拟环境中的配件切换方法、装置、设备及存储介质 | |
CN111921197B (zh) | 对局回放画面的显示方法、装置、终端及存储介质 | |
CN111202975B (zh) | 虚拟场景中的准星控制方法、装置、设备及存储介质 | |
WO2020151594A1 (zh) | 视角转动的方法、装置、设备及存储介质 | |
CN111589132A (zh) | 虚拟道具展示方法、计算机设备及存储介质 | |
US11675488B2 (en) | Method and apparatus for constructing building in virtual environment, device, and storage medium | |
TWI802978B (zh) | 應用程式內的控制項位置調整方法及裝置、設備及存儲介質 | |
WO2021031765A1 (zh) | 虚拟环境中瞄准镜的应用方法和相关装置 | |
JP7413563B2 (ja) | 仮想オブジェクトの制御方法、装置、機器及びコンピュータプログラム | |
CN110743168A (zh) | 虚拟场景中的虚拟对象控制方法、计算机设备及存储介质 | |
CN108744510A (zh) | 虚拟物品显示方法、装置及存储介质 | |
CN111013137A (zh) | 虚拟场景中的移动控制方法、装置、设备及存储介质 | |
US12061773B2 (en) | Method and apparatus for determining selected target, device, and storage medium | |
WO2022237076A1 (zh) | 虚拟对象的控制方法、装置、设备及计算机可读存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20846755 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 20846755.5 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2020846755 Country of ref document: EP Effective date: 20210917 |
|
ENP | Entry into the national phase |
Ref document number: 20217034151 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2021565096 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |