[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2021017783A1 - 视角转动的方法、装置、设备及存储介质 - Google Patents

视角转动的方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2021017783A1
WO2021017783A1 PCT/CN2020/100873 CN2020100873W WO2021017783A1 WO 2021017783 A1 WO2021017783 A1 WO 2021017783A1 CN 2020100873 W CN2020100873 W CN 2020100873W WO 2021017783 A1 WO2021017783 A1 WO 2021017783A1
Authority
WO
WIPO (PCT)
Prior art keywords
viewing angle
function
angle rotation
control
function control
Prior art date
Application number
PCT/CN2020/100873
Other languages
English (en)
French (fr)
Inventor
杨槿
潘佳绮
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to JP2021565096A priority Critical patent/JP7309913B2/ja
Priority to SG11202110279UA priority patent/SG11202110279UA/en
Priority to EP20846755.5A priority patent/EP3925677A4/en
Priority to KR1020217034151A priority patent/KR102663747B1/ko
Publication of WO2021017783A1 publication Critical patent/WO2021017783A1/zh
Priority to US17/337,279 priority patent/US11878240B2/en
Priority to JP2023110919A priority patent/JP2023139033A/ja
Priority to US18/540,504 priority patent/US20240123342A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6669Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera using a plurality of virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character change rooms
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6676Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dedicated player input
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • the embodiments of the present application relate to the field of human-computer interaction, and particularly relate to viewing angle rotation technology.
  • the user aims at the shooting target and observes the environment by controlling the rotation of the virtual character's perspective.
  • the user interface of the above-mentioned application program is provided with a viewing angle rotation control, and the user controls the viewing angle rotation of the virtual character through moving operations such as up, down, left, and right triggered on the viewing angle rotation control; and in the process of controlling the viewing angle rotation, the screen simultaneously It can only respond to the viewing angle rotation operation of one contact.
  • the screen can only respond to the viewing angle rotation operation of one contact at the same time, once the only contact disappears, switching the viewing angle again needs to trigger a contact again, which reduces the interaction efficiency during the operation.
  • the embodiments of the present application provide a method, device, device, and storage medium for viewing angle rotation, which can improve the interaction efficiency during the viewing angle rotation operation.
  • the technical solution is as follows:
  • a method of viewing angle rotation which is executed by a terminal, and the method includes:
  • the first-view screen is the screen when the virtual environment is observed in the virtual environment from the first viewpoint of the virtual character, and the first-view screen is superimposed with the first functional control and the second functional control,
  • the first function control is used to support the first function and the viewing angle rotation function
  • the second function control is used to support the second function and the viewing angle rotation function
  • the first function and perspective rotation function of the first functional control are turned on, and the first perspective screen is switched to the second perspective screen;
  • the second perspective screen is the second perspective direction of the virtual character in the virtual environment The screen when observing the virtual environment;
  • the viewing angle rotation function of the first functional control is turned off, the second function and viewing angle rotation function of the second functional control are turned on, and the second viewing angle screen is switched to the third viewing angle screen; the third viewing angle screen is in the virtual In the environment, the virtual character's third angle of view is used to observe the screen in the virtual environment.
  • a viewing angle rotation device which includes:
  • the display module is used to display the first-view picture of the application program.
  • the first-view picture is the picture when the virtual environment is observed in the virtual environment from the first viewpoint of the virtual character, and the first function control is superimposed on the first-view picture
  • a second function control the first function control is used to support the first function and the viewing angle rotation function
  • the second function control is used to support the second function and the viewing angle rotation function
  • a receiving module configured to receive a first viewing angle rotation operation triggered based on the first function control
  • the processing module is used for turning on the first function and the viewing angle rotation function of the first functional control according to the first viewing angle rotation operation, and switching the first viewing angle picture to the second viewing angle picture;
  • the second viewing angle picture adopts a virtual character in a virtual environment The picture when observing the virtual environment in the second angle of view;
  • a receiving module configured to receive a second viewing angle rotation operation triggered based on the second functional control when the first functional control is in the on state
  • the processing module is used for turning off the viewing angle rotation function of the first functional control according to the second viewing angle rotation operation, turning on the second function and viewing angle rotation function of the second functional control, and switching the second viewing angle picture to the third viewing angle picture;
  • the perspective screen is the screen when the virtual character's third perspective is used to observe the virtual environment in the virtual environment.
  • a terminal includes:
  • the processor connected to the memory;
  • the processor is configured to load and execute executable instructions to implement the method of viewing angle rotation as described in the previous aspect and any of its optional embodiments.
  • a computer-readable storage medium stores at least one instruction, at least one program, code set, or instruction set, the above-mentioned at least one instruction and the above-mentioned at least one program ,
  • the foregoing code set or instruction set is loaded and executed by the processor to implement the method of viewing angle rotation as described in the previous aspect and any of its optional embodiments.
  • a computer program product including instructions, which when run on a computer, cause the computer to execute the method of viewing angle rotation described in the previous aspect and any of its optional embodiments.
  • the terminal displays the first-view screen of the application program.
  • the first-view screen is superimposed with the first functional control and the second functional control.
  • the first functional control is used to support the first function and the viewing angle rotation function
  • the second functional control is used to support The second function and the viewing angle rotation function;
  • the terminal receives the first viewing angle rotation operation triggered based on the first function control; according to the first viewing angle rotation operation, the first function and the viewing angle rotation function of the first functional control are turned on, and the first viewing angle screen is switched It is a second perspective picture; when the first functional control is in the on state, the second perspective rotation operation triggered by the second functional control is received; according to the second perspective rotation operation, the perspective rotation function of the first functional control is turned off, and the second The second function of the function control and the viewing angle rotation function switch the second viewing angle picture to the third viewing angle picture.
  • the above method can also respond to the viewing angle rotation operation triggered by the second function control while the viewing angle rotation function of the first functional control is triggered, that is, the screen can respond to the viewing angle rotation operation of at least two contacts at the same time, which improves the operation process.
  • the terminal when the first functional control is in the on state, based on the viewing angle rotation function triggered by the second functional control, the terminal first responds to the viewing angle rotation operation triggered based on the second functional control, ensuring that multiple viewing angle rotation functions are provided.
  • the functional controls are all turned on, the order and accuracy of the terminal's response to the viewing angle rotation operation.
  • Fig. 1 is a schematic diagram of a camera model provided by an exemplary embodiment of the present application
  • Fig. 2 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • FIG. 3 is a structural block diagram of a terminal provided by another exemplary embodiment of the present application.
  • FIG. 4 is a flowchart of a method of viewing angle rotation provided by an exemplary embodiment of the present application
  • Fig. 5 is a schematic diagram of an interface for viewing angle rotation provided by an exemplary embodiment of the present application.
  • FIG. 6 is a flowchart of a method of viewing angle rotation provided by another exemplary embodiment of the present application.
  • FIG. 7 is a flowchart of a method of viewing angle rotation provided by another exemplary embodiment of the present application.
  • FIG. 8 is a flowchart of a method of viewing angle rotation provided by another exemplary embodiment of the present application.
  • Fig. 9 is a flowchart of a method for setting a viewing angle rotation operation provided by an exemplary embodiment of the present application.
  • FIG. 10 is a schematic diagram of an interface of a method for setting a viewing angle rotation operation provided by an exemplary embodiment of the present application
  • FIG. 11 is a schematic interface diagram of a method for setting a viewing angle rotation operation provided by another exemplary embodiment of the present application.
  • FIG. 12 is a flowchart of a method of viewing angle rotation provided by another exemplary embodiment of the present application.
  • FIG. 13 is a flowchart of a method of viewing angle rotation provided by another exemplary embodiment of the present application.
  • FIG. 14 is a flowchart of a method of viewing angle rotation provided by another exemplary embodiment of the present application.
  • FIG. 15 is a block diagram of a viewing angle rotation device provided by an exemplary embodiment of the present application.
  • Fig. 16 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • Virtual environment the virtual environment displayed (or provided) when the application is running on the terminal.
  • the virtual environment may be a simulation environment of the real world, a semi-simulation and semi-fictional environment, or a purely fictitious environment.
  • the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment.
  • the following embodiments take the virtual environment as a three-dimensional virtual environment as an example, but are not limited thereto.
  • Virtual role refers to the movable object in the virtual environment.
  • the movable object may be at least one of a virtual character, a virtual animal, and an animation character.
  • the virtual environment is a three-dimensional virtual environment
  • the virtual character is a three-dimensional model created based on animation skeleton technology.
  • Each virtual character has its own shape and volume in the three-dimensional virtual environment, and occupies a part of the space in the three-dimensional virtual environment.
  • Viewing direction the viewing direction when the virtual character is observed in the virtual environment from the first-person, third-person, or other perspectives.
  • other perspectives can be overhead perspectives or any possible other perspectives
  • the first-person perspective is the observation perspective of the first-person virtual character in the virtual environment, and the observed virtual screen does not include the virtual character itself
  • the third-person perspective is The viewing angle of the third-person virtual character in the virtual environment.
  • the observed virtual picture includes the virtual character itself.
  • the viewing angle direction refers to the direction observed through the camera model when the virtual character observes in the virtual environment.
  • the camera model automatically follows the virtual character in the virtual environment, that is, when the position of the virtual character in the virtual environment changes, the camera model follows the position of the virtual character in the virtual environment and changes simultaneously, and the camera The model is always within the preset distance range of the virtual character in the virtual environment.
  • the relative position of the camera model and the virtual character does not change.
  • Camera model is a three-dimensional model located around the virtual character in a three-dimensional virtual environment.
  • the camera model is located near the head of the virtual character or the head of the virtual character;
  • the third-person perspective is adopted, the The camera model can be located behind the virtual character and bound with the virtual character, or can be located at any position with a preset distance from the virtual character.
  • the camera model can observe the virtual character in the three-dimensional virtual environment from different angles.
  • the third-person perspective is the over-the-shoulder perspective of the first person
  • the camera model is located behind the virtual character (such as the head and shoulders of the virtual character).
  • the perspective includes other perspectives, such as a top-view perspective; when a top-down perspective is used, the camera model can be located above the virtual character's head, and the top-view perspective is viewed from the air Angle of view to observe the virtual environment.
  • the camera model is not actually displayed in the three-dimensional virtual environment, that is, the camera model is not displayed in the three-dimensional virtual environment displayed on the user interface.
  • a virtual character corresponds to a camera model.
  • the camera model can be rotated with the virtual character as the center of rotation, such as: Any point of is the center of rotation to rotate the camera model.
  • the camera model not only rotates in angle, but also shifts in displacement.
  • the distance between the camera model and the center of rotation remains unchanged. That is, the camera model is rotated on the surface of the sphere with the center of rotation as the center of the sphere, where any point of the virtual character can be the head, torso, or any point around the virtual character.
  • the center of the angle of view of the camera model points in a direction where the point on the spherical surface where the camera model is located points to the center of the sphere.
  • the camera model can also observe the virtual character at a preset angle in different directions.
  • a point is determined in the virtual character 11 as the rotation center 12, and the camera model rotates around the rotation center 12.
  • the camera model is configured with an initial position, which is the virtual character The position above the back (such as the position behind the brain).
  • the initial position is position 13, and when the camera model rotates to position 14 or position 15, the viewing angle direction of the camera model changes with the rotation of the camera model.
  • the terminal in this application can be a laptop portable computer, a mobile phone, a tablet computer, an e-book reader, an electronic game console, a moving picture expert compression standard audio layer 4 (Moving Picture Experts Group Audio Layer IV, MP4) player, etc. .
  • a moving picture expert compression standard audio layer 4 Moving Picture Experts Group Audio Layer IV, MP4
  • the aforementioned terminal includes a pressure touch screen 120, a memory 140, and a processor 160. Please refer to the structural block diagram of the terminal shown in FIG. 2.
  • the touch screen 120 may be a capacitive screen or a resistive screen.
  • the touch screen 120 is used to implement interaction between the terminal and the user.
  • the terminal obtains the viewing angle rotation operation triggered by the user through the touch screen 120.
  • the memory 140 may include one or more computer-readable storage media.
  • the foregoing computer storage medium includes at least one of random access memory (Random Access Memory, RAM), read only memory (Read Only Memory, ROM), and flash memory (Flash).
  • An operating system 142 and application programs 144 are installed in the memory 140.
  • the operating system 142 is basic software that provides the application program 144 with secure access to computer hardware.
  • the operating system 142 may be an Android system (Android) or an Apple system (IOS).
  • the application program 144 is an application program supporting a virtual environment, and the virtual environment includes a virtual character.
  • the application program 144 is an application program supporting a three-dimensional virtual environment.
  • the application program 144 may be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, a MOBA game, and a multiplayer gun battle survival game.
  • the application 144 may be a stand-alone version of the application, such as a stand-alone version of a 3D game program; it may also be an online version of the application.
  • the processor 160 may include one or more processing cores, such as a 4-core processor or an 8-core processor.
  • the processor 160 is configured to execute a viewing angle rotation command according to the viewing angle rotation operation of the virtual character received on the touch screen 120.
  • the foregoing terminal may further include a gyroscope 180.
  • the above-mentioned gyroscope 180 is used to obtain the viewing angle rotation operation of the virtual character triggered by the user.
  • Fig. 4 is a flowchart of a method of viewing angle rotation provided by an exemplary embodiment of the present application. The method is applied to the terminal shown in Fig. 2 or Fig. 3 as an example for illustration. The method includes:
  • Step 201 Display a first-view picture of the application program.
  • the terminal displays the first-view picture of the application program.
  • the application program may be at least one of a virtual reality application, a three-dimensional map application, a military simulation program, a TPS game, an FPS game, and a MOBA game.
  • the first-perspective picture is a picture when the virtual character's first-perspective direction is adopted in the virtual environment to observe the virtual environment.
  • the first perspective direction may be a direction of observing the virtual environment using at least one of a first person perspective, a third person perspective, or other perspectives.
  • the other viewing angles can be a top view or any other possible viewing angles.
  • the virtual environment screen corresponding to the first-person perspective does not include the virtual character itself; the third-person perspective and the virtual environment screen corresponding to the overhead perspective include the virtual character itself. For example, when you observe the virtual environment through the camera model, you can see The three-dimensional model of the virtual character and the virtual firearms held by the virtual character.
  • a first function control and a second function control are superimposed on the first view angle screen, the first function control is used to support the first function and the viewing angle rotation function, and the second function control is used to support the second function and the viewing angle rotation function.
  • the first function refers to other functions except the viewing angle rotation function; the second function refers to other functions except the viewing angle rotation function.
  • the other functions may be a lens opening function, a probe function, or a shooting function, etc.
  • the first function control includes at least one of a lens opening control, a probe control and a shooting control.
  • the second function control includes at least one of a lens opening control, a probe control and a shooting control.
  • the first functional control is different from the second functional control.
  • the first functional control is a mirror opening control
  • the second functional control is a probe control.
  • the open-scope control is used to turn on or off the collimator, and the collimator is used to assist in aiming the target during shooting.
  • the collimator may include a multiplier, a red dot collimator, and a holographic collimator.
  • the probe control is used to control the virtual character's head to shoot out when there is an obstruction, thereby reducing the exposed area of itself.
  • the shooting control is used to control fire, for example, to control a virtual rifle to fire at the target.
  • Step 202 Receive a first viewing angle rotation operation triggered based on the first function control.
  • the terminal receives a first view angle rotation operation triggered based on the first function control; optionally, the first view angle rotation operation includes any one of a click operation and a long press operation.
  • Step 203 Turn on the first function and the viewing angle rotation function of the first functional control according to the first viewing angle rotation operation, and switch the first viewing angle picture to the second viewing angle picture.
  • the terminal rotates a corresponding angle based on the first viewing angle direction according to the first viewing angle rotation operation to rotate the first viewing angle picture to the second viewing angle picture.
  • the second-perspective picture is a picture when the virtual character's second-perspective direction is adopted in the virtual environment to observe the virtual environment.
  • the terminal generates a first sequence label of the first functional control according to the first viewing angle rotation operation; the above first sequence label is used to determine whether to turn on or off the viewing angle rotation operation of the first functional control.
  • the terminal when the first functional control is a mirror opening control, the terminal turns on the collimator according to the first viewing angle rotation operation, and turns on the viewing angle rotation operation to switch the first viewing angle picture to the second viewing angle picture.
  • the terminal turns on the probe function according to the first viewing angle rotation operation, and turns on the viewing angle rotation operation to switch the first viewing angle picture to the second viewing angle picture.
  • the terminal fires according to the first angle of view rotation operation, and starts the angle of view rotation operation to switch the first angle of view screen to the second angle of view screen.
  • the firing control control fire includes two modes: first, press to fire; second, release to fire; therefore, the above terminal fires according to the first angle of view rotation operation, which can be fired when the firing control is pressed , It can also fire when the shooting control is pressed and released.
  • Step 204 Receive a second viewing angle rotation operation triggered based on the second functional control when the first functional control is in the on state.
  • the terminal receives a second viewing angle rotation operation triggered based on the second function control.
  • the second viewing angle rotation operation includes any one of a click operation and a long press operation.
  • Step 205 Turn off the viewing angle rotation function of the first functional control according to the second viewing angle rotation operation, turn on the second function and viewing angle rotation function of the second functional control, and switch the second viewing angle image to the third viewing angle image.
  • the terminal rotates the corresponding angle based on the second viewing angle direction according to the second viewing angle rotation operation to rotate the second viewing angle picture to the third viewing angle picture.
  • the third-perspective picture is a picture when the virtual character's third-perspective direction is adopted in the virtual environment to observe the virtual environment.
  • the terminal when the second functional control is a mirror opening control, the terminal turns on the collimator according to the second viewing angle rotation operation, and turns on the viewing angle rotation operation to switch the second viewing angle picture to the third viewing angle picture.
  • the terminal turns on the probe function according to the second viewing angle rotation operation, and turns on the viewing angle rotation operation to switch the second viewing angle image to the third viewing angle image.
  • the terminal fires according to the second angle of view rotation operation, and starts the angle of view rotation operation to switch the second angle of view screen to the third angle of view screen.
  • the first function of the first function control is in the on state, for example, the collimator is in the on state, or the probe function is in the on state. It should be noted that if the first function control is a shooting control, if it is a non-continuous shooting state, the shooting control is still in the open state after a firing, but no bullets will be fired.
  • the aforementioned second sequence label is used to determine whether to enable or disable the viewing angle rotation function of the second function control.
  • the first functional control includes the first functional control in an on state, and the first functional control in the on state corresponds to a first sequence label; the schematic steps for generating the second sequence label are as follows:
  • the second sequence label is x+1; for example, if the first sequence label includes 1, 2, and the largest sequence label is 2, the second sequence label is determined to be 3.
  • the terminal determines whether the second sequence label is greater than the first sequence label; when the second sequence label is greater than the first sequence label, the terminal turns off the viewing angle rotation function of the first functional control, and turns on the second function and viewing angle rotation function of the second functional control .
  • the terminal displays the first-view screen of the application program.
  • the first-view screen is superimposed with the first functional control and the second functional control, and the first functional control is used to support The first function and the viewing angle rotation function, the second function control is used to support the second function and the viewing angle rotation function;
  • the terminal receives the first viewing angle rotation operation triggered based on the first function control; according to the first viewing angle rotation operation, the first function control is turned on The first function and the viewing angle rotation function of the switch, the first viewing angle picture is switched to the second viewing angle picture; when the first function control is in the open state, receiving the second viewing angle rotation operation triggered by the second function control; rotating according to the second viewing angle Operate, turn off the viewing angle rotation function of the first functional control, turn on the second function and viewing angle rotation function of the second functional control, and switch the second viewing angle picture to the third viewing angle picture.
  • the above method can also respond to the viewing angle rotation operation triggered by the second function control while the viewing angle rotation function of the first functional control is triggered, that is, the screen can respond to the viewing angle rotation operation of at least two contacts at the same time, which improves the operation process.
  • the screen can respond to the viewing angle rotation operation of at least two contacts at the same time, which improves the operation process.
  • the efficiency of the interaction when the first functional control is in the on state, the viewing angle rotation function is triggered based on the second functional control, and the terminal first responds to the viewing angle rotation operation triggered based on the second functional control, which ensures that the viewing angle rotation function is triggered in multiple When the functional controls are all turned on, the order and accuracy of the terminal's response to the viewing angle rotation operation.
  • the setting of multiple functional controls with viewing angle rotation function ensures that players can freely complete the viewing angle rotation operation in different states, providing more flexibility for combat Sex and operating space.
  • the switching of the viewing angle rotation function of the first functional control and the second functional control is shown.
  • the user interface 21 under the first viewing angle includes the probe control 22, the lens opening control 23 and the shooting control 24; the terminal receiving is based on the probe control 22
  • the first angle of view rotation operation the first angle of view is rotated to the second angle of view, such as the user interface 25 in the second angle of view, relative to the first angle of view, the second angle of view has moved the distance L1 to the right;
  • the terminal While being triggered, the terminal receives the second-view rotation operation based on the shooting button 24, and rotates the second-view picture to the third-view picture, such as the user interface 26 in the third viewpoint.
  • the right has moved the distance L2.
  • the terminal may determine a functional control from the first functional control to perform the viewing angle rotation operation;
  • step 206 to step 208 are added after step 205, as shown in FIG. 6, and the steps are as follows:
  • Step 206 Determine whether to end the second viewing angle rotation operation on the second function control.
  • the terminal determines whether to end the second viewing angle rotation operation on the second function control.
  • step 207 is executed; when the terminal does not end the second view angle rotation operation on the second function control, step 208 is executed.
  • the terminal uses a drag operation to rotate the view angle of the virtual character, that is, the second view angle rotation operation also includes a drag operation.
  • the second viewing angle rotation operation includes a click operation and a drag operation.
  • step 207 is executed; otherwise, step 208 is executed.
  • the second viewing angle rotation operation includes a long-press operation and a drag operation.
  • step 207 is executed; otherwise, step 208 is executed.
  • the second viewing angle rotation operation on the second function control ends, that is, the viewing angle rotation function on the second function control is turned off.
  • Step 207 Determine the i-th first function control in the open state from the first function controls in the open state, and turn on the viewing angle rotation function of the i-th first function control in the open state.
  • the terminal determines the i-th first functional control in the on state from the first functional control in the on state, and turns on the first functional control in the on state.
  • the viewing angle rotation function of i first function controls in the open state, i is a positive integer.
  • the viewing angle rotation function of the i-th first functional control in the on state is turned on.
  • the n first function controls that are in the open state respectively correspond to n first sequence numbers
  • the largest sequence label is determined from the n first sequence numbers
  • the i-th corresponding to the largest sequence label is in the open state
  • the first functional control of is filtered out, and the viewing angle rotation function of the i-th first functional control is turned on, and n is a positive integer.
  • Step 208 still perform the second viewing angle rotation operation on the second function control.
  • the control automatically takes over the viewing angle rotation, which greatly guarantees the orderliness and accuracy of the terminal's response to the viewing angle rotation operation when there are multiple functional controls with the viewing angle rotation function.
  • the disappearance of the rotation contact of one viewing angle can receive the response of the rotation of the viewing angle by the rotation contact of another viewing angle, and it can also avoid the situation that the picture is stuck when the viewing angle rotation contact is triggered again.
  • the embodiment shown in FIG. 6 described above can be divided into two parts: the pressing process and the releasing process according to the operating state of the second function control.
  • the pressing of the second function control Schematic description of the process:
  • Step 31 start.
  • Step 32 The terminal receives a pressing operation on the second functional control with the viewing angle rotation function (triggering of the second viewing angle rotation operation).
  • Step 33 The terminal activates the viewing angle rotation function of the second functional control according to the pressing operation, and marks the second sequence number for the second functional control.
  • the terminal marks the second functional control with a second sequence number.
  • the last triggered function control with viewing angle rotation function is marked with a sequence number of x, and the second sequence number is x+1, that is, before the second function control is triggered, x is the largest sequence label ; X is a positive integer.
  • Step 34 The terminal judges whether a first functional control with a viewing angle rotation function is in a pressed state.
  • the terminal determines that the first function control with the viewing angle rotation function is in the pressed state, and step 35 is executed; otherwise, step 36 is executed.
  • Step 35 The terminal turns off the viewing angle rotation function of the first function control.
  • the end here refers to the end of searching for the first functional control that is in the pressed state, but the viewing angle rotation function of the second functional control is still in the on state.
  • Step 41 start.
  • Step 42 The terminal receives the release operation on the second function control with the viewing angle rotation function (cancellation of the second viewing angle rotation operation).
  • Step 43 The terminal judges whether the viewing angle rotation function of the second function control is in an on state.
  • step 47 is executed; otherwise, step 44 is executed.
  • Step 44 The terminal judges whether a first function control with a viewing angle rotation function is in a pressed state.
  • the terminal determines that the first function control with the viewing angle rotation function is in the pressed state, and step 45 is executed; otherwise, step 47 is executed.
  • Step 45 The terminal searches for the first functional control corresponding to the largest sequence label.
  • the terminal determines the first function control corresponding to the largest sequence label from the n first function controls in the pressed state.
  • Step 46 The terminal enables the viewing angle rotation function of the first function control corresponding to the largest sequence label.
  • the user can customize the trigger mode of the viewing angle rotation operation.
  • the trigger mode of the viewing angle rotation operation can be defined as a click operation, or a long press operation, or a touch operation.
  • the customization of the trigger mode of the viewing angle rotation operation is described, as shown in Figure 9. The steps are as follows:
  • Step 301 Display the setting interface of the application.
  • the setting interface of the application program is displayed on the terminal, and the setting interface includes at least two mode setting controls, and the mode setting controls are used to set the trigger mode of the viewing angle rotation operation.
  • the mode setting control includes at least two of a click mode setting control, a long-press mode setting control, and a mixed mode setting control.
  • the viewing angle rotation operation corresponding to the click mode setting control is a click operation
  • the viewing angle rotation operation corresponding to the long press mode setting control is a long press operation
  • the viewing angle rotation operation corresponding to the mixed mode setting control is a touch operation
  • the duration of the touch operation is used for Determine whether the second function of the second function control is turned on or off.
  • the viewing angle rotation operation includes any one of a first viewing angle rotation operation and a second viewing angle rotation operation.
  • Step 302 Receive a selection operation triggered on the setting interface.
  • the selection operation is used to select at least two mode setting controls corresponding to the target trigger mode.
  • the selection operation may include at least one of a single-click operation, a double-click operation, a long press operation, and a sliding operation.
  • Step 303 According to the selection operation, the trigger mode of the viewing angle rotation operation is determined as the target trigger mode.
  • the terminal determines the trigger mode of the viewing angle rotation operation as the target trigger mode; optionally, the target trigger mode includes at least two of a click operation, a long press operation, and a touch operation.
  • the setting interface 51 of the application includes three setting buttons for the mirror opening mode: click mode setting control 52, long press mode setting control 53 and mixed mode setting control 54.
  • click mode setting control 52 click mode setting control 52
  • long press mode setting control 53 long press mode setting control 53
  • mixed mode setting control 54 The user can choose any of the three lens opening modes.
  • the setting interface 55 of the application program includes setting buttons for three probe modes: click mode setting control 56, long-press mode setting control 57, and mixed mode setting control 58. The user can select any of the three probe modes.
  • the user can customize the trigger mode of the viewing angle rotation operation, adapt to the user's own shooting habits and operating characteristics, meet the independent operation needs of users at different levels, and enrich users The choice provides more personalized combat experience.
  • the first function control also includes a first function
  • the second function control also includes a second function
  • the first viewing angle rotation operation also controls the opening and closing of the first function
  • the second viewing angle rotation operation also Control the opening and closing of the second function.
  • the target trigger mode includes a click operation; the terminal starts the second function of the second function control according to the click operation; when the click operation ends, keeps the second function of the second function control in an on state.
  • the terminal closes the second function of the second function control according to a click operation on the second function control again.
  • the mirror opening control controls the mirror opening and closing process as follows:
  • Step 61 start.
  • Step 62 The terminal receives the click operation on the mirror opening control.
  • Step 63 The terminal judges whether the mirror opening function is in an on state.
  • step 64 When the terminal determines that the mirror opening function is in the on state, step 64 is executed; otherwise, step 65 is executed.
  • Step 64 the terminal closes the mirror.
  • Step 65 the terminal opens the mirror.
  • the target trigger mode includes a long-press operation; the terminal activates the second function of the second function control according to the long-press operation; when the long-press operation ends, keep the second function of the second function control in an on state.
  • the terminal closes the second function of the second function control according to a click operation on the second function control.
  • the mirror opening control controls the mirror opening and closing process as follows:
  • Step 71 start.
  • Step 72 The terminal receives the long press operation on the mirror opening control.
  • Step 73 the terminal opens the mirror.
  • Step 74 The terminal judges whether to end the long press operation on the mirror opening control.
  • step 75 is executed; otherwise, step 76 is executed.
  • Step 75 the terminal turns off the mirror.
  • Step 76 The terminal keeps the mirror on state.
  • the target trigger mode includes a touch operation; the terminal activates the second function of the second function control according to the touch operation; when the touch operation ends, the duration of the touch operation is acquired; when the duration is greater than the time threshold, the first function is maintained.
  • the second function of the second function control is in the on state.
  • the time threshold is used to determine to keep the second function of the second function control in the on state after the touch operation is finished.
  • the second function of the second function control is turned off.
  • the terminal closes the second function of the second function control according to a click operation on the second function control.
  • the mirror opening control controls the mirror opening and closing process as follows:
  • Step 81 start.
  • Step 82 The terminal receives the touch operation on the mirror opening control.
  • Step 83 The terminal judges whether the mirror opening function is in an on state.
  • step 84 is executed; otherwise, step 85 is executed.
  • Step 84 the terminal turns off the mirror.
  • Step 85 the terminal opens the mirror.
  • Step 86 The terminal judges whether the operation duration at the end of the touch operation is greater than the time threshold.
  • step 87 When the terminal determines that the operation duration at the end of the touch operation is greater than the time threshold, step 87 is executed; otherwise, step 88 is executed.
  • the time threshold may be 0.2 seconds (s); when the terminal determines that the operation duration at the end of the touch operation is greater than 0.2s, step 87 is executed; otherwise, step 88 is executed.
  • Step 87 The terminal determines that the touch operation is a long-press operation and keeps the mirror on state.
  • Step 88 The terminal determines that the touch operation is a click operation and turns off the mirror.
  • the response logic of the viewing angle rotation when the first functional control and the second functional control are triggered is customized by the user, which is illustrative.
  • the terminal passes the first functional control according to the custom logic. Control the perspective rotation of the virtual character; or, control the perspective rotation of the virtual character through the second function control according to the custom logic; where the custom logic is the user-defined first function control and the second function control when triggered Response logic of rotation operation.
  • the custom logic is that when the first functional control and the second functional control are triggered at the same time, the terminal enables the viewing angle rotation function of the first functional control and turns off the viewing angle rotation control of the second functional control. Therefore, when the first function control and the second function control are triggered at the same time, the terminal controls the viewing angle rotation of the virtual character through the first function control.
  • the terminal also controls the viewing angle rotation of the virtual character through a gyroscope.
  • the terminal receives its own rotation operation, and controls the rotation of the virtual character's viewing angle through the gyroscope.
  • the method of viewing angle rotation provided in this embodiment provides users with the function of customizing the response logic of viewing angle rotation, enabling users to customize control operation logic with viewing angle rotation function that is more in line with their own operating habits, and improves The operating experience in user engagement is improved.
  • the viewing angle rotation of the virtual character can also be controlled by the gyroscope, so that while the user rotates the viewing angle of the virtual character, it can also control other operations on the virtual character, which improves the interaction efficiency in the battle.
  • FIG. 15 is a viewing angle rotation device provided by an exemplary embodiment provided by the present application.
  • the device may form part or all of the terminal through software, hardware, or a combination of the two, and the device includes:
  • the display module 401 is used to display a first-view picture of the application program.
  • the first-view picture is a picture when the virtual environment is observed in the virtual environment from the first viewpoint of the virtual character, and the first function control is superimposed on the first-view picture
  • a second function control the first function control is used to support the first function and the viewing angle rotation function
  • the second function control is used to support the second function and the viewing angle rotation function
  • the receiving module 402 is configured to receive the first viewing angle rotation operation triggered based on the first function control
  • the processing module 403 is configured to turn on the first function and the viewing angle rotation function of the first functional control according to the first viewing angle rotation operation, and switch the first viewing angle picture to the second viewing angle picture; the second viewing angle picture adopts a virtual environment in a virtual environment. The screen when the character observes the virtual environment from the second perspective direction;
  • the receiving module 402 is configured to receive a second viewing angle rotation operation triggered based on the second functional control when the first functional control is in the on state;
  • the processing module 403 is configured to turn off the viewing angle rotation function of the first functional control according to the second viewing angle rotation operation, turn on the second function and viewing angle rotation function of the second functional control, and switch the second viewing angle picture to the third viewing angle picture;
  • the three-perspective picture is the picture when the virtual character's third-perspective direction is adopted in the virtual environment to observe the virtual environment.
  • the processing module 403 includes:
  • a generating sub-module 4032 is used to generate the second order label of the second function control according to the second viewing angle rotation operation
  • the processing sub-module 4034 is used for turning off the viewing angle rotation function of the first functional control and turning on the second function and viewing angle rotation function of the second functional control when the second sequence label is greater than the first sequence label; where the first sequence label is The order label of the first function control.
  • the first function control includes a first function control in an open state, and the first function control in an open state corresponds to a first order label;
  • a sub-module 4032 is generated for obtaining the largest sequence label from the first sequence label according to the second viewing angle rotation operation; the sequence label obtained by adding one to the largest sequence label is determined as the second sequence label.
  • the first functional control includes the first functional control in an on state
  • the processing sub-module 4034 is configured to determine the i-th first function control in the open state from the first function control in the open state when the second view angle rotation operation based on the second function control is ended, and open the i-th first function control
  • the viewing angle rotation function of the first function control in the open state, i is a positive integer.
  • the first function control in the on state corresponds to a first order label
  • the processing sub-module 4034 is configured to enable the viewing angle rotation function of the i-th first functional control in the open state when the first sequence label of the i-th first functional control in the open state is the maximum sequence label.
  • the display module 401 is configured to display the setting interface of the application program, and the setting interface includes at least two mode setting controls, and the mode setting controls are used to set the trigger mode of the viewing angle rotation operation;
  • the receiving module 402 is configured to receive a selection operation triggered on the setting interface, and the selection operation is used to select a mode setting control corresponding to a target trigger mode among at least two mode setting controls;
  • the determining module 404 is configured to determine the trigger mode of the viewing angle rotation operation as the target trigger mode according to the selection operation;
  • the viewing angle rotation operation includes any one of a first viewing angle rotation operation and a second viewing angle rotation operation.
  • the viewing angle rotation operation includes a second viewing angle rotation operation
  • the target trigger mode includes a click operation
  • the processing module 403 is configured to enable the second function of the second function control according to the click operation; when the click operation is ended, keep the second function of the second function control in the on state.
  • the viewing angle rotation operation includes a second viewing angle rotation operation
  • the target trigger mode includes a long press operation
  • the processing module 403 is configured to enable the second function of the second function control according to the long-press operation; when the long-press operation is ended, keep the second function of the second function control in the on state.
  • the viewing angle rotation operation includes a second viewing angle rotation operation
  • the target trigger mode includes a touch operation
  • the processing module 403 is configured to enable the second function of the second functional control according to the touch operation; when the touch operation is ended, obtain the duration of the touch operation; when the duration is greater than the time threshold, keep the second function of the second functional control at Open state.
  • the processing module 403 is configured to turn off the second function of the second function control when the duration is less than or equal to the time threshold.
  • the processing module 403 is configured to control the viewing angle rotation of the virtual character through the first function control according to the custom logic; or, control the viewing angle rotation of the virtual character through the second function control according to the custom logic;
  • the logic is the user-defined response logic to the viewing angle rotation operation when the first functional control and the second functional control are triggered.
  • the first viewing angle screen of the application is displayed on the terminal.
  • the first viewing angle screen is superimposed with the first functional control and the second functional control.
  • the first functional control is used to support The first function and the viewing angle rotation function
  • the second function control is used to support the second function and the viewing angle rotation function
  • the terminal receives the first viewing angle rotation operation triggered based on the first function control; according to the first viewing angle rotation operation, the first function control is turned on The first function and the viewing angle rotation function of the switch, the first viewing angle picture is switched to the second viewing angle picture; when the first function control is in the open state, receiving the second viewing angle rotation operation triggered by the second function control; rotating according to the second viewing angle Operate, turn off the viewing angle rotation function of the first functional control, turn on the second function and viewing angle rotation function of the second functional control, and switch the second viewing angle picture to the third viewing angle picture.
  • the above device can also respond to the viewing angle rotation operation triggered by the second function control while the viewing angle rotation function of the first functional control is triggered, that is, the screen can respond to the viewing angle rotation operation of at least two contacts at the same time, which improves the operation process.
  • the device when the first functional control is in the on state, the device triggers the viewing angle rotation function based on the second functional control, and the terminal first responds to the viewing angle rotation operation triggered by the second functional control, which ensures that multiple functions with viewing angle rotation function When the controls are turned on, the order and accuracy of the terminal's response to the viewing angle rotation operation.
  • FIG. 16 shows a structural block diagram of a terminal 500 provided by an exemplary embodiment of the present invention.
  • the terminal 500 may be: a smartphone, a tablet computer, a moving Picture Experts Group Audio Layer III (MP3) player, a moving Picture Experts compressed standard audio layer 4 (Moving Picture Experts Group Audio Layer IV) , MP4) player, laptop or desktop computer.
  • MP3 Moving Picture Experts Group Audio Layer III
  • MP4 Moving Picture Experts compressed standard audio layer 4
  • the terminal 500 may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
  • the terminal 500 includes a processor 501 and a memory 502.
  • the processor 501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
  • the processor 501 may adopt at least one hardware form among digital signal processing (Digital Signal Processing, DSP), Field-Programmable Gate Array (Field-Programmable Gate Array, FPGA), and Programmable Logic Array (Programmable Logic Array, PLA). achieve.
  • the processor 501 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the awake state, and is also called a central processing unit (CPU);
  • the coprocessor is A low-power processor used to process data in the standby state.
  • the processor 501 may be integrated with a graphics processor (Graphics Processing Unit, GPU), and the GPU is used to render and draw content that needs to be displayed on the display screen.
  • the processor 501 may also include an artificial intelligence (Artificial Intelligence, AI) processor, and the AI processor is used to process calculation operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 502 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 502 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 502 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 501 to implement the viewing angle rotation provided by the method embodiment of the present application. Methods.
  • the terminal 500 may optionally further include: a peripheral device interface 503 and at least one peripheral device.
  • the processor 501, the memory 502, and the peripheral device interface 503 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 503 through a bus, a signal line or a circuit board.
  • the peripheral device includes: at least one of a radio frequency circuit 504, a touch screen 505, a camera 506, an audio circuit 507, a positioning component 508, and a power supply 509.
  • the peripheral device interface 503 can be used to connect at least one peripheral device related to Input/Output (I/O) to the processor 501 and the memory 502.
  • the processor 501, the memory 502, and the peripheral device interface 503 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 501, the memory 502, and the peripheral device interface 503 or The two can be implemented on separate chips or circuit boards, which are not limited in this embodiment.
  • the radio frequency circuit 504 is used to receive and transmit radio frequency (RF) signals, also called electromagnetic signals.
  • the radio frequency circuit 504 communicates with a communication network and other communication devices through electromagnetic signals.
  • the radio frequency circuit 504 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • the radio frequency circuit 504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a user identity module card, and so on.
  • the radio frequency circuit 504 can communicate with other terminals through at least one wireless communication protocol.
  • the wireless communication protocol includes but is not limited to: World Wide Web, Metropolitan Area Network, Intranet, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area network and/or Wireless Fidelity (WiFi) networks.
  • the radio frequency circuit 504 may also include a circuit related to Near Field Communication (NFC), which is not limited in this application.
  • NFC Near Field Communication
  • the display screen 505 is used to display a user interface (UI).
  • the UI can include graphics, text, icons, videos, and any combination thereof.
  • the display screen 505 also has the ability to collect touch signals on or above the surface of the display screen 505.
  • the touch signal can be input to the processor 501 as a control signal for processing.
  • the display screen 505 may also be used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
  • the display screen 505 there may be one display screen 505, which is provided with the front panel of the terminal 500; in other embodiments, there may be at least two display screens 505, which are respectively arranged on different surfaces of the terminal 500 or in a folding design; In still other embodiments, the display screen 505 may be a flexible display screen, which is arranged on the curved surface or the folding surface of the terminal 500. Furthermore, the display screen 505 can also be set as a non-rectangular irregular pattern, that is, a special-shaped screen.
  • the display screen 505 may be made of materials such as a liquid crystal display (Liquid Crystal Display, LCD), an organic light emitting diode (Organic Light-Emitting Diode, OLED).
  • the camera assembly 506 is used to capture images or videos.
  • the camera assembly 506 includes a front camera and a rear camera.
  • the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
  • the camera assembly 506 may also include a flash.
  • the flash can be a single-color flash or a dual-color flash. Dual color temperature flash refers to a combination of warm light flash and cold light flash, which can be used for light compensation under different color temperatures.
  • the audio circuit 507 may include a microphone and a speaker.
  • the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals and input them to the processor 501 for processing, or input to the radio frequency circuit 504 to implement voice communication. For the purpose of stereo collection or noise reduction, there may be multiple microphones, which are respectively set in different parts of the terminal 500.
  • the microphone can also be an array microphone or an omnidirectional acquisition microphone.
  • the speaker is used to convert the electrical signal from the processor 501 or the radio frequency circuit 504 into sound waves.
  • the speaker can be a traditional membrane speaker or a piezoelectric ceramic speaker.
  • the speaker When the speaker is a piezoelectric ceramic speaker, it can not only convert the electrical signal into human audible sound waves, but also convert the electrical signal into human inaudible sound waves for purposes such as distance measurement.
  • the audio circuit 507 may also include a headphone jack.
  • the positioning component 508 is used to locate the current geographic location of the terminal 500 to implement navigation or location-based service (LBS).
  • LBS location-based service
  • the positioning component 508 may be a positioning component based on the Global Positioning System (GPS) of the United States, the Beidou system of China, or the Galileo system of Russia.
  • GPS Global Positioning System
  • the power supply 509 is used to supply power to various components in the terminal 500.
  • the power source 509 may be alternating current, direct current, disposable batteries, or rechargeable batteries.
  • the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery.
  • a wired rechargeable battery is a battery charged through a wired line
  • a wireless rechargeable battery is a battery charged through a wireless coil.
  • the rechargeable battery can also be used to support fast charging technology.
  • the terminal 500 further includes one or more sensors 510.
  • the one or more sensors 510 include, but are not limited to: an acceleration sensor 511, a gyroscope sensor 512, a pressure sensor 513, a fingerprint sensor 514, an optical sensor 515, and a proximity sensor 516.
  • the acceleration sensor 511 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal 500.
  • the acceleration sensor 511 can be used to detect the components of gravitational acceleration on three coordinate axes.
  • the processor 501 may control the touch screen 505 to display the user interface in a horizontal view or a vertical view according to the gravity acceleration signal collected by the acceleration sensor 511.
  • the acceleration sensor 511 can also be used for game or user motion data collection.
  • the gyroscope sensor 512 can detect the body direction and rotation angle of the terminal 500, and the gyroscope sensor 512 can cooperate with the acceleration sensor 511 to collect the user's 3D actions on the terminal 500.
  • the processor 501 can implement the following functions according to the data collected by the gyroscope sensor 512: motion sensing (for example, changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 513 may be arranged on the side frame of the terminal 500 and/or the lower layer of the touch screen 505.
  • the processor 501 performs left and right hand recognition or quick operation according to the holding signal collected by the pressure sensor 513.
  • the processor 501 controls the operability controls on the UI interface according to the user's pressure operation on the touch display screen 505.
  • the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
  • the fingerprint sensor 514 is used to collect the user's fingerprint.
  • the processor 501 can identify the user's identity according to the fingerprint collected by the fingerprint sensor 514, or the fingerprint sensor 514 can identify the user's identity according to the collected fingerprint. When it is recognized that the user's identity is a trusted identity, the processor 501 authorizes the user to perform related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings.
  • the fingerprint sensor 514 may be provided on the front, back or side of the terminal 500. When a physical button or a manufacturer logo is provided on the terminal 500, the fingerprint sensor 514 can be integrated with the physical button or the manufacturer logo.
  • the optical sensor 515 is used to collect the ambient light intensity.
  • the processor 501 may control the display brightness of the touch screen 505 according to the ambient light intensity collected by the optical sensor 515. Specifically, when the ambient light intensity is high, the display brightness of the touch screen 505 is increased; when the ambient light intensity is low, the display brightness of the touch screen 505 is decreased.
  • the processor 501 may also dynamically adjust the shooting parameters of the camera assembly 506 according to the ambient light intensity collected by the optical sensor 515.
  • the proximity sensor 516 also called a distance sensor, is usually provided on the front panel of the terminal 500.
  • the proximity sensor 516 is used to collect the distance between the user and the front of the terminal 500.
  • the processor 501 controls the touch screen 505 to switch from the on-screen state to the off-screen state; when the proximity sensor 516 detects When the distance between the user and the front of the terminal 500 gradually increases, the processor 501 controls the touch display screen 505 to switch from the rest screen state to the bright screen state.
  • FIG. 16 does not constitute a limitation on the terminal 500, and may include more or fewer components than shown, or combine some components, or adopt different component arrangements.
  • the program can be stored in a computer-readable storage medium.
  • the medium may be a computer-readable storage medium included in the memory in the foregoing embodiment; or may be a computer-readable storage medium that exists alone and is not assembled into the terminal.
  • the computer-readable storage medium stores at least one instruction, at least one program, code set or instruction set, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor In order to realize the method of viewing angle rotation as described in any one of FIGS.
  • the computer-readable storage medium may include: read only memory (Read Only Memory, ROM), random access memory (Random Access Memory, RAM), solid state drive (Solid State Drives, SSD), or optical disk.
  • the random access memory may include resistance random access memory (Resistance Random Access Memory, ReRAM) and dynamic random access memory (Dynamic Random Access Memory, DRAM).
  • ReRAM resistance random access memory
  • DRAM Dynamic Random Access Memory

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

一种视角转动的方法、装置、设备及存储介质,该方法包括:显示应用程序的第一视角画面(201),第一视角画面上叠加有第一功能控件和第二功能控件;接收基于第一功能控件触发的第一视角转动操作(202);根据第一视角转动操作开启第一功能控件的第一功能和视角转动功能,将第一视角画面切换为第二视角画面(203);在第一功能控件处于开启状态下,接收基于第二功能控件触发的第二视角转动操作(204);根据第二视角转动操作关闭第一功能控件的视角转动功能,开启第二功能控件的第二功能和视角转动功能,将第二视角画面切换为第三视角画面(205)。所述方法能同时响应至少两个触点的视角转动操作,提高了操作过程中的交互效率。

Description

视角转动的方法、装置、设备及存储介质
本申请要求于2019年07月26日提交中国专利局、申请号为201910683976.6、申请名称为“视角转动的方法、装置、设备及存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及人机交互领域,特别涉及视角转动技术。
背景技术
诸如智能手机、平板电脑之类的移动终端上,存在基于虚拟环境的应用程序,比如,第一人称射击(First Person Shooting,FPS)游戏、第三人称射击(Third Person Shooting,TPS)游戏以及多人在线战术竞技(Multiplayer Online Battle Arena,MOBA)游戏等等。
在上述应用程序中,用户通过控制虚拟角色的视角转动来瞄准射击目标以及观察环境。通常,上述应用程序的用户界面上设置有视角转动控件,用户通过在视角转动控件上触发的上下左右等移动操作,控制虚拟角色的视角转动;而在上述控制视角转动的过程中,屏幕上同时仅能响应一个触点的视角转动操作。
在屏幕上同时仅能响应一个触点的视角转动操作的情况下,一旦唯一的触点消失,再次切换视角则需要重新触发一个触点,降低了操作过程中的交互效率。
发明内容
本申请实施例提供了一种视角转动的方法、装置、设备及存储介质,可以提高视角转动操作过程中的交互效率。所述技术方案如下:
根据本申请的一个方面,提供了一种视角转动的方法,由终端执行,该方法包括:
显示应用程序的第一视角画面,第一视角画面是在虚拟环境中采用虚拟角色的第一视角方向观察虚拟环境时的画面,第一视角画面上叠加有第一功能控件和第二功能控件,第一功能控件用于支持第一功能和视角转动功能,第二功能控件用于支持第二功能和视角转动功能;
接收基于第一功能控件触发的第一视角转动操作;
根据第一视角转动操作,开启第一功能控件的第一功能和视角转动功能,将第一视角画面切换为第二视角画面;第二视角画面是在虚拟环境中采用虚拟角色的第二视角方向观察虚拟环境时的画面;
在第一功能控件处于开启状态下,接收基于第二功能控件触发的第二视角转动操作;
根据第二视角转动操作,关闭第一功能控件的视角转动功能,开启第二 功能控件的第二功能和视角转动功能,将第二视角画面切换为第三视角画面;第三视角画面是在虚拟环境中采用虚拟角色的第三视角方向观察虚拟环境时的画面。
根据本申请的另一个方面,提供了一种视角转动的装置,该装置包括:
显示模块,用于显示应用程序的第一视角画面,第一视角画面是在虚拟环境中采用虚拟角色的第一视角方向观察虚拟环境时的画面,第一视角画面上还叠加有第一功能控件和第二功能控件,第一功能控件用于支持第一功能和视角转动功能,第二功能控件用于支持第二功能和视角转动功能;
接收模块,用于接收基于第一功能控件触发的第一视角转动操作;
处理模块,用于根据第一视角转动操作,开启第一功能控件的第一功能和视角转动功能,将第一视角画面切换为第二视角画面;第二视角画面是在虚拟环境中采用虚拟角色的第二视角方向观察虚拟环境时的画面;
接收模块,用于在第一功能控件处于开启状态下,接收基于第二功能控件触发的第二视角转动操作;
处理模块,用于根据第二视角转动操作,关闭第一功能控件的视角转动功能,开启第二功能控件的第二功能和视角转动功能,将第二视角画面切换为第三视角画面;第三视角画面是在虚拟环境中采用虚拟角色的第三视角方向观察虚拟环境时的画面。
根据本申请的另一个方面,提供了一种终端,该终端包括:
存储器;
与存储器相连的处理器;
其中,处理器被配置为加载并执行可执行指令以实现如上一个方面及其可选实施例中任一所述的视角转动的方法。
根据本申请的另一个方面,提供了一种计算机可读存储介质,上述计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,上述至少一条指令、上述至少一段程序、上述代码集或指令集由处理器加载并执行以实现如上一个方面及其可选实施例中任一所述的视角转动的方法。
根据本申请的另一个方面,提供了一种计算机程序产品,包括指令,当其在计算机上运行时,使得计算机执行如上一个方面及其可选实施例中任一所述的视角转动的方法。
本申请实施例提供的技术方案带来的有益效果至少包括:
终端上显示应用程序的第一视角画面,第一视角画面上叠加有第一功能控件和第二功能控件,第一功能控件用于支持第一功能和视角转动功能,第二功能控件用于支持第二功能和视角转动功能;终端接收基于第一功能控件触发的第一视角转动操作;根据第一视角转动操作,开启第一功能控件的第一功能和视角转动功能,将第一视角画面切换为第二视角画面;在第一功能控件处于开启状态下,接收基于第二功能控件触发的第二视角转动操作;根 据第二视角转动操作,关闭第一功能控件的视角转动功能,开启第二功能控件的第二功能和视角转动功能,将第二视角画面切换为第三视角画面。
上述方法在第一功能控件的视角转动功能被触发的同时,还可以响应基于第二功能控件触发的视角转动操作,即屏幕上能够同时响应至少两个触点的视角转动操作,提高了操作过程中的交互效率。且该方法中在第一功能控件处于开启状态的情况下,基于第二功能控件触发的视角转动功能,终端首先响应基于第二功能控件触发的视角转动操作,保证了多个具有视角转动功能的功能控件均开启时,终端对视角转动操作的响应的有序性和准确性。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请一个示例性实施例提供的摄像机模型的示意图;
图2是本申请一个示例性实施例提供的终端的结构框图;
图3是本申请另一个示例性实施例提供的终端的结构框图;
图4是本申请一个示例性实施例提供的视角转动的方法流程图;
图5是本申请一个示例性实施例提供的视角转动的界面示意图;
图6是本申请另一个示例性实施例提供的视角转动的方法流程图;
图7是本申请另一个示例性实施例提供的视角转动的方法流程图;
图8是本申请另一个示例性实施例提供的视角转动的方法流程图;
图9是本申请一个示例性实施例提供的视角转动操作的设置方法的流程图;
图10是本申请一个示例性实施例提供的视角转动操作的设置方法的界面示意图;
图11是本申请另一个示例性实施例提供的视角转动操作的设置方法的界面示意图;
图12是本申请另一个示例性实施例提供的视角转动的方法流程图;
图13是本申请另一个示例性实施例提供的视角转动的方法流程图;
图14是本申请另一个示例性实施例提供的视角转动的方法流程图;
图15是本申请一个示例性实施例提供的视角转动的装置的框图;
图16是本申请一个示例性的实施例提供的终端的结构框图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
首先,对本申请实施例涉及的若干个名词进行解释:
虚拟环境:是应用程序在终端上运行时显示(或提供)的虚拟环境。该虚拟环境可以是对真实世界的仿真环境,也可以是半仿真半虚构的环境,还可以是纯虚构的环境。虚拟环境可以是二维虚拟环境、2.5维虚拟环境和三维虚拟环境中的任意一种,下述实施例以虚拟环境是三维虚拟环境来举例说明,但对此不加以限定。
虚拟角色:是指在虚拟环境中的可活动对象。该可活动对象可以是虚拟人物、虚拟动物、动漫人物中的至少一种。可选地,当虚拟环境为三维虚拟环境时,虚拟角色是基于动画骨骼技术创建的三维立体模型。每个虚拟角色在三维虚拟环境中具有自身的形状和体积,占据三维虚拟环境中的一部分空间。
视角方向:以虚拟角色的第一人称视角、第三人称视角或者其它视角在虚拟环境中进行观察时的观察方向。其中,其它视角可以是俯视视角或任意可能的其它的视角;第一人称视角是第一人称的虚拟角色在虚拟环境中的观察视角,所观察到的虚拟画面中不包括虚拟角色本身;第三人称视角是第三人称的虚拟角色在虚拟环境中的观察视角,所观察到的虚拟画面中包括虚拟角色本身。可选地,本申请的实施例中,视角方向是指在虚拟环境中虚拟角色进行观察时通过摄像机模型观察的方向。
可选地,摄像机模型在虚拟环境中对虚拟角色进行自动跟随,即,当虚拟角色在虚拟环境中的位置发生改变时,摄像机模型跟随虚拟角色在虚拟环境中的位置同时发生改变,且该摄像机模型在虚拟环境中始终处于虚拟角色的预设距离范围内。可选地,在自动跟随过程中,摄像头模型和虚拟角色的相对位置不发生变化。
摄像机模型:是在三维虚拟环境中位于虚拟角色周围的三维模型,当采用第一人称视角时,该摄像机模型位于虚拟角色的头部附近或者位于虚拟角色的头部;当采用第三人称视角时,该摄像机模型可以位于虚拟角色的后方并与虚拟角色进行绑定,也可以位于与虚拟角色相距预设距离的任意位置,通过该摄像机模型可以从不同角度对位于三维虚拟环境中的虚拟角色进行观察。可选地,该第三人称视角为第一人称的过肩视角时,摄像机模型位于虚拟角色(比如虚拟人物的头肩部)的后方。可选地,除第一人称视角和第三人称视角外,视角还包括其他视角,比如俯视视角;当采用俯视视角时,该摄像机模型可以位于虚拟角色头部的上空,俯视视角是以从空中俯视的角度观察虚拟环境的视角。可选地,该摄像机模型在三维虚拟环境中不会进行实际显示,即,在用户界面显示的三维虚拟环境中不显示该摄像机模型。
以该摄像机模型位于与虚拟角色相距预设距离的任意位置为例进行说明,可选地,一个虚拟角色对应一个摄像机模型,该摄像机模型可以以虚拟角色为旋转中心进行旋转,如:以虚拟角色的任意一点为旋转中心对摄像机 模型进行旋转,摄像机模型在旋转过程中的不仅在角度上有转动,还在位移上有偏移,旋转时摄像机模型与该旋转中心之间的距离保持不变,即,将摄像机模型在以该旋转中心作为球心的球体表面进行旋转,其中,虚拟角色的任意一点可以是虚拟角色的头部、躯干、或者虚拟角色周围的任意一点,本申请实施例对此不加以限定。可选地,摄像机模型在对虚拟角色进行观察时,该摄像机模型的视角的中心指向为该摄像机模型所在球面的点指向球心的方向。
可选地,该摄像机模型还可以在虚拟角色的不同方向以预设的角度对虚拟角色进行观察。
示意性的,请参考图1,在虚拟角色11中确定一点作为旋转中心12,摄像机模型围绕该旋转中心12进行旋转,可选地,该摄像机模型配置有一个初始位置,该初始位置为虚拟角色后上方的位置(比如脑部的后方位置)。示意性的,如图1所示,该初始位置为位置13,当摄像机模型旋转至位置14或者位置15时,摄像机模型的视角方向随摄像机模型的转动而进行改变。
本申请中的终端可以是膝上型便携计算机、手机、平板电脑、电子书阅读器、电子游戏机、动态影像专家压缩标准音频层面4(Moving Picture Experts Group Audio Layer IV,MP4)播放器等等。
关于硬件结构,上述终端包括了压力触控屏120、存储器140和处理器160,请参考图2所示的终端的结构框图。
触摸屏120可以是电容屏或者电阻屏。触摸屏120用于实现终端与用户之间的交互。在本申请的实施例中,终端通过触摸屏120获得用户触发的视角转动操作。
存储器140可以包括一个或者多个计算机可读存储介质。上述计算机存储介质包括随机存取存储器(Random Access Memory,RAM)、只读存储器(Read Only Memory,ROM)、闪存(Flash)中的至少一种。存储器140中安装有操作系统142和应用程序144。
操作系统142是为应用程序144提供对计算机硬件的安全访问的基础软件。操作系统142可以是安卓系统(Android)或者苹果系统(IOS)。
应用程序144是支持虚拟环境的应用程序,虚拟环境中包括虚拟角色。可选的,应用程序144是支持三维虚拟环境的应用程序。该应用程序144可以是虚拟现实应用程序、三维地图程序、军事仿真程序、TPS游戏、FPS游戏、MOBA游戏、多人枪战类生存游戏中的任意一种。可选的,该应用程序144可以是单机版的应用程序,比如单机版的3D游戏程序;也可以是网络联机版的应用程序。
处理器160可以包括一个或者多个处理核心,比如4核心处理器、8核心处理器。处理器160用于根据触摸屏120上接收到的虚拟角色的视角转动 操作,执行视角转动的命令。
如图3所示,在本申请的实施例中,上述终端还可以包括陀螺仪180。上述陀螺仪180用于获取用户触发的虚拟角色的视角转动操作。
图4是本申请的一个示例性实施例提供的视角转动的方法的流程图,以该方法应用于图2或者图3所示的终端中为例来说明,该方法包括:
步骤201,显示应用程序的第一视角画面。
终端上显示应用程序的第一视角画面,可选地,应用程序可以是虚拟现实应用程序、三维地图应用程序、军事仿真程序、TPS游戏、FPS游戏、MOBA游戏中的至少一种。
可选地,第一视角画面是在虚拟环境中采用虚拟角色的第一视角方向观察虚拟环境时的画面。第一视角方向可以是采用第一人称视角、第三人称视角或其它视角中的至少一种观察虚拟环境的方向。其它视角可以是俯视视角或任意可能的其它的视角。其中,第一人称视角对应的虚拟环境画面中不包括虚拟角色本身;第三人称视角以及俯视视角对应的虚拟环境画面中包括虚拟角色本身,比如,在通过摄像机模型对虚拟环境进行观察时,能够看到虚拟角色的三维模型以及虚拟角色所持有的虚拟枪械等。
可选地,第一视角画面上还叠加有第一功能控件和第二功能控件,第一功能控件用于支持第一功能和视角转动功能,第二功能控件用于支持第二功能和视角转动功能。其中,第一功能是指除视角转动功能外的其它功能;第二功能是指除视角转动功能外的其它功能。比如,其它功能可以是开镜功能、或者探头功能、或者射击功能等等。
可选地,第一功能控件包括开镜控件、探头控件和射击控件中的至少一种。
可选地,第二功能控件包括开镜控件、探头控件和射击控件中的至少一种。
可选地,第一功能控件与第二功能控件不同。比如,第一功能控件为开镜控件,第二功能控件为探头控件。
其中,开镜控件用于开启或者关闭准镜,准镜用于在射击时辅助瞄准目标,比如,准镜可以包括倍镜、红点准镜和全息准镜等等。探头控件用于在有遮挡物时控制虚拟角色将头部探出射击,从而减小自身暴露面积。射击控件用于控制开火,比如,控制虚拟步枪向目标开火。
步骤202,接收基于第一功能控件触发的第一视角转动操作。
终端接收基于第一功能控件触发的第一视角转动操作;可选地,第一视角转动操作包括点击操作和长按操作中的任意一种。
步骤203,根据第一视角转动操作开启第一功能控件的第一功能和视角转动功能,将第一视角画面切换为第二视角画面。
终端根据第一视角转动操作以第一视角方向为基准转动相应的角度,将第一视角画面转动至第二视角画面。可选地,第二视角画面是在虚拟环境中采用虚拟角色的第二视角方向观察虚拟环境时的画面。
可选地,终端根据第一视角转动操作生成第一功能控件的第一顺序标号;上述第一顺序标号用于确定开启或者关闭第一功能控件的视角转动操作。
示意性的,当第一功能控件为开镜控件时,终端根据第一视角转动操作开启准镜,且开启视角转动操作,将第一视角画面切换为第二视角画面。
当第一功能控件为探头控件时,终端根据第一视角转动操作开启探头功能,且开启视角转动操作,将第一视角画面切换为第二视角画面。
当第一功能控件为射击控件时,终端根据第一视角转动操作开火,且开启视角转动操作,将第一视角画面切换为第二视角画面。需要说明的是,射击控件控制开火包括两种模式:第一,按下开火;第二,松开开火;因此,上述终端根据第一视角转动操作开火,可以是在射击控件被按下时开火,也可以是在射击控件被按下又松开时开火。
步骤204,在第一功能控件处于开启状态下,接收基于第二功能控件触发的第二视角转动操作。
可选地,在第一功能控件的第一功能和视角转动功能均处于开启状态的情况下,终端接收基于第二功能控件触发的第二视角转动操作。
可选地,第二视角转动操作包括点击操作和长按操作中的任意一种。
步骤205,根据第二视角转动操作关闭第一功能控件的视角转动功能,开启第二功能控件的第二功能和视角转动功能,将第二视角画面切换为第三视角画面。
终端根据第二视角转动操作以第二视角方向为基准转动相应的角度,将第二视角画面转动至第三视角画面。可选地,第三视角画面是在虚拟环境中采用虚拟角色的第三视角方向观察虚拟环境时的画面。
示意性的,当第二功能控件为开镜控件时,终端根据第二视角转动操作开启准镜,且开启视角转动操作,将第二视角画面切换为第三视角画面。
当第二功能控件为探头控件时,终端根据第二视角转动操作开启探头功能,且开启视角转动操作,将第二视角画面切换为第三视角画面。
当第二功能控件为射击控件时,终端根据第二视角转动操作开火,且开启视角转动操作,将第二视角画面切换为第三视角画面。
此时,第一功能控件的第一功能处于开启状态,比如,准镜处于开启状态,或者探头功能处于开启状态。需要说明的是,若第一功能控件为射击控件,如果是非连续射击状态,那么在一次开火后,射击控件仍处于开启状态,但是不会有子弹射出。
可选地,终端开启第二功能控件的视角转动功能的示意性步骤如下:
1)根据第二视角转动操作,生成第二功能控件的第二顺序标号。
上述第二顺序标号用于确定开启或者关闭第二功能控件的视角转动功能。
可选地,第一功能控件包括处于开启状态的第一功能控件,处于开启状态的第一功能控件对应有第一顺序标号;第二顺序标号生成的示意性步骤如下:
a)根据第二视角转动操作从所述第一顺序标号中获取最大顺序标号;
b)将最大顺序标号加一后得到的顺序标号确定为第二顺序标号。
也就是说,最大顺序标号为x,则第二顺序标号为x+1;比如,第一顺序标号包括1、2,最大顺序标号为2,则第二顺序标号确定为3。
2)当第二顺序标号大于第一顺序标号时,关闭第一功能控件的视角转动功能,开启第二功能控件的第二功能和视角转动功能;其中,第一顺序标号是第一功能控件的顺序标号。
终端确定第二顺序标号是否大于第一顺序标号;当第二顺序标号大于第一顺序标号时,终端关闭第一功能控件的视角转动功能,且开启第二功能控件的第二功能和视角转动功能。
综上所述,本实施例提供的视角转动的方法,终端上显示应用程序的第一视角画面,第一视角画面上叠加有第一功能控件和第二功能控件,第一功能控件用于支持第一功能和视角转动功能,第二功能控件用于支持第二功能和视角转动功能;终端接收基于第一功能控件触发的第一视角转动操作;根据第一视角转动操作,开启第一功能控件的第一功能和视角转动功能,将第一视角画面切换为第二视角画面;在第一功能控件处于开启状态下,接收基于第二功能控件触发的第二视角转动操作;根据第二视角转动操作,关闭第一功能控件的视角转动功能,开启第二功能控件的第二功能和视角转动功能,将第二视角画面切换为第三视角画面。
上述方法在第一功能控件的视角转动功能被触发的同时,还可以响应基于第二功能控件触发的视角转动操作,即屏幕上能够同时响应至少两个触点的视角转动操作,提高了操作过程中的交互效率。且该方法中在第一功能控件处于开启状态的情况下,基于第二功能控件触发视角转动功能,终端首先响应基于第二功能控件触发的视角转动操作,保证了在多个具有视角转动功能的功能控件均开启时,终端对视角转动操作的响应的有序性和准确性。
在FPS游戏、TPS游戏以及MOBA游戏等类型的游戏中,多个具有视角转动功能的功能控件的设置,确保玩家在不同状态下均能自由的完成视角转动操作,为交战提供了更多的灵活性和操作空间。
示意性的,如图5,示出了第一功能控件和第二功能控件的视角转动功能的切换。以第一功能控件为探头控件和开镜控件,第二功能控件为射击控件为例,第一视角下的用户界面21上包括探头控件22、开镜控件23和射 击控件24;终端接收基于探头控件22的第一视角转动操作,将第一视角画面转动至第二视角画面,如第二视角下的用户界面25,相对于第一视角,第二视角向右方移动了距离L1;在探头控件22被触发的同时,终端接收基于射击按钮24的第二视角转动操作,将第二视角画面转动至第三视角画面,如第三视角下的用户界面26,相对于第一视角,第三视角向右方移动了距离L2。
基于图4,若第一功能控件包括处于开启状态下的第一功能控件,当第二功能控件的视角转动功能关闭时,终端可以从第一功能控件中确定出一个功能控件执行视角转动操作;示意性的,在步骤205之后增加步骤206至步骤208,如图6,步骤如下:
步骤206,判断是否结束第二功能控件上的第二视角转动操作。
终端判断是否结束第二功能控件上的第二视角转动操作。当终端结束第二功能控件上的第二视角转动操作时,执行步骤207;当终端未结束第二功能控件上的第二视角转动操作时,执行步骤208。
可选地,终端对虚拟角色的视角转动采用拖动操作,也就是说,第二视角转动操作还包括拖动操作。
示意性的,第二视角转动操作包括点击操作和拖动操作,当终端结束第二功能控件上的拖动操作和点击操作时,执行步骤207;否则,执行步骤208。或者,第二视角转动操作包括长按操作和拖动操作,当终端结束第二功能控件上的拖动操作和长按操作时,执行步骤207;否则,执行步骤208。
第二功能控件上的第二视角转动操作结束,也即第二功能控件上的视角转动功能关闭。
步骤207,从处于开启状态的第一功能控件中确定出第i个处于开启状态的第一功能控件,开启第i个处于开启状态的第一功能控件的视角转动功能。
当第二功能控件上的视角转动功能关闭,且有第一功能控件处于开启状态时,终端从处于开启状态的第一功能控件中确定出第i个处于开启状态的第一功能控件,开启第i个处于开启状态的第一功能控件的视角转动功能,i为正整数。
可选地,当第i个处于开启状态的第一功能控件的第一顺序标号为最大顺序标号时,开启第i个处于开启状态的第一功能控件的视角转动功能。
也就是说,n个处于开启状态的第一功能控件分别对应有n个第一顺序标号,从n个第一顺序序号中确定出最大顺序标号,将最大顺序标号对应的第i个处于开启状态的第一功能控件筛选出来,开启第i个处于开启状态的第一功能控件的视角转动功能,n为正整数。
步骤208,仍执行第二功能控件上的第二视角转动操作。
综上所述,本实施例提供的视角转动的方法,在结束第二功能控件上的第二视角转动操作的情况下,若有多个第一功能控件处于开启状态,确定出一个第一功能控件自动接管视角转动,极大限度的保证了在多个具有视角转动功能的功能控件时,终端对视角转动操作的响应的有序性和准确性。且一个视角转动触点的消失,能够由另一个视角转动触点承接视角转动的响应,还可以避免再次触发视角转动触点时画面卡顿的情况。
示意性的,上述图6所示的实施例,按照第二功能控件上的操作状态可以划分为按下过程和松开过程两部分进行说明,如图7,是对第二功能控件的按下过程的示意性说明:
步骤31,开始。
步骤32,终端接收具有视角转动功能的第二功能控件上的按下操作(第二视角转动操作的触发)。
步骤33,终端根据按下操作开启第二功能控件的视角转动功能,为第二功能控件标记第二顺序序号。
终端根据按下操作开启第二功能控件的视角转动功能的同时,为第二功能控件标记第二顺序序号。示意性的,对上一个触发的具有视角转动功能的功能控件标记顺序序号为x,则第二顺序序号为x+1,也就是说,在第二功能控件被触发之前,x为最大顺序标号;x为正整数。
步骤34,终端判断是否有具有视角转动功能的第一功能控件处于按下状态。
终端确定有具有视角转动功能的第一功能控件处于按下状态,执行步骤35;否则,执行步骤36。
步骤35,终端关闭第一功能控件的视角转动功能。
步骤36,结束。
此处的结束是指结束对处于按下状态的第一功能控件的寻找,但仍保持第二功能控件的视角转动功能的开启状态。
如图8,是对第二功能控件的松开过程的示意性说明:
步骤41,开始。
步骤42,终端接收具有视角转动功能的第二功能控件上的松开操作(第二视角转动操作的取消)。
步骤43,终端判断第二功能控件的视角转动功能是否处于开启状态。
当终端确定第二功能控件的视角转动功能处于开启状态时,执行步骤47;否则,执行步骤44。
步骤44,终端判断是否有具有视角转动功能的第一功能控件处于按下状态。
终端确定有具有视角转动功能的第一功能控件处于按下状态,执行步骤 45;否则,执行步骤47。
步骤45,终端寻找最大顺序标号对应的第一功能控件。
终端从n个处于按下状态的第一功能控件中确定出最大顺序标号对应的第一功能控件。
步骤46,终端开启最大顺序标号对应的第一功能控件的视角转动功能。
步骤47,结束。
在一些实施例中,用户可以自定义视角转动操作的触发方式,比如,可以定义视角转动操作的触发方式为点击操作,或者长按操作,或者触摸操作等等。对视角转动操作的触发方式的自定义进行说明,如图9,步骤如下:
步骤301,显示应用程序的设置界面。
终端上显示应用程序的设置界面,设置界面上包括至少两个模式设置控件,模式设置控件用于设置视角转动操作的触发方式。
可选地,模式设置控件包括点击模式设置控件、长按模式设置控件和混合模式设置控件中的至少两种。
其中,点击模式设置控件对应的视角转动操作为点击操作;长按模式设置控件对应的视角转动操作为长按操作;混合模式设置控件对应的视角转动操作为触摸操作,触摸操作的持续时长用于确定第二功能控件的第二功能的开启或关闭。
可选地,视角转动操作包括第一视角转动操作和第二视角转动操作中的任意一种。
步骤302,接收在设置界面上触发的选择操作。
选择操作用于选择至少两个模式设置控件中目标触发方式对应的模式设置控件。可选地,选择操作可以包括单击操作、双击操作、长按操作、滑动操作中的至少一种。
步骤303,根据选择操作,将视角转动操作的触发方式确定为目标触发方式。
终端根据选择操作,将视角转动操作的触发方式确定为目标触发方式;可选地,目标触发方式包括点击操作、长按操作和触摸操作中的至少两种。
示意性的,如图10,以功能控件为开镜控件为例,应用程序的设置界面51上包括三种开镜模式的设置按钮:点击模式设置控件52、长按模式设置控件53和混合模式设置控件54。用户可以选择三种开镜模式中的任意一种。
如图11,以功能控件为探头控件为例,应用程序的设置界面55上包括三种探头模式的设置按钮:点击模式设置控件56、长按模式设置控件57和混合模式设置控件58。用户可以选择三种探头模式中的任意一种。
综上所述,本实施例提供的视角转动操作的设置方法,用户可以自定义 视角转动操作的触发方式,适应用户自身的射击习惯和操作特点,满足不同层次用户独立的操作需求,丰富了用户的选择,提供了更多的个性化交战体验。
还需要说明的是,第一功能控件还包括第一功能,第二功能控件还包括第二功能;相应的,第一视角转动操作还控制第一功能的开启与关闭,第二视角转动操作还控制第二功能的开启与关闭。以第二视角转动操作控制第二功能控件的第二功能的开启与关闭为例进行说明,如下:
在一些实施例中,目标触发方式包括点击操作;终端根据点击操作开启第二功能控件的第二功能;当结束点击操作时,保持第二功能控件的第二功能处于开启状态。
可选地,在第二功能控件的第二功能处于开启的状态下,终端根据第二功能控件上的再次点击操作关闭第二功能控件的第二功能。
示意性的,如图12,以第二功能控件为开镜控件,第二功能为开镜为例,在点击模式下,开镜控件控制开镜开启以及关闭的过程如下步骤所示:
步骤61,开始。
步骤62,终端接收开镜控件上的点击操作。
步骤63,终端判断开镜功能是否处于开启状态。
当终端确定开镜功能处于开启状态时,执行步骤64;否则,执行步骤65。
步骤64,终端关镜。
步骤65,终端开镜。
在一些实施例中,目标触发方式包括长按操作;终端根据长按操作开启第二功能控件的第二功能;当结束长按操作时,保持第二功能控件的第二功能处于开启状态。
可选地,在第二功能控件的第二功能处于开启的状态下,终端根据第二功能控件上的点击操作关闭第二功能控件的第二功能。
示意性的,如图13,以第二功能控件为开镜控件,第二功能为开镜为例,在长按模式下,开镜控件控制开镜开启以及关闭的过程如下步骤所示:
步骤71,开始。
步骤72,终端接收开镜控件上的长按操作。
步骤73,终端开镜。
步骤74,终端判断是否结束开镜控件上的长按操作。
当终端确定结束开镜控件上的长按操作,执行步骤75;否则,执行步骤76。
步骤75,终端关镜。
步骤76,终端保持开镜状态。
在一些实施例中,目标触发方式包括触摸操作;终端根据触摸操作开启 第二功能控件的第二功能;当结束触摸操作时,获取触摸操作的持续时长;当持续时长大于时间阈值时,保持第二功能控件的第二功能处于开启状态。其中,时间阈值用于确定在结束触摸操作之后保持第二功能控件的第二功能处于开启状态。
可选地,当持续时长小于或者等于时间阈值时,关闭第二功能控件的第二功能。
可选地,在第二功能控件的第二功能处于开启的状态下,终端根据第二功能控件上的点击操作关闭第二功能控件的第二功能。
示意性的,如图14,以第二功能控件为开镜控件,第二功能为开镜为例,在混合模式下,开镜控件控制开镜开启以及关闭的过程如下步骤所示:
步骤81,开始。
步骤82,终端接收开镜控件上的触摸操作。
步骤83,终端判断开镜功能是否处于开启状态。
当终端确定开镜功能处于开启状态时,执行步骤84;否则,执行步骤85。
步骤84,终端关镜。
步骤85,终端开镜。
步骤86,终端判断在结束触摸操作时的操作时长是否大于时间阈值。
当终端确定在结束触摸操作时的操作时长大于时间阈值时,执行步骤87;否则,执行步骤88。
示意性的,时间阈值可以是0.2秒(s);当终端确定在结束触摸操作时的操作时长大于0.2s时,执行步骤87;否则,执行步骤88。
步骤87,终端确定触摸操作为长按操作,保持开镜状态。
步骤88,终端确定触摸操作为点击操作,关镜。
还需要说明的是,第一视角转动操作控制第一功能控件的第一功能的开启与关闭的过程,与第二视角转动操作控制第二功能控件的第二功能的开启与关闭类似,在此不再加以赘述。
还需要说明的是,在一些实施例中,第一功能控件和第二功能控件被触发时视角转动的响应逻辑是由用户自定义的,示意性的,终端根据自定义逻辑通过第一功能控件控制虚拟角色的视角转动;或,根据自定义逻辑通过第二功能控件控制虚拟角色的视角转动;其中,自定义逻辑是由用户自定义的第一功能控件与第二功能控件被触发时对视角转动操作的响应逻辑。
比如,自定义逻辑为当第一功能控件和第二功能控件同时被触发时,终端开启第一功能控件的视角转动功能,关闭第二功能控件的视角转动控件。因此,当第一功能控件和第二功能控件同时被触发时,终端通过第一功能控件控制虚拟角色的视角转动。
在一些实施例中,终端还通过陀螺仪控制虚拟角色的视角转动。当第一功能控件和/或第二控件被触发的同时,终端接收到自身的转动操作,通过陀螺仪控制虚拟角色的视角转动。
综上所述,本实施例提供的视角转动的方法,为用户提供了自定义视角转动的响应逻辑的功能,使用户能够自定义更符合自身操作习惯的具有视角转动功能的控件操作逻辑,提高了用户交战中的操作体验。
另外,还可以通过陀螺仪控制虚拟角色的视角转动,使用户转动虚拟角色的视角的同时,还能够控制其他对虚拟角色的操作,提高了交战中的交互效率。
图15是本申请提供的一个示例性实施例提供的视角转动的装置,该装置可以通过软件、硬件、或者二者的结合组成终端的部分或者全部,该装置包括:
显示模块401,用于显示应用程序的第一视角画面,第一视角画面是在虚拟环境中采用虚拟角色的第一视角方向观察虚拟环境时的画面,第一视角画面上叠加有第一功能控件和第二功能控件,第一功能控件用于支持第一功能和视角转动功能,第二功能控件用于支持第二功能和视角转动功能;
接收模块402,用于接收基于第一功能控件触发的第一视角转动操作;
处理模块403,用于根据第一视角转动操作,开启第一功能控件的第一功能和视角转动功能,将第一视角画面切换为第二视角画面;第二视角画面是在虚拟环境中采用虚拟角色的第二视角方向观察虚拟环境时的画面;
接收模块402,用于在第一功能控件处于开启状态下,接收基于第二功能控件触发的第二视角转动操作;
处理模块403,用于根据第二视角转动操作,关闭第一功能控件的视角转动功能,开启第二功能控件的第二功能和视角转动功能,将第二视角画面切换为第三视角画面;第三视角画面是在虚拟环境中采用虚拟角色的第三视角方向观察虚拟环境时的画面。
在一些实施例中,处理模块403,包括:
生成子模块4032,用于根据第二视角转动操作,生成第二功能控件的第二顺序标号;
处理子模块4034,用于当第二顺序标号大于第一顺序标号时,关闭第一功能控件的视角转动功能,开启第二功能控件的第二功能和视角转动功能;其中,第一顺序标号是第一功能控件的顺序标号。
在一些实施例中,第一功能控件包括处于开启状态的第一功能控件,处于开启状态的第一功能控件对应有第一顺序标号;
生成子模块4032,用于根据第二视角转动操作从第一顺序标号中获取最大顺序标号;将最大顺序标号加一后得到的顺序标号确定为第二顺序标 号。
在一些实施例中,第一功能控件包括处于开启状态的第一功能控件;
处理子模块4034,用于当结束基于第二功能控件的第二视角转动操作时,从处于开启状态的第一功能控件中确定出第i个处于开启状态的第一功能控件,开启第i个处于开启状态的第一功能控件的视角转动功能,i为正整数。
在一些实施例中,处于开启状态的第一功能控件对应有第一顺序标号;
处理子模块4034,用于当第i个处于开启状态的第一功能控件的第一顺序标号为最大顺序标号时,开启第i个处于开启状态的第一功能控件的视角转动功能。
在一些实施例中,
显示模块401,用于显示应用程序的设置界面,设置界面上包括至少两个模式设置控件,模式设置控件用于设置视角转动操作的触发方式;
接收模块402,用于接收在设置界面上触发的选择操作,选择操作用于选择至少两个模式设置控件中目标触发方式对应的模式设置控件;
确定模块404,用于根据选择操作,将视角转动操作的触发方式确定为目标触发方式;
其中,视角转动操作包括第一视角转动操作和第二视角转动操作中的任意一种。
在一些实施例中,视角转动操作包括第二视角转动操作;目标触发方式包括点击操作;
处理模块403,用于根据点击操作开启第二功能控件的第二功能;当结束点击操作时,保持第二功能控件的第二功能处于开启状态。
在一些实施例中,视角转动操作包括第二视角转动操作;目标触发方式包括长按操作;
处理模块403,用于根据长按操作开启第二功能控件的第二功能;当结束长按操作时,保持第二功能控件的第二功能处于开启状态。
在一些实施例中,视角转动操作包括第二视角转动操作;目标触发方式包括触摸操作;
处理模块403,用于根据触摸操作开启第二功能控件的第二功能;当结束触摸操作时,获取触摸操作的持续时长;当持续时长大于时间阈值时,保持第二功能控件的第二功能处于开启状态。
在一些实施例中,处理模块403,用于当持续时长小于或者等于时间阈值时,关闭第二功能控件的第二功能。
在一些实施例中,处理模块403,用于根据自定义逻辑通过第一功能控件控制虚拟角色的视角转动;或,根据自定义逻辑通过第二功能控件控制虚拟角色的视角转动;其中,自定义逻辑是由用户自定义的第一功能控件与第 二功能控件被触发时对视角转动操作的响应逻辑。
综上所述,本实施例提供的视角转动的装置,终端上显示应用程序的第一视角画面,第一视角画面上叠加有第一功能控件和第二功能控件,第一功能控件用于支持第一功能和视角转动功能,第二功能控件用于支持第二功能和视角转动功能;终端接收基于第一功能控件触发的第一视角转动操作;根据第一视角转动操作,开启第一功能控件的第一功能和视角转动功能,将第一视角画面切换为第二视角画面;在第一功能控件处于开启状态下,接收基于第二功能控件触发的第二视角转动操作;根据第二视角转动操作,关闭第一功能控件的视角转动功能,开启第二功能控件的第二功能和视角转动功能,将第二视角画面切换为第三视角画面。
上述装置在第一功能控件的视角转动功能被触发的同时,还可以响应基于第二功能控件触发的视角转动操作,即屏幕上能够同时响应至少两个触点的视角转动操作,提高了操作过程中的交互效率。且该装置在第一功能控件处于开启状态的情况下,基于第二功能控件触发视角转动功能,终端首先响应基于第二功能控件触发的视角转动操作,保证了在多个具有视角转动功能的功能控件均开启时,终端对视角转动操作的响应的有序性和准确性。
在FPS游戏、TPS游戏以及MOBA游戏等等类型的游戏中,多个具有视角转动功能的功能控件的设置,确保玩家在不同状态下均能自由的完成视角转动操作,为交战提供了更多的灵活性和操作空间。
图16示出了本发明一个示例性实施例提供的终端500的结构框图。该终端500可以是:智能手机、平板电脑、动态影像专家压缩标准音频层面3(Moving Picture Experts Group Audio Layer III,MP3)播放器、动态影像专家压缩标准音频层面4(Moving Picture Experts Group Audio Layer IV,MP4)播放器、笔记本电脑或台式电脑。终端500还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。
通常,终端500包括有:处理器501和存储器502。
处理器501可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器501可以采用数字信号处理(Digital Signal Processing,DSP)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)、可编程逻辑阵列(Programmable Logic Array,PLA)中的至少一种硬件形式来实现。处理器501也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称中央处理器(Central Processing Unit,CPU);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器501可以在集成有图像处理器(Graphics Processing Unit,GPU),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器501还可以包括人工智能(Artificial  Intelligence,AI)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器502可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器502还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器502中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器501所执行以实现本申请中方法实施例提供的视角转动的方法。
在一些实施例中,终端500还可选包括有:外围设备接口503和至少一个外围设备。处理器501、存储器502和外围设备接口503之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口503相连。具体地,外围设备包括:射频电路504、触摸显示屏505、摄像头506、音频电路507、定位组件508和电源509中的至少一种。
外围设备接口503可被用于将输入/输出(Input/Output,I/O)相关的至少一个外围设备连接到处理器501和存储器502。在一些实施例中,处理器501、存储器502和外围设备接口503被集成在同一芯片或电路板上;在一些其他实施例中,处理器501、存储器502和外围设备接口503中的任意一个或两个可以在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路504用于接收和发射射频(Radio Frequency,RF)信号,也称电磁信号。射频电路504通过电磁信号与通信网络以及其他通信设备进行通信。射频电路504将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号转换为电信号。可选地,射频电路504包括:天线系统、RF收发器、一个或多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡等等。射频电路504可以通过至少一种无线通信协议来与其它终端进行通信。该无线通信协议包括但不限于:万维网、城域网、内联网、各代移动通信网络(2G、3G、4G及5G)、无线局域网和/或无线保真(Wireless Fidelity,WiFi)网络。在一些实施例中,射频电路504还可以包括近距离无线通信(Near Field Communication,NFC)有关的电路,本申请对此不加以限定。
显示屏505用于显示用户界面(User Interface,UI)。该UI可以包括图形、文本、图标、视频及其它们的任意组合。当显示屏505是触摸显示屏时,显示屏505还具有采集在显示屏505的表面或表面上方的触摸信号的能力。该触摸信号可以作为控制信号输入至处理器501进行处理。此时,显示屏505还可以用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏505可以为一个,设置终端500的前面板;在另一些实施例中,显示屏505可以为至少两个,分别设置在终端500的不同表面或呈折叠设计;在再一些实施例中,显示屏505可以是柔性显示屏,设置在终端500的弯曲表面上或折叠面上。甚至,显示屏505还可以设置成非矩形 的不规则图形,也即异形屏。显示屏505可以采用液晶显示屏(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等材质制备。
摄像头组件506用于采集图像或视频。可选地,摄像头组件506包括前置摄像头和后置摄像头。通常,前置摄像头设置在终端的前面板,后置摄像头设置在终端的背面。在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头、长焦摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能、主摄像头和广角摄像头融合实现全景拍摄以及虚拟现实(Virtual Reality,VR)拍摄功能或者其它融合拍摄功能。在一些实施例中,摄像头组件506还可以包括闪光灯。闪光灯可以是单色温闪光灯,也可以是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,可以用于不同色温下的光线补偿。
音频电路507可以包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器501进行处理,或者输入至射频电路504以实现语音通信。出于立体声采集或降噪的目的,麦克风可以为多个,分别设置在终端500的不同部位。麦克风还可以是阵列麦克风或全向采集型麦克风。扬声器则用于将来自处理器501或射频电路504的电信号转换为声波。扬声器可以是传统的薄膜扬声器,也可以是压电陶瓷扬声器。当扬声器是压电陶瓷扬声器时,不仅可以将电信号转换为人类可听见的声波,也可以将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路507还可以包括耳机插孔。
定位组件508用于定位终端500的当前地理位置,以实现导航或基于位置的服务(Location Based Service,LBS)。定位组件508可以是基于美国的全球定位系统(Global Positioning System,GPS)、中国的北斗系统或俄罗斯的伽利略系统的定位组件。
电源509用于为终端500中的各个组件进行供电。电源509可以是交流电、直流电、一次性电池或可充电电池。当电源509包括可充电电池时,该可充电电池可以是有线充电电池或无线充电电池。有线充电电池是通过有线线路充电的电池,无线充电电池是通过无线线圈充电的电池。该可充电电池还可以用于支持快充技术。
在一些实施例中,终端500还包括有一个或多个传感器510。该一个或多个传感器510包括但不限于:加速度传感器511、陀螺仪传感器512、压力传感器513、指纹传感器514、光学传感器515以及接近传感器516。
加速度传感器511可以检测以终端500建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器511可以用于检测重力加速度在三个坐标轴上的分量。处理器501可以根据加速度传感器511采集的重力加速度信号,控制触摸显示屏505以横向视图或纵向视图进行用户界面的显示。加速度传 感器511还可以用于游戏或者用户的运动数据的采集。
陀螺仪传感器512可以检测终端500的机体方向及转动角度,陀螺仪传感器512可以与加速度传感器511协同采集用户对终端500的3D动作。处理器501根据陀螺仪传感器512采集的数据,可以实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
压力传感器513可以设置在终端500的侧边框和/或触摸显示屏505的下层。当压力传感器513设置在终端500的侧边框时,可以检测用户对终端500的握持信号,由处理器501根据压力传感器513采集的握持信号进行左右手识别或快捷操作。当压力传感器513设置在触摸显示屏505的下层时,由处理器501根据用户对触摸显示屏505的压力操作,实现对UI界面上的可操作性控件进行控制。可操作性控件包括按钮控件、滚动条控件、图标控件、菜单控件中的至少一种。
指纹传感器514用于采集用户的指纹,由处理器501根据指纹传感器514采集到的指纹识别用户的身份,或者,由指纹传感器514根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器501授权该用户执行相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。指纹传感器514可以被设置终端500的正面、背面或侧面。当终端500上设置有物理按键或厂商Logo时,指纹传感器514可以与物理按键或厂商Logo集成在一起。
光学传感器515用于采集环境光强度。在一个实施例中,处理器501可以根据光学传感器515采集的环境光强度,控制触摸显示屏505的显示亮度。具体地,当环境光强度较高时,调高触摸显示屏505的显示亮度;当环境光强度较低时,调低触摸显示屏505的显示亮度。在另一个实施例中,处理器501还可以根据光学传感器515采集的环境光强度,动态调整摄像头组件506的拍摄参数。
接近传感器516,也称距离传感器,通常设置在终端500的前面板。接近传感器516用于采集用户与终端500的正面之间的距离。在一个实施例中,当接近传感器516检测到用户与终端500的正面之间的距离逐渐变小时,由处理器501控制触摸显示屏505从亮屏状态切换为息屏状态;当接近传感器516检测到用户与终端500的正面之间的距离逐渐变大时,由处理器501控制触摸显示屏505从息屏状态切换为亮屏状态。
本领域技术人员可以理解,图16中示出的结构并不构成对终端500的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分 步骤是可以通过程序来指令相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,该计算机可读存储介质可以是上述实施例中的存储器中所包含的计算机可读存储介质;也可以是单独存在,未装配入终端中的计算机可读存储介质。该计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如图4至图16任一所述的视角转动的方法。
可选地,该计算机可读存储介质可以包括:只读存储器(Read Only Memory,ROM)、随机存取记忆体(Random Access Memory,RAM)、固态硬盘(Solid State Drives,SSD)或光盘等。其中,随机存取记忆体可以包括电阻式随机存取记忆体(Resistance Random Access Memory,ReRAM)和动态随机存取存储器(Dynamic Random Access Memory,DRAM)。上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本申请的较佳实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (15)

  1. 一种视角转动的方法,由终端执行,所述方法包括:
    显示应用程序的第一视角画面,所述第一视角画面是在虚拟环境中采用虚拟角色的第一视角方向观察所述虚拟环境时的画面,所述第一视角画面上叠加有第一功能控件和第二功能控件,所述第一功能控件用于支持第一功能和视角转动功能,所述第二功能控件用于支持第二功能和所述视角转动功能;
    接收基于所述第一功能控件触发的第一视角转动操作;
    根据所述第一视角转动操作,开启所述第一功能控件的所述第一功能和所述视角转动功能,将所述第一视角画面切换为第二视角画面;所述第二视角画面是在所述虚拟环境中采用所述虚拟角色的第二视角方向观察所述虚拟环境时的画面;
    在所述第一功能控件处于开启状态下,接收基于所述第二功能控件触发的第二视角转动操作;
    根据所述第二视角转动操作,关闭所述第一功能控件的所述视角转动功能,开启所述第二功能控件的所述第二功能和所述视角转动功能,将所述第二视角画面切换为第三视角画面;所述第三视角画面是在所述虚拟环境中采用所述虚拟角色的第三视角方向观察所述虚拟环境时的画面。
  2. 根据权利要求1所述的方法,所述根据所述第二视角转动操作,关闭所述第一功能控件的所述视角转动功能,开启所述第二功能控件的所述第二功能和所述视角转动功能,包括:
    根据所述第二视角转动操作,生成所述第二功能控件的第二顺序标号;
    当所述第二顺序标号大于第一顺序标号时,关闭所述第一功能控件的所述视角转动功能,开启所述第二功能控件的所述第二功能和所述视角转动功能;其中,所述第一顺序标号是所述第一功能控件的顺序标号。
  3. 根据权利要求2所述的方法,所述第一功能控件包括处于开启状态的第一功能控件,所述处于开启状态的第一功能控件对应有所述第一顺序标号;
    所述根据所述第二视角转动操作,生成所述第二功能控件的第二顺序标号,包括:
    根据所述第二视角转动操作从所述第一顺序标号中获取最大顺序标号;
    将所述最大顺序标号加一后得到的顺序标号确定为所述第二顺序标号。
  4. 根据权利要求2所述的方法,所述第一功能控件包括处于开启状态的第一功能控件;
    在所述关闭所述第一功能控件的所述视角转动功能,开启所述第二功能控件的所述第二功能和所述视角转动功能之后,所述方法还包括:
    当结束基于所述第二功能控件的所述第二视角转动操作时,从所述处于 开启状态的第一功能控件中确定出第i个所述处于开启状态的第一功能控件,开启第i个所述处于开启状态的第一功能控件的所述视角转动功能,所述i为正整数。
  5. 根据权利要求4所述的方法,所述处于开启状态的第一功能控件对应有所述第一顺序标号;
    所述从所述处于开启状态的第一功能控件中确定出第i个所述处于开启状态的第一功能控件,开启第i个所述处于开启状态的第一功能控件的所述视角转动功能,包括:
    当第i个所述处于开启状态的第一功能控件的所述第一顺序标号为最大顺序标号时,开启第i个所述处于开启状态的第一功能控件的所述视角转动功能。
  6. 根据权利要求1至5任一所述的方法,所述方法还包括:
    显示所述应用程序的设置界面,所述设置界面上包括至少两个模式设置控件,所述模式设置控件用于设置视角转动操作的触发方式;
    接收在所述设置界面上触发的选择操作,所述选择操作用于选择至少两个所述模式设置控件中目标触发方式对应的模式设置控件;
    根据所述选择操作,将所述视角转动操作的触发方式确定为所述目标触发方式;其中,所述视角转动操作包括所述第一视角转动操作和所述第二视角转动操作中的任意一种。
  7. 根据权利要求6所述的方法,所述视角转动操作包括第二视角转动操作;所述目标触发方式包括点击操作;
    所述根据所述第二视角转动操作,开启所述第二功能控件的所述第二功能,包括:
    根据所述点击操作开启所述第二功能控件的所述第二功能;
    所述方法还包括:
    当结束所述点击操作时,保持所述第二功能控件的所述第二功能处于开启状态。
  8. 根据权利要求6所述的方法,所述视角转动操作包括第二视角转动操作;所述目标触发方式包括长按操作;
    所述根据所述第二视角转动操作,开启所述第二功能控件的所述第二功能,包括:
    根据所述长按操作开启所述第二功能控件的所述第二功能;
    所述方法还包括:
    当结束所述长按操作时,保持所述第二功能控件的所述第二功能处于开启状态。
  9. 根据权利要求6所述的方法,所述视角转动操作包括第二视角转动操作;所述目标触发方式包括触摸操作;
    所述根据所述第二视角转动操作,开启所述第二功能控件的所述第二功能,包括:
    根据所述触摸操作开启所述第二功能控件的所述第二功能;
    所述方法还包括:
    当结束所述触摸操作时,获取所述触摸操作的持续时长;
    当所述持续时长大于时间阈值时,保持所述第二功能控件的所述第二功能处于开启状态。
  10. 根据权利要求9所述的方法,所述方法还包括:
    当所述持续时长小于或者等于所述时间阈值时,关闭所述第二功能控件的所述第二功能。
  11. 根据权利要求1所述的方法,所述方法还包括:
    根据自定义逻辑通过所述第一功能控件控制所述虚拟角色的视角转动;
    或,
    根据所述自定义逻辑通过所述第二功能控件控制所述虚拟角色的视角转动;
    其中,所述自定义逻辑是由用户自定义的所述第一功能控件与所述第二功能控件被触发时对视角转动操作的响应逻辑。
  12. 一种视角转动的装置,所述装置包括:
    显示模块,用于显示应用程序的第一视角画面,所述第一视角画面是在虚拟环境中采用虚拟角色的第一视角方向观察所述虚拟环境时的画面,所述第一视角画面上叠加有第一功能控件和第二功能控件,所述第一功能控件用于支持第一功能和视角转动功能,所述第二功能控件用于支持第二功能和所述视角转动功能;
    接收模块,用于接收基于所述第一功能控件触发的第一视角转动操作;
    处理模块,用于根据所述第一视角转动操作,开启所述第一功能控件的所述第一功能和所述视角转动功能,将所述第一视角画面切换为第二视角画面;所述第二视角画面是在所述虚拟环境中采用所述虚拟角色的第二视角方向观察所述虚拟环境时的画面;
    所述接收模块,用于在所述第一功能控件处于开启状态下,接收基于所述第二功能控件触发的第二视角转动操作;
    所述处理模块,用于根据所述第二视角转动操作,关闭所述第一功能控件的所述视角转动功能,开启所述第二功能控件的所述第二功能和所述视角转动功能,将所述第二视角画面切换为第三视角画面;所述第三视角画面是在所述虚拟环境中采用所述虚拟角色的第三视角方向观察所述虚拟环境时的画面。
  13. 一种终端,所述终端包括:
    存储器;
    与存储器相连的处理器;
    其中,处理器被配置为加载并执行可执行指令以实现如权利要求1至11任一所述的视角转动的方法。
  14. 一种计算机可读存储介质,所述可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行以实现如权利要求1至11任一所述的视角转动的方法。
  15. 一种计算机程序产品,包括指令,当其在计算机上运行时,使得计算机执行如权利要求1至11中任一项所述的视角转动的方法。
PCT/CN2020/100873 2019-07-26 2020-07-08 视角转动的方法、装置、设备及存储介质 WO2021017783A1 (zh)

Priority Applications (7)

Application Number Priority Date Filing Date Title
JP2021565096A JP7309913B2 (ja) 2019-07-26 2020-07-08 視点回転の方法、装置、端末およびコンピュータプラグラム
SG11202110279UA SG11202110279UA (en) 2019-07-26 2020-07-08 Viewing angle rotation method, device, apparatus, and storage medium
EP20846755.5A EP3925677A4 (en) 2019-07-26 2020-07-08 VIEWING ANGLE ROTATION METHOD, DEVICE, APPARATUS AND STORAGE MEDIA
KR1020217034151A KR102663747B1 (ko) 2019-07-26 2020-07-08 시야각 회전 방법, 디바이스, 장치, 및 저장 매체
US17/337,279 US11878240B2 (en) 2019-07-26 2021-06-02 Method, apparatus, device, and storage medium for perspective rotation
JP2023110919A JP2023139033A (ja) 2019-07-26 2023-07-05 視点回転の方法、装置、端末およびコンピュータプログラム
US18/540,504 US20240123342A1 (en) 2019-07-26 2023-12-14 Method, apparatus, device, and storage medium for perspective rotation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910683976.6A CN110393916B (zh) 2019-07-26 2019-07-26 视角转动的方法、装置、设备及存储介质
CN201910683976.6 2019-07-26

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/337,279 Continuation US11878240B2 (en) 2019-07-26 2021-06-02 Method, apparatus, device, and storage medium for perspective rotation

Publications (1)

Publication Number Publication Date
WO2021017783A1 true WO2021017783A1 (zh) 2021-02-04

Family

ID=68326262

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/100873 WO2021017783A1 (zh) 2019-07-26 2020-07-08 视角转动的方法、装置、设备及存储介质

Country Status (7)

Country Link
US (2) US11878240B2 (zh)
EP (1) EP3925677A4 (zh)
JP (2) JP7309913B2 (zh)
KR (1) KR102663747B1 (zh)
CN (1) CN110393916B (zh)
SG (1) SG11202110279UA (zh)
WO (1) WO2021017783A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110393916B (zh) 2019-07-26 2023-03-14 腾讯科技(深圳)有限公司 视角转动的方法、装置、设备及存储介质
CN111111168B (zh) * 2019-12-16 2021-03-26 腾讯科技(深圳)有限公司 虚拟道具的控制方法和装置、存储介质及电子装置
US11562615B2 (en) * 2020-04-10 2023-01-24 Igt Symbol substitution system
CN111589132A (zh) * 2020-04-26 2020-08-28 腾讯科技(深圳)有限公司 虚拟道具展示方法、计算机设备及存储介质
CN113589992B (zh) * 2021-08-17 2023-09-12 网易(杭州)网络有限公司 游戏界面交互方法、游戏界面交互装置、介质及终端设备
USD1034646S1 (en) * 2021-10-08 2024-07-09 Samsung Electronics Co., Ltd. Display screen or portion thereof with graphical user interface

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150157932A1 (en) * 2012-07-06 2015-06-11 WEMADE ENTERTAINMENT CO., LTD a corporation Method of processing user gesture inputs in online game
CN107694087A (zh) * 2017-10-23 2018-02-16 网易(杭州)网络有限公司 信息处理方法及终端设备
CN108499105A (zh) * 2018-04-16 2018-09-07 腾讯科技(深圳)有限公司 在虚拟环境中进行视角调整的方法、装置及存储介质
CN108815851A (zh) * 2018-06-05 2018-11-16 腾讯科技(深圳)有限公司 在虚拟环境中射击时的界面显示方法、设备及存储介质
CN110393916A (zh) * 2019-07-26 2019-11-01 腾讯科技(深圳)有限公司 视角转动的方法、装置、设备及存储介质

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7800592B2 (en) 2005-03-04 2010-09-21 Apple Inc. Hand held electronic device with multiple touch sensing devices
US9327191B2 (en) * 2006-05-08 2016-05-03 Nintendo Co., Ltd. Method and apparatus for enhanced virtual camera control within 3D video games or other computer graphics presentations providing intelligent automatic 3D-assist for third person viewpoints
US7768514B2 (en) * 2006-12-19 2010-08-03 International Business Machines Corporation Simultaneous view and point navigation
US9007379B1 (en) * 2009-05-29 2015-04-14 Two Pic Mc Llc Methods and apparatus for interactive user control of virtual cameras
JP5300777B2 (ja) * 2010-03-31 2013-09-25 株式会社バンダイナムコゲームス プログラム及び画像生成システム
US20140002580A1 (en) 2012-06-29 2014-01-02 Monkeymedia, Inc. Portable proprioceptive peripatetic polylinear video player
US9227141B2 (en) * 2013-12-31 2016-01-05 Microsoft Technology Licensing, Llc Touch screen game controller
JP2016073663A (ja) 2015-11-25 2016-05-12 グリー株式会社 プログラム、及び表示システム
KR20160126848A (ko) * 2015-12-22 2016-11-02 주식회사 인챈트인터렉티브 사용자의 제스처 입력을 처리하는 방법
CN105760076B (zh) * 2016-02-03 2018-09-04 网易(杭州)网络有限公司 游戏控制方法及装置
US10354446B2 (en) * 2016-04-13 2019-07-16 Google Llc Methods and apparatus to navigate within virtual-reality environments
DE102016211453A1 (de) * 2016-06-27 2017-12-28 Conti Temic Microelectronic Gmbh Verfahren und Fahrzeugsteuersystem zum Erzeugen von Abbildungen eines Umfeldmodells und entsprechendes Fahrzeug
US10004991B2 (en) 2016-06-28 2018-06-26 Hothead Games Inc. Systems and methods for customized camera views in virtualized environments
JP6373920B2 (ja) 2016-09-14 2018-08-15 株式会社バンダイナムコエンターテインメント シミュレーションシステム及びプログラム
JP6539253B2 (ja) * 2016-12-06 2019-07-03 キヤノン株式会社 情報処理装置、その制御方法、およびプログラム
US10460492B2 (en) * 2017-09-13 2019-10-29 Canon Kabushiki Kaisha Method, system and apparatus for navigating a virtual camera using a navigation device
CN116450020B (zh) 2017-09-26 2024-08-09 网易(杭州)网络有限公司 虚拟射击主体控制方法、装置、电子设备及存储介质
JP2018060539A (ja) * 2017-10-02 2018-04-12 望月 玲於奈 ユーザーインターフェースプログラム
CN107890664A (zh) 2017-10-23 2018-04-10 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备
KR102309397B1 (ko) * 2017-12-20 2021-10-06 레이아 인코포레이티드 크로스-렌더 멀티뷰 카메라, 시스템 및 방법
CN108376424A (zh) * 2018-02-09 2018-08-07 腾讯科技(深圳)有限公司 用于对三维虚拟环境进行视角切换的方法、装置、设备及存储介质
CN108525294B (zh) 2018-04-04 2021-07-27 网易(杭州)网络有限公司 射击游戏的控制方法和装置
CN108553891A (zh) 2018-04-27 2018-09-21 腾讯科技(深圳)有限公司 对象瞄准方法和装置、存储介质及电子装置
CN108771863B (zh) 2018-06-11 2022-04-15 网易(杭州)网络有限公司 射击游戏的控制方法和装置
WO2020133143A1 (en) * 2018-12-28 2020-07-02 Zhejiang Dahua Technology Co., Ltd. Systems and methods for image display
CN109821237B (zh) * 2019-01-24 2022-04-22 腾讯科技(深圳)有限公司 视角转动的方法、装置、设备及存储介质
JP2022051972A (ja) * 2019-02-06 2022-04-04 ソニーグループ株式会社 情報処理装置および方法、並びにプログラム
US11216149B2 (en) * 2019-03-15 2022-01-04 Samsung Electronics Co., Ltd. 360° video viewer control using smart device
CN110038297A (zh) 2019-04-12 2019-07-23 网易(杭州)网络有限公司 移动终端的游戏操作方法及装置、存储介质及电子设备
JP6829298B1 (ja) * 2019-10-18 2021-02-10 株式会社スクウェア・エニックス プログラム、コンピュータ装置、及び制御方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150157932A1 (en) * 2012-07-06 2015-06-11 WEMADE ENTERTAINMENT CO., LTD a corporation Method of processing user gesture inputs in online game
CN107694087A (zh) * 2017-10-23 2018-02-16 网易(杭州)网络有限公司 信息处理方法及终端设备
CN108499105A (zh) * 2018-04-16 2018-09-07 腾讯科技(深圳)有限公司 在虚拟环境中进行视角调整的方法、装置及存储介质
CN108815851A (zh) * 2018-06-05 2018-11-16 腾讯科技(深圳)有限公司 在虚拟环境中射击时的界面显示方法、设备及存储介质
CN110393916A (zh) * 2019-07-26 2019-11-01 腾讯科技(深圳)有限公司 视角转动的方法、装置、设备及存储介质

Also Published As

Publication number Publication date
EP3925677A4 (en) 2022-05-04
KR102663747B1 (ko) 2024-05-09
US11878240B2 (en) 2024-01-23
JP7309913B2 (ja) 2023-07-18
EP3925677A1 (en) 2021-12-22
CN110393916A (zh) 2019-11-01
CN110393916B (zh) 2023-03-14
JP2023139033A (ja) 2023-10-03
US20210291053A1 (en) 2021-09-23
SG11202110279UA (en) 2021-10-28
US20240123342A1 (en) 2024-04-18
JP2022531599A (ja) 2022-07-07
KR20210142705A (ko) 2021-11-25

Similar Documents

Publication Publication Date Title
JP7231737B2 (ja) 動作制御方法、装置、電子機器およびプログラム
CN109350964B (zh) 控制虚拟角色的方法、装置、设备及存储介质
CN108619721B (zh) 虚拟场景中的距离信息显示方法、装置及计算机设备
CN109529319B (zh) 界面控件的显示方法、设备及存储介质
WO2021017783A1 (zh) 视角转动的方法、装置、设备及存储介质
WO2019214402A1 (zh) 虚拟环境中的配件切换方法、装置、设备及存储介质
CN111921197B (zh) 对局回放画面的显示方法、装置、终端及存储介质
CN111202975B (zh) 虚拟场景中的准星控制方法、装置、设备及存储介质
WO2020151594A1 (zh) 视角转动的方法、装置、设备及存储介质
CN111589132A (zh) 虚拟道具展示方法、计算机设备及存储介质
US11675488B2 (en) Method and apparatus for constructing building in virtual environment, device, and storage medium
TWI802978B (zh) 應用程式內的控制項位置調整方法及裝置、設備及存儲介質
WO2021031765A1 (zh) 虚拟环境中瞄准镜的应用方法和相关装置
JP7413563B2 (ja) 仮想オブジェクトの制御方法、装置、機器及びコンピュータプログラム
CN110743168A (zh) 虚拟场景中的虚拟对象控制方法、计算机设备及存储介质
CN108744510A (zh) 虚拟物品显示方法、装置及存储介质
CN111013137A (zh) 虚拟场景中的移动控制方法、装置、设备及存储介质
US12061773B2 (en) Method and apparatus for determining selected target, device, and storage medium
WO2022237076A1 (zh) 虚拟对象的控制方法、装置、设备及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20846755

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 20846755.5

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2020846755

Country of ref document: EP

Effective date: 20210917

ENP Entry into the national phase

Ref document number: 20217034151

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2021565096

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE