[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2024221693A1 - 调整虚拟镜头的方法、装置、存储介质及电子装置 - Google Patents

调整虚拟镜头的方法、装置、存储介质及电子装置 Download PDF

Info

Publication number
WO2024221693A1
WO2024221693A1 PCT/CN2023/117086 CN2023117086W WO2024221693A1 WO 2024221693 A1 WO2024221693 A1 WO 2024221693A1 CN 2023117086 W CN2023117086 W CN 2023117086W WO 2024221693 A1 WO2024221693 A1 WO 2024221693A1
Authority
WO
WIPO (PCT)
Prior art keywords
orientation
virtual
virtual lens
area
game
Prior art date
Application number
PCT/CN2023/117086
Other languages
English (en)
French (fr)
Inventor
林羽
Original Assignee
网易(杭州)网络有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 网易(杭州)网络有限公司 filed Critical 网易(杭州)网络有限公司
Publication of WO2024221693A1 publication Critical patent/WO2024221693A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/23Input arrangements for video game devices for interfacing with the game device, e.g. specific interfaces between game controller and console
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present disclosure relates to the field of computer technology, and in particular to a method, device, storage medium and electronic device for adjusting a virtual lens.
  • the perspective adjustment operation performed by the player i.e., the virtual camera orientation adjustment operation
  • the virtual camera orientation control often requires dragging the virtual camera orientation control.
  • the above method of adjusting the virtual camera orientation increases the player's operation difficulty and reduces the game experience.
  • At least some embodiments of the present disclosure provide a method, device, storage medium and electronic device for adjusting a virtual lens, so as to at least solve the technical problem in the related art that the virtual lens direction adjustment operation is difficult, resulting in a poor gaming experience for players.
  • a method for adjusting a virtual lens wherein a graphical user interface is provided through a terminal device, the content displayed by the graphical user interface at least partially includes a game scene and multiple controls, and the displayed content of the game scene at least partially includes a virtual game character, and the method comprises: obtaining a first orientation of the virtual game character and a first orientation area of the virtual lens in the game scene, wherein the first orientation is the initial orientation of the virtual game character in the game scene, and the first orientation area is the initial orientation area corresponding to the first orientation; in response to the virtual game character changing from the first orientation to the second orientation, detecting whether the second orientation exceeds the first orientation area, wherein the second orientation is the orientation of the virtual game character after adjustment in the game scene; in response to the second orientation exceeding the first orientation area, detecting whether a touch operation performed on a target control is received, wherein the target control is any one of the multiple controls, and the target control is used to control the virtual game character to perform a target action
  • an apparatus for adjusting a virtual lens wherein a graphical user interface is provided through a terminal device, wherein the content displayed by the graphical user interface at least partially includes a game scene and a plurality of controls, and the displayed content of the game scene at least partially includes a virtual game character
  • the apparatus comprises: an acquisition module, for acquiring a first orientation of the virtual game character and a first orientation area of the virtual lens in the game scene, wherein the first orientation is the initial orientation of the virtual game character in the game scene, and the first orientation area is the initial orientation area corresponding to the first orientation; a first detection module, for detecting whether the second orientation exceeds the first orientation area in response to the virtual game character changing from the first orientation to the second orientation, wherein the second orientation is the orientation of the virtual game character after adjustment in the game scene; a second detection module, for detecting whether a touch operation performed on a target control is received in response to the second orientation exceeding the first orientation area, wherein the target control is any
  • a computer-readable storage medium in which a computer program is stored, wherein the computer program is configured to execute the method for adjusting a virtual lens according to the first aspect when running.
  • an electronic device including: a memory and a processor, wherein a computer program is stored in the memory, and the processor is configured to run the computer program to execute the method for adjusting a virtual lens according to the first aspect.
  • the present disclosure first obtains a first orientation of a virtual game character and a first orientation area of a virtual lens in a game scene, wherein the first orientation is an initial orientation of the virtual game character in the game scene, and the first orientation area is an initial orientation area corresponding to the first orientation.
  • the second orientation is an orientation of the virtual game character after adjustment in the game scene
  • the target control is any control among a plurality of controls
  • the target control is used to control a virtual game character to perform a target action, and on this basis, responding to the touch operation performed on the target control, adjusting the orientation of the virtual lens based on the second orientation.
  • the method for adjusting the virtual lens achieves the purpose of automatically assisting the virtual lens orientation based on the orientation of the virtual game character, the orientation of the virtual lens, and the touch operation, thereby achieving the technical effect of reducing the difficulty of controlling the orientation of the virtual lens in the game and improving the gaming experience, thereby solving the technical problem in the related art that the difficulty of adjusting the orientation of the virtual lens leads to a poor gaming experience for the player, and reduces the response frequency of the hardware device, which helps to save the computing resources of the hardware device.
  • FIG1 is a hardware structure block diagram of a mobile terminal for a method for adjusting a virtual lens according to one embodiment of the present disclosure
  • FIG2 is a flow chart of a method for adjusting a virtual lens according to one embodiment of the present disclosure
  • FIG3 is a schematic diagram of an optional initial game scene according to one embodiment of the present disclosure.
  • FIG4 is a schematic diagram of an optional change in orientation of a virtual game character according to one embodiment of the present disclosure.
  • FIG5 is a schematic diagram of an optional game scene after the virtual camera orientation is adjusted according to one embodiment of the present disclosure
  • FIG6 is a structural block diagram of a device for adjusting a virtual lens according to one embodiment of the present disclosure.
  • FIG7 is a structural block diagram of an optional device for adjusting a virtual lens according to one embodiment of the present disclosure.
  • FIG8 is a structural block diagram of another optional device for adjusting a virtual lens according to one embodiment of the present disclosure.
  • FIG9 is a structural block diagram of another optional device for adjusting a virtual lens according to one embodiment of the present disclosure.
  • FIG. 10 is a schematic diagram of an electronic device according to one embodiment of the present disclosure.
  • the method for players to adjust the virtual lens direction in a virtual three-dimensional electronic game is: taking the mobile touch game operation as an example, use one finger to control the virtual lens direction control to adjust the real-time direction of the virtual lens.
  • the defects of this method are: the game operation of the player is difficult, the finger used to control the virtual lens direction control is occupied and cannot perform other operations, and conversely, it is difficult to free a finger to control the virtual lens direction control when performing other game operations; the operation of controlling the virtual lens direction is highly dependent on the player's experience, and the game experience is poor.
  • a terminal device e.g., a mobile terminal, a computer terminal, or a similar computing device.
  • the mobile terminal can be a smart phone, a tablet computer, a PDA, a mobile Internet device, a PAD, a game console, and other terminal devices.
  • FIG1 is a hardware structure block diagram of a mobile terminal for a method of adjusting a virtual lens according to one embodiment of the present disclosure.
  • the mobile terminal may include one or more (only one is shown in FIG1 ) processors 102, a memory 104, a transmission device 106, an input-output device 108, and a display device 110.
  • the processor 102 calls and runs a computer program stored in the memory 104 to execute the method of adjusting the virtual lens, and the adjustment result of the virtual lens orientation in the generated electronic game scene is transmitted to the input-output device 108 and/or the display device 110 through the transmission device 106, and then the adjustment result of the virtual lens orientation is provided to the player.
  • the processor 102 may include but is not limited to: a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processing (DSP) chip, a microprocessor (MCU), a programmable logic device (Field Programmable Gate Array, FPGA), a neural network processor (Neural-Network Processing Unit, NPU), a tensor processor (Tensor Processing Unit, TPU), an artificial intelligence (AI) type processor and other processing devices.
  • CPU central processing unit
  • GPU graphics processing unit
  • DSP digital signal processing
  • MCU microprocessor
  • FPGA Field Programmable Gate Array
  • NPU Nel-Network Processing Unit
  • TPU tensor processor
  • AI artificial intelligence
  • FIG1 is only for illustration and does not limit the structure of the mobile terminal.
  • the mobile terminal may include more or fewer components than those shown in FIG1 , or may have a different configuration than that shown in FIG1 .
  • the above-mentioned terminal device can also provide a human-computer interaction interface with a touch-sensitive surface, which can sense finger contact and/or gestures to perform human-computer interaction with a graphical user interface (Graphical User Interface, GUI).
  • the human-computer interaction function can include the following interactions: creating web pages, drawing, word processing, making electronic documents, games, video conferencing, instant messaging, sending and receiving emails, call interface, playing digital videos, playing digital music and/or web browsing, etc.
  • the executable instructions for executing the above-mentioned human-computer interaction functions are configured/stored in a computer program product executable by one or more processors or a readable storage medium.
  • the server can be an independent physical server, or a server cluster or distributed system composed of multiple physical servers, or a cloud server that provides basic cloud computing services such as cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communications, middleware services, domain name services, security services, content delivery networks (Content Delivery Network, CDN), and big data and artificial intelligence platforms.
  • cloud services such as cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communications, middleware services, domain name services, security services, content delivery networks (Content Delivery Network, CDN), and big data and artificial intelligence platforms.
  • the electronic game server can generate the adjustment result of the virtual lens orientation in the electronic game scene based on the method of adjusting the virtual lens, and provide the adjustment result of the virtual lens orientation to the player (for example, it can be rendered and displayed on the display screen of the player's terminal, or provided to the player through holographic projection, etc.).
  • an embodiment of a method for adjusting a virtual lens is provided. It should be noted that the steps shown in the flowchart of the accompanying drawings can be executed in a computer system such as a set of computer executable instructions, and although a logical order is shown in the flowchart, in some cases, the steps shown or described can be executed in an order different from that shown here.
  • a method for adjusting a virtual lens running on the mobile terminal is provided.
  • a graphical user interface is provided by a terminal device.
  • the content displayed by the graphical user interface at least partially includes a game scene and a plurality of controls.
  • the displayed content of the game scene at least partially includes a virtual game character.
  • FIG. 2 is a flow chart of a method for adjusting a virtual lens according to one embodiment of the present disclosure. As shown in FIG. 2, the method includes the following steps:
  • Step S21 obtaining a first orientation of the virtual game character and a first orientation area of the virtual camera in the game scene, wherein the first orientation is the initial orientation of the virtual game character in the game scene, and the first orientation area is the initial orientation area corresponding to the first orientation;
  • Step S22 in response to the virtual game character changing from the first orientation to the second orientation, detecting whether the second orientation exceeds the first orientation area, wherein the second orientation is the orientation of the virtual game character after adjustment in the game scene;
  • Step S23 in response to the second orientation exceeding the first orientation area, detecting whether a touch operation performed on a target control is received, wherein the target control is any control among the multiple controls, and the target control is used to control the virtual game character to perform a target action;
  • Step S24 in response to the touch operation performed on the target control, adjusting the orientation of the virtual lens based on the second orientation.
  • the terminal device may be a computer device (PC) and a mobile terminal device (such as a smart phone, a tablet computer, a game console, and a smart wearable device).
  • the graphical user interface provided by the terminal device may be a user interface (UI) corresponding to the game scene.
  • the graphical user interface includes at least the game scene and the plurality of controls, and the plurality of controls are used to control the scene parameters of the game scene or to control the virtual game characters in the game scene.
  • the virtual game characters displayed in the game scene are the game characters currently controlled by the player.
  • the game types corresponding to the above game scenarios can be: action games (for example, first-person or third-person shooting games, two-dimensional or three-dimensional fighting games, war action games, and sports action games, etc.), adventure games (for example, exploration games, collection games, puzzle games, etc.), Simulation (for example: simulation sandbox games, simulation development games, strategy simulation games, city building simulation games, business simulation games, etc.), role-playing games and leisure games (for example: chess and card table games, casual competitive games, music rhythm games, dress-up development games, etc.), etc.
  • action games for example, first-person or third-person shooting games, two-dimensional or three-dimensional fighting games, war action games, and sports action games, etc.
  • adventure games for example, exploration games, collection games, puzzle games, etc.
  • Simulation for example: simulation sandbox games, simulation development games, strategy simulation games, city building simulation games, business simulation games, etc.
  • role-playing games and leisure games for example: chess and card table games, casual competitive games, music
  • the first orientation is the initial orientation of the virtual game character in the game scene.
  • the virtual game character often needs to change its orientation in the game scene. For example, when walking, running, turning or looking left and right during driving, the orientation of the virtual game character changes.
  • the first orientation is the orientation of the virtual game character before the orientation change.
  • the second orientation is the orientation of the virtual game character after adjustment in the game scene.
  • the virtual lens is a pre-set observation lens in the game scene.
  • the game scene area captured by the virtual lens (including the scene and the virtual game character, virtual prop model, etc. displayed in the scene) is displayed in the graphical user interface.
  • the first orientation area is the orientation area of the virtual lens in the game scene when the virtual game character is in the first orientation, that is, the initial orientation area corresponding to the first orientation.
  • the virtual lens orientation area corresponds to the orientation of the virtual game character.
  • the orientation of the virtual lens is consistent with the orientation of the virtual game character and follows the orientation change of the virtual game character, and when the display mode of the virtual game character is set to always display at the center point of the graphical user interface, the orientation area of the virtual lens in the game scene is the central sector area of the current graphical user interface of the game scene.
  • the second orientation is detected to see whether it exceeds the first orientation area of the virtual lens. If the second orientation of the virtual game character after the orientation change does not exceed the first orientation area of the virtual lens, it means that the orientation of the virtual lens does not need to be adjusted.
  • the above-mentioned target control is any control specified in the above-mentioned multiple controls for controlling the virtual game character to perform the target action.
  • the above-mentioned target action can be an action pre-set as a "core operation", for example, the target action includes: normal attack, movement, jumping and releasing skills, etc.
  • the multiple controls include: normal attack button, movement joystick, jump button, skill button, etc.
  • the second orientation of the virtual game character exceeds the first orientation area and a touch operation is received on the target control, it indicates that the orientation of the virtual lens is to be adjusted, and the orientation of the virtual lens is adjusted based on the second orientation.
  • the orientation of the virtual lens is adjusted to be consistent with the second orientation.
  • the method provided by the embodiment of the present disclosure reduces the difficulty of controlling the virtual camera orientation in the game by using a virtual camera orientation traction auxiliary means.
  • whether to trigger the virtual camera orientation traction auxiliary is determined based on whether the current orientation of the virtual game character is out of the virtual camera orientation area and whether the virtual game character performs a core key operation.
  • a first orientation of a virtual game character and a first orientation area of a virtual lens in a game scene are first obtained, wherein the first orientation is the initial orientation of the virtual game character in the game scene, and the first orientation area is the initial orientation area corresponding to the first orientation; further, in response to the virtual game character changing from the first orientation to the second orientation, it is detected whether the second orientation exceeds the first orientation area, wherein the second orientation is the adjusted orientation of the virtual game character in the game scene; further, in response to the second orientation exceeding the first orientation area, it is detected whether a touch operation performed on a target control is received, wherein the target control is any one of a plurality of controls, and the target control is used to control the virtual game character to perform a target action; on this basis, in response to the touch operation performed on the target control, the orientation of the virtual lens is adjusted based on the second orientation.
  • the method for adjusting the virtual lens achieves the purpose of automatically assisting the direction of the virtual lens based on the direction of the virtual game character, the direction of the virtual lens and the touch operation, thereby achieving the technical effect of reducing the difficulty of controlling the direction of the virtual lens in the game and improving the gaming experience, thereby solving the technical problem in the related art that the difficulty of adjusting the direction of the virtual lens leads to a poor gaming experience for players, and reduces the response frequency of the hardware device, which helps to save the computing resources of the hardware device.
  • the following takes the automatic adjustment of the virtual lens orientation for a virtual character in a free perspective scene in a role-playing game as an example to further illustrate the technical solution of the embodiment of the present disclosure.
  • step S21 obtaining the first direction area of the virtual camera in the game scene may include the following execution steps:
  • Step S211 obtaining a third orientation of the virtual lens, wherein the third orientation is an initial orientation of the virtual lens in the game scene, and the third orientation corresponds to the first orientation;
  • Step S212 determining a first orientation area based on a preset angle range with the third orientation as a reference line.
  • the third orientation corresponds to the first orientation.
  • the third orientation is the initial orientation of the virtual lens in the game scene.
  • the first orientation is the initial orientation of the virtual game character in the game scene. That is to say, in an optional implementation, in a default state, the virtual lens and
  • the initial orientation of the virtual game character in the game scene corresponds to the initial orientation of the virtual game character in the game scene, which is manifested in that the center of the field of vision of the virtual game character is located at the center of the screen of the graphical user interface displaying the game scene.
  • the first orientation area of the virtual lens is determined.
  • the preset angle range is determined by the preset angle parameter and the preset range parameter.
  • the preset angle parameter is the central angle of the fan-shaped area
  • the preset range parameter is the radius of the fan-shaped area.
  • the perpendicular bisector of the chord of the fan-shaped area is the orientation of the virtual lens.
  • FIG3 is a schematic diagram of an optional initial game scene according to one embodiment of the present disclosure.
  • the orientation of the virtual character (as shown by the black arrow in FIG3 , which is equivalent to the first orientation mentioned above) is consistent with the orientation of the scene virtual lens (equivalent to the third orientation mentioned above), and the orientation area of the virtual lens (equivalent to the first orientation area mentioned above) is a fan-shaped area in the center of the graphical user interface, and the area boundary is determined by the two radii of the fan-shaped area.
  • the orientation of the virtual lens is the direction of the perpendicular bisector of the chord corresponding to the fan-shaped area. In a free perspective scene, the orientation of the virtual lens is set to always be adsorbed on the center of the screen.
  • the method for adjusting the virtual lens may further include the following execution steps:
  • Step S25 In response to the second orientation not exceeding the first orientation region, controlling the virtual lens to remain at a third orientation.
  • FIG4 is a schematic diagram of an optional virtual game character orientation change according to one embodiment of the present disclosure.
  • the virtual character performs a game task in the game scene, which often involves the situation of the virtual character orientation change.
  • the black arrow moves out of the fan-shaped area (in FIG4, the virtual character model turns right 90 degrees as an example).
  • the graphical user interface still displays the scene image 1.
  • the scene image observed after the virtual character turns right 90 degrees should also be deflected to the right 90 degrees with the virtual character as the center. Therefore, it is necessary to adjust the virtual lens orientation so that the player can obtain the game scene picture that can be observed by the adjusted orientation of the virtual character from the graphical user interface.
  • step S24 in response to the touch operation performed on the target control, adjusting the orientation of the virtual lens based on the second orientation may include the following execution steps:
  • Step S241 in response to receiving a touch operation on a target control within a preset time period, adjusting the orientation of the virtual lens based on a second orientation.
  • step S241 adjusting the orientation of the virtual lens based on the second orientation may include the following execution steps:
  • Step S2411 based on the second orientation, the virtual lens is controlled to rotate at a preset rate until the virtual lens rotates to a fourth orientation, wherein the fourth orientation is the orientation of the virtual lens after adjustment in the game scene, and the fourth orientation corresponds to the second orientation.
  • the fourth orientation is determined by the adjusted orientation of the virtual game character (i.e., the second orientation).
  • the fourth orientation corresponds to the second orientation, and the fourth orientation points to a position in the game scene corresponding to the center position of the graphical user interface so that the orientation area corresponding to the virtual camera is located at the center position of the area displayed by the graphical user interface.
  • the preset rate is the angular rate at which the virtual lens changes its orientation, and the preset rate can be specified by a technician or set by a player through a game preference setting.
  • the method for adjusting the virtual lens may further include the following execution steps:
  • Step S26 determining a second orientation region based on a preset angle range with the fourth orientation as a reference line, so as to update the first orientation region based on the second orientation region.
  • the fourth orientation is the adjusted virtual lens orientation, and a preset angle range is determined based on the fourth orientation to obtain a second orientation area.
  • the preset angle range is a fan-shaped area with the fourth orientation as the center line
  • the fan-shaped area corresponding to the virtual lens orientation before adjustment is adjusted according to the fourth orientation to obtain the second orientation area, and the perpendicular midline direction of the chord of the fan-shaped area corresponding to the second orientation area is consistent with the fourth orientation.
  • the visual expression on the graphical user interface changes as follows: the central area displayed by the graphical user interface switches from the first orientation area of the game scene to the second orientation area.
  • FIG. 5 is a schematic diagram of an optional game scene after the virtual lens orientation is adjusted according to one embodiment of the present disclosure.
  • the orientation area of the virtual lens will change at a slow rate and re-adsorb to the transformed virtual character model orientation (i.e., the black arrow).
  • the transformed virtual character model orientation i.e., the black arrow
  • the adjustment result of the virtual lens orientation is reflected in the graphical user interface as follows: the currently displayed scene screen is changed from scene screen 1 to scene screen 2, wherein scene screen 2 is a scene screen obtained by deflecting scene screen 1 90 degrees to the right with the virtual character as the center. For example, after the virtual character turns from facing due north to facing due east, the scene screen displayed in the graphical user interface should be switched from the due north scene screen to the due east scene screen.
  • the core controls may include skill release controls and attack controls displayed in the graphical user interface
  • the skill release controls include: skill 1 button, skill 2 button, and skill 3 button. That is, when the player controls the virtual character to turn to a direction beyond the fan-shaped area shown in FIG. 3 after the adjustment, and further touches at least one of the skill 1 button, skill 2 button, skill 3 button, and attack control, the direction of the virtual lens is adjusted according to the method provided in the embodiment of the present disclosure.
  • the above method can be used to adjust the virtual lens direction to an angle corresponding to the adjusted direction of the virtual character, that is, to adjust the scene picture currently displayed on the graphical user interface to the scene picture corresponding to the adjusted direction of the virtual character.
  • the height of the virtual lens remains unchanged.
  • the height of the virtual lens remains unchanged, that is, the virtual game character's line of sight height remains unchanged.
  • the virtual models displayed at the same height have the same game space height coordinates in the game scene.
  • the method for adjusting the virtual lens may further include the following execution steps:
  • Step S27 in response to not receiving a touch operation performed on the target control within a preset time period, controlling the virtual game character to remain in the second orientation, and controlling the virtual camera to remain in the first orientation area.
  • the display content of the graphical user interface remains as the scene screen shown in FIG4, that is, the orientation of the virtual character model is still the adjusted orientation (equivalent to the above-mentioned second orientation), and the virtual lens remains in the orientation area before adjustment (equivalent to the above-mentioned first orientation area).
  • the technical solution provided by the embodiment of the present disclosure completes the automatic auxiliary traction of the virtual camera direction. Therefore, during the game process, it is avoided that the player needs to allocate fingers to control the virtual camera direction while operating the game.
  • the judgment condition for the virtual lens orientation change provided by the embodiment of the present disclosure can be applied to most game operation processes, thereby realizing auxiliary traction of the virtual lens orientation in the game scene and improving the smoothness of the player's operation.
  • the method for adjusting the virtual lens provided by the embodiment of the present disclosure can achieve synchronization between the virtual lens orientation control and the game operation, that is, it reduces the difficulty of the player to control the virtual lens orientation while performing the game operation, reduces the player's operation cost, and enables the player to have a better immersive experience during the game.
  • the focus of the technical solution provided by the embodiment of the present disclosure is to trigger automatic adjustment of the virtual lens orientation through the positional relationship between the orientation of the virtual game character and the virtual lens orientation area, and the touch operation of multiple controls corresponding to the virtual game character.
  • the method according to the above embodiment can be implemented by means of software plus a necessary general hardware platform, and of course can also be implemented by hardware, but in many cases the former is a better implementation method.
  • the technical solution of the present disclosure, or the part that contributes to the prior art can be embodied in the form of a software product, which is stored in a storage medium (such as a disk, an optical disk), and includes a number of instructions for a terminal device (which can be a mobile phone, a computer, a server, or a network device, etc.) to execute the methods of each embodiment of the present disclosure.
  • a device for adjusting a virtual lens is also provided, and the device is used to implement the above-mentioned embodiments and preferred implementation modes, and the descriptions that have been made are omitted.
  • the term "module” may be a combination of software and/or hardware that implements a predetermined function.
  • the device described in the following embodiments is preferably implemented in software, the implementation of hardware, or a combination of software and hardware, is also possible and conceivable.
  • FIG6 is a structural block diagram of an apparatus for adjusting a virtual lens according to one embodiment of the present disclosure.
  • a graphical user interface is provided through a terminal device.
  • the content displayed by the graphical user interface at least partially includes a game scene and multiple controls.
  • the displayed content of the game scene at least partially includes a virtual game character.
  • the apparatus includes: an acquisition module 601, which is used to acquire a first orientation of the virtual game character and a first orientation area of the virtual lens in the game scene, wherein the first orientation is the initial orientation of the virtual game character in the game scene, and the first orientation area is the initial orientation area corresponding to the first orientation; a first detection module 602, which is used to detect whether the second orientation exceeds the first orientation area in response to the virtual game character changing from the first orientation to the second orientation, wherein the second orientation is the orientation of the virtual game character after adjustment in the game scene; a second detection module 603, which is used to detect whether a touch operation performed on a target control is received in response to the second orientation exceeding the first orientation area, wherein the target control is any control among multiple controls, and the target control is used to control the virtual game character to perform a target action; an adjustment module 604, which is used to adjust the orientation of the virtual lens based on the second orientation in response to the touch operation performed on the target control.
  • an acquisition module 601 which is used to acquire a first orientation of the virtual
  • the device first obtains the first orientation of the virtual game character and the first orientation area of the virtual lens in the game scene, wherein the first orientation is the initial orientation of the virtual game character in the game scene, and the first orientation area is the initial orientation area corresponding to the first orientation. Further, in response to the virtual game character changing from the first orientation to the second orientation, it is detected whether the second orientation exceeds the first orientation area, wherein the second orientation is the orientation of the virtual game character after adjustment in the game scene. Further, in response to the second orientation exceeding the first orientation area, it is detected whether a touch operation performed on a target control is received, wherein the target control is any control among a plurality of controls, and the target control is used to control the virtual game character to perform a target action.
  • the method for adjusting the virtual lens achieves the purpose of automatically assisting the virtual lens orientation based on the orientation of the virtual game character, the orientation of the virtual lens, and the touch operation, thereby achieving the technical effect of reducing the difficulty of controlling the virtual lens orientation in the game and improving the game experience, thereby solving the technical problem of the difficulty of adjusting the virtual lens orientation in the related art leading to poor game experience for players, and reducing the response frequency of the hardware device, which helps to save the computing resources of the hardware device.
  • the acquisition module 601 is further used to: acquire a third orientation of the virtual lens, wherein the third orientation is the initial orientation of the virtual lens in the game scene, and the third orientation corresponds to the first orientation; and determine the first orientation area based on a preset angle range with the third orientation as a reference line.
  • Figure 7 is a structural block diagram of an optional device for adjusting a virtual lens according to one embodiment of the present disclosure.
  • the device in addition to all the modules shown in Figure 6, the device also includes: a maintaining module 605, which is used to control the virtual lens to remain in a third orientation in response to the second orientation not exceeding the first orientation area.
  • the adjustment module 604 is further configured to: in response to receiving a touch operation on a target control within a preset time period, adjust the orientation of the virtual lens based on the second orientation.
  • the adjustment module 604 is further used to: control the virtual lens to rotate at a preset rate based on the second direction until the virtual lens rotates to a fourth direction, wherein the fourth direction is the direction of the virtual lens after adjustment in the game scene, and the fourth direction corresponds to the second direction.
  • Figure 8 is a structural block diagram of another optional device for adjusting a virtual lens according to one embodiment of the present disclosure.
  • the device in addition to all the modules shown in Figure 7, the device also includes: a determination module 606, which is used to determine a second orientation area based on a preset angle range with the fourth orientation as a baseline, so as to update the first orientation area based on the second orientation area.
  • the height of the virtual lens remains unchanged.
  • Figure 9 is a structural block diagram of another optional device for adjusting the virtual lens according to one embodiment of the present disclosure.
  • the device in addition to all the modules shown in Figure 8, the device also includes: a touch module 607, which is used to control the virtual game character to remain in the second orientation and control the virtual lens to remain in the first orientation area in response to not receiving a touch operation performed on the target control within a preset time length.
  • the above modules can be implemented by software or hardware. For the latter, it can be implemented in the following ways, but not limited to: the above modules are all located in the same processor; or the above modules are located in different processors in any combination. same processor.
  • An embodiment of the present disclosure further provides a computer-readable storage medium, in which a computer program is stored, wherein the computer program is configured to execute the steps of any of the above method embodiments when running.
  • the above-mentioned computer-readable storage medium may include but is not limited to: a USB flash drive, a read-only memory (ROM), a random access memory (RAM), a mobile hard disk, a magnetic disk or an optical disk, and other media that can store computer programs.
  • a USB flash drive a read-only memory (ROM), a random access memory (RAM), a mobile hard disk, a magnetic disk or an optical disk, and other media that can store computer programs.
  • the computer-readable storage medium may be located in any one of the computer terminals in a computer terminal group in a computer network, or in any one of the mobile terminals in a mobile terminal group.
  • the computer-readable storage medium may be configured to store a computer program for performing the following steps:
  • the computer-readable storage medium is further configured to store program code for executing the following steps: obtaining a third orientation of the virtual lens, wherein the third orientation is the initial orientation of the virtual lens in the game scene, and the third orientation corresponds to the first orientation; and determining the first orientation area based on a preset angle range with the third orientation as a baseline.
  • the computer-readable storage medium is further configured to store a program code for executing the following steps: in response to the second orientation not exceeding the first orientation region, controlling the virtual lens to remain at a third orientation.
  • the computer-readable storage medium is further configured to store a program code for executing the following steps: in response to receiving a touch operation on a target control within a preset time period, adjusting the orientation of the virtual lens based on the second orientation.
  • the computer-readable storage medium is also configured to store program code for executing the following steps: controlling the virtual lens to rotate at a preset rate based on the second orientation until the virtual lens rotates to a fourth orientation, wherein the fourth orientation is the orientation of the virtual lens after adjustment in the game scene, and the fourth orientation corresponds to the second orientation.
  • the computer-readable storage medium is further configured to store program code for executing the following steps: determining a second orientation area based on a preset angle range with the fourth orientation as a reference line, so as to update the first orientation area based on the second orientation area.
  • the computer-readable storage medium is further configured to store a program code for executing the following steps: in the process of adjusting the orientation of the virtual lens based on the second orientation, the height of the virtual lens remains unchanged.
  • the computer-readable storage medium is further configured to store program codes for executing the following steps: in response to not receiving a touch operation on a target control within a preset time period, controlling the virtual game character to remain in a second orientation, and controlling the virtual lens to remain in a first orientation area.
  • a technical solution for implementing a method for adjusting a virtual lens is provided.
  • the target control is any control among a plurality of controls, and the target control is used to control the virtual game character to perform a target action.
  • the orientation of the virtual lens is adjusted based on the second orientation.
  • the method for adjusting the virtual lens achieves the purpose of automatically assisting the direction of the virtual lens based on the direction of the virtual game character, the direction of the virtual lens and the touch operation, thereby achieving the technical effect of reducing the difficulty of controlling the direction of the virtual lens in the game and improving the gaming experience, thereby solving the technical problem in the related art that the difficulty of adjusting the direction of the virtual lens leads to a poor gaming experience for players, and reduces the response frequency of the hardware device, which helps to save the computing resources of the hardware device.
  • the example implementation described here can be implemented by software, or by combining software with necessary hardware. It is embodied in the form of a software product, which can be stored in a computer-readable storage medium (which can be a CD-ROM, a USB flash drive, a mobile hard disk, etc.) or on a network, and includes a number of instructions to enable a computing device (which can be a personal computer, a server, a terminal device, or a network device, etc.) to execute a method according to an embodiment of the present disclosure.
  • a computer-readable storage medium which can be a CD-ROM, a USB flash drive, a mobile hard disk, etc.
  • a computing device which can be a personal computer, a server, a terminal device, or a network device, etc.
  • a program product capable of implementing the above method of the present embodiment is stored on a computer-readable storage medium.
  • various aspects of the embodiments of the present disclosure may also be implemented in the form of a program product, which includes a program code, and when the program product is run on a terminal device, the program code is used to enable the terminal device to execute the steps according to various exemplary implementations of the present disclosure described in the above “Exemplary Method” section of the present embodiment.
  • the program product for implementing the above method in the embodiment of the present disclosure, it can adopt a portable compact disk read-only memory (CD-ROM) and include program code, and can be run on a terminal device, such as a personal computer.
  • a terminal device such as a personal computer.
  • the program product of the embodiment of the present disclosure is not limited to this.
  • the computer-readable storage medium can be any tangible medium containing or storing a program, which can be used by or in combination with an instruction execution system, an apparatus or a device.
  • the program product may be in any combination of one or more computer-readable media.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device or device, or any combination thereof. More specific examples (non-exhaustive) of computer-readable storage media include: an electrical connection with one or more wires, a portable disk, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination thereof.
  • program code contained in the computer-readable storage medium can be transmitted using any appropriate medium, including but not limited to wireless, wired, optical cable, RF, etc., or any suitable combination of the above.
  • An embodiment of the present disclosure further provides an electronic device, including a memory and a processor, wherein a computer program is stored in the memory, and the processor is configured to run the computer program to execute the steps in any one of the above method embodiments.
  • the electronic device may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
  • the processor may be configured to perform the following steps through a computer program:
  • the processor may also be configured to perform the following steps through a computer program: obtaining a third orientation of the virtual lens, wherein the third orientation is the initial orientation of the virtual lens in the game scene, and the third orientation corresponds to the first orientation; and determining the first orientation area based on a preset angle range with the third orientation as a baseline.
  • the processor may be further configured to execute the following steps through a computer program: in response to the second orientation not exceeding the first orientation area, controlling the virtual lens to remain at a third orientation.
  • the processor may also be configured to perform the following steps through a computer program: in response to receiving a touch operation on a target control within a preset time period, adjusting the orientation of the virtual lens based on the second orientation.
  • the processor may also be configured to perform the following steps through a computer program: controlling the virtual lens to rotate at a preset rate based on the second direction until the virtual lens rotates to a fourth direction, wherein the fourth direction is the direction of the virtual lens after adjustment in the game scene, and the fourth direction corresponds to the second direction.
  • the processor may also be configured to perform the following steps through a computer program: determining a second orientation area based on a preset angle range with the fourth orientation as a reference line, so as to update the first orientation area based on the second orientation area.
  • the processor may be further configured to perform the following steps through a computer program: in the process of adjusting the orientation of the virtual lens based on the second orientation, the height of the virtual lens remains unchanged.
  • the processor may also be configured to execute the following steps through a computer program: in response to not receiving a touch operation on a target control within a preset time period, controlling the virtual game character to remain in the second orientation, and controlling the virtual lens to remain in the first orientation area.
  • a technical solution for implementing a method for adjusting a virtual lens is provided.
  • the target control is any control among a plurality of controls, and the target control is used to control the virtual game character to perform a target action.
  • the orientation of the virtual lens is adjusted based on the second orientation.
  • the method for adjusting the virtual lens achieves the purpose of automatically assisting the direction of the virtual lens based on the direction of the virtual game character, the direction of the virtual lens and the touch operation, thereby achieving the technical effect of reducing the difficulty of controlling the direction of the virtual lens in the game and improving the gaming experience, thereby solving the technical problem in the related art that the difficulty of adjusting the direction of the virtual lens leads to a poor gaming experience for players, and reduces the response frequency of the hardware device, which helps to save the computing resources of the hardware device.
  • Fig. 10 is a schematic diagram of an electronic device according to one embodiment of the present disclosure. As shown in Fig. 10, the electronic device 1000 is only an example and should not bring any limitation to the functions and scope of use of the embodiments of the present disclosure.
  • the electronic device 1000 is presented in the form of a general-purpose computing device.
  • the components of the electronic device 1000 may include, but are not limited to: the at least one processor 1010, the at least one memory 1020, a bus 1030 connecting different system components (including the memory 1020 and the processor 1010), and a display 1040.
  • the memory 1020 stores program codes, which can be executed by the processor 1010, so that the processor 1010 executes the steps described in the method part of the embodiment of the present disclosure according to various exemplary embodiments of the present disclosure.
  • the memory 1020 may include a readable medium in the form of a volatile storage unit, such as a random access memory unit (RAM) 10201 and/or a cache memory unit 10202, and may further include a read-only memory unit (ROM) 10203, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • RAM random access memory unit
  • ROM read-only memory unit
  • non-volatile memory such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
  • the memory 1020 may also include a program/utility 10204 having a set (at least one) of program modules 10205, such program modules 10205 including but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which or some combination thereof may include the implementation of a network environment.
  • the memory 1020 may further include a memory remotely disposed relative to the processor 1010, and these remote memories may be connected to the electronic device 1000 via a network. Examples of the above-mentioned network include but are not limited to the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the bus 1030 may represent one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a local bus of the processor 1010, or a bus using any of a variety of bus architectures.
  • the display 1040 can be, for example, a touch screen liquid crystal display (LCD), which enables a user to interact with the user interface of the electronic device 1000.
  • LCD liquid crystal display
  • the electronic device 1000 may also communicate with one or more external devices 1100 (e.g., keyboards, pointing devices, Bluetooth devices, etc.), may also communicate with one or more devices that enable a user to interact with the electronic device 1000, and/or communicate with any device that enables the electronic device 1000 to communicate with one or more other computing devices (e.g., routers, modems, etc.). Such communication may be performed through an input/output (I/O) interface 1050.
  • the electronic device 1000 may also communicate with one or more networks (e.g., a local area network (LAN), a wide area network (WAN), and/or a public network, such as the Internet) through a network adapter 1060. As shown in FIG.
  • the network adapter 1060 communicates with other modules of the electronic device 1000 through a bus 1030. It should be understood that although not shown in FIG. 10 , other hardware and/or software modules may be used in conjunction with the electronic device 1000, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, disk arrays (Redundant Arrays of Independent Disks, RAID) systems, tape drives, and data backup storage systems.
  • the electronic device 1000 may further include: a keyboard, a cursor control device (such as a mouse), an input/output interface (I/O interface), a network interface, a power supply and/or a camera.
  • the structure shown in FIG. 10 is for illustration only and does not limit the structure of the electronic device described above.
  • the electronic device 1000 may also include more or fewer components than those shown in FIG. 10 , or have a configuration different from that shown in FIG. 10 .
  • the memory 1020 may be used to store computer programs and corresponding data, such as the computer programs and corresponding data corresponding to the method for adjusting the virtual lens in the embodiment of the present disclosure.
  • the processor 1010 executes various functional applications and data processing by running the computer program stored in the memory 1020, that is, implements the method for adjusting the virtual lens described above.
  • the disclosed technical content can be implemented in other ways.
  • the device embodiments described above are only schematic.
  • the division of the units can be a logical function division. There may be other division methods in actual implementation.
  • multiple units or components can be combined or integrated into another system, or some features can be ignored or not executed.
  • Another point is that the mutual coupling or direct coupling or communication connection shown or discussed can be through some interfaces, indirect coupling or communication connection of units or modules, which can be electrical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place or distributed on multiple units. Some or all of the units may be selected according to actual needs to achieve the purpose of the present embodiment.
  • each functional unit in each embodiment of the present disclosure may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit may be implemented in the form of hardware or in the form of software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
  • the technical solution of the present disclosure is essentially or the part that contributes to the prior art or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including a number of instructions for a computer device (which can be a personal computer, a server or a network device, etc.) to perform all or part of the steps of the method described in each embodiment of the present disclosure.
  • the aforementioned storage medium includes: U disk, read-only memory (ROM), random access memory (RAM), mobile hard disk, disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种调整虚拟镜头的方法,包括:获取虚拟游戏角色的第一朝向和游戏场景内虚拟镜头的第一朝向区域;响应于虚拟游戏角色从第一朝向变化至第二朝向,检测第二朝向是否超出第一朝向区域;响应于第二朝向超出第一朝向区域,检测是否接收到对目标控件执行的触控操作;响应对目标控件执行的触控操作,基于第二朝向调整虚拟镜头的朝向。

Description

调整虚拟镜头的方法、装置、存储介质及电子装置
相关申请的交叉引用
本申请要求于2023年04月26日提交的申请号为202310494541.3、名称为“调整虚拟镜头的方法、装置、存储介质及电子装置”的中国专利申请的优先权,该中国专利申请的全部内容通过引用并入全文。
技术领域
本公开涉及计算机技术领域,具体而言,涉及一种调整虚拟镜头的方法、装置、存储介质及电子装置。
背景技术
对于虚拟三维电子游戏在移动设备上运行的游戏客户端来说,玩家所执行的视角调整操作(也即虚拟镜头朝向调整操作)往往需要依赖对虚拟镜头朝向控件的拖动。当游戏场景的操作较复杂时,例如在动作类、格斗类游戏中,上述对虚拟镜头朝向的调整方法增加了玩家的操作难度,降低了游戏体验。
针对上述的问题,目前尚未提出有效的解决方案。
需要说明的是,在上述背景技术部分公开的信息仅用于加强对本公开的背景的理解,因此可以包括不构成对本领域普通技术人员已知的现有技术的信息。
发明内容
本公开至少部分实施例提供了一种调整虚拟镜头的方法、装置、存储介质及电子装置,以至少解决相关技术中虚拟镜头朝向调整操作难度大导致玩家游戏体验差的技术问题。
根据本公开的第一方面,提供了一种调整虚拟镜头的方法,通过终端设备提供一图形用户界面,图形用户界面所显示的内容至少部分地包含一游戏场景和多个控件,游戏场景的显示内容至少部分地包含一虚拟游戏角色,方法包括:获取虚拟游戏角色的第一朝向和游戏场景内虚拟镜头的第一朝向区域,其中,第一朝向为虚拟游戏角色在游戏场景内的初始朝向,第一朝向区域为第一朝向对应的初始朝向区域;响应于虚拟游戏角色从第一朝向变化至第二朝向,检测第二朝向是否超出第一朝向区域,其中,第二朝向为虚拟游戏角色在游戏场景内经过调整后的朝向;响应于第二朝向超出第一朝向区域,检测是否接收到对目标控件执行的触控操作,其中,目标控件为多个控件中的任一控件,目标控件用于控制虚拟游戏角色执行目标动作;响应对目标控件执行的触控操作,基于第二朝向调整虚拟镜头的朝向。
根据本公开的第二方面,还提供了一种调整虚拟镜头的装置,通过终端设备提供一图形用户界面,图形用户界面所显示的内容至少部分地包含一游戏场景和多个控件,游戏场景的显示内容至少部分地包含一虚拟游戏角色,装置包括:获取模块,用于获取虚拟游戏角色的第一朝向和游戏场景内虚拟镜头的第一朝向区域,其中,第一朝向为虚拟游戏角色在游戏场景内的初始朝向,第一朝向区域为第一朝向对应的初始朝向区域;第一检测模块,用于响应于虚拟游戏角色从第一朝向变化至第二朝向,检测第二朝向是否超出第一朝向区域,其中,第二朝向为虚拟游戏角色在游戏场景内经过调整后的朝向;第二检测模块,用于响应于第二朝向超出第一朝向区域,检测是否接收到对目标控件执行的触控操作,其中,目标控件为多个控件中的任一控件,目标控件用于控制虚拟游戏角色执行目标动作;调整模块,用于响应对目标控件执行的触控操作,基于第二朝向调整虚拟镜头的朝向。
根据本公开的第三方面,还提供了一种计算机可读存储介质,计算机可读存储介质中存储有计算机程序,其中,计算机程序被设置为运行时执行上述第一方面的调整虚拟镜头的方法。
根据本公开的第四方面,还提供了一种电子装置,包括:存储器和处理器,存储器中存储有计算机程序,处理器被设置为运行计算机程序以执行上述第一方面的调整虚拟镜头的方法。
本公开首先获取虚拟游戏角色的第一朝向和游戏场景内虚拟镜头的第一朝向区域,其中,第一朝向为虚拟游戏角色在游戏场景内的初始朝向,第一朝向区域为第一朝向对应的初始朝向区域,进一步地,响应于虚拟游戏角色从第一朝向变化至第二朝向,检测第二朝向是否超出第一朝向区域,其中,第二朝向为虚拟游戏角色在游戏场景内经过调整后的朝向,进一步地,响应于第二朝向超出第一朝向 区域,检测是否接收到对目标控件执行的触控操作,其中,目标控件为多个控件中的任一控件,目标控件用于控制虚拟游戏角色执行目标动作,在此基础上,响应对目标控件执行的触控操作,基于第二朝向调整虚拟镜头的朝向。由此,本公开提供的调整虚拟镜头的方法达到了基于虚拟游戏角色的朝向、虚拟镜头的朝向以及触控操作情况对虚拟镜头朝向进行自动地辅助牵引的目的,从而实现了降低游戏中虚拟镜头朝向控制难度、提升游戏体验的技术效果,进而解决了相关技术中虚拟镜头朝向调整操作难度大导致玩家游戏体验差的技术问题,并且降低了硬件设备的响应频率,有助于节约硬件设备的计算资源。
附图说明
图1是根据本公开其中一实施例的一种调整虚拟镜头的方法的移动终端的硬件结构框图;
图2是根据本公开其中一实施例的一种调整虚拟镜头的方法的流程图;
图3是根据本公开其中一实施例的一种可选的初始游戏场景的示意图;
图4是根据本公开其中一实施例的一种可选的虚拟游戏角色朝向变化的示意图;
图5是根据本公开其中一实施例的一种可选的虚拟镜头朝向调整后的游戏场景的示意图;
图6是根据本公开其中一实施例的一种调整虚拟镜头的装置的结构框图;
图7是根据本公开其中一实施例的一种可选的调整虚拟镜头的装置的结构框图;
图8是根据本公开其中一实施例的另一种可选的调整虚拟镜头的装置的结构框图;
图9是根据本公开其中一实施例的另一种可选的调整虚拟镜头的装置的结构框图;
图10是根据本公开其中一实施例的一种电子装置的示意图。
具体实施方式
现有技术中,玩家在虚拟三维电子游戏中实现虚拟镜头朝向调整操作的方法为:以移动端触控游戏操作为例,使用一根手指控制虚拟镜头朝向控件以调整虚拟镜头的实时朝向。这种方法的缺陷在于:玩家的游戏操作难度较大,用于控制虚拟镜头朝向控件的手指被占用而无法进行其他操作,反之在进行其他游戏操作时难以空余一根手指用于控制虚拟镜头朝向控件;控制虚拟镜头朝向的操作对玩家经验的依赖程度较高,游戏体验差。针对现有技术的上述的问题,在本公开之前尚未提出有效的解决方案。
为了使本技术领域的人员更好地理解本公开方案,下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本公开一部分的实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都应当属于本公开保护的范围。
需要说明的是,本公开的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。应该理解这样使用的数据在适当情况下可以互换,以便这里描述的本公开的实施例能够以除了在这里图示或描述的那些以外的顺序实施。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元的过程、方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
需要说明的是,在本公开的说明书中,“例如”一词用来表示“用作例子、例证或说明”。本公开中被描述为“例如”的任何实施例不一定被解释为比其它实施例更优选或更具优势。为了使本领域任何技术人员能够实现和使用本公开,给出了以下描述。在以下描述中,为了解释的目的而列出了细节。应当明白的是,本领域普通技术人员可以认识到,在不使用这些特定细节的情况下也可以实现本公开。在其它实例中,不会对公知的结构和过程进行详细阐述,以避免不必要的细节使本公开的描述变得晦涩。因此,本公开并非旨在限于所示的实施例,而是与符合本公开所公开的原理和特征的最广范围相一致。
本公开涉及到的上述方法实施例,可以在终端设备(例如,移动终端、计算机终端或者类似的运算装置)中执行。以运行在移动终端上为例,该移动终端可以是智能手机、平板电脑、掌上电脑以及移动互联网设备、PAD、游戏机等终端设备。
图1是根据本公开其中一实施例的一种调整虚拟镜头的方法的移动终端的硬件结构框图。如图1所示,移动终端可以包括一个或多个(图1中仅示出一个)处理器102、存储器104、传输设备106、输入输出设备108以及显示设备110。以调整虚拟镜头的方法通过该移动终端应用于电子游戏场景为例,处理器102调用并运行存储器104中存储的计算机程序以执行该调整虚拟镜头的方法,所生成的电子游戏场景中的虚拟镜头朝向的调整结果通过传输设备106传输至输入输出设备108和/或显示设备110,进而将该虚拟镜头朝向的调整结果提供给玩家。
仍然如图1所示,处理器102可以包括但不限于:中央处理器(Central Processing Unit,CPU)、图形处理器(Graphics Processing Unit,GPU)、数字信号处理(Digital Signal Processing,DSP)芯片、微处理器(Microcontroller Unit,MCU)、可编程逻辑器件(Field Programmable Gate Array,FPGA)、神经网络处理器(Neural-Network Processing Unit,NPU)、张量处理器(Tensor Processing Unit,TPU)、人工智能(Artificial Intelligence,AI)类型处理器等的处理装置。
本领域技术人员可以理解,图1所示的结构仅为示意,其并不对上述移动终端的结构造成限定。例如,移动终端还可包括比图1中所示更多或者更少的组件,或者具有与图1所示不同的配置。
在一些以游戏场景为主的可选实施例中,上述终端设备还可以提供具有触摸触敏表面的人机交互界面,该人机交互界面可以感应手指接触和/或手势来与图形用户界面(Graphical User Interface,GUI)进行人机交互,该人机交互功能可以包括如下交互:创建网页、绘图、文字处理、制作电子文档、游戏、视频会议、即时通信、收发电子邮件、通话界面、播放数字视频、播放数字音乐和/或网络浏览等、用于执行上述人机交互功能的可执行指令被配置/存储在一个或多个处理器可执行的计算机程序产品或可读存储介质中。
本公开涉及到的上述方法实施例,还可以在服务器中执行。其中,服务器可以是独立的物理服务器,也可以是多个物理服务器构成的服务器集群或者分布式系统,还可以是提供云服务、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、内容分发网络(Content Delivery Network,CDN)、以及大数据和人工智能平台等基础云计算服务的云服务器。以调整虚拟镜头的方法通过电子游戏服务器应用于电子游戏场景为例,电子游戏服务器可基于该调整虚拟镜头的方法生成的电子游戏场景中的虚拟镜头朝向的调整结果,并将该虚拟镜头朝向的调整结果提供给玩家(例如,可以渲染显示在玩家终端的显示屏上,或者,通过全息投影提供给玩家等)。
根据本公开其中一实施例,提供了一种调整虚拟镜头的方法的实施例,需要说明的是,在附图的流程图示出的步骤可以在诸如一组计算机可执行指令的计算机系统中执行,并且,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤。
在本实施例中提供了一种运行于上述移动终端的一种调整虚拟镜头的方法,通过终端设备提供一图形用户界面,图形用户界面所显示的内容至少部分地包含一游戏场景和多个控件,游戏场景的显示内容至少部分地包含一虚拟游戏角色,图2是根据本公开其中一实施例的一种调整虚拟镜头的方法的流程图,如图2所示,该方法包括如下步骤:
步骤S21,获取虚拟游戏角色的第一朝向和游戏场景内虚拟镜头的第一朝向区域,其中,第一朝向为虚拟游戏角色在游戏场景内的初始朝向,第一朝向区域为第一朝向对应的初始朝向区域;
步骤S22,响应于虚拟游戏角色从第一朝向变化至第二朝向,检测第二朝向是否超出第一朝向区域,其中,第二朝向为虚拟游戏角色在游戏场景内经过调整后的朝向;
步骤S23,响应于第二朝向超出第一朝向区域,检测是否接收到对目标控件执行的触控操作,其中,目标控件为多个控件中的任一控件,目标控件用于控制虚拟游戏角色执行目标动作;
步骤S24,响应对目标控件执行的触控操作,基于第二朝向调整虚拟镜头的朝向。
上述终端设备可以是计算机设备(PC)和移动终端设备(如智能手机、平板电脑、游戏机和智能穿戴设备等)等。上述终端设备提供的图形用户界面可以是游戏场景对应的用户界面(User Interface,UI)。上述图形用户界面中至少包括上述游戏场景和上述多个控件,上述多个控件用于控制上述游戏场景的场景参数或者用于控制上述游戏场景中的虚拟游戏角色。上述游戏场景中所显示的虚拟游戏角色为玩家当前操控的游戏角色。
上述游戏场景对应的游戏类型可以是:动作类(例如:第一人称或第三人称射击游戏、二维或三维格斗游戏、战争动作游戏和体育动作游戏等)、冒险类(例如:探险游戏、收藏游戏、解谜游戏等)、 模拟类(例如:模拟沙盘游戏、模拟养成游戏、策略模拟游戏、城市建造模拟游戏、商业模拟游戏等)、角色扮演类和休闲类(例如:棋牌桌游游戏、休闲竞技游戏、音乐节奏游戏、换装养成游戏等)等。
上述第一朝向为上述虚拟游戏角色在游戏场景内的初始朝向。虚拟游戏角色在游戏场景中经常需要变换朝向,例如,行走、奔跑、驾驶载具过程中转向运动或者左右观望时,虚拟游戏角色的朝向变换。上述第一朝向为朝向变换前虚拟游戏角色的朝向。对应地,上述第二朝向为虚拟游戏角色在游戏场景内经过调整后的朝向。
上述虚拟镜头为上述游戏场景内预先设置的观察镜头。上述虚拟镜头所捕获的游戏场景区域(包括场景和该场景内显示的虚拟游戏角色、虚拟道具模型等)显示在上述图形用户界面内。上述第一朝向区域为虚拟游戏角色处于第一朝向时,游戏场景内虚拟镜头的朝向区域,也即第一朝向对应的初始朝向区域。
在本公开实施例中,虚拟镜头的朝向区域与虚拟游戏角色的朝向相对应。一种可选的实施方式为:虚拟镜头的朝向与虚拟游戏角色的朝向一致并跟随虚拟游戏角色的朝向变化,将虚拟游戏角色的显示方式设置为始终显示在图形用户界面中心点时,游戏场景内虚拟镜头的朝向区域为游戏场景的当前图形用户界面中央扇形区域。
当监测到虚拟游戏角色从第一朝向变化至第二朝向时,监测上述第二朝向是否超出上述虚拟镜头的第一朝向区域。如果虚拟游戏角色朝向变换后的第二朝向未超出虚拟镜头的第一朝向区域,则表示上述虚拟镜头的朝向无需调整。
如果虚拟游戏角色朝向变换后的第二朝向超出了虚拟镜头的第一朝向区域,则检测是否接收对目标控件执行的触控操作。上述目标控件为上述多个控件中指定的用于控制虚拟游戏角色执行目标动作的任一控件。上述目标动作可以是预先设置为“核心操作”的动作,例如,目标动作包括:普通攻击、移动、跳跃和释放技能等。对应地,多个控件包括:普通攻击按钮、移动摇杆、跳跃按钮、技能按钮等。
当虚拟游戏角色的第二朝向超出上述第一朝向区域,且接收到对目标控件执行的触控操作时,表示上述虚拟镜头的朝向待调整,此时基于上述第二朝向对上述虚拟镜头的朝向进行调整。在一种可选的实施方式中,将上述虚拟镜头的朝向调整至与上述第二朝向一致。
本公开实施例提供的方法通过虚拟镜头朝向的牵引辅助手段降低玩家在游戏中虚拟镜头朝向控制的操作难度。在游戏过程中,基于虚拟游戏角色的当前朝向是否脱离虚拟镜头朝向区域,以及基于虚拟游戏角色是否进行核心按键操作,判断是否触发虚拟镜头朝向的牵引辅助。
在本公开至少部分实施例中,首先获取虚拟游戏角色的第一朝向和游戏场景内虚拟镜头的第一朝向区域,其中,第一朝向为虚拟游戏角色在游戏场景内的初始朝向,第一朝向区域为第一朝向对应的初始朝向区域,进一步地,响应于虚拟游戏角色从第一朝向变化至第二朝向,检测第二朝向是否超出第一朝向区域,其中,第二朝向为虚拟游戏角色在游戏场景内经过调整后的朝向,进一步地,响应于第二朝向超出第一朝向区域,检测是否接收到对目标控件执行的触控操作,其中,目标控件为多个控件中的任一控件,目标控件用于控制虚拟游戏角色执行目标动作,在此基础上,响应对目标控件执行的触控操作,基于第二朝向调整虚拟镜头的朝向。由此,本公开提供的调整虚拟镜头的方法达到了基于虚拟游戏角色的朝向、虚拟镜头的朝向以及触控操作情况对虚拟镜头朝向进行自动地辅助牵引的目的,从而实现了降低游戏中虚拟镜头朝向控制难度、提升游戏体验的技术效果,进而解决了相关技术中虚拟镜头朝向调整操作难度大导致玩家游戏体验差的技术问题,并且降低了硬件设备的响应频率,有助于节约硬件设备的计算资源。
以下以角色扮演游戏中自由视角场景下为虚拟人物角色进行虚拟镜头朝向自动调整为例,对本公开实施例的技术方案进行进一步说明。
可选地,在步骤S21中,获取游戏场景内虚拟镜头的第一朝向区域,可以包括以下执行步骤:
步骤S211,获取虚拟镜头的第三朝向,其中,第三朝向为虚拟镜头在游戏场景内的初始朝向,且第三朝向与第一朝向相对应;
步骤S212,基于以第三朝向为基准线的预设夹角范围,确定第一朝向区域。
第三朝向与第一朝向相对应,第三朝向为虚拟镜头在游戏场景内的初始朝向,第一朝向为虚拟游戏角色在游戏场景内的初始朝向,也就是说,在一种可选的实施方式中,在默认状态下,虚拟镜头和 虚拟游戏角色在游戏场景内的初始朝向相对应,表现为,虚拟游戏角色的视野中心处于显示有游戏场景的图形用户界面的画面中心。
基于上述虚拟镜头的第三朝向和预设夹角范围,确定上述虚拟镜头的第一朝向区域。上述预设夹角范围由预设夹角参数和预设范围参数确定。例如,预设夹角范围为扇形区域时,预设夹角参数为扇形区域对应的圆心角,预设范围参数为扇形区域的半径。扇形区域的弦的中垂线方向为虚拟镜头的朝向。
图3是根据本公开其中一实施例的一种可选的初始游戏场景的示意图,如图3所示,虚拟人物角色的朝向(如图3中的黑色箭头所示,相当于上述第一朝向)与场景虚拟镜头朝向(相当于上述第三朝向)一致,该虚拟镜头的朝向区域(相当于上述第一朝向区域)为图形用户界面中央的扇形区域,区域边界由该扇形区域的两条半径确定。上述虚拟镜头朝向为扇形区域对应的弦的中垂线方向。在自由视角场景下,上述虚拟镜头朝向被设置为始终吸附在屏幕中心。将虚拟人物角色在如图3所示的初始朝向时,图形用户界面中显示的游戏场景记为场景图像1。
在游戏进行过程中,当监测到虚拟人物角色向右转90度时,监测虚拟人物角色的调整后朝向是否超出如图3所示的扇形区域。
可选地,上述调整虚拟镜头的方法还可以包括以下执行步骤:
步骤S25,响应于第二朝向未超出第一朝向区域,控制虚拟镜头保持在第三朝向。
仍然以角色扮演游戏中自由视角场景下为虚拟人物角色进行虚拟镜头朝向自动调整为例,当虚拟人物角色的调整后朝向未超出如图3所示的扇形区域时,表示图形用户界面当前显示的场景图像无需调整,将虚拟镜头的朝向保持原本朝向,该原本朝向为虚拟人物角色的朝向调整前对应的虚拟镜头的朝向。
图4是根据本公开其中一实施例的一种可选的虚拟游戏角色朝向变化的示意图,如图4所示,在游戏进行过程中,虚拟人物角色在游戏场景中执行游戏任务,经常会涉及虚拟人物角色朝向变换的情况,此时,当虚拟人物角色的调整后朝向超出如图3所示的扇形区域时,如图4所示,黑色箭头移出扇形区域(图4中以虚拟人物模型向右转90度为例)。此时,图形用户界面中仍然显示场景图像1,然而,按照现实世界客观规律,当虚拟人物角色向右转90度后观察到的场景图像也应当以虚拟人物角色为中心向右偏转90度。因此,需要对虚拟镜头朝向进行调整,使得玩家能够从图形用户界面中获取虚拟人物角色的调整后朝向所能观察到的游戏场景画面。
可选地,在步骤S24中,响应对目标控件执行的触控操作,基于第二朝向调整虚拟镜头的朝向,可以包括以下执行步骤:
步骤S241,响应在预设时长内接收到针对目标控件的触控操作,基于第二朝向调整虚拟镜头的朝向。
可选地,在步骤S241中,基于第二朝向调整虚拟镜头的朝向,可以包括以下执行步骤:
步骤S2411,基于第二朝向控制虚拟镜头按照预设速率进行转动,直至虚拟镜头转动至第四朝向,其中,第四朝向为虚拟镜头在游戏场景内经过调整后的朝向,且第四朝向与第二朝向相对应。
上述第四朝向由虚拟游戏角色的调整后朝向(也即第二朝向)确定。在一种可选的实施方式中,第四朝向与第二朝向相对应,第四朝向指向游戏场景中与图形用户界面的中心位置对应的位置以使得虚拟相机对应的朝向区域位于图形用户界面所显示的区域的中心位置。上述预设速率为虚拟镜头进行朝向变换的角速率,该预设速率可以由技术人员指定,也可以由玩家通过游戏偏好设置设定。
可选地,上述调整虚拟镜头的方法还可以包括以下执行步骤:
步骤S26,基于以第四朝向为基准线的预设夹角范围,确定第二朝向区域,以基于第二朝向区域对第一朝向区域进行更新。
上述第四朝向为调整后的虚拟镜头朝向,基于该第四朝向确定预设夹角范围,得到第二朝向区域。例如,当预设夹角范围为以第四朝向为中心线的扇形区域时,按照第四朝向对调整前的虚拟镜头朝向对应的扇形区域进行调整,得到上述第二朝向区域,该第二朝向区域对应的扇形区域的弦的中垂线方向与第四朝向一致。
容易理解的是,基于第二朝向区域对第一朝向区域进行更新的过程中,图形用户界面上的视觉表现变化为:图形用户界面所显示的中心区域由游戏场景的第一朝向区域切换为第二朝向区域。
仍然以角色扮演游戏中自由视角场景下为虚拟人物角色进行虚拟镜头朝向自动调整为例,图5是根据本公开其中一实施例的一种可选的虚拟镜头朝向调整后的游戏场景的示意图,如图5所示,此时,如果操控该虚拟人物角色的玩家对图形用户界面显示的核心控件进行触控操作时,虚拟镜头的朝向区域将以平缓的速率变换,重新吸附至变换后的虚拟人物模型朝向(即黑色箭头)上。如图5所示,上述虚拟镜头朝向的调整结果体现在图形用户界面上为:当前显示的场景画面由场景画面1变换至场景画面2,其中,场景画面2为以虚拟人物角色为中心由场景画面1向右偏转90度得到的场景画面。例如,虚拟人物角色从面向正北方转向面向正东方后,图形用户界面中显示的场景画面应当从正北方场景画面切换为正东方场景画面。
本例中,上述核心控件可以包括图形用户界面中显示的释放技能控件和攻击控件,释放技能控件包括:技能1按钮、技能2按钮和技能3按钮。也就是说,当玩家操控该虚拟人物角色转向至调整后朝向超出如图3所示的扇形区域后,进一步触控上述技能1按钮、技能2按钮、技能3按钮和攻击控件中的至少一个时,按照本公开实施例提供的上述方法调整虚拟镜头的朝向。
同理,当在游戏场景中上述虚拟人物角色朝向偏转任意角度(通常为0至360度之间)时,均可以采用上述方法,将虚拟镜头朝向调整至与虚拟人物角色的调整后朝向相对应的角度,也即,将图形用户界面当前显示的场景画面调整至虚拟人物角色的调整后朝向相对应的场景画面。
可选地,上述调整虚拟镜头的方法中,在基于第二朝向调整虚拟镜头的朝向的过程中,虚拟镜头的高度保持不变。
在虚拟镜头的朝向跟随虚拟游戏角色的朝向变换的过程中,为了保证图形用户界面中显示的场景画面的变换流畅性,在基于虚拟游戏角色的调整后朝向(即第二朝向)调整虚拟镜头的朝向的过程中,虚拟镜头的高度保持不变,也即,虚拟游戏角色的视线高度不变,对应地,在图形用户界面显示的场景画面中,同一高度显示的虚拟模型在游戏场景中的游戏空间高度坐标也一致。
可选地,上述调整虚拟镜头的方法还可以包括以下执行步骤:
步骤S27,响应于在预设时长内未接收到对目标控件执行的触控操作,控制虚拟游戏角色保持在第二朝向,以及控制虚拟镜头保持在第一朝向区域。
仍然以角色扮演游戏中自由视角场景下为虚拟人物角色进行虚拟镜头朝向自动调整为例,如图4所示,当检测到当虚拟人物角色的调整后朝向超出如图3所示的扇形区域,且玩家对图形用户界面中显示的多个控件中的目标控件执行了触控操作时,对虚拟镜头的朝向进行调整,得到如图5所示的场景画面。然而,当检测到当虚拟人物角色的调整后朝向超出如图3所示的扇形区域,但玩家在预设时长内未对图形用户界面中显示的多个控件中的目标控件执行触控操作时,图形用户界面的显示内容保持为如图4所示的场景画面,也即,虚拟人物模型的朝向仍然为调整后朝向(相当于上述第二朝向),虚拟镜头仍然保持在调整前的朝向区域(相当于上述第一朝向区域)。
由此,本公开实施例提供的技术方案完成了对虚拟镜头朝向的自动地辅助牵引。由此,在游戏过程中,避免玩家进行在游戏操作的同时需要分配手指控制虚拟镜头朝向的问题。
容易注意到的是,在虚拟三维游戏场景中,尤其是在自由视角游戏场景中,本公开实施例提供的虚拟镜头朝向变换的判断条件能够适用于大部分游戏操作过程,从而实现在游戏场景中对虚拟镜头朝向的辅助牵引,提升玩家操作的流畅性。此外,对于移动端游戏来说,本公开实施例提供的调整虚拟镜头的方法能够实现虚拟镜头朝向控制和游戏操作的同步性,也即,降低了玩家在进行游戏操作的同时对虚拟镜头朝向进行控制的操作难度,降低了玩家操作成本,使得玩家在游戏过程中得到更好的沉浸式体验。
容易理解的是,本公开实施例提供的技术方案的重点在于:通过虚拟游戏角色的朝向与虚拟镜头朝向区域之间的位置关系,以及虚拟游戏角色对应的多个控件的触控操作,触发对虚拟镜头朝向的自动调整。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到根据上述实施例的方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本公开的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,或者网络设备等)执行本公开各个实施例的方法。
在本实施例中还提供了一种调整虚拟镜头的装置,该装置用于实现上述实施例及优选实施方式,已经进行过说明的不再赘述。如以下所使用的,术语“模块”可以实现预定功能的软件和/或硬件的组合。尽管以下实施例所描述的装置较佳地以软件来实现,但是硬件,或者软件和硬件的组合的实现也是可能并被构想的。
图6是根据本公开其中一实施例的一种调整虚拟镜头的装置的结构框图,通过终端设备提供一图形用户界面,图形用户界面所显示的内容至少部分地包含一游戏场景和多个控件,游戏场景的显示内容至少部分地包含一虚拟游戏角色,如图6所示,该装置包括:获取模块601,用于获取虚拟游戏角色的第一朝向和游戏场景内虚拟镜头的第一朝向区域,其中,第一朝向为虚拟游戏角色在游戏场景内的初始朝向,第一朝向区域为第一朝向对应的初始朝向区域;第一检测模块602,用于响应于虚拟游戏角色从第一朝向变化至第二朝向,检测第二朝向是否超出第一朝向区域,其中,第二朝向为虚拟游戏角色在游戏场景内经过调整后的朝向;第二检测模块603,用于响应于第二朝向超出第一朝向区域,检测是否接收到对目标控件执行的触控操作,其中,目标控件为多个控件中的任一控件,目标控件用于控制虚拟游戏角色执行目标动作;调整模块604,用于响应对目标控件执行的触控操作,基于第二朝向调整虚拟镜头的朝向。
该装置首先获取虚拟游戏角色的第一朝向和游戏场景内虚拟镜头的第一朝向区域,其中,第一朝向为虚拟游戏角色在游戏场景内的初始朝向,第一朝向区域为第一朝向对应的初始朝向区域,进一步地,响应于虚拟游戏角色从第一朝向变化至第二朝向,检测第二朝向是否超出第一朝向区域,其中,第二朝向为虚拟游戏角色在游戏场景内经过调整后的朝向,进一步地,响应于第二朝向超出第一朝向区域,检测是否接收到对目标控件执行的触控操作,其中,目标控件为多个控件中的任一控件,目标控件用于控制虚拟游戏角色执行目标动作,在此基础上,响应对目标控件执行的触控操作,基于第二朝向调整虚拟镜头的朝向。由此,本公开提供的调整虚拟镜头的方法达到了基于虚拟游戏角色的朝向、虚拟镜头的朝向以及触控操作情况对虚拟镜头朝向进行自动地辅助牵引的目的,从而实现了降低游戏中虚拟镜头朝向控制难度、提升游戏体验的技术效果,进而解决了相关技术中虚拟镜头朝向调整操作难度大导致玩家游戏体验差的技术问题,并且降低了硬件设备的响应频率,有助于节约硬件设备的计算资源。
可选地,上述获取模块601,还用于:获取虚拟镜头的第三朝向,其中,第三朝向为虚拟镜头在游戏场景内的初始朝向,且第三朝向与第一朝向相对应;基于以第三朝向为基准线的预设夹角范围,确定第一朝向区域。
可选地,图7是根据本公开其中一实施例的一种可选的调整虚拟镜头的装置的结构框图,如图7所示,该装置除包括图6所示的所有模块外,还包括:保持模块605,用于响应于第二朝向未超出第一朝向区域,控制虚拟镜头保持在第三朝向。
可选地,上述调整模块604,还用于:响应在预设时长内接收到针对目标控件的触控操作,基于第二朝向调整虚拟镜头的朝向。
可选地,上述调整模块604,还用于:基于第二朝向控制虚拟镜头按照预设速率进行转动,直至虚拟镜头转动至第四朝向,其中,第四朝向为虚拟镜头在游戏场景内经过调整后的朝向,且第四朝向与第二朝向相对应。
可选地,图8是根据本公开其中一实施例的另一种可选的调整虚拟镜头的装置的结构框图,如图8所示,该装置除包括图7所示的所有模块外,还包括:确定模块606,用于基于以第四朝向为基准线的预设夹角范围,确定第二朝向区域,以基于第二朝向区域对第一朝向区域进行更新。
可选地,在调整虚拟镜头的装置中,在基于第二朝向调整虚拟镜头的朝向的过程中,虚拟镜头的高度保持不变。
可选地,图9是根据本公开其中一实施例的另一种可选的调整虚拟镜头的装置的结构框图,如图9所示,该装置除包括图8所示的所有模块外,还包括:触控模块607,用于响应于在预设时长内未接收到对目标控件执行的触控操作,控制虚拟游戏角色保持在第二朝向,以及控制虚拟镜头保持在第一朝向区域。
需要说明的是,上述各个模块是可以通过软件或硬件来实现的,对于后者,可以通过以下方式实现,但不限于此:上述模块均位于同一处理器中;或者,上述各个模块以任意组合的形式分别位于不 同的处理器中。
本公开的实施例还提供了一种计算机可读存储介质,该计算机可读存储介质中存储有计算机程序,其中,该计算机程序被设置为运行时执行上述任一项方法实施例中的步骤。
可选地,在本实施例中,上述计算机可读存储介质可以包括但不限于:U盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、移动硬盘、磁碟或者光盘等各种可以存储计算机程序的介质。
可选地,在本实施例中,上述计算机可读存储介质可以位于计算机网络中计算机终端群中的任意一个计算机终端中,或者位于移动终端群中的任意一个移动终端中。
可选地,在本实施例中,上述计算机可读存储介质可以被设置为存储用于执行以下步骤的计算机程序:
S1,获取虚拟游戏角色的第一朝向和游戏场景内虚拟镜头的第一朝向区域,其中,第一朝向为虚拟游戏角色在游戏场景内的初始朝向,第一朝向区域为第一朝向对应的初始朝向区域;
S2,响应于虚拟游戏角色从第一朝向变化至第二朝向,检测第二朝向是否超出第一朝向区域,其中,第二朝向为虚拟游戏角色在游戏场景内经过调整后的朝向;
S3,响应于第二朝向超出第一朝向区域,检测是否接收到对目标控件执行的触控操作,其中,目标控件为多个控件中的任一控件,目标控件用于控制虚拟游戏角色执行目标动作;
S4,响应对目标控件执行的触控操作,基于第二朝向调整虚拟镜头的朝向。
可选地,上述计算机可读存储介质还被设置为存储用于执行以下步骤的程序代码:获取虚拟镜头的第三朝向,其中,第三朝向为虚拟镜头在游戏场景内的初始朝向,且第三朝向与第一朝向相对应;基于以第三朝向为基准线的预设夹角范围,确定第一朝向区域。
可选地,上述计算机可读存储介质还被设置为存储用于执行以下步骤的程序代码:响应于第二朝向未超出第一朝向区域,控制虚拟镜头保持在第三朝向。
可选地,上述计算机可读存储介质还被设置为存储用于执行以下步骤的程序代码:响应在预设时长内接收到针对目标控件的触控操作,基于第二朝向调整虚拟镜头的朝向。
可选地,上述计算机可读存储介质还被设置为存储用于执行以下步骤的程序代码:基于第二朝向控制虚拟镜头按照预设速率进行转动,直至虚拟镜头转动至第四朝向,其中,第四朝向为虚拟镜头在游戏场景内经过调整后的朝向,且第四朝向与第二朝向相对应。
可选地,上述计算机可读存储介质还被设置为存储用于执行以下步骤的程序代码:基于以第四朝向为基准线的预设夹角范围,确定第二朝向区域,以基于第二朝向区域对第一朝向区域进行更新。
可选地,上述计算机可读存储介质还被设置为存储用于执行以下步骤的程序代码:在基于第二朝向调整虚拟镜头的朝向的过程中,虚拟镜头的高度保持不变。
可选地,上述计算机可读存储介质还被设置为存储用于执行以下步骤的程序代码:响应于在预设时长内未接收到对目标控件执行的触控操作,控制虚拟游戏角色保持在第二朝向,以及控制虚拟镜头保持在第一朝向区域。
在上述实施例的计算机可读存储介质中,提供了一种实现调整虚拟镜头的方法的技术方案。首先获取虚拟游戏角色的第一朝向和游戏场景内虚拟镜头的第一朝向区域,其中,第一朝向为虚拟游戏角色在游戏场景内的初始朝向,第一朝向区域为第一朝向对应的初始朝向区域,进一步地,响应于虚拟游戏角色从第一朝向变化至第二朝向,检测第二朝向是否超出第一朝向区域,其中,第二朝向为虚拟游戏角色在游戏场景内经过调整后的朝向,进一步地,响应于第二朝向超出第一朝向区域,检测是否接收到对目标控件执行的触控操作,其中,目标控件为多个控件中的任一控件,目标控件用于控制虚拟游戏角色执行目标动作,在此基础上,响应对目标控件执行的触控操作,基于第二朝向调整虚拟镜头的朝向。由此,本公开提供的调整虚拟镜头的方法达到了基于虚拟游戏角色的朝向、虚拟镜头的朝向以及触控操作情况对虚拟镜头朝向进行自动地辅助牵引的目的,从而实现了降低游戏中虚拟镜头朝向控制难度、提升游戏体验的技术效果,进而解决了相关技术中虚拟镜头朝向调整操作难度大导致玩家游戏体验差的技术问题,并且降低了硬件设备的响应频率,有助于节约硬件设备的计算资源。
通过以上的实施方式的描述,本领域的技术人员易于理解,这里描述的示例实施方式可以通过软件实现,也可以通过软件结合必要的硬件的方式来实现。因此,根据本公开实施方式的技术方案可以 以软件产品的形式体现出来,该软件产品可以存储在一个计算机可读存储介质(可以是CD-ROM,U盘,移动硬盘等)中或网络上,包括若干指令以使得一台计算设备(可以是个人计算机、服务器、终端装置、或者网络设备等)执行根据本公开实施方式的方法。
在本公开的示例性实施例中,计算机可读存储介质上存储有能够实现本实施例上述方法的程序产品。在一些可能的实施方式中,本公开实施例的各个方面还可以实现为一种程序产品的形式,其包括程序代码,当所述程序产品在终端设备上运行时,所述程序代码用于使所述终端设备执行本实施例上述“示例性方法”部分中描述的根据本公开各种示例性实施方式的步骤。
根据本公开的实施方式的用于实现上述方法的程序产品,其可以采用便携式紧凑盘只读存储器(CD-ROM)并包括程序代码,并可以在终端设备,例如个人电脑上运行。然而,本公开实施例的程序产品不限于此,在本公开实施例中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。
上述程序产品可以采用一个或多个计算机可读介质的任意组合。该计算机可读存储介质例如可以为但不限于电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例子(非穷举的列举)包括:具有一个或多个导线的电连接、便携式盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。
需要说明的是,计算机可读存储介质上包含的程序代码可以用任何适当的介质传输,包括但不限于无线、有线、光缆、RF等等,或者上述的任意合适的组合。
本公开的实施例还提供了一种电子装置,包括存储器和处理器,该存储器中存储有计算机程序,该处理器被设置为运行计算机程序以执行上述任一项方法实施例中的步骤。
可选地,上述电子装置还可以包括传输设备以及输入输出设备,其中,该传输设备和上述处理器连接,该输入输出设备和上述处理器连接。
可选地,在本实施例中,上述处理器可以被设置为通过计算机程序执行以下步骤:
S1,获取虚拟游戏角色的第一朝向和游戏场景内虚拟镜头的第一朝向区域,其中,第一朝向为虚拟游戏角色在游戏场景内的初始朝向,第一朝向区域为第一朝向对应的初始朝向区域;
S2,响应于虚拟游戏角色从第一朝向变化至第二朝向,检测第二朝向是否超出第一朝向区域,其中,第二朝向为虚拟游戏角色在游戏场景内经过调整后的朝向;
S3,响应于第二朝向超出第一朝向区域,检测是否接收到对目标控件执行的触控操作,其中,目标控件为多个控件中的任一控件,目标控件用于控制虚拟游戏角色执行目标动作;
S4,响应对目标控件执行的触控操作,基于第二朝向调整虚拟镜头的朝向。
可选地,上述处理器还可以被设置为通过计算机程序执行以下步骤:获取虚拟镜头的第三朝向,其中,第三朝向为虚拟镜头在游戏场景内的初始朝向,且第三朝向与第一朝向相对应;基于以第三朝向为基准线的预设夹角范围,确定第一朝向区域。
可选地,上述处理器还可以被设置为通过计算机程序执行以下步骤:响应于第二朝向未超出第一朝向区域,控制虚拟镜头保持在第三朝向。
可选地,上述处理器还可以被设置为通过计算机程序执行以下步骤:响应在预设时长内接收到针对目标控件的触控操作,基于第二朝向调整虚拟镜头的朝向。
可选地,上述处理器还可以被设置为通过计算机程序执行以下步骤:基于第二朝向控制虚拟镜头按照预设速率进行转动,直至虚拟镜头转动至第四朝向,其中,第四朝向为虚拟镜头在游戏场景内经过调整后的朝向,且第四朝向与第二朝向相对应。
可选地,上述处理器还可以被设置为通过计算机程序执行以下步骤:基于以第四朝向为基准线的预设夹角范围,确定第二朝向区域,以基于第二朝向区域对第一朝向区域进行更新。
可选地,上述处理器还可以被设置为通过计算机程序执行以下步骤:在基于第二朝向调整虚拟镜头的朝向的过程中,虚拟镜头的高度保持不变。
可选地,上述处理器还可以被设置为通过计算机程序执行以下步骤:响应于在预设时长内未接收到对目标控件执行的触控操作,控制虚拟游戏角色保持在第二朝向,以及控制虚拟镜头保持在第一朝向区域。
在上述实施例的电子装置中,提供了一种实现调整虚拟镜头的方法的技术方案。首先获取虚拟游戏角色的第一朝向和游戏场景内虚拟镜头的第一朝向区域,其中,第一朝向为虚拟游戏角色在游戏场景内的初始朝向,第一朝向区域为第一朝向对应的初始朝向区域,进一步地,响应于虚拟游戏角色从第一朝向变化至第二朝向,检测第二朝向是否超出第一朝向区域,其中,第二朝向为虚拟游戏角色在游戏场景内经过调整后的朝向,进一步地,响应于第二朝向超出第一朝向区域,检测是否接收到对目标控件执行的触控操作,其中,目标控件为多个控件中的任一控件,目标控件用于控制虚拟游戏角色执行目标动作,在此基础上,响应对目标控件执行的触控操作,基于第二朝向调整虚拟镜头的朝向。由此,本公开提供的调整虚拟镜头的方法达到了基于虚拟游戏角色的朝向、虚拟镜头的朝向以及触控操作情况对虚拟镜头朝向进行自动地辅助牵引的目的,从而实现了降低游戏中虚拟镜头朝向控制难度、提升游戏体验的技术效果,进而解决了相关技术中虚拟镜头朝向调整操作难度大导致玩家游戏体验差的技术问题,并且降低了硬件设备的响应频率,有助于节约硬件设备的计算资源。
图10是根据本公开其中一实施例的一种电子装置的示意图。如图10所示,电子装置1000仅仅是一个示例,不应对本公开实施例的功能和使用范围带来任何限制。
如图10所示,电子装置1000以通用计算设备的形式表现。电子装置1000的组件可以包括但不限于:上述至少一个处理器1010、上述至少一个存储器1020、连接不同系统组件(包括存储器1020和处理器1010)的总线1030和显示器1040。
其中,上述存储器1020存储有程序代码,所述程序代码可以被处理器1010执行,使得处理器1010执行本公开实施例的上述方法部分中描述的根据本公开各种示例性实施方式的步骤。
存储器1020可以包括易失性存储单元形式的可读介质,例如随机存取存储单元(RAM)10201和/或高速缓存存储单元10202,还可以进一步包括只读存储单元(ROM)10203,还可包括非易失性存储器,如一个或者多个磁性存储装置、闪存、或者其他非易失性固态存储器。
在一些实例中,存储器1020还可以包括具有一组(至少一个)程序模块10205的程序/实用工具10204,这样的程序模块10205包括但不限于:操作系统、一个或者多个应用程序、其它程序模块以及程序数据,这些示例中的每一个或某种组合中可能包括网络环境的实现。存储器1020可进一步包括相对于处理器1010远程设置的存储器,这些远程存储器可以通过网络连接至电子装置1000。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
总线1030可以为表示几类总线结构中的一种或多种,包括存储单元总线或者存储单元控制器、外围总线、图形加速端口、处理器1010或者使用多种总线结构中的任意总线结构的局域总线。
显示器1040可以例如触摸屏式的液晶显示器(Liquid Crystal Display,LCD),该液晶显示器可使得用户能够与电子装置1000的用户界面进行交互。
可选地,电子装置1000也可以与一个或多个外部设备1100(例如键盘、指向设备、蓝牙设备等)通信,还可与一个或者多个使得用户能与该电子装置1000交互的设备通信,和/或与使得该电子装置1000能与一个或多个其它计算设备进行通信的任何设备(例如路由器、调制解调器等等)通信。这种通信可以通过输入/输出(I/O)接口1050进行。并且,电子装置1000还可以通过网络适配器1060与一个或者多个网络(例如局域网(Local Area Network,LAN),广域网(Wide Area Network,WAN)和/或公共网络,例如因特网)通信。如图10所示,网络适配器1060通过总线1030与电子装置1000的其它模块通信。应当明白,尽管图10中未示出,可以结合电子装置1000使用其它硬件和/或软件模块,可以包括但不限于:微代码、设备驱动器、冗余处理单元、外部磁盘驱动阵列、磁盘阵列(Redundant Arrays of Independent Disks,RAID)系统、磁带驱动器以及数据备份存储系统等。
上述电子装置1000还可以包括:键盘、光标控制设备(如鼠标)、输入/输出接口(I/O接口)、网络接口、电源和/或相机。
本领域普通技术人员可以理解,图10所示的结构仅为示意,其并不对上述电子装置的结构造成限定。例如,电子装置1000还可包括比图10中所示更多或者更少的组件,或者具有与图10所示不同的配置。存储器1020可用于存储计算机程序及对应的数据,如本公开实施例中的调整虚拟镜头的方法对应的计算机程序及对应的数据。处理器1010通过运行存储在存储器1020内的计算机程序,从而执行各种功能应用以及数据处理,即实现上述的调整虚拟镜头的方法。
上述本公开实施例序号仅仅为了描述,不代表实施例的优劣。
在本公开的上述实施例中,对各个实施例的描述都各有侧重,某个实施例中没有详述的部分,可以参见其他实施例的相关描述。
在本公开所提供的几个实施例中,应该理解到,所揭露的技术内容,可通过其它的方式实现。其中,以上所描述的装置实施例仅仅是示意性的,例如所述单元的划分,可以为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,单元或模块的间接耦合或通信连接,可以是电性或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本公开各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本公开的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可为个人计算机、服务器或者网络设备等)执行本公开各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、只读存储器(ROM)、随机存取存储器(RAM)、移动硬盘、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述仅是本公开的优选实施方式,应当指出,对于本技术领域的普通技术人员来说,在不脱离本公开原理的前提下,还可以做出若干改进和润饰,这些改进和润饰也应视为本公开的保护范围。

Claims (10)

  1. 一种调整虚拟镜头的方法,通过终端设备提供一图形用户界面,所述图形用户界面所显示的内容至少部分地包含一游戏场景和多个控件,所述游戏场景的显示内容至少部分地包含一虚拟游戏角色,所述方法包括:
    获取所述虚拟游戏角色的第一朝向和所述游戏场景内虚拟镜头的第一朝向区域,其中,所述第一朝向为所述虚拟游戏角色在所述游戏场景内的初始朝向,所述第一朝向区域为所述第一朝向对应的初始朝向区域;
    响应于所述虚拟游戏角色从所述第一朝向变化至第二朝向,检测所述第二朝向是否超出所述第一朝向区域,其中,所述第二朝向为所述虚拟游戏角色在所述游戏场景内经过调整后的朝向;
    响应于所述第二朝向超出所述第一朝向区域,检测是否接收到对目标控件执行的触控操作,其中,所述目标控件为所述多个控件中的任一控件,所述目标控件用于控制所述虚拟游戏角色执行目标动作;
    响应对所述目标控件执行的触控操作,基于所述第二朝向调整所述虚拟镜头的朝向。
  2. 根据权利要求1所述的方法,其中,获取所述游戏场景内所述虚拟镜头的所述第一朝向区域包括:
    获取所述虚拟镜头的第三朝向,其中,所述第三朝向为所述虚拟镜头在所述游戏场景内的初始朝向,且所述第三朝向与所述第一朝向相对应;
    基于以所述第三朝向为基准线的预设夹角范围,确定所述第一朝向区域。
  3. 根据权利要求2所述的方法,其中,所述方法还包括:
    响应于所述第二朝向未超出所述第一朝向区域,控制所述虚拟镜头保持在所述第三朝向。
  4. 根据权利要求1所述的方法,其中,响应对所述目标控件执行的触控操作,基于所述第二朝向调整所述虚拟镜头的朝向,包括:
    响应在预设时长内接收到针对所述目标控件的触控操作,基于所述第二朝向调整所述虚拟镜头的朝向。
  5. 根据权利要求1所述的方法,其中,基于所述第二朝向调整所述虚拟镜头的朝向包括:
    基于所述第二朝向控制所述虚拟镜头按照预设速率进行转动,直至所述虚拟镜头转动至第四朝向,其中,所述第四朝向为所述虚拟镜头在所述游戏场景内经过调整后的朝向,且所述第四朝向与所述第二朝向相对应。
  6. 根据权利要求5所述的方法,其中,所述方法还包括:
    基于以所述第四朝向为基准线的预设夹角范围,确定第二朝向区域,以基于所述第二朝向区域对所述第一朝向区域进行更新。
  7. 根据权利要求2所述的方法,其中,在基于所述第二朝向调整所述虚拟镜头的朝向的过程中,所述虚拟镜头的高度保持不变。
  8. 一种调整虚拟镜头的装置,通过终端设备提供一图形用户界面,所述图形用户界面所显示的内容至少部分地包含一游戏场景和多个控件,所述游戏场景的显示内容至少部分地包含一虚拟游戏角色,所述装置包括:
    获取模块,用于获取所述虚拟游戏角色的第一朝向和所述游戏场景内虚拟镜头的第一朝向区域,其中,所述第一朝向为所述虚拟游戏角色在所述游戏场景内的初始朝向,所述第一朝向区域为所述第一朝向对应的初始朝向区域;
    第一检测模块,用于响应于所述虚拟游戏角色从所述第一朝向变化至第二朝向,检测所述第二朝向是否超出所述第一朝向区域,其中,所述第二朝向为所述虚拟游戏角色在所述游戏场景内经过调整后的朝向;
    第二检测模块,用于响应于所述第二朝向超出所述第一朝向区域,检测是否接收到对目标控件执行的触控操作,其中,所述目标控件为所述多个控件中的任一控件,所述目标控件用于控制所述虚拟游戏角色执行目标动作;
    调整模块,用于响应对所述目标控件执行的触控操作,基于所述第二朝向调整所述虚拟镜头的朝向。
  9. 一种计算机可读存储介质,所述计算机可读存储介质中存储有计算机程序,其中,所述计算机程序被设置为被处理器运行时执行权利要求1至7任一项中所述的调整虚拟镜头的方法。
  10. 一种电子装置,包括存储器和处理器,所述存储器中存储有计算机程序,所述处理器被设置为运行所述计算机程序以执行权利要求1至7任一项中所述的调整虚拟镜头的方法。
PCT/CN2023/117086 2023-04-26 2023-09-05 调整虚拟镜头的方法、装置、存储介质及电子装置 WO2024221693A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202310494541.3A CN116549966A (zh) 2023-04-26 2023-04-26 调整虚拟镜头的方法、装置、存储介质及电子装置
CN202310494541.3 2023-04-26

Publications (1)

Publication Number Publication Date
WO2024221693A1 true WO2024221693A1 (zh) 2024-10-31

Family

ID=87490970

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/117086 WO2024221693A1 (zh) 2023-04-26 2023-09-05 调整虚拟镜头的方法、装置、存储介质及电子装置

Country Status (2)

Country Link
CN (1) CN116549966A (zh)
WO (1) WO2024221693A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116549966A (zh) * 2023-04-26 2023-08-08 网易(杭州)网络有限公司 调整虚拟镜头的方法、装置、存储介质及电子装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110215690A (zh) * 2019-07-11 2019-09-10 网易(杭州)网络有限公司 游戏场景中的视角切换方法、装置及电子设备
CN112791404A (zh) * 2021-01-12 2021-05-14 网易(杭州)网络有限公司 游戏中虚拟对象的控制方法、装置以及触控终端
CN112933592A (zh) * 2021-01-26 2021-06-11 网易(杭州)网络有限公司 游戏中的信息处理方法、装置、电子设备及存储介质
CN116549966A (zh) * 2023-04-26 2023-08-08 网易(杭州)网络有限公司 调整虚拟镜头的方法、装置、存储介质及电子装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110215690A (zh) * 2019-07-11 2019-09-10 网易(杭州)网络有限公司 游戏场景中的视角切换方法、装置及电子设备
CN112791404A (zh) * 2021-01-12 2021-05-14 网易(杭州)网络有限公司 游戏中虚拟对象的控制方法、装置以及触控终端
CN112933592A (zh) * 2021-01-26 2021-06-11 网易(杭州)网络有限公司 游戏中的信息处理方法、装置、电子设备及存储介质
CN116549966A (zh) * 2023-04-26 2023-08-08 网易(杭州)网络有限公司 调整虚拟镜头的方法、装置、存储介质及电子装置

Also Published As

Publication number Publication date
CN116549966A (zh) 2023-08-08

Similar Documents

Publication Publication Date Title
US11623142B2 (en) Data processing method and mobile terminal
JP7498362B2 (ja) ゲームにおける仮想オブジェクトの移動の制御方法、装置、電子デバイス及び記憶媒体
US20220334716A1 (en) Adaptive display method and apparatus for virtual scene, electronic device, storage medium, and computer program product
JP2024105265A (ja) 仮想コンソールゲーム用コントローラ
CN113908550A (zh) 虚拟角色控制方法、非易失性存储介质及电子装置
WO2024221693A1 (zh) 调整虚拟镜头的方法、装置、存储介质及电子装置
WO2022156629A1 (zh) 虚拟对象的控制方法、装置、电子设备、存储介质及计算机程序产品
JP2017153772A (ja) 情報処理装置およびゲームプログラム
WO2024007675A1 (zh) 虚拟对象的切换方法、装置、存储介质及电子装置
Pelegrino et al. Creating and designing customized and dynamic game interfaces using smartphones and touchscreen
WO2024001191A1 (zh) 游戏中的操作方法、装置、非易失性存储介质和电子装置
WO2023065949A1 (zh) 虚拟场景中的对象控制方法、装置、终端设备、计算机可读存储介质、计算机程序产品
JP2024026661A (ja) 仮想オブジェクトの制御方法、装置、端末及びコンピュータプログラム
CN114404932A (zh) 技能释放控制方法、装置、存储介质及电子装置
CN113680062A (zh) 一种游戏中的信息查看方法及装置
WO2023002907A1 (ja) 情報処理システム、プログラム及び情報処理方法
JP7286857B2 (ja) 情報処理システム、プログラム及び情報処理方法
JP7163526B1 (ja) 情報処理システム、プログラム及び情報処理方法
CN114504812A (zh) 虚拟角色控制方法及装置
CN115721933A (zh) 信息处理方法、装置、存储介质和处理器
CN115089968A (zh) 一种游戏中的操作引导方法、装置、电子设备及存储介质
CN115105832A (zh) 控制视图显示的方法、装置、存储介质及电子装置
CN116570909A (zh) 信息交互方法、装置、存储介质和电子装置
CN117112094A (zh) 控件交互方法、装置、存储介质及电子装置
WO2024228824A1 (en) Systems and methods for enabling communication between users