[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN113713383B - Throwing prop control method, throwing prop control device, computer equipment and storage medium - Google Patents

Throwing prop control method, throwing prop control device, computer equipment and storage medium Download PDF

Info

Publication number
CN113713383B
CN113713383B CN202111060411.6A CN202111060411A CN113713383B CN 113713383 B CN113713383 B CN 113713383B CN 202111060411 A CN202111060411 A CN 202111060411A CN 113713383 B CN113713383 B CN 113713383B
Authority
CN
China
Prior art keywords
throwing
virtual
scene
prop
virtual object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111060411.6A
Other languages
Chinese (zh)
Other versions
CN113713383A (en
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202111060411.6A priority Critical patent/CN113713383B/en
Publication of CN113713383A publication Critical patent/CN113713383A/en
Application granted granted Critical
Publication of CN113713383B publication Critical patent/CN113713383B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a throwing prop control method, a throwing prop control device, computer equipment and a storage medium, and belongs to the technical field of virtual scenes. The method comprises the following steps: displaying a virtual scene interface; displaying a first scene picture in the virtual scene interface, wherein the first scene picture comprises at least two throwing props; at least two of the throwing prop are within reach of the first virtual object; in response to receiving a throwing operation on the throwing prop, a second scene screen is displayed in the virtual scene interface, the second scene screen being a screen of the first virtual object throwing at least one of the throwing props. The scheme ensures the action authenticity of the virtual object and improves the man-machine interaction efficiency when the user controls throwing props.

Description

Throwing prop control method, throwing prop control device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of virtual scenes, and in particular, to a method and apparatus for controlling a throwing prop, a computer device, and a storage medium.
Background
Currently, in some game-type applications, such as in first person shooter-type games, a throwing prop, or virtual prop known as throwing is often provided.
In the related art, in order to simulate the situation that a virtual object in a virtual scene uses a throwing prop as truly as possible, a series of limb actions between the virtual object taking out the throwing prop and throwing the throwing prop are generally designed. The virtual object performs the limb action once every time the user controls the virtual object to throw the throwing prop once.
However, the series of limb movements from the removal of the throwing prop to the throwing of the throwing prop generally takes a long time, so that the time interval between two consecutive uses of the throwing prop by the virtual object is long, thereby affecting the man-machine interaction efficiency when the throwing prop is used by the user.
Disclosure of Invention
The embodiment of the application provides a throwing prop control method, a throwing prop control device, computer equipment and a storage medium, which can improve the man-machine interaction efficiency of a user when the throwing prop is used. The technical scheme is as follows:
in one aspect, an embodiment of the present application provides a throwing prop control method, including:
displaying a virtual scene interface, wherein the virtual scene interface is used for displaying a scene picture of a virtual scene, and the virtual scene comprises a first virtual object;
Displaying a first scene picture in the virtual scene interface, wherein the first scene picture comprises at least two throwing props; at least two of the throwing prop are within reach of the first virtual object;
in response to receiving a throwing operation on the throwing prop, a second scene screen is displayed in the virtual scene interface, the second scene screen being a screen of the first virtual object throwing at least one of the throwing props.
In another aspect, an embodiment of the present application provides a throwing prop control device, the device comprising:
the interface display module is used for displaying a virtual scene interface, wherein the virtual scene interface is used for displaying a scene picture of a virtual scene, and the virtual scene comprises a first virtual object;
the first picture display module is used for displaying a first scene picture in the virtual scene interface, wherein the first scene picture comprises at least two throwing props; at least two of the throwing prop are within reach of the first virtual object;
and the second scene display module is used for displaying a second scene in the virtual scene interface in response to receiving a throwing operation of the throwing prop, wherein the second scene is the scene of the first virtual object throwing at least one throwing prop.
In one possible implementation, the first scene frame is a frame in which at least two of the throwing prop are suspended within reach of the hand of the first virtual object;
the second scene picture is an animation picture of the first virtual object throwing at least one throwing prop through a hand.
In one possible implementation, the second frame presentation module is configured to,
responding to receiving throwing operation of the throwing prop, and acquiring an operation mode of the throwing operation;
and displaying the second scene picture in the virtual scene interface based on the operation mode.
In one possible implementation, the second frame presentation module is configured to,
and responding to the operation mode to be a clicking operation, and displaying a picture of throwing a single throwing prop by the first virtual object in the virtual scene interface.
In one possible implementation, the second frame presentation module is configured to,
and responding to the operation mode of double-click operation or sliding operation, and displaying the picture of continuously throwing the throwing prop by the first virtual object in the virtual scene interface.
In one possible implementation, the second frame presentation module is configured to,
and responding to the operation mode of long-press operation or sliding operation, and displaying a picture of continuously throwing the throwing prop by the first virtual object in the operation duration of the throwing operation (namely the long-press operation or the sliding operation) in the virtual scene interface.
In one possible implementation, the apparatus further includes:
and the prop adding module is used for adding the throwing props in the reach range of the first virtual object in response to the first condition.
In one possible implementation, the prop adding module is configured to add the throwing prop within reach of the first virtual object in response to the throwing prop hitting a target object after being thrown.
In one possible implementation, the first scene display module is configured to display the first scene in the virtual scene interface in response to receiving an operation to release a target skill.
In one possible implementation, the target skill has a duration;
the second scene display module is configured to display a second scene in the virtual scene interface in response to receiving a throwing operation on the throwing prop within a duration of the target skill.
In one possible implementation, the apparatus further includes:
a time increasing module for increasing the duration of the target skill in response to a second condition being met.
In one possible implementation, the time increasing module is configured to increase the duration of the target skill in response to the number of target objects hit by the throwing prop reaching a number threshold within the duration of the target skill.
In one possible implementation, the apparatus further includes:
and the timing information display module is used for displaying timing information in the virtual scene interface within the duration of the target skill, and the timing information is used for indicating the remaining duration of the target skill.
In another aspect, embodiments of the present application provide a computer device including a processor and a memory having at least one computer instruction stored therein, the at least one computer instruction loaded and executed by the processor to implement a throwing prop control method as described in the above aspects.
In another aspect, embodiments of the present application provide a computer-readable storage medium having stored therein at least one computer instruction that is loaded and executed by a processor to implement a throwing prop control method as described in the above aspects.
In another aspect, a computer program product or computer program is provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The computer instructions are read from a computer-readable storage medium by a processor of a computer device, and executed by the processor, cause the computer device to perform the throwing prop control method provided in various alternative implementations of the above aspects.
The beneficial effects of the technical scheme provided by the embodiment of the application at least comprise:
before throwing the props, placing at least two throwing props in the reach of the virtual object controlled by the user, so that when the user triggers throwing operation, the virtual object can throw one or more throwing props through continuous throwing actions, thereby reducing actions of the virtual object in the throwing control process of continuously throwing the props, reducing continuous throwing time intervals under the condition that the throwing actions of the virtual object are natural enough and close to reality, ensuring the action authenticity of the virtual object, improving man-machine interaction efficiency when the user controls throwing the props, further reducing duration of single-unit fight, and saving electric quantity and data flow consumed by a terminal.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
FIG. 1 is a schematic illustration of an implementation environment provided by an exemplary embodiment of the present application;
FIG. 2 is a schematic diagram of a display interface of a virtual scene provided in an exemplary embodiment of the present application;
FIG. 3 is a flow chart of a throwing prop control method provided in an exemplary embodiment of the present application;
FIG. 4 is a schematic illustration of a throwing prop according to the embodiment of FIG. 3;
FIG. 5 is a flow chart illustrating a method of throwing prop control according to an exemplary embodiment of the present application;
FIG. 6 is a schematic illustration of a throwing prop trigger according to the embodiment of FIG. 5;
FIG. 7 is a schematic illustration of a prop throwing in accordance with the embodiment of FIG. 5;
FIG. 8 is a schematic illustration of a remaining duration presentation involved in the embodiment of FIG. 5;
FIG. 9 is a control flow diagram of a projectile weapon shown in an exemplary embodiment of the present application;
FIG. 10 is a block diagram of a throwing prop control device according to an exemplary embodiment of the present application;
FIG. 11 is a block diagram of a computer device provided in an exemplary embodiment of the present application;
Fig. 12 is a block diagram of a computer device according to an exemplary embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims.
It should be understood that references herein to "a number" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship.
The embodiment of the application provides a throwing prop control method, which can enable the aiming direction of a target virtual prop to be controllable in a continuous shooting process. For ease of understanding, several terms referred to in this application are explained below.
1) Virtual scene
A virtual scene is a virtual scene that an application program displays (or provides) while running on a terminal. The virtual scene can be a simulation environment scene of a real world, a half-simulation half-fictional three-dimensional environment scene, or a pure fictional three-dimensional environment scene. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, and a three-dimensional virtual scene, and the following embodiments are exemplified by the virtual scene being a three-dimensional virtual scene, but are not limited thereto. Optionally, the virtual scene may also be used for virtual scene fight between at least two virtual characters. Optionally, the virtual scene may also be used to fight between at least two virtual characters using a virtual firearm. Optionally, the virtual scene may be further operable to use the virtual firearm to fight between at least two virtual characters within a target area range that is continuously smaller over time in the virtual scene.
Virtual scenes are typically presented by application generation in a computer device such as a terminal based on hardware (such as a screen) in the terminal. The terminal can be a mobile terminal such as a smart phone, a tablet computer or an electronic book reader; alternatively, the terminal may be a notebook computer or a personal computer device of a stationary computer.
2) Virtual object
Virtual objects refer to movable objects in a virtual scene. The movable object may be at least one of a virtual character, a virtual animal, a virtual vehicle. Alternatively, when the virtual scene is a three-dimensional virtual scene, the virtual object is a three-dimensional stereoscopic model created based on an animated skeleton technique. Each virtual object has its own shape, volume, and orientation in the three-dimensional virtual scene and occupies a portion of the space in the three-dimensional virtual scene.
3) Virtual prop
The virtual props refer to props which can be used by virtual objects in a virtual environment, and comprise virtual weapons such as handguns, rifles, sniper guns, daggers, knives, swords, axes and the like which can initiate injuries to other virtual objects, replenishing props such as bullets, quick clips, sighting telescope, silencers and the like which are installed on appointed virtual weapons, virtual pendants with added partial attributes for the virtual weapons, and defending props such as shields, armor, armored vehicles and the like.
In the embodiment of the application, the virtual props include throwing props, such as virtual flying cutters, virtual hatches, virtual mines, virtual flash bullets and the like.
4) First person shooting game
The first-person shooting game is a shooting game in which a user can play at a first-person viewing angle, and a screen of a virtual environment in the game is a screen in which the virtual environment is observed at a viewing angle of a first virtual object. In the game, at least two virtual objects perform a single-play fight mode in the virtual environment, the virtual objects achieve the purpose of survival in the virtual environment by avoiding injuries initiated by other virtual objects and dangers (such as poison gas rings, marshes and the like) existing in the virtual environment, when the life value of the virtual objects in the virtual environment is zero, the life of the virtual objects in the virtual environment is ended, and finally the virtual objects surviving in the virtual environment are winners. Optionally, the fight may take a time when the first client joins the fight as a start time and a time when the last client exits the fight as an end time, and each client may control one or more virtual objects in the virtual environment. Alternatively, the competitive mode of the fight may include a single fight mode, a two-person team fight mode, or a multi-person team fight mode, which is not limited in the embodiments of the present application.
FIG. 1 illustrates a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application. The implementation environment may include: a first terminal 110, a server 120, and a second terminal 130.
The first terminal 110 is installed and operated with an application 111 supporting a virtual environment, and the application 111 may be a multi-person online fight program. When the first terminal runs the application 111, a user interface of the application 111 is displayed on the screen of the first terminal 110. The application 111 may be any one of a multiplayer online tactical Game (Multiplayer Online Battle Arena Games, MOBA), a large fleeing shooting Game, and a simulated strategy Game (SLG). In the present embodiment, the application 111 is exemplified as an FPS (First Person Shooting Game, first person shooter game). The first terminal 110 is a terminal used by the first user 112, and the first user 112 uses the first terminal 110 to control a first virtual object located in the virtual environment to perform activities, where the first virtual object may be referred to as a master virtual object of the first user 112. The activities of the first virtual object include, but are not limited to: adjusting at least one of body posture, crawling, walking, running, riding, flying, jumping, driving, picking up, shooting, attacking, throwing, releasing skills. Illustratively, the first virtual object is a first virtual character, such as an emulated character or a cartoon character.
The second terminal 130 is installed and operated with an application 131 supporting a virtual environment, and the application 131 may be a multi-person online fight program. When the second terminal 130 runs the application 131, a user interface of the application 131 is displayed on a screen of the second terminal 130. The client may be any of a MOBA game, a fleeing game, a SLG game, in this embodiment illustrated by the application 131 being a FPS game. The second terminal 130 is a terminal used by the second user 132, and the second user 132 uses the second terminal 130 to control a second virtual object located in the virtual environment to perform activities, and the second virtual object may be referred to as a master virtual character of the second user 132. Illustratively, the second virtual object is a second virtual character, such as an emulated character or a cartoon character.
Optionally, the first virtual object and the second virtual object are in the same virtual world. Optionally, the first virtual object and the second virtual object may belong to the same camp, the same team, the same organization, have a friend relationship, or have temporary communication rights. Alternatively, the first virtual object and the second virtual object may belong to different camps, different teams, different organizations, or have hostile relationships.
Alternatively, the applications installed on the first terminal 110 and the second terminal 130 are the same, or the applications installed on the two terminals are the same type of application on different operating system platforms (android or IOS). The first terminal 110 may refer broadly to one of the plurality of terminals and the second terminal 130 may refer broadly to another of the plurality of terminals, the present embodiment being illustrated with only the first terminal 110 and the second terminal 130. The device types of the first terminal 110 and the second terminal 130 are the same or different, and the device types include: at least one of a smart phone, a tablet computer, an electronic book reader, an MP3 player, an MP4 player, a laptop portable computer, and a desktop computer.
Only two terminals are shown in fig. 1, but in different embodiments there are a number of other terminals that can access the server 120. Optionally, there is one or more terminals corresponding to the developer, on which a development and editing platform for supporting the application program of the virtual environment is installed, the developer may edit and update the application program on the terminal, and transmit the updated application program installation package to the server 120 through a wired or wireless network, and the first terminal 110 and the second terminal 130 may download the application program installation package from the server 120 to implement the update of the application program.
The first terminal 110, the second terminal 130, and other terminals are connected to the server 120 through a wireless network or a wired network.
The server 120 includes at least one of a server, a server cluster formed by a plurality of servers, a cloud computing platform and a virtualization center. The server 120 is used to provide background services for applications supporting a three-dimensional virtual environment. Optionally, the server 120 takes on primary computing work and the terminal takes on secondary computing work; alternatively, the server 120 takes on secondary computing work and the terminal takes on primary computing work; alternatively, a distributed computing architecture is used for collaborative computing between the server 120 and the terminals.
In one illustrative example, server 120 includes memory 121, processor 122, user account database 123, combat service module 124, and user-oriented Input/Output Interface (I/O Interface) 125. Wherein the processor 122 is configured to load instructions stored in the server 120, process data in the user account database 123 and the combat service module 124; the user account database 123 is configured to store data of user accounts used by the first terminal 110, the second terminal 130, and other terminals, such as an avatar of the user account, a nickname of the user account, and a combat index of the user account, where the user account is located; the combat service module 124 is configured to provide a plurality of combat rooms for users to combat, such as 1V1 combat, 3V3 combat, 5V5 combat, etc.; the user-oriented I/O interface 125 is used to establish communication exchanges of data with the first terminal 110 and/or the second terminal 130 via a wireless network or a wired network.
The virtual scene may be a three-dimensional virtual scene, or the virtual scene may be a two-dimensional virtual scene. Taking an example that the virtual scene is a three-dimensional virtual scene, please refer to fig. 2, which illustrates a schematic diagram of a display interface of the virtual scene provided in an exemplary embodiment of the present application. As shown in fig. 2, the display interface of the virtual scene includes a scene screen 200, and the scene screen 200 includes a virtual object 210 currently controlled, an environment screen 220 of the three-dimensional virtual scene, and a virtual object 240. Wherein, the virtual object 240 may be a virtual object controlled by a corresponding user of other terminals or a virtual object controlled by an application program.
In fig. 2, the currently controlled virtual object 210 and the virtual object 240 are three-dimensional models in a three-dimensional virtual scene, and an environmental screen of the three-dimensional virtual scene displayed in the scene screen 200 is an object observed from a perspective of the currently controlled virtual object 210, and as illustrated in fig. 2, an environmental screen 220 of the three-dimensional virtual scene displayed under the perspective of the currently controlled virtual object 210 is the ground 224, the sky 225, the horizon 223, the hill 221, and the factory building 222, for example.
The currently controlled virtual object 210 may perform skill release or use of a virtual prop under control of a user, move and perform a specified action, and the virtual object in the virtual scene may exhibit different three-dimensional models under control of the user, for example, a screen of the terminal supports touch operation, and the scene picture 200 of the virtual scene includes a virtual control, so when the user touches the virtual control, the currently controlled virtual object 210 may perform the specified action in the virtual scene and exhibit the currently corresponding three-dimensional model.
Fig. 3 shows a flowchart of a throwing prop control method provided in an exemplary embodiment of the present application. The throwing prop control method can be executed by computer equipment, the computer equipment can be a terminal, a server or the computer equipment can also comprise the terminal and the server. As shown in fig. 3, the throwing prop control method includes:
step 310, displaying a virtual scene interface, where the virtual scene interface is used to display a scene picture of a virtual scene, and the virtual scene includes a first virtual object.
In this embodiment of the present application, the first virtual object may be a virtual object controlled by a terminal that exposes the virtual scene interface. Wherein the first virtual object has the ability to use a throwing prop in the virtual scene.
Step 320, displaying a first scene image in the virtual scene interface, wherein the first scene image comprises at least two throwing props; at least two throwing props are within reach of the first virtual object.
The above-mentioned reach of the first virtual object may refer to a range in which the first virtual object directly interacts without moving. For example, the above-mentioned reach is a range in which the first virtual object directly interacts through the limb without moving. That is, the first virtual object may interact directly with the virtual prop within reach through a limb (such as a hand or foot, etc.).
In an embodiment of the present application, the first virtual object may have a plurality of throwing props, and when the plurality of throwing props are used, the plurality of throwing props may be simultaneously placed in the virtual scene within an reach corresponding to the first virtual object.
In one possible implementation of the embodiments of the present application, the user may trigger placement of at least two throwing props within reach of the first virtual object by a specified operation.
The throwing prop refers to a virtual prop which presets a flight track or a drop point position before being thrown and acts after being thrown. Throwing props include, but are not limited to, virtual flying knives, virtual hatchets, virtual mines, virtual flash bombs, virtual smoke bombs, virtual mines, virtual replenishment packs, and the like.
For example, please refer to fig. 4, which illustrates a throwing prop presentation schematic according to an embodiment of the present application. As shown in fig. 4, when a first virtual object is ready to use throwing props, at least two throwing props 41 may be presented in the virtual scene interface with the positions of the at least two throwing props 41 in the virtual scene within reach of the first virtual object, e.g., in fig. 4, at least two throwing props are queued in front of and proximate to the first virtual object.
Referring to the interface shown in fig. 4, when a first virtual object can use a plurality of throwing props, the plurality of throwing props can be taken out by the first virtual object before the user controls the first virtual object to throw the throwing props, and simultaneously placed in the accessible position of the first virtual object, without taking out the next throwing prop after one throwing prop is thrown out.
In response to receiving the throwing operation of the throwing prop, a second scene screen is presented in the virtual scene interface, the second scene screen being a screen of the first virtual object throwing at least one throwing prop.
In embodiments of the present application, when the user determines to throw one or more throwing props, a throwing operation may be triggered, at which point the computer device may control the first virtual object to throw one or more throwing props that are already within reach thereof. When the first virtual object continuously throws two or more throwing props, only the throwing action is required to be continuously executed, and the action of taking out the throwing props is not required to be executed, so that the action amplitude of the first virtual object when continuously throwing the throwing props is reduced, the action in the process of throwing the virtual props by the first virtual object can be as natural and close to reality as possible, and the interval between the two throwing props can be as short as possible when the first virtual object continuously throws the virtual props.
For example, taking the interface shown in fig. 4 as an example, after the computer device receives the throwing operation of the user on the throwing prop, the computer device may control the first virtual object to throw out one throwing prop of the displayed multiple throwing props, at this time, the first virtual object performs a throwing action, and then, when the next throwing prop needs to be continuously thrown, the computer device controls the first virtual object to perform a throwing action again, so as to throw out the other throwing prop, where, because the two throwing props that are thrown out are already shown in the reach of the first virtual object in advance, between the two adjacent throwing props, the first virtual object does not need to perform an extra action of taking out the other throwing prop, thereby reducing the actions in the continuous throwing process of the first virtual object, and under the condition that the throwing action of the first virtual object is ensured to be sufficiently natural and sufficiently close to reality, the time interval of continuous throwing can be greatly reduced.
In summary, according to the throwing prop control scheme provided by the embodiment of the application, before throwing props, at least two throwing props are placed in the reach of a virtual object controlled by a user, so that when the user triggers throwing operation, the virtual object can throw one or more throwing props through continuous throwing actions, actions of the virtual object in the throwing control process of continuously throwing props are reduced, the time interval of continuous throwing is reduced under the condition that the throwing actions of the virtual object are natural enough and close to reality is ensured, the action authenticity of the virtual object is ensured, the man-machine interaction efficiency of the user when controlling the throwing props is improved, the duration of single-office fight is further reduced, and the electric quantity and the data flow consumed by a terminal are saved.
Fig. 5 shows a flowchart of a throwing prop control method according to an exemplary embodiment of the present application. The virtual object control method may be performed by a computer device, which may be a terminal, a server, or the computer device may include the terminal and the server. As shown in fig. 5, the throwing prop control method includes:
step 501, displaying a virtual scene interface, where the virtual scene interface is used to display a scene picture of a virtual scene, and the virtual scene includes a first virtual object.
In this embodiment of the present application, after a user opens an application program (such as an application program of a shooting game class) corresponding to a virtual scene in a terminal and triggers to enter the virtual scene, a computer device may display a virtual scene interface in the terminal through the application program.
In one possible implementation, the virtual scene interface may further include various operation controls besides the scene images of the virtual scene, where the operation controls may be used to control the virtual scene, such as control the first virtual object to act (e.g., throw, move, shoot, interact, etc.) in the virtual scene, turn on or off the thumbnail map of the virtual scene, and exit the virtual scene, etc.
Step 502, displaying a first scene picture in a virtual scene interface, wherein the first scene picture comprises at least two throwing props; at least two throwing props are within reach of the first virtual object.
In one possible implementation, in response to receiving an operation to release the target skills, the computer device presents a first scene screen in the virtual scene interface.
In the embodiment of the application, the first scene picture can be triggered to be displayed by a triggering operation of a user. For example, the user may release the target skills to trigger placement of at least two throwing props within reach of the first virtual object.
The target skill may be a skill of the first virtual object, or the target skill may be a skill of the first virtual object after acquiring or equipping the target virtual prop (for example, acquiring a specific chip prop, or acquiring a specific weapon prop, etc.).
For example, please refer to fig. 6, which illustrates a throwing prop triggering schematic diagram according to an embodiment of the present application. As shown in fig. 6, when the first virtual object has the target skill, a skill release control 62 corresponding to the target skill may be displayed in the virtual scene interface 61, and after the user clicks the skill release control 62, the virtual scene interface 61 may switch a screen displaying that a plurality of throwing props (such as virtual flyers) appear in an accessible range near the first virtual object, where the screen displaying the plurality of throwing props in the accessible range near the first virtual object may refer to fig. 4 above.
In one possible implementation, when the first scene is displayed by the target skill trigger, the target skill may have a certain cooling time or charging time, that is, after the user places at least two throwing objects within reach of the first virtual object by the target skill trigger display, it is necessary to wait a certain period of time (the period of time may be preconfigured by a developer) before placing at least two throwing objects within reach of the first virtual object again. For example, taking fig. 6 as an example, after the user has triggered the skill release control 62, the computer device places at least two throwing props within reach of the first virtual object while setting the skill release control 62 to an inoperable state (or directly canceling the display of the skill release control 62); after a period of time has elapsed, the skill release control 62 may reenter the operational state (or the skill release control 62 may be displayed again).
In another possible implementation, in addition to limiting the target skills by cooling time or charging time, the use of the target skills may be limited by props or resources. For example, as also shown in FIG. 6, after the user triggers the skill release control 62, the computer device sets the skill release control 62 to an inoperable state (or directly cancels the display of the skill release control 62); thereafter, when a user acquires a particular prop, or after a specified number of resources are collected, the computer device may be triggered to control the skill release control 62 to reenter an operational state, or to display the skill release control 62 again.
In embodiments of the present application, the manner of throwing at least two throwing props within reach of the first virtual object may be determined by the manner of operation of the throwing operation. The process of controlling the manner of throwing the throwing prop by the manner of operation of the throwing operation may refer to the subsequent steps.
In response to receiving a throwing operation on a throwing prop, an operation mode of the throwing operation is acquired, step 503.
In the embodiment of the application, the throwing action of the throwing prop can be triggered by throwing operations of different operation modes.
Optionally, the triggering manner of the throwing operation may include clicking, double clicking, sliding or long pressing, and the embodiment of the application is not limited to the triggering manner of the throwing operation.
Step 504, based on the operation mode, displaying a second scene picture in the virtual scene interface; the second scene view is a view of the first virtual object throwing at least one throwing prop.
In this embodiment of the present application, when the first virtual object throws the throwing prop, the throwing prop may be thrown in a direction described by the current sight.
In the embodiment of the present application, when the triggering modes of the throwing operation include multiple modes, the multiple modes may trigger different throwing modes.
In one possible implementation manner, the process of displaying the second scene picture in the virtual scene interface based on the operation manner may include:
and responding to the operation mode of clicking operation, and displaying a picture of throwing a single throwing prop by the first virtual object in the virtual scene interface.
In embodiments of the present application, a user may trigger throwing a single throwing prop by a single click operation. For example, a user may trigger throwing a single throwing prop by clicking on a throwing control presented in the virtual scene interface, or may trigger throwing a single throwing prop by clicking on an area of the virtual scene interface where the control is not presented. That is, after the computer device detects that the user clicks the throwing control or the blank area (i.e. the area without the control), the computer device controls the first virtual object to throw one throwing prop within the above-mentioned reach, so that quick single-shot throwing of the throwing prop is realized.
For example, please refer to fig. 7, which illustrates a prop throwing schematic diagram according to an embodiment of the present application. As shown in fig. 7, after the user clicks a throwing control or blank area in the virtual scene interface 71, the first virtual object throws out a throwing prop 73 within reach through the hand 72, in the direction of the current sight aiming.
In one possible implementation manner, the process of displaying the second scene picture in the virtual scene interface based on the operation manner may include:
and responding to the operation mode of double-click operation or sliding operation, and displaying the picture of continuously throwing the throwing prop by the first virtual object in the virtual scene interface.
In embodiments of the present application, multiple throwing props within the reach described above may also indicate a continuous throwing. For example, a user may trigger continuous throwing of a throwing prop within the reach described above by double clicking a throwing control presented in the virtual scene interface, or by a sliding operation (e.g., a sliding up operation, a sliding down operation, etc.) performed from the throwing control position; alternatively, the user may trigger the continuous throwing of the throwing prop by double-clicking on an area of the virtual scene interface where the control is not shown, or by a sliding operation performed from the area where the control is not shown.
For example, after the computer device detects that the user double clicks the throwing control or the blank area (i.e. the area without the control), the first virtual object is controlled to throw a plurality of throwing props within the reach range, and the throwing props are thrown continuously at a certain time interval. For example, there are 5 throwing props within the reach, and when the computer device detects that the user double clicks the throwing control, the first virtual object is controlled to throw the 5 throwing props one by one at a smaller time interval.
For example, taking fig. 7 as an example, after the user double-clicks the throwing control or the blank area in the virtual scene interface 71, the first virtual object firstly throws the throwing control 73 in the reachable range through the hand 72 toward the direction aimed by the current sight, then, without further operation of the user, can throw the next throwing control 74, and so on until no throwing prop exists in the reachable range.
In one possible implementation manner, the process of displaying the second scene picture in the virtual scene interface based on the operation manner may include:
and responding to the long-press operation or the sliding operation in the operation mode, and displaying the picture of continuously throwing the throwing prop by the first virtual object in the operation duration of the throwing operation (namely the long-press operation or the sliding operation) in the virtual scene interface.
In the embodiment of the present application, the number of the throwing props that the first virtual object continuously throws may also be controlled by the throwing operation. For example, when the throwing operation is a long press operation or a sliding operation, the computer apparatus may control the throwing prop within the reach of the continuous throwing of the first virtual object for the duration of the long press operation or the sliding operation.
For example, taking fig. 7 as an example, after the user presses the throwing control or the blank area in the virtual scene interface 71 for a long time, the first virtual object firstly throws the throwing control 73 in the reachable range through the hand 72 toward the direction aimed by the current sight, then the next throwing control 74 can be thrown without further operation of the user, and so on, the long-press operation is finished, or no throwing prop exists in the reachable range.
In one possible implementation, the first scene frame is a frame in which at least two throwing prop are hovering within reach of a hand of the first virtual object.
In an embodiment of the present application, when the first virtual object is a virtual object having a hand, for example, a humanoid virtual object, the computer device may set the at least two throwing props in suspension within reach of the hand of the first virtual object.
The hand accessibility refers to a space that the first virtual object can reach through hand motions without moving. When at least two throwing props are arranged in suspension within the reach of the hand of the first virtual object, the first virtual object can throw the throwing props within the reach through the action of the hand.
In one possible implementation, when the first scene is a scene in which at least two throwing props are hovering within the hand reach of the first virtual object, the second scene is an animated scene in which the first virtual object throws at least one throwing prop by hand.
Wherein, when the computer device hovers at least two throwing props within reach of the hand of the first virtual object, in response to receiving a throwing operation, the computer device may control the first virtual object to throw throwing props within reach using its hand.
Since the throwing prop is suspended in the reachable range of the hand of the first virtual object, when the first virtual object uses the hand to throw the throwing prop in the reachable range, the hand action of the first virtual object can be reduced as much as possible, so that the authenticity of the action of the first virtual object is ensured, and the interval between two continuous throwing is reduced.
In response to the first condition being met, a throwing prop is newly added within reach of the first virtual object, step 505.
In this embodiment of the present application, in order to make the throwing action of the first virtual object sufficiently natural, the above-mentioned accessibility range may be limited, and accordingly, the number of throwing props displayed simultaneously within the accessibility range may also be limited, which may cause the user to quickly consume the throwing props within the accessibility range, and if the user needs to continue throwing, the user needs to wait for the next triggering of the display of the first scene image, so that the throwing efficiency is low. In this regard, in the embodiment of the present application, a solution is also related to automatically supplementing the throwing prop within the reach, that is, when the user controls the first virtual object to throw the throwing prop within the reach through the throwing operation, it may be determined whether to newly add the throwing prop within the reach by detecting the first condition.
The first condition may be a condition corresponding to an effect generated by the throwing prop that the first virtual object has consumed, for example, whether the first virtual object hits a valid target object, whether the first virtual object moves to a designated range of the throwing prop that has been thrown and falls to the ground, and so on.
In one possible implementation, in response to meeting the first condition, adding a throwing prop within reach of the first virtual object, comprising:
in response to the throwing prop hitting the target object after being thrown, the throwing prop is newly added within reach of the first virtual object.
In the embodiment of the application, whether to supplement the throwing prop is determined by taking the condition that whether the throwing prop concentrates on the target object or not as an example, when the computer equipment detects that the throwing prop thrown by the first virtual object hits the target object, one or more throwing props are newly added within the reach of the first virtual object.
The solution shown in the foregoing embodiments of the present application is described by taking the first condition as an example, where the first condition includes hitting the target object after the throwing prop is thrown, and alternatively, the first condition may also include other conditions, for example, the first condition may include eliminating the target object by throwing the prop, and so on.
In one possible implementation, when the target skill has a duration, the process of presenting the second scene view in the virtual scene interface in response to receiving a throwing operation on the throwing prop may include:
in response to receiving a throwing operation on a throwing prop within a duration of the target skill, a second scene screen is presented in the virtual scene interface.
In this embodiment of the present application, the use time of the throwing prop may be limited, that is, after the user triggers the target skill, the user may trigger the first virtual object to throw the throwing prop within the reach through the control operation for the duration of the target skill. Optionally, if the duration of the target skill is over, exiting the state in which the throwing prop within reach can be thrown, and correspondingly, the computer device can cancel the throwing prop set within reach.
In one possible implementation, the duration of the target skill is increased in response to the second condition being met.
In the embodiment of the application, the computer device may also adjust the duration of the target skill according to the condition of the first virtual object in the virtual scene in combination with the second condition.
In one possible implementation, the increasing the duration of the target skill in response to the second condition being met may include:
the duration of the target skill is increased in response to the number of target objects hit by the throwing prop reaching a number threshold for the duration of the target skill.
In the embodiment of the application, in order to improve the efficiency of the user to control the first virtual object to continuously throw the throwing prop, the computer device may further combine the number of target objects hit by the throwing prop to extend the duration of the target skill.
The solution shown in the foregoing embodiment of the present application is described only by taking the case that the second condition includes that the number of target objects hit by the throwing prop reaches the threshold value, and alternatively, the second condition may also include other conditions, for example, the second condition may include that the number of target objects eliminated by throwing prop reaches the threshold value, and so on.
In one possible implementation, the computer device may also present timing information in the virtual scene interface for the duration of the target skill, the timing information being used to indicate the remaining duration of the target skill.
In the embodiment of the application, when the target skill has the duration, in order to enable the user to know the related situation of the duration of the target skill at any time, the first virtual object is better controlled to throw, and the computer device can display the remaining duration of the target skill in the virtual scene interface.
For example, the computer device may display the remaining duration corresponding to the skill release control of the target skill, for example, please refer to fig. 8, which shows a schematic diagram of displaying the remaining duration according to an embodiment of the present application. As shown in fig. 8, a skill release control 81 is shown in the virtual scene interface, and after the skill release control 81 is triggered, a timing bar 82 may be shown on the periphery of the skill release control 81, where the length of the timing bar 82 indicates the remaining time period. Alternatively, the computer may also present the value of the remaining time period at the center of the skill release control 81.
The embodiment of the present application is not limited to the manner of displaying the timing information.
In summary, according to the throwing prop control scheme provided by the embodiment of the application, before throwing props, at least two throwing props are placed in the reach of a virtual object controlled by a user, so that when the user triggers throwing operation, the virtual object can throw one or more throwing props through continuous throwing actions, actions of the virtual object in the throwing control process of continuously throwing props are reduced, the time interval of continuous throwing is reduced under the condition that the throwing actions of the virtual object are natural enough and close to reality is ensured, the action authenticity of the virtual object is ensured, the man-machine interaction efficiency of the user when controlling the throwing props is improved, the duration of single-office fight is further reduced, and the electric quantity and the data flow consumed by a terminal are saved.
Taking the application of the scheme shown in the above embodiment of the present application to the application scenario of the game as an example, the technical weapon of the fly cutter can be added in the game through the scheme shown in the embodiment of the present application, for example, the weapon in the scheme is a big skill weapon, the player needs to activate the big skill weapon first to use, and the activating mode can be time cooling, that is, the player can use after waiting for a certain time. After the activation of the big weapon, the player-controlled game character's hand may display multiple flies, such as 5 flies, all of which 5 flies may be thrown before the end of the big skill use. Correspondingly, according to the scheme shown by the embodiment of the application, a fly cutter operation mode can be added in a game, for example, the operation of the fly cutter can be that a firing key is clicked, a game character controlled by a player can throw one fly cutter towards the center direction when the firing key is clicked once, if a target is hit, one fly cutter can be recovered immediately, if a player rapidly double-clicks the firing button, the game character can release all the rest fly cutters, the fly cutters can fly out one by one at a certain interval, the interval can be relatively short, the operation is suitable for the situation of multiple targets on site, the direction of each fly cutter when flying out can be changed along with the center change, for example, the player can control the game character to move in the process of continuously releasing the fly cutters, and then the position of the fly cutter when flying hands can also move along with the current position of the game character.
The judging principle of the single click and the double click can be as follows: firstly judging whether to click a firing key button, wherein each button is provided with a center point and a click radius R; when the player clicks the screen by the finger, the computer device will acquire the clicked position, then calculate the distance between the center of the fire key, and if the distance is smaller than the radius R, click the fire key. The user presses the fire key and the computer device records the clicking time, the first fly cutter flies out, if the player clicks the fire key once again within a preset time, the player is judged to be double-clicked, and then the rest fly cutters are thrown out according to the setting.
The flight track of the throwing prop is determined by parameters such as the direction of the initial speed, acceleration and the like; a ray is emitted forward in the flying process of the fly cutter, and the ray is used for detecting a front obstacle (such as a target object); when an object is detected, the next frame is indicated as touching the obstacle, and a collision point is acquired, at which if a target object is encountered, the injury can be calculated.
Taking a game scenario as an example, please refer to fig. 9, which is a control flow chart of a projectile weapon (corresponding to the projectile stage) according to an exemplary embodiment of the present application. As shown in fig. 9, the control flow of the projectile weapon may be as follows:
S901, a player activates a fly cutter big skill weapon.
S902, judging whether a player clicks a large skill weapon using a fly cutter; if yes, go to S903, otherwise return.
S903, 5 flyers (namely, throwing props) are switched around the game role.
S904, judging whether the player presses the fire key; if yes, the process goes to S905, otherwise, the process goes back to S903.
S905, throwing out a fly cutter by the game character, and subtracting one fly cutter around the game character.
S906, judging whether the fly cutter hits the target object; if yes, the process advances to S907, otherwise, the process advances to S908.
S907, supplementing a fly cutter around the game role.
S908, judging whether the player double clicks the firing key; if yes, the process advances to S909, otherwise, returns to S904.
S909, the game characters throw the rest flying cutters one by one.
S910, judging whether the big-invitation-fly cutter big-invitation-skill is finished, if yes, entering S911, otherwise, returning to S904.
S911, switch back to a normal weapon, such as a weapon used outside of the large skill weapon using fly cutters.
Fig. 10 shows a block diagram of a throwing prop control device provided in an exemplary embodiment of the present application. The throwing prop control apparatus may be employed in a computer device to perform all or part of the steps of the method shown in fig. 3 or 5. As shown in fig. 10, the throwing prop control device includes:
The interface display module 1001 is configured to display a virtual scene interface, where the virtual scene interface is configured to display a scene picture of a virtual scene, and the virtual scene includes a first virtual object;
a first screen display module 1002, configured to display a first scene screen in the virtual scene interface, where the first scene screen includes at least two throwing props; at least two of the throwing prop are within reach of the first virtual object;
a second scene display module 1003, configured to display a second scene in the virtual scene interface in response to receiving a throwing operation on the throwing prop, where the second scene is a scene of the first virtual object throwing at least one throwing prop.
In one possible implementation, the first scene frame is a frame in which at least two of the throwing prop are suspended within reach of the hand of the first virtual object;
the second scene picture is an animation picture of the first virtual object throwing at least one throwing prop through a hand.
In one possible implementation, the second frame presentation module 1003 is configured to,
Responding to receiving throwing operation of the throwing prop, and acquiring an operation mode of the throwing operation;
and displaying the second scene picture in the virtual scene interface based on the operation mode.
In one possible implementation, the second frame presentation module 1003 is configured to,
and responding to the operation mode to be a clicking operation, and displaying a picture of throwing a single throwing prop by the first virtual object in the virtual scene interface.
In one possible implementation, the second frame presentation module 1003 is configured to,
and responding to the operation mode of double-click operation or sliding operation, and displaying the picture of continuously throwing the throwing prop by the first virtual object in the virtual scene interface.
In one possible implementation, the second frame presentation module 1003 is configured to,
and responding to the operation mode of long-press operation or sliding operation, and displaying the picture of continuously throwing the throwing prop by the first virtual object in the operation duration of the throwing operation in the virtual scene interface.
In one possible implementation, the apparatus further includes:
and the prop adding module is used for adding the throwing props in the reach range of the first virtual object in response to the first condition.
In one possible implementation, the prop adding module is configured to add the throwing prop within reach of the first virtual object in response to the throwing prop hitting a target object after being thrown.
In one possible implementation, the first scene display module 1002 is configured to display the first scene in the virtual scene interface in response to receiving an operation to release the target skill.
In one possible implementation, the target skill has a duration;
the second scene display module 1003 is configured to display a second scene in the virtual scene interface in response to receiving a throwing operation on the throwing prop within a duration of the target skill.
In one possible implementation, the apparatus further includes:
a time increasing module for increasing the duration of the target skill in response to a second condition being met.
In one possible implementation, the time increasing module is configured to increase the duration of the target skill in response to the number of target objects hit by the throwing prop reaching a number threshold within the duration of the target skill.
In one possible implementation, the apparatus further includes:
and the timing information display module is used for displaying timing information in the virtual scene interface within the duration of the target skill, and the timing information is used for indicating the remaining duration of the target skill.
In summary, according to the throwing prop control scheme provided by the embodiment of the application, before throwing props, at least two throwing props are placed in the reach of a virtual object controlled by a user, so that when the user triggers throwing operation, the virtual object can throw one or more throwing props through continuous throwing actions, actions of the virtual object in the throwing control process of continuously throwing props are reduced, the time interval of continuous throwing is reduced under the condition that the throwing actions of the virtual object are natural enough and close to reality is ensured, the action authenticity of the virtual object is ensured, the man-machine interaction efficiency of the user when controlling the throwing props is improved, the duration of single-office fight is further reduced, and the electric quantity and the data flow consumed by a terminal are saved.
Fig. 11 shows a block diagram of a computer device 1100 provided by an exemplary embodiment of the present application. The computer device 1100 may be a portable mobile terminal such as: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion picture expert compression standard audio plane 3), an MP4 (Moving Picture Experts Group Audio Layer IV, motion picture expert compression standard audio plane 4) player, a notebook computer, or a desktop computer. The computer device 1100 may also be referred to by other names of user devices, portable terminals, laptop terminals, desktop terminals, and the like.
In general, the computer device 1100 includes: a processor 1101 and a memory 1102.
The processor 1101 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1101 may be implemented in at least one hardware form of DSP (Digital Signal Processing ), FPGA (Field-Programmable Gate Array, field programmable gate array), PLA (Programmable Logic Array ). The processor 1101 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (Central Processing Unit ), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 1101 may be integrated with a GPU (Graphics Processing Unit, image processor) for taking care of rendering and rendering of content that the display screen is required to display. In some embodiments, the processor 1101 may also include an AI (Artificial Intelligence ) processor for processing computing operations related to machine learning.
Memory 1102 may include one or more computer-readable storage media, which may be non-transitory. Memory 1102 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1102 is used to store at least one computer instruction for execution by processor 1101 to implement the throwing prop control method provided by the method embodiments in the present application.
In some embodiments, the computer device 1100 may further optionally include: a peripheral interface 1103 and at least one peripheral. The processor 1101, memory 1102, and peripheral interface 1103 may be connected by a bus or signal lines. The individual peripheral devices may be connected to the peripheral device interface 1103 by buses, signal lines or circuit boards. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1104, a display screen 1105, a camera assembly 1106, audio circuitry 1107, and a power supply 1109.
A peripheral interface 1103 may be used to connect I/O (Input/Output) related at least one peripheral device to the processor 1101 and memory 1102. In some embodiments, the processor 1101, memory 1102, and peripheral interface 1103 are integrated on the same chip or circuit board; in some other embodiments, any one or both of the processor 1101, memory 1102, and peripheral interface 1103 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1104 is used to receive and transmit RF (Radio Frequency) signals, also known as electromagnetic signals. The radio frequency circuit 1104 communicates with a communication network and other communication devices via electromagnetic signals. The radio frequency circuit 1104 converts an electrical signal into an electromagnetic signal for transmission, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1104 includes: antenna systems, RF transceivers, one or more amplifiers, tuners, oscillators, digital signal processors, codec chipsets, subscriber identity module cards, and so forth. The radio frequency circuitry 1104 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocol includes, but is not limited to: the world wide web, metropolitan area networks, intranets, generation mobile communication networks (2G, 3G, 4G, and 5G), wireless local area networks, and/or WiFi (Wireless Fidelity ) networks. In some embodiments, the radio frequency circuitry 1104 may also include NFC (Near Field Communication, short range wireless communication) related circuitry, which is not limited in this application.
The display screen 1105 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 1105 is a touch display, the display 1105 also has the ability to collect touch signals at or above the surface of the display 1105. The touch signal may be input to the processor 1101 as a control signal for processing. At this time, the display screen 1105 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, the display 1105 may be one, disposed on the front panel of the computer device 1100; in other embodiments, the display 1105 may be at least two, respectively disposed on different surfaces of the computer device 1100 or in a folded design; in other embodiments, the display 1105 may be a flexible display disposed on a curved surface or a folded surface of the computer device 1100. Even more, the display 1105 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display 1105 may be made of LCD (Liquid Crystal Display ), OLED (Organic Light-Emitting Diode) or other materials.
The camera assembly 1106 is used to capture images or video. Optionally, the camera assembly 1106 includes a front camera and a rear camera. Typically, the front camera is disposed on the front panel of the terminal and the rear camera is disposed on the rear surface of the terminal. In some embodiments, the at least two rear cameras are any one of a main camera, a depth camera, a wide-angle camera and a tele camera, so as to realize that the main camera and the depth camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize a panoramic shooting and Virtual Reality (VR) shooting function or other fusion shooting functions. In some embodiments, the camera assembly 1106 may also include a flash. The flash lamp can be a single-color temperature flash lamp or a double-color temperature flash lamp. The dual-color temperature flash lamp refers to a combination of a warm light flash lamp and a cold light flash lamp, and can be used for light compensation under different color temperatures.
The audio circuit 1107 may include a microphone and a speaker. The microphone is used for collecting sound waves of users and environments, converting the sound waves into electric signals, and inputting the electric signals to the processor 1101 for processing, or inputting the electric signals to the radio frequency circuit 1104 for voice communication. The microphone may be provided in a plurality of different locations of the computer device 1100 for stereo acquisition or noise reduction purposes. The microphone may also be an array microphone or an omni-directional pickup microphone. The speaker is used to convert electrical signals from the processor 1101 or the radio frequency circuit 1104 into sound waves. The speaker may be a conventional thin film speaker or a piezoelectric ceramic speaker. When the speaker is a piezoelectric ceramic speaker, not only the electric signal can be converted into a sound wave audible to humans, but also the electric signal can be converted into a sound wave inaudible to humans for ranging and other purposes. In some embodiments, the audio circuit 1107 may also include a headphone jack.
The power supply 1109 is used to power the various components in the computer device 1100. The power source 1109 may be an alternating current, a direct current, a disposable battery, or a rechargeable battery. When the power source 1109 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, the computer device 1100 also includes one or more sensors 1110. The one or more sensors 1110 include, but are not limited to: acceleration sensor 1111, gyroscope sensor 1112, pressure sensor 1113, optical sensor 1115, and proximity sensor 1116.
The acceleration sensor 1111 may detect the magnitudes of accelerations on three coordinate axes of a coordinate system established with the computer device 1100. For example, the acceleration sensor 1111 may be configured to detect components of gravitational acceleration in three coordinate axes. The processor 1101 may control the display screen 1105 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal acquired by the acceleration sensor 1111. Acceleration sensor 1111 may also be used for the acquisition of motion data of a game or a user.
The gyro sensor 1112 may detect a body direction and a rotation angle of the computer apparatus 1100, and the gyro sensor 1112 may collect 3D actions of the user on the computer apparatus 1100 in cooperation with the acceleration sensor 1111. The processor 1101 may implement the following functions based on the data collected by the gyro sensor 1112: motion sensing (e.g., changing UI according to a tilting operation by a user), image stabilization at shooting, game control, and inertial navigation.
The pressure sensor 1113 may be disposed on a side frame of the computer device 1100 and/or on an underlying layer of the display screen 1105. When the pressure sensor 1113 is disposed on a side frame of the computer apparatus 1100, a grip signal of the computer apparatus 1100 by a user may be detected, and the processor 1101 performs a left-right hand recognition or a shortcut operation according to the grip signal collected by the pressure sensor 1113. When the pressure sensor 1113 is disposed at the lower layer of the display screen 1105, the processor 1101 realizes control of the operability control on the UI interface according to the pressure operation of the user on the display screen 1105. The operability controls include at least one of a button control, a scroll bar control, an icon control, and a menu control.
The optical sensor 1115 is used to collect the ambient light intensity. In one embodiment, the processor 1101 may control the display brightness of the display screen 1105 based on the intensity of ambient light collected by the optical sensor 1115. Specifically, when the intensity of the ambient light is high, the display luminance of the display screen 1105 is turned up; when the ambient light intensity is low, the display luminance of the display screen 1105 is turned down. In another embodiment, the processor 1101 may also dynamically adjust the shooting parameters of the camera assembly 1106 based on the intensity of ambient light collected by the optical sensor 1115.
A proximity sensor 1116, also known as a distance sensor, is typically provided on the front panel of the computer device 1100. The proximity sensor 1116 is used to capture the distance between the user and the front face of the computer device 1100. In one embodiment, when the proximity sensor 1116 detects a gradual decrease in the distance between the user and the front face of the computer device 1100, the processor 1101 controls the display 1105 to switch from the on-screen state to the off-screen state; when the proximity sensor 1116 detects that the distance between the user and the front face of the computer device 1100 gradually increases, the display screen 1105 is controlled by the processor 1101 to switch from the off-screen state to the on-screen state.
Those skilled in the art will appreciate that the architecture shown in fig. 11 is not limiting as to the computer device 1100, and may include more or fewer components than shown, or may combine certain components, or employ a different arrangement of components.
Fig. 12 shows a block diagram of a computer device 1200 shown in an exemplary embodiment of the present application. The computer device can be implemented as the protection blocking device in the above-mentioned scheme of the application. The computer apparatus 1200 includes a central processing unit (Central Processing Unit, CPU) 1201, a system Memory 1204 including a random access Memory (Random Access Memory, RAM) 1202 and a Read-Only Memory (ROM) 1203, and a system bus 1205 connecting the system Memory 1204 and the central processing unit 1201. The computer device 1200 also includes a basic Input/Output system (I/O) 1206, which helps to transfer information between various devices within the computer, and a mass storage device 1207, which stores an operating system 1213, application programs 1214, and other program modules 1215.
The basic input/output system 1206 includes a display 1208 for displaying information and an input device 1209, such as a mouse, keyboard, etc., for user input of information. Wherein the display 1208 and the input device 1209 are coupled to the central processing unit 1201 via an input-output controller 1210 coupled to a system bus 1205. The basic input/output system 1206 may also include an input/output controller 1210 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, the input output controller 1210 also provides output to a display screen, a printer, or other type of output device.
The mass storage device 1207 is connected to the central processing unit 1201 through a mass storage controller (not shown) connected to the system bus 1205. The mass storage device 1207 and its associated computer-readable media provide non-volatile storage for the computer device 1200. That is, the mass storage device 1207 may include a computer readable medium (not shown), such as a hard disk or a compact disk-Only (CD-ROM) drive.
The computer readable medium may include computer storage media and communication media without loss of generality. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, erasable programmable read-Only register (Erasable Programmable Read Only Memory, EPROM), electrically erasable programmable read-Only Memory (EEPROM) flash Memory or other solid state Memory technology, CD-ROM, digital versatile disks (Digital Versatile Disc, DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will recognize that the computer storage medium is not limited to the one described above. The system memory 1204 and mass storage device 1207 described above may be collectively referred to as memory.
According to various embodiments of the disclosure, the computer device 1200 may also operate through a network, such as the Internet, to a remote computer on the network. I.e., the computer device 1200 may be connected to the network 1212 through a network interface unit 1211 coupled to the system bus 1205, or alternatively, the network interface unit 1211 may be used to connect to other types of networks or remote computer systems (not shown).
The memory further includes at least one computer instruction stored in the memory, and the central processing unit 1201 implements all or part of the steps of the throwing prop control method described in the above embodiments by executing the at least one instruction, the at least one program, the code set, or the instruction set.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as a memory, comprising at least one computer instruction executable by a processor to perform all or part of the steps of the method shown in any of the embodiments of fig. 3 or 5 described above. For example, the non-transitory computer readable storage medium may be ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
In an exemplary embodiment, a computer program product or a computer program is also provided, the computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium and executes the computer instructions to cause the computer device to perform all or part of the steps of the method shown in any of the embodiments of fig. 3 or 5 described above.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (12)

1. A method of throwing prop control, the method comprising:
displaying a virtual scene interface, wherein the virtual scene interface is used for displaying a scene picture of a virtual scene, and the virtual scene comprises a first virtual object;
displaying a first scene picture in the virtual scene interface, wherein the first scene picture is a picture that at least two throwing props are suspended within the reach of the hand of the first virtual object; at least two of the throwing prop are within reach of the first virtual object;
responsive to receiving a throwing operation on the throwing prop, displaying a second scene screen in the virtual scene interface, the second scene screen being a screen of the first virtual object throwing at least one of the throwing props;
wherein the displaying, in the virtual scene interface, a second scene picture in response to receiving a throwing operation of the throwing prop, includes:
responding to receiving throwing operation of the throwing prop, and acquiring an operation mode of the throwing operation;
responding to the operation mode as a clicking operation, and displaying a picture of throwing a single throwing prop by the first virtual object in the virtual scene interface; or, in response to the operation mode being a double-click operation or a sliding operation, displaying a picture of continuously throwing the throwing prop by the first virtual object in the virtual scene interface; or, in response to the long-press operation or the sliding operation, displaying the picture of continuously throwing the throwing prop by the first virtual object in the operation duration of the throwing operation in the virtual scene interface.
2. The method of claim 1, wherein the second scene is an animated scene of the first virtual object throwing at least one of the throwing props by hand.
3. The method according to claim 1, wherein the method further comprises:
in response to a first condition being met, the throwing prop is newly added within reach of the first virtual object.
4. A method according to claim 3, wherein said adding the throwing prop within reach of the first virtual object in response to a first condition being met comprises:
in response to the throwing prop hitting a target object after being thrown, the throwing prop is newly added within reach of the first virtual object.
5. The method of claim 1, wherein the presenting the first scene picture in the virtual scene interface comprises:
and in response to receiving an operation to release target skills, displaying the first scene picture in the virtual scene interface.
6. The method of claim 5, wherein the target skills have a duration;
the method includes, in response to receiving a throwing operation of the throwing prop, displaying a second scene screen in the virtual scene interface, including:
In response to receiving a throwing operation on the throwing prop within a duration of the target skill, a second scene screen is presented in the virtual scene interface.
7. The method of claim 6, wherein the method further comprises:
in response to a second condition being met, increasing a duration of the target skill.
8. The method of claim 7, wherein the increasing the duration of the target skill in response to the second condition being met comprises:
in response to the number of target objects hit by the throwing prop reaching a number threshold within the duration of the target skill, the duration of the target skill is increased.
9. The method of claim 6, wherein the method further comprises:
and displaying timing information in the virtual scene interface within the duration of the target skill, wherein the timing information is used for indicating the remaining duration of the target skill.
10. A throwing prop control device, the device comprising:
the interface display module is used for displaying a virtual scene interface, wherein the virtual scene interface is used for displaying a scene picture of a virtual scene, and the virtual scene comprises a first virtual object;
The first picture display module is used for displaying a first scene picture in the virtual scene interface, wherein the first scene picture is a picture in which at least two throwing props are suspended within the reach of the hand of the first virtual object; at least two of the throwing prop are within reach of the first virtual object;
a second screen display module, configured to display a second scene screen in the virtual scene interface in response to receiving a throwing operation on the throwing prop, where the second scene screen is a screen of the first virtual object throwing at least one throwing prop;
the second picture display module is specifically configured to,
responding to receiving throwing operation of the throwing prop, and acquiring an operation mode of the throwing operation;
responding to the operation mode as a clicking operation, and displaying a picture of throwing a single throwing prop by the first virtual object in the virtual scene interface; or, in response to the operation mode being a double-click operation or a sliding operation, displaying a picture of continuously throwing the throwing prop by the first virtual object in the virtual scene interface; or, in response to the long-press operation or the sliding operation, displaying the picture of continuously throwing the throwing prop by the first virtual object in the operation duration of the throwing operation in the virtual scene interface.
11. A computer device comprising a processor and a memory having stored therein at least one computer instruction that is loaded and executed by the processor to implement a throwing prop control method as claimed in any one of claims 1 to 9.
12. A computer readable storage medium having stored therein at least one computer instruction that is loaded and executed by a processor to implement a throwing prop control method as claimed in any one of claims 1 to 9.
CN202111060411.6A 2021-09-10 2021-09-10 Throwing prop control method, throwing prop control device, computer equipment and storage medium Active CN113713383B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111060411.6A CN113713383B (en) 2021-09-10 2021-09-10 Throwing prop control method, throwing prop control device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111060411.6A CN113713383B (en) 2021-09-10 2021-09-10 Throwing prop control method, throwing prop control device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113713383A CN113713383A (en) 2021-11-30
CN113713383B true CN113713383B (en) 2023-06-27

Family

ID=78683233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111060411.6A Active CN113713383B (en) 2021-09-10 2021-09-10 Throwing prop control method, throwing prop control device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113713383B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114618163A (en) * 2022-03-21 2022-06-14 北京字跳网络技术有限公司 Driving method and device of virtual prop, electronic equipment and readable storage medium
CN116983630A (en) * 2022-08-19 2023-11-03 腾讯科技(深圳)有限公司 Man-machine interaction method, device, equipment, medium and product based on virtual world
CN118022327A (en) * 2022-11-07 2024-05-14 腾讯科技(深圳)有限公司 Control method, device and equipment for virtual props and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0990458A2 (en) * 1998-09-28 2000-04-05 Konami Co., Ltd. Video game machine, method for switching viewpoint on gamescreen of video game, and computer-readable recording medium containing game-screen-viewpoint switching program
JP2018089120A (en) * 2016-12-02 2018-06-14 株式会社コナミデジタルエンタテインメント Game control device, game system and program
CN110427111A (en) * 2019-08-01 2019-11-08 腾讯科技(深圳)有限公司 The operating method of virtual item, device, equipment and storage medium in virtual environment
CN112121414A (en) * 2020-09-29 2020-12-25 腾讯科技(深圳)有限公司 Tracking method and device in virtual scene, electronic equipment and storage medium
CN112138384A (en) * 2020-10-23 2020-12-29 腾讯科技(深圳)有限公司 Using method, device, terminal and storage medium of virtual throwing prop
CN113069772A (en) * 2021-03-31 2021-07-06 网易(杭州)网络有限公司 Method and device for assembling virtual props in game and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0990458A2 (en) * 1998-09-28 2000-04-05 Konami Co., Ltd. Video game machine, method for switching viewpoint on gamescreen of video game, and computer-readable recording medium containing game-screen-viewpoint switching program
JP2018089120A (en) * 2016-12-02 2018-06-14 株式会社コナミデジタルエンタテインメント Game control device, game system and program
CN110427111A (en) * 2019-08-01 2019-11-08 腾讯科技(深圳)有限公司 The operating method of virtual item, device, equipment and storage medium in virtual environment
CN112121414A (en) * 2020-09-29 2020-12-25 腾讯科技(深圳)有限公司 Tracking method and device in virtual scene, electronic equipment and storage medium
CN112138384A (en) * 2020-10-23 2020-12-29 腾讯科技(深圳)有限公司 Using method, device, terminal and storage medium of virtual throwing prop
CN113069772A (en) * 2021-03-31 2021-07-06 网易(杭州)网络有限公司 Method and device for assembling virtual props in game and electronic equipment

Also Published As

Publication number Publication date
CN113713383A (en) 2021-11-30

Similar Documents

Publication Publication Date Title
CN108434736B (en) Equipment display method, device, equipment and storage medium in virtual environment battle
CN110917619B (en) Interactive property control method, device, terminal and storage medium
CN110585710B (en) Interactive property control method, device, terminal and storage medium
WO2021203856A1 (en) Data synchronization method and apparatus, terminal, server, and storage medium
WO2021184806A1 (en) Interactive prop display method and apparatus, and terminal and storage medium
CN113713382B (en) Virtual prop control method and device, computer equipment and storage medium
CN110465098B (en) Method, device, equipment and medium for controlling virtual object to use virtual prop
CN113713383B (en) Throwing prop control method, throwing prop control device, computer equipment and storage medium
CN111744184B (en) Control showing method in virtual scene, computer equipment and storage medium
CN111744186B (en) Virtual object control method, device, equipment and storage medium
CN113289331B (en) Display method and device of virtual prop, electronic equipment and storage medium
CN110507990B (en) Interaction method, device, terminal and storage medium based on virtual aircraft
CN112057857B (en) Interactive property processing method, device, terminal and storage medium
CN111921190B (en) Prop equipment method, device, terminal and storage medium for virtual object
TWI849357B (en) Method of displaying virtual props, equipment, electronic device, and storage medium
CN112044084B (en) Virtual item control method, device, storage medium and equipment in virtual environment
CN111475029B (en) Operation method, device, equipment and storage medium of virtual prop
CN112138384A (en) Using method, device, terminal and storage medium of virtual throwing prop
CN111659122B (en) Virtual resource display method and device, electronic equipment and storage medium
CN111249726B (en) Operation method, device, equipment and readable medium of virtual prop in virtual environment
CN110960849B (en) Interactive property control method, device, terminal and storage medium
CN111589102B (en) Auxiliary tool detection method, device, equipment and storage medium
CN114191820B (en) Throwing prop display method and device, electronic equipment and storage medium
CN113144600B (en) Virtual object control method, device, equipment and storage medium
CN112402969B (en) Virtual object control method, device, equipment and storage medium in virtual scene

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant