CN107168616B - Game interaction interface display method and device, electronic equipment and storage medium - Google Patents
Game interaction interface display method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN107168616B CN107168616B CN201710425829.XA CN201710425829A CN107168616B CN 107168616 B CN107168616 B CN 107168616B CN 201710425829 A CN201710425829 A CN 201710425829A CN 107168616 B CN107168616 B CN 107168616B
- Authority
- CN
- China
- Prior art keywords
- user interface
- interface
- preset
- virtual scene
- static frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0483—Interaction with page-structured environments, e.g. book metaphor
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/807—Role playing or strategy games
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The disclosure provides a game interaction interface display method and device, electronic equipment and a storage medium. The method comprises the following steps: when monitoring a preset operation on the graphical user interface, performing static frame screen capture on a current main interface of the virtual scene to obtain a static frame screen capture image, and closing the rendering of the virtual scene; generating a first user interface according to the static frame screenshot picture and popping up and displaying the first user interface; and when the closing operation of the first user interface is received, closing the first user interface, releasing the static frame screenshot, and starting the rendering of the virtual scene to recover the current main interface of the virtual scene. The method and the device can reduce the size of bag body resources, reduce the memory loss caused by real-time rendering of the game scene, reduce the frame dropping situation of the game card frame, improve the substituting feeling of the game scene, and bring better experience for users.
Description
Technical Field
The present disclosure relates to the field of human-computer interaction technologies, and in particular, to a game interaction interface display method, a game interaction interface display apparatus, and a computer-readable storage medium and an electronic device for implementing the game interaction interface display method.
Background
At present, in a game interactive interface, a game object and various information of the game object are often required to be displayed. For example, in mobile platform Game applications such as a Massively Multiplayer Online Role Playing Game (MMORPG) and an Action Role Playing Game (ARPG), a Graphical User Interface (GUI) for Game interaction such as a full screen mode and a pop-up mode is often involved.
FIG. 1 is a GUI interface of a game application in a full screen mode, where a base image is required to be output in the full screen mode, a game scene under a UI layer is directly covered, and rendering of the game scene is stopped. FIG. 2 is a GUI interface in another alternative gaming application pop-up mode, where a pop-up window is used to render a subsequent game scene in real-time or in a real-time fuzzy manner.
In the prior art, the full screen mode has the disadvantages that one base map needs to be output, and if each base map is different, different GUI resources need to be output, so that the size of the bag body is increased. In addition, compared with the popup mode, for games such as the MMORPG, directly opening the full-screen interface is easy to lose contact with the game environment, and the game substitution feeling is weak. In the popup mode, the lower layer game scene needs real-time rendering or real-time fuzzy rendering, which causes very serious memory loss, even causes frame dropping of the game card, and affects experience. This phenomenon is particularly serious for 3D realistic-writing games with a high number of model surfaces.
Therefore, an improved game interactive interface display scheme is highly needed, so that GUI resources are reduced or even not required to be output, the size of bag body resources is reduced, memory loss caused by real-time rendering of a game scene is reduced, the frame dropping situation of a game card frame is reduced, and the realistic feeling and the substitution feeling of a game scene picture are improved.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of the present disclosure is to provide an image processing method, an image processing apparatus, an electronic device, and a computer-readable storage medium, which overcome one or more of the problems due to the limitations and disadvantages of the related art, at least to some extent.
Additional features and advantages of the disclosure will be set forth in the detailed description which follows, or in part will be obvious from the description, or may be learned by practice of the disclosure.
According to one aspect of the present disclosure, a game interaction interface display method is provided, in which a graphical user interface is obtained by executing a software application on a processor of a terminal and rendering on a display of the terminal, and content presented by the graphical user interface includes a virtual scene; the method comprises the following steps:
when monitoring a preset operation on the graphical user interface, performing static frame screen capture on the current main interface of the virtual scene to obtain a static frame screen capture image, and closing the rendering of the virtual scene;
generating a first user interface according to the static frame screenshot picture and displaying the first user interface in a popup mode;
and when receiving a closing operation of the first user interface, closing the first user interface, releasing the static frame screenshot, and starting rendering of the virtual scene to recover to a current main interface of the virtual scene.
In an exemplary embodiment of the present disclosure, the preset operation includes any one of the following operations:
clicking a preset virtual control for operation;
sliding touch control preset virtual control operation;
long-time pressing a preset virtual control for operation;
pressing a preset virtual control for operation;
and the preset virtual control is a function button arranged in the virtual scene.
In an exemplary embodiment of the disclosure, the generating a first user interface from the still-frame screenshot comprises:
after a static frame screen capture is carried out on the current main interface of the virtual scene to obtain a static frame screen capture image, the static frame screen capture image is rendered on a preset base image;
and adding the rendered preset base map to the first user interface.
In an exemplary embodiment of the present disclosure, the method further comprises:
and after the first user interface is closed, removing the rendered preset base map from the first user interface and releasing the rendered preset base map.
In an exemplary embodiment of the present disclosure, the first user interface includes a fuzzy layer interface, a normal user interface, and a popup user interface, which are sequentially stacked; and the rendered preset base map is positioned in the fuzzy layer interface.
According to one aspect of the present disclosure, there is provided a game interaction interface display apparatus, which obtains a graphical user interface by executing a software application on a processor of a terminal and rendering on a display of the terminal, content presented by the graphical user interface including a virtual scene; the device includes:
the static frame screen capturing module is used for performing static frame screen capturing on the current main interface of the virtual scene to obtain a static frame screen capturing image and closing the rendering of the virtual scene when monitoring a preset operation on the graphical user interface;
the interface display module is used for generating a first user interface according to the static frame screenshot picture and displaying the first user interface in a popup mode; and
and the interface recovery module is used for closing the first user interface to release the static frame screenshot when receiving the closing operation of the first user interface, and starting the rendering of the virtual scene to recover the current main interface of the virtual scene.
In an exemplary embodiment of the present disclosure, the preset operation includes any one of the following operations:
clicking a preset virtual control for operation;
sliding touch control preset virtual control operation;
long-time pressing a preset virtual control for operation; and the preset virtual control is a function button arranged in the virtual scene.
In an exemplary embodiment of the disclosure, the generating a first user interface from the still-frame screenshot comprises:
after a static frame screen capture is carried out on the current main interface of the virtual scene to obtain a static frame screen capture image, the static frame screen capture image is rendered on a preset base image;
and adding the rendered preset base map to the first user interface.
In an exemplary embodiment of the disclosure, the interface resuming module is further configured to remove and release the rendered preset base map from the first user interface after the first user interface is closed.
In an exemplary embodiment of the present disclosure, the first user interface includes a fuzzy layer interface, a normal user interface, and a popup user interface, which are sequentially stacked; and the rendered preset base map is positioned in the fuzzy layer interface.
According to an aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the game interaction interface display method of any one of the above.
According to an aspect of the present disclosure, there is provided a computing device comprising:
a processor, and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the steps of any one of the game interaction interface display methods described above via execution of the executable instructions.
According to the game interaction interface display method and device, when a preset operation on a graphical user interface is monitored, a static frame screen capture is conducted on a current main interface of a game virtual scene to obtain a static frame screen capture image, rendering of the virtual scene is closed, then a first user interface is generated according to the static frame screen capture image and displayed in a popup mode, when the closing operation of the first user interface is received, the first user interface is closed, the static frame screen capture image is released, and rendering of the virtual scene is started to restore the current main interface of the virtual scene. Therefore, a base map is not required to be output, the rendering of the game scene is closed when the first user interface is displayed, the static frame screen capture map is directly released and the game scene is recovered after the rendering is closed, and compared with a full-screen mode, GUI resources can be remarkably reduced or even not required to be output, and the size of the bag body resources is reduced; compared with a real-time rendering or real-time fuzzy mode, the method reduces the memory consumption caused by real-time rendering or real-time fuzzy of the game scene, and reduces the frame dropping situation of the game card; therefore, the game scene image fidelity and substitution feeling can be improved, and better experience is brought to the user.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The above and other features and advantages of the present disclosure will become more apparent by describing in detail exemplary embodiments thereof with reference to the attached drawings. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 schematically illustrates a game GUI interface diagram in a prior art full screen mode;
FIG. 2 schematically illustrates a game GUI interface diagram in a prior art pop-up mode;
FIG. 3 schematically illustrates a game virtual scene diagram according to an embodiment of the disclosure;
FIG. 4 schematically illustrates a flow chart of a game interaction interface display method according to an embodiment of the present disclosure;
FIG. 5 schematically illustrates a flow diagram of a game interaction interface display method according to another embodiment of the present disclosure;
FIG. 6 schematically illustrates a first user interface exploded view in accordance with an embodiment of the present disclosure;
FIG. 7 schematically illustrates a first user interface diagram according to yet another embodiment of the present disclosure;
FIG. 8 schematically illustrates a first user interface diagram according to yet another embodiment of the present disclosure;
FIG. 9 schematically shows a game interaction interface display device according to an embodiment of the present disclosure;
FIG. 10 schematically illustrates a block diagram of an electronic device in an exemplary embodiment of the disclosure;
fig. 11 schematically illustrates a program product for image processing in an exemplary embodiment of the disclosure.
Detailed Description
The principles and spirit of the present disclosure will be described with reference to a number of exemplary embodiments. It is understood that these embodiments are given solely for the purpose of enabling those skilled in the art to better understand and to practice the present disclosure, and are not intended to limit the scope of the present disclosure in any way. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
As will be appreciated by one skilled in the art, embodiments of the present disclosure may be embodied as a system, apparatus, device, method, or computer program product. Accordingly, the present disclosure may be embodied in the form of: entirely hardware, entirely software (including firmware, resident software, micro-code, etc.), or a combination of hardware and software. In this document, it is to be understood that any number of elements in the figures are provided by way of illustration and not limitation, and any nomenclature is used for differentiation only and not in any limiting sense.
According to an embodiment of the disclosure, a game interaction interface display method, a computer-readable storage medium, a game interaction interface display device and an electronic device are provided. The principles and spirit of the present disclosure are explained in detail below with reference to several representative embodiments of the present disclosure.
The inventor finds that when a preset operation on a game graphical user interface is monitored in a game virtual scene, a static frame screenshot can be performed on a current main interface of the virtual scene to obtain a static frame screenshot image, rendering of the virtual scene is closed, and then a first user interface is generated according to the static frame screenshot image and displayed in a popup mode. And then when receiving a closing operation of the first user interface, closing the first user interface, releasing the static frame screenshot, and starting rendering of the virtual scene to recover to a current main interface of the virtual scene. Therefore, one-key opening and one-key closing of the first user interface can be realized, a base image is not required to be output in a full-screen mode in the prior art, rendering of a game scene under the interface is closed when the first user interface is displayed, and a closed static frame screenshot is returned to be directly released in a memory and the game scene is recovered, so that GUI resources are remarkably reduced or even not required to be output, the size of bag body resources is reduced, and compared with a real-time rendering or real-time fuzzy mode, the memory loss caused by real-time rendering or real-time fuzzy of the game scene is reduced, and the frame dropping condition of a game card frame is reduced; therefore, the realistic feeling and the substitution feeling of the game scene picture can be improved, and better experience is brought to the user.
Having described the general principles of the present disclosure, various non-limiting embodiments of the present disclosure are described in detail below.
Referring first to the game virtual scenario exemplarily shown in fig. 3, a user, i.e., a game player, starts a game program and enters the game virtual scenario 300, and during the game, the player triggers an interactive interface displaying the game virtual character 320 and various information (such as character name, strength, physical strength, intelligence, and the like) of the game virtual character 320 as required, as shown in fig. 7. The method for specifically presenting the game interactive interface display in the embodiment of the present disclosure may be applied to the exemplary game virtual scene 300. It should be noted that the game virtual scene 300 may also be another game virtual scene of a different or similar type, which is not limited herein.
A game interactive interface display method according to an exemplary embodiment of the present disclosure is described below with reference to FIGS. 4 to 8 in conjunction with a game virtual scene shown in FIG. 3. It should be noted that the above application scenarios are merely illustrated for the convenience of understanding the spirit and principles of the present disclosure, and the embodiments of the present disclosure are not limited in this respect. Rather, embodiments of the present disclosure may be applied to any scenario where applicable.
Referring to fig. 4, the game interactive interface display method may be applied to a terminal, such as a mobile terminal like a computer or a mobile phone. A graphical user interface is obtained by executing a software application on a processor of a terminal and rendering on a display of the terminal, the content presented by the graphical user interface comprising a virtual scene and at least partly a virtual object. Such as rendering the game virtual scene 300 shown in fig. 3 and the virtual character 320 in the virtual scene. The game interactive interface display method shown in fig. 4 may include the steps of:
step S401: and when monitoring a preset operation on the graphical user interface, performing static frame screen capture on the current main interface of the virtual scene to obtain a static frame screen capture image, and closing the rendering of the virtual scene.
In the present exemplary embodiment, a still frame is a still picture, and for example, a one-second motion picture is composed of continuous 25 still pictures. And performing static frame screenshot on the current main interface of the virtual scene 300 to obtain a static frame screenshot, that is, obtaining a static picture of the current main interface of the virtual scene 300. In addition, real-time blurring or rendering stopping can be performed on the game virtual scene 300 first, and then static frame screen capturing is performed, the loss degree of performance is different in the sequence of the static frame screen capturing steps, and the performance loss is lower after the static frame screen capturing is performed first and then blurring or rendering stopping is performed, but the effect of the example embodiment can be achieved.
In one embodiment of the present disclosure, the preset operation may include, but is not limited to, any one of the following operations: clicking a preset virtual control operation, sliding and touching the preset virtual control operation, and long-pressing the preset virtual control operation. And the preset virtual control is a function button arranged in the virtual scene. Of course, in other exemplary embodiments of the present disclosure, the preset operation may also be other operations such as pressing again, shaking the mobile terminal, and the like, which is not particularly limited in this exemplary embodiment.
For example, the function buttons 310 may be "backpack," "battle group," "soul," "character," "team," "log," "social," and so on as shown in FIG. 3. The pop-up display of the first user interface is triggered by any one of the function buttons 310, which is clicked, slid, pressed for a long time, or pressed (applicable to a mobile terminal having a pressure-sensitive function). Therefore, a trigger mechanism of any function entrance button response point pressing operation can be realized, and the method is simple and easy to operate. Taking the example of clicking the "role" button, a pop-up display is triggered to display the first user interface as shown in FIG. 7. At the moment, the current main interface of the game virtual scene under the first user interface is subjected to static frame screen capture to obtain a static frame screen capture image, and the rendering of the virtual scene is closed, so that the performance loss is obviously reduced.
Step S402: and generating a first user interface according to the static frame screenshot picture and displaying the first user interface in a popup mode.
Illustratively, the first user interface may be generated and pop-up displayed according to a static screen of the current host interface of the virtual scene 300. Referring to fig. 5, in particular, in an exemplary embodiment of the present disclosure, the generating of the first user interface may include the steps of:
step S501: and after the current main interface of the virtual scene is subjected to static frame screen capture to obtain a static frame screen capture image, rendering the static frame screen capture image to a preset base image.
Illustratively, as shown in FIG. 3, for example, a still-frame screenshot of a current interface in the game virtual scene 300 may be obtained and rendered onto a predetermined base map, such as the base map of the size of the blur layer interface 610 shown in FIG. 6.
Step S502: and adding the rendered preset base map into the first user interface to pop up and display the first user interface. For example, referring to fig. 7 or fig. 8, the rendered preset base map, such as a map, may be added to the first user interface, and the first user interface displaying the character rendering target and the character information associated with the virtual character 320 may pop up.
Referring to fig. 6, the first user interface, i.e., the upper scene, displayed in a pop-up manner may include a blurred layer interface 610, a normal user interface 620, and a pop-up user interface 630, which are sequentially stacked. And when the three interfaces are popped up, the three interfaces can be opened in sequence. Wherein layers including at least character rendering targets (e.g., virtual character 320) may be located in the generic user interface 620. The fuzzy layer interface 610, the normal user interface 620 and the pop window user interface 630 correspond to a node normal node, a top node and a blu node, respectively.
Illustratively, when the authoring program implements the pop-up first user interface, the first user interface may be subordinate to 3 primary nodes, namely the node normal node, the top node, and the blu node. The common user interface 620 is hung under a normal node, the popup user interface 630 is hung under a top node, and the fuzzy layer interface 610 is hung under a blu node. The fuzzy layer interface 610 is rendered by adopting a fuzzy shader, and the initial state is not displayed, namely the blurnode node is initially invisible. And when the three nodes are in a normal state and are in a level, the program automatically judges the hierarchical relation of each node after the first user interface is triggered to be opened. Specifically, the fuzzy layer interface 610 and the normal user interface 620 may be opened first, and the rendered preset base map may be displayed in the fuzzy layer interface 610. Then, according to specific situations, the pop-up user interface 630 may be popped up when the user performs, for example, a click operation on the general user interface 620, so as to display, for example, character information associated with a character rendering target. Therefore, the game scene of the lower layer can be fused with the UI of the upper layer just, and the scene substitution feeling is enhanced.
Step S403: and when receiving a closing operation of the first user interface, closing the first user interface, releasing the static frame screenshot, and starting rendering of the virtual scene to recover to a current main interface of the virtual scene.
Illustratively, for example, when a closing operation is received on the user interface shown in fig. 7, if the user clicks the button "x" in the upper right corner, the first user interface is closed and the real-time generated static frame screenshot is released. In one exemplary embodiment, the method may further include: and after the first user interface is closed, removing the rendered preset base map from the first user interface and releasing the rendered preset base map. For example, the rendered preset base map is released from the fuzzy layer interface. The game interactive interface can be opened and closed by one key, resources are directly released, memory loss caused by real-time rendering of a game scene is reduced, scene rendering calling and loading performance loss can be reduced to the maximum extent after the game interactive UI interface is opened, the advantages of full screen and popup window are considered, and visual aesthetic feeling and substitution feeling of the game are improved.
Further, while operations of the disclosed methods are described in the foregoing description and drawings in a particular order, this does not require or imply that all of the operations must be performed in the particular order or achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
Having described the method of the exemplary embodiment of the present disclosure, the game interactive interface display device of the exemplary embodiment of the present disclosure will be explained next with reference to fig. 10.
The game interaction interface display apparatus shown in fig. 10 is applied to a terminal, and a graphical user interface is obtained by executing a software application on a processor of the terminal and rendering on a display of the terminal, wherein the content presented by the graphical user interface comprises a virtual scene and at least partially comprises a virtual object. The game interactive interface display device 100 may include the still frame screenshot module 101, the interface display module 102 and the interface restoration module 103. Wherein:
the static frame screenshot module 101 is configured to perform static frame screenshot on the current main interface of the virtual scene to obtain a static frame screenshot image and close rendering of the virtual scene when a preset operation on the graphical user interface is monitored.
The interface display module 102 is configured to generate a first user interface according to the still-frame screenshot and display the first user interface in a pop-up manner.
The interface recovery module 103 is configured to, when receiving a closing operation on the first user interface, close the first user interface to release the static frame screenshot, and start rendering the virtual scene to recover to a current main interface of the virtual scene.
In an exemplary embodiment, the preset operation includes any one of the following operations:
clicking a preset virtual control operation, sliding and touching the preset virtual control operation, and long-pressing the preset virtual control operation. And the preset virtual control is a function button arranged in the virtual scene.
In an exemplary embodiment, the generating a first user interface from the still-frame screenshot comprises:
after a static frame screen capture is carried out on the current main interface of the virtual scene to obtain a static frame screen capture image, the static frame screen capture image is rendered on a preset base image;
and adding the rendered preset base map to the first user interface.
In an exemplary embodiment, the interface resuming module 103 is further configured to remove the rendered preset base map from the first user interface and release the rendered preset base map after the first user interface is closed.
In an exemplary embodiment, the first user interface includes a fuzzy layer interface, a normal user interface and a popup user interface which are sequentially stacked; and the rendered preset base map is positioned in the fuzzy layer interface.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
It should be noted that although in the above detailed description several modules or units of the device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functionality of two or more modules or units described above may be embodied in one module or unit, according to embodiments of the present disclosure. Conversely, the features and functions of one module or unit described above may be further divided into embodiments by a plurality of modules or units. The components shown as modules or units may or may not be physical units, i.e. may be located in one place or may also be distributed over a plurality of network units. Some or all of the modules can be selected according to actual needs to achieve the purpose of the wood-disclosed scheme. One of ordinary skill in the art can understand and implement it without inventive effort.
In an exemplary embodiment of the present disclosure, an electronic device capable of implementing the above method is also provided.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
An electronic device 600 according to this embodiment of the present disclosure is described below with reference to fig. 10. The electronic device 600 shown in fig. 10 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 10, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: the at least one processing unit 610, the at least one memory unit 620, a bus 630 connecting different system components (including the memory unit 620 and the processing unit 610), and a display unit 640.
Wherein the storage unit stores program code that is executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present disclosure as described in the above section "exemplary methods" of this specification.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. As shown, the network adapter 660 communicates with the other modules of the electronic device 600 over the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
Through the above description of the embodiments, those skilled in the art will readily understand that the exemplary embodiments described herein may be implemented by software, or by software in combination with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a removable hard disk, etc.) or on a network, and includes several instructions to enable a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) to execute the method according to the embodiments of the present disclosure.
In an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the "exemplary methods" section above of this specification, when the program product is run on the terminal device.
Referring to fig. 11, a program product 800 for implementing the above method according to an embodiment of the present disclosure is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.
Claims (12)
1. A game interaction interface display method is characterized in that a graphical user interface is obtained by executing a software application on a processor of a terminal and rendering on a display of the terminal, and content presented by the graphical user interface comprises a virtual scene; the method comprises the following steps:
when monitoring a preset operation on the graphical user interface, performing static frame screen capture on the current main interface of the virtual scene to obtain a static frame screen capture image, and closing the rendering of the virtual scene;
generating a first user interface according to the static frame screenshot picture and displaying the first user interface in a popup mode;
and when receiving a closing operation of the first user interface, closing the first user interface, releasing the static frame screenshot, and starting rendering of the virtual scene to recover to a current main interface of the virtual scene.
2. The game interaction interface display method according to claim 1, wherein the preset operation includes any one of the following operations:
clicking a preset virtual control for operation;
sliding touch control preset virtual control operation;
long-time pressing a preset virtual control for operation;
and the preset virtual control is a function button arranged in the virtual scene.
3. The game interaction interface display method of claim 1, wherein generating the first user interface from the still-frame screenshot comprises:
after a static frame screen capture is carried out on the current main interface of the virtual scene to obtain a static frame screen capture image, the static frame screen capture image is rendered on a preset base image;
and adding the rendered preset base map to the first user interface.
4. A game interaction interface display method as recited in claim 3, further comprising:
and after the first user interface is closed, removing the rendered preset base map from the first user interface and releasing the rendered preset base map.
5. The game interaction interface display method of claim 4, wherein the first user interface comprises a fuzzy layer interface, a normal user interface and a popup user interface which are sequentially stacked; and the rendered preset base map is positioned in the fuzzy layer interface.
6. A game interaction interface display device is characterized in that a graphical user interface is obtained by executing a software application on a processor of a terminal and rendering on a display of the terminal, and content presented by the graphical user interface comprises a virtual scene; the device includes:
the static frame screen capturing module is used for performing static frame screen capturing on the current main interface of the virtual scene to obtain a static frame screen capturing image and closing the rendering of the virtual scene when monitoring a preset operation on the graphical user interface;
the interface display module is used for generating a first user interface according to the static frame screenshot picture and displaying the first user interface in a popup mode; and
and the interface recovery module is used for closing the first user interface to release the static frame screenshot when receiving the closing operation of the first user interface, and starting the rendering of the virtual scene to recover the current main interface of the virtual scene.
7. A game interaction interface display device as claimed in claim 6, wherein the preset operation comprises any one of the following operations:
clicking a preset virtual control for operation;
sliding touch control preset virtual control operation;
long-time pressing a preset virtual control for operation;
and the preset virtual control is a function button arranged in the virtual scene.
8. A game interaction interface display device as recited in claim 6, wherein the generating a first user interface from the still-frame screenshot comprises:
after a static frame screen capture is carried out on the current main interface of the virtual scene to obtain a static frame screen capture image, the static frame screen capture image is rendered on a preset base image;
and adding the rendered preset base map to the first user interface.
9. The game interaction interface display device of claim 8, wherein the interface recovery module is further configured to remove the rendered default base map from the first user interface and release the rendered default base map after the first user interface is closed.
10. The game interaction interface display device of claim 9, wherein the first user interface comprises a fuzzy layer interface, a normal user interface and a pop-up user interface, which are stacked in sequence; and the rendered preset base map is positioned in the fuzzy layer interface.
11. A computer-readable storage medium, having stored thereon a computer program which, when executed by a processor, carries out the steps of the game interaction interface display method of any one of claims 1 to 5.
12. A computing device, comprising:
a processor, and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the steps of the game interaction interface display method of any one of claims 1 to 5 via execution of the executable instructions.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710425829.XA CN107168616B (en) | 2017-06-08 | 2017-06-08 | Game interaction interface display method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710425829.XA CN107168616B (en) | 2017-06-08 | 2017-06-08 | Game interaction interface display method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107168616A CN107168616A (en) | 2017-09-15 |
CN107168616B true CN107168616B (en) | 2020-02-21 |
Family
ID=59824930
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710425829.XA Active CN107168616B (en) | 2017-06-08 | 2017-06-08 | Game interaction interface display method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107168616B (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108525309A (en) * | 2018-03-30 | 2018-09-14 | 湖南电灯泡信息技术服务有限公司 | A kind of application method of Games Software and the Games Software |
CN108525301A (en) * | 2018-03-30 | 2018-09-14 | 湖南电灯泡信息技术服务有限公司 | A kind of network game software and its application method |
CN108434746A (en) * | 2018-03-30 | 2018-08-24 | 湖南电灯泡信息技术服务有限公司 | A kind of network game software |
CN108635851B (en) * | 2018-05-16 | 2021-07-27 | 网易(杭州)网络有限公司 | Game picture processing method and device |
CN109542574B (en) * | 2018-11-28 | 2022-04-05 | 北京龙创悦动网络科技有限公司 | Pop-up window background blurring method and device based on OpenGL |
CN109800041B (en) * | 2018-12-24 | 2022-10-18 | 天津字节跳动科技有限公司 | Method and device for realizing small program background blurring, electronic equipment and storage medium |
CN113577770A (en) * | 2021-07-23 | 2021-11-02 | 广州元游信息技术有限公司 | Game rendering method |
CN115228080A (en) * | 2022-05-16 | 2022-10-25 | 网易(杭州)网络有限公司 | Display control method and device of game scene picture and electronic equipment |
CN114898006A (en) * | 2022-05-20 | 2022-08-12 | 北京字跳网络技术有限公司 | Interaction method, device, equipment and storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105094345A (en) * | 2015-09-29 | 2015-11-25 | 腾讯科技(深圳)有限公司 | Information processing method, terminal and computer storage medium |
CN105302569A (en) * | 2015-11-18 | 2016-02-03 | 网易(杭州)网络有限公司 | Method and device used for generating special-shaped window |
CN106512402A (en) * | 2016-11-29 | 2017-03-22 | 北京像素软件科技股份有限公司 | Game role rendering method and device |
CN106534733A (en) * | 2015-09-09 | 2017-03-22 | 杭州海康威视数字技术股份有限公司 | Video window displaying method and apparatus |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9594488B2 (en) * | 2013-12-12 | 2017-03-14 | Google Inc. | Interactive display of high dynamic range images |
US20160180441A1 (en) * | 2014-12-22 | 2016-06-23 | Amazon Technologies, Inc. | Item preview image generation |
-
2017
- 2017-06-08 CN CN201710425829.XA patent/CN107168616B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106534733A (en) * | 2015-09-09 | 2017-03-22 | 杭州海康威视数字技术股份有限公司 | Video window displaying method and apparatus |
CN105094345A (en) * | 2015-09-29 | 2015-11-25 | 腾讯科技(深圳)有限公司 | Information processing method, terminal and computer storage medium |
CN105302569A (en) * | 2015-11-18 | 2016-02-03 | 网易(杭州)网络有限公司 | Method and device used for generating special-shaped window |
CN106512402A (en) * | 2016-11-29 | 2017-03-22 | 北京像素软件科技股份有限公司 | Game role rendering method and device |
Also Published As
Publication number | Publication date |
---|---|
CN107168616A (en) | 2017-09-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107168616B (en) | Game interaction interface display method and device, electronic equipment and storage medium | |
CN107551555B (en) | Game picture display method and device, storage medium and terminal | |
EP3129871B1 (en) | Generating a screenshot | |
US20150082239A1 (en) | Remote Virtualization of Mobile Apps with Transformed Ad Target Preview | |
CN107050850A (en) | The recording and back method of virtual scene, device and playback system | |
WO2017054465A1 (en) | Information processing method, terminal and computer storage medium | |
CN107977141B (en) | Interaction control method and device, electronic equipment and storage medium | |
WO2018126957A1 (en) | Method for displaying virtual reality screen and virtual reality device | |
CN111760272B (en) | Game information display method and device, computer storage medium and electronic equipment | |
CN113559520B (en) | Interaction control method and device in game, electronic equipment and readable storage medium | |
CN113827970B (en) | Information display method and device, computer readable storage medium and electronic equipment | |
CN111467791B (en) | Target object control method, device and system | |
CN111045777B (en) | Rendering method and device, storage medium and electronic equipment | |
CN111467790A (en) | Target object control method, device and system | |
CN113885731B (en) | Virtual prop control method and device, electronic equipment and storage medium | |
CN111782108B (en) | Interface switching control method, device, medium and equipment in game | |
CN107626105B (en) | Game picture display method and device, storage medium and electronic equipment | |
CN106984044B (en) | Method and equipment for starting preset process | |
US20240361837A1 (en) | Ephemeral Artificial Reality Experiences | |
JP7361399B2 (en) | Screen capture methods, devices and storage media | |
CN113680062B (en) | Information viewing method and device in game | |
CN114020396A (en) | Display method of application program and data generation method of application program | |
CN113318441A (en) | Game scene display control method and device, electronic equipment and storage medium | |
CN110665218A (en) | Game interaction method, game interaction device, storage medium and display equipment | |
JP2020529055A (en) | Rule-based user interface generation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |