CN118781242A - Rendering processing method, device, equipment, storage medium and program product - Google Patents
Rendering processing method, device, equipment, storage medium and program product Download PDFInfo
- Publication number
- CN118781242A CN118781242A CN202310390253.3A CN202310390253A CN118781242A CN 118781242 A CN118781242 A CN 118781242A CN 202310390253 A CN202310390253 A CN 202310390253A CN 118781242 A CN118781242 A CN 118781242A
- Authority
- CN
- China
- Prior art keywords
- target
- game
- protocol
- rendering
- server
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000009877 rendering Methods 0.000 title claims abstract description 157
- 238000003860 storage Methods 0.000 title claims abstract description 24
- 238000003672 processing method Methods 0.000 title abstract description 9
- 238000012545 processing Methods 0.000 claims abstract description 123
- 238000000034 method Methods 0.000 claims abstract description 85
- 230000006870 function Effects 0.000 claims description 109
- 238000013507 mapping Methods 0.000 claims description 30
- 238000004040 coloring Methods 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 11
- 238000005516 engineering process Methods 0.000 abstract description 20
- 230000008569 process Effects 0.000 description 17
- 238000010586 diagram Methods 0.000 description 15
- 238000004891 communication Methods 0.000 description 13
- 230000000694 effects Effects 0.000 description 8
- 239000011159 matrix material Substances 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 235000014510 cooky Nutrition 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000013519 translation Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000006855 networking Effects 0.000 description 1
- 230000037361 pathway Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 208000015041 syndromic microphthalmia 10 Diseases 0.000 description 1
Landscapes
- Information Transfer Between Computers (AREA)
Abstract
The embodiment of the application discloses a rendering processing method, a rendering processing device, rendering processing equipment, a storage medium and a program product, at least relates to cloud technology, can timely respond to control operation of a user on a cloud game, reasonably reuses system capacity supported by terminal equipment and improves game experience. The method comprises the following steps: sending a game operation event aiming at the cloud game to a first server, wherein the game operation event is used for determining a target calling protocol by the first server, the target calling protocol is a calling protocol used when a target function module in terminal equipment is called, and the target function module is used for executing processing capacity required by the game operation event; receiving a game picture of the cloud game and a target calling protocol, wherein the game picture is sent by a first server; invoking a target function module based on a target invoking protocol to generate a target control corresponding to the target function module; rendering the target control and the game picture to obtain a target rendering result of the cloud game; and displaying the target rendering result.
Description
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to a rendering processing method, a rendering processing device, rendering processing equipment, a storage medium and a program product.
Background
Rendering is one of Computer Graphics (CG) processes, and the rendered image can more conform to a three-dimensional scene. For example, after the game picture of the cloud game is rendered, the terminal device can clearly display the rendered game picture.
With the continuous development of cloud technology, a game running mode which depends on cloud technology running, namely cloud game, is provided for a cloud game scene. The cloud game described is a cloud computing based game style. The current rendering mode of cloud games is rendering in the running mode of the cloud games. And a rendering mode in a cloud game running mode is adopted, all cloud games are required to run on a server side, and the server compresses the game pictures after rendering and transmits the game pictures to the terminal equipment through a network. However, the running of the cloud game is not limited to the game screen in the video stream, but for the system capability of the terminal device required when running the cloud game, the terminal device needs to inform the server, so that the server can render the game screen meeting the support of the terminal device. However, in the rendering mode, after the terminal equipment informs the server of relevant information such as system capacity and the like required by the cloud game in operation, the server can complete rendering processing of the cloud game, so that the control operation of a user on the cloud game cannot be responded well, and game experience is reduced. In addition, in the existing method, rendering of the cloud game is completed at the server side, and related information of system capacity sent by the terminal equipment is needed to be relied on, so that the system capacity supported by the terminal equipment cannot be reasonably reused at the server side, and user experience is poor.
Disclosure of Invention
The embodiment of the application provides a rendering processing method, a rendering processing device, rendering processing equipment, a storage medium and a program product, which can timely respond to control operation of a user on a cloud game, reasonably multiplex system capacity supported by terminal equipment and promote game experience.
In a first aspect, an embodiment of the present application provides a method for rendering processing. The method can be applied to the terminal equipment. The described method of rendering processing comprises: sending a game operation event aiming at the cloud game to a first server, wherein the game operation event is used for determining a target calling protocol by the first server, the target calling protocol is a calling protocol used when a target function module in terminal equipment is called, and the target function module is used for executing processing capacity required by the game operation event; receiving a game picture of the cloud game and a target calling protocol, wherein the game picture is sent by a first server; invoking a target function module based on a target invoking protocol to generate a target control corresponding to the target function module; rendering the target control and the game picture to obtain a target rendering result of the cloud game; and displaying the target rendering result.
In a second aspect, an embodiment of the present application provides another method of rendering processing. The method may be applied in a first server. The described method of rendering processing comprises: acquiring a game operation event aiming at a cloud game, which is sent by terminal equipment; determining a target calling protocol based on the game operation event, wherein the target calling protocol is a calling protocol used when a target functional module is called, and the target functional module is used for executing processing capacity required by the game operation event; and sending a game picture of the cloud game and a target calling protocol to the terminal equipment, wherein the target calling protocol is used for calling the target functional module by the terminal equipment so that the terminal equipment generates a target control corresponding to the target functional module, and rendering the target control and the game picture to obtain a target rendering result of the cloud game.
In a third aspect, an embodiment of the present application provides a terminal device. The terminal device comprises a sending unit, an obtaining unit and a processing unit. The sending unit is used for sending a game operation event aiming at the cloud game to the first server, the game operation event is used for determining a target calling protocol by the first server, the target calling protocol is a calling protocol used when a target functional module is called, and the target functional module is used for executing processing capacity required by the game operation event. And the acquisition unit is used for receiving the game picture of the cloud game and the target calling protocol, which are sent by the first server. The processing unit is used for calling the target function module based on the target calling protocol so as to generate a target control corresponding to the target function module; the processing unit is used for rendering the target control and the game picture to obtain a target rendering result of the cloud game; and the processing unit is used for displaying the target rendering result.
In some optional embodiments, the obtaining unit is configured to obtain a first mapping table, where the first mapping table is used to indicate an association between each calling protocol and each function module in the terminal device, the target calling protocol is one or more of each calling protocol, and the target function module is one or more of each function module. The processing unit is used for calling the target function module based on the target calling protocol and the first mapping table.
In other alternative embodiments, the obtaining unit is configured to obtain a target execution parameter, where the target execution parameter includes a target call protocol and a call parameter when the target call protocol is called. The processing unit is used for generating a target control corresponding to the target function module based on the target execution parameters.
In other optional embodiments, the obtaining unit is further configured to obtain the callback parameter corresponding to the target function module after generating the target control corresponding to the target function module based on the target execution parameter. And the processing unit is used for carrying out callback processing on the target control based on the callback parameter so as to call the target control.
In other alternative embodiments, the obtaining unit is configured to obtain bitmap information of a current frame in the game screen, where the bitmap information is used to indicate a frame condition of the current frame in the game screen. The processing unit is used for: determining rendering environment parameters based on bitmap information of the current frame; and rendering the target control and the game picture based on the rendering environment parameters to obtain a target rendering result of the cloud game.
In other alternative embodiments, the processing unit is configured to: creating a target shading container when the rendering environment parameters include RGBA color values of the target control; filling pixel parameters of the picture vertexes of the current frame based on the RGBA color values to obtain a first result; determining a second result based on the dynamic effect execution parameters of the current frame, wherein the second result is used for indicating the relative position of the picture vertex, and the relative position is used for indicating the relative situation between the picture vertex and the decoder when the dynamic effect of the target control is changed; and coloring and drawing the first result and the second result based on the target coloring container to obtain a target rendering result of the cloud game.
In other optional embodiments, the obtaining unit is configured to receive a game frame and a target call protocol of the cloud game sent by the second server, where the game frame and the target call protocol of the cloud game are sent by the first server to the second server.
In other optional embodiments, the obtaining unit is configured to receive a video stream of the cloud game sent by the first server, where the video stream includes a game frame of the cloud game and a target call protocol. The processing unit is used for analyzing the video stream to obtain a game picture of the cloud game and a target call protocol.
In other optional embodiments, the obtaining unit is further configured to receive a game control operation of the cloud game before sending the game operation event for the cloud game to the first server. The processing unit is used for generating game operation events based on game control operations.
In a fourth aspect, an embodiment of the present application provides a first server. The first server comprises an acquisition unit, a processing unit and a sending unit. The acquisition unit is used for acquiring game operation events which are sent by the terminal equipment and aim at the cloud game. The processing unit is used for determining a target calling protocol based on the game operation event, wherein the target calling protocol is a calling protocol used when a target functional module is called, and the target functional module is used for executing processing capacity required by the game operation event. The sending unit is used for sending a game picture of the cloud game and a target calling protocol to the terminal equipment, and the target calling protocol is used for calling the target functional module by the terminal equipment so that the terminal equipment generates a target control corresponding to the target functional module, and rendering the target control and the game picture to obtain a target rendering result of the cloud game.
In some optional embodiments, the processing unit is further configured to determine a call protocol corresponding to each function module in the terminal device before determining the target call protocol based on the game operation event; and constructing an association relation between each calling protocol and the corresponding functional module to generate a first mapping table, wherein the first mapping table is used for indicating the association relation between each calling protocol and each functional module in the terminal equipment. The processing unit is used for: determining a function module identifier corresponding to the game operation event; and determining a target calling protocol from the first mapping table based on the identification of the functional module.
In other optional embodiments, the sending unit is configured to send the game frame of the cloud game and the target call protocol to the second server, and the second server is configured to send the game frame of the cloud game and the target call protocol to the terminal device.
A fifth aspect of an embodiment of the present application provides a rendering processing apparatus, including: memory, input/output (I/O) interfaces, and memory. The memory is used for storing program instructions. The processor is configured to execute program instructions in the memory to perform a method of rendering processing corresponding to the implementation manner of the first aspect; or a method of performing a rendering process corresponding to the embodiment of the second aspect described above.
A sixth aspect of the embodiments of the present application provides a computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform to execute a method corresponding to an embodiment of the first aspect described above; or a method of performing a rendering process corresponding to the embodiment of the second aspect described above.
A seventh aspect of the embodiments of the present application provides a computer program product comprising instructions which, when run on a computer or processor, cause the computer or processor to perform the above-described method for performing the embodiment of the above-described first aspect; or a method of performing a rendering process corresponding to the embodiment of the second aspect described above.
From the above technical solutions, the embodiment of the present application has the following advantages:
In the embodiment of the application, after sending the game operation event aiming at the cloud game to the first server, the terminal equipment can enable the first server to determine the target calling protocol based on the game operation event. Furthermore, the described target call protocol can be understood as a call protocol used when a target function module in the terminal device is called, which can perform processing power required when a game operation event is executed. In this way, after receiving the game picture and the target call protocol of the cloud game sent by the first server, the terminal equipment can call the target function module based on the target call protocol to generate a target control corresponding to the target function module, and further perform rendering processing on the target control and the game picture to obtain a target rendering result of the cloud game. And finally, the terminal equipment displays the target rendering result. By the method, the terminal equipment does not need to inform the first server of the related information of the system capacity of the terminal equipment, the rendering process of the cloud game is not needed to be completed on the first server side, the first server determines a corresponding target calling protocol through a game operation event, the required target calling protocol is sent to the terminal equipment side, and a corresponding target function module is called from the terminal equipment side as far as possible based on the target calling protocol, so that the control operation of a user on the cloud game is responded timely, and the game experience is improved; compared with the traditional scheme that the terminal equipment informs about the relevant information such as the capacity of the first server system, the method and the device complete the rendering processing of the cloud game at the terminal equipment side, can reasonably reuse the system capacity supported by the terminal equipment, and improve the user experience.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 shows a rendering schematic of a prior art scheme using a cloud game;
FIG. 2 is a schematic diagram of a system architecture according to an embodiment of the present application;
FIG. 3 illustrates a flow chart of a method of rendering processing provided by an embodiment of the present application;
FIG. 4 is a schematic flow chart of SDK registration provided in an embodiment of the present application;
FIG. 5 illustrates a flow diagram of a rendering process provided in an embodiment of the present application;
Fig. 6 shows a schematic diagram of multiplexing of functional modules according to an embodiment of the present application;
FIG. 7A is a schematic diagram of an embodiment of the present application applied to a web page;
FIG. 7B is a schematic diagram of an embodiment of the present application applied to a mobile terminal page;
fig. 8 shows a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 9 shows a schematic structural diagram of a first server according to an embodiment of the present application;
Fig. 10 shows a schematic hardware structure of a rendering processing device according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a rendering processing method, a rendering processing device, rendering processing equipment, a storage medium and a program product, which can timely respond to control operation of a user on a cloud game, reasonably multiplex system capacity supported by terminal equipment and promote game experience.
It will be appreciated that in the specific embodiments of the present application, related data such as user information, personal data of a user, etc. are involved, and when the above embodiments of the present application are applied to specific products or technologies, user permission or consent is required, and the collection, use and processing of related data is required to comply with relevant laws and regulations and standards of relevant countries and regions.
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Cloud gaming (cloudgaming), which may also be referred to as game on demand (gamingondemand), is an online gaming technology based on cloud computing technology. Cloud gaming technology enables lightweight devices (thinclient) with relatively limited graphics processing and data computing capabilities to run high quality games. In a cloud game scene, the game is not run in a player game terminal, but is run in a cloud server, the cloud server renders the game scene into a video and audio stream, and the video and audio stream is transmitted to the player game terminal through a network. The player game terminal does not need to have strong graphic operation and data processing capability, and only needs to have basic streaming media playing capability and the capability of acquiring player input instructions and sending the player input instructions to the cloud server.
On the basis of cloud gaming, fig. 1 shows a rendering diagram of the existing scheme when using cloud gaming. As shown in fig. 1, after the terminal device mainly operates the cloud game, a game control operation is received, and a game control instruction is generated according to the game control operation. Moreover, the running of the cloud game does not depend entirely on the rendered game screen, but also needs to depend on the system capabilities such as the base capabilities and components included in the terminal device. Based on the information, the terminal device transmits relevant information such as system capacity and the like required by running the cloud game and game control instructions to the server in the form of data stream and the like. After receiving the game control instruction, the server performs cloud rendering processing on the corresponding game picture and the received system capacity, so that the game picture after rendering is compressed and then sent to the terminal device in a video stream mode. In this way, after the terminal device analyzes the video stream to obtain the game picture, the corresponding game picture is displayed. However, in the rendering mode, after the terminal equipment informs the server of relevant information such as system capacity and the like required by the cloud game in operation, the server can complete rendering processing of the cloud game, so that the control operation of a user on the cloud game cannot be responded well, and game experience is reduced. In addition, in the existing method, rendering of the cloud game is completed at the server side, and related information of system capacity sent by the terminal equipment is needed to be relied on, so that the system capacity supported by the terminal equipment cannot be reasonably reused at the server side, and user experience is poor.
Therefore, in order to solve the technical problems described above, an embodiment of the present application provides a rendering processing method. The rendering processing method provided by the application can be applied to the system architecture schematic diagram shown in fig. 2. As shown in fig. 2, the system architecture includes a terminal device and a first server. The terminal equipment is provided with application programs such as cloud games. After the user operates the cloud game on the display interface of the terminal device, the terminal device can generate a game operation event aiming at the cloud game. In this way, the terminal device transmits the game operation event to the first server, so that the first server transmits the game picture of the cloud game and the target call protocol to the terminal device after determining the target call protocol based on the game operation event. After receiving the target call protocol, the terminal equipment can call the target function module based on the target call protocol, so as to generate a target control corresponding to the target function module. In this way, the terminal equipment performs rendering processing on the target control and the game picture to obtain a target rendering result of the cloud game, and displays the target rendering result. By the method, the terminal equipment does not need to inform the first server of the related information of the system capacity of the terminal equipment, the rendering process of the cloud game is not needed to be completed on the first server side, the first server determines a corresponding target calling protocol through a game operation event, the required target calling protocol is sent to the terminal equipment side, and a corresponding target function module is called from the terminal equipment side as far as possible based on the target calling protocol, so that the control operation of a user on the cloud game is responded timely, and the game experience is improved; compared with the traditional scheme that the terminal equipment informs about the relevant information such as the capacity of the first server system, the method and the device complete the rendering processing of the cloud game at the terminal equipment side, can reasonably reuse the system capacity supported by the terminal equipment, and improve the user experience.
It should be noted that the above-described terminal devices may include, but are not limited to, smartphones, desktop computers, notebook computers, tablet computers, smart speakers, vehicle-mounted devices, smart watches, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server or the like for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content distribution networks (contentdelivery network, CDN), basic cloud computing services such as big data and artificial intelligent platforms, and the application is not limited in particular. In addition, the terminal device and the server may be directly connected or indirectly connected by wired communication or wireless communication, and the present application is not particularly limited.
In addition, the above-mentioned rendering processing method can also be applied in the field of cloud technology (closed technology) and the like. The described cloud technology refers to a hosting technology for unifying serial resources such as hardware, software, network and the like in a wide area network or a local area network to realize calculation, storage, processing and sharing of data. The cloud technology (cloudtechnology) can form a resource pool based on the general terms of network technology, information technology, integration technology, management platform technology, application technology and the like applied by the cloud computing business model, and is flexible and convenient as required. Cloud computing technology will become an important support. Background services of technical networking systems require a large amount of computing, storage resources, such as video websites, picture-like websites, and more portals. Along with the high development and application of the internet industry, each article possibly has an own identification mark in the future, the identification mark needs to be transmitted to a background system for logic processing, data with different levels can be processed separately, and various industry data needs strong system rear shield support and can be realized only through cloud computing.
The cloud computing (cloudcomputing) mentioned is a computing model that distributes computing tasks across a large pool of computer-made resources, enabling various application systems to acquire computing power, storage space, and information services as needed. The network that provides the resources is referred to as the "cloud". Resources in the cloud are infinitely expandable in the sense of users, and can be acquired at any time, used as needed, expanded at any time and paid for use as needed. As a basic capability provider of cloud computing, a cloud computing resource pool (cloud platform for short, commonly referred to as infrastructure as a service (infrastructureasaservice, iaaS) platform) is established, in which multiple types of virtual resources are deployed for external clients to select for use, where the cloud computing resource pool mainly includes: according to a logical functional division, a platform as a service (platformasaservice, paaS) layer may be deployed on the IaaS layer, software as a service (SoftwareasaService, saaS) layer may be redeployed above the PaaS layer, saaS can also be directly deployed on IaS, paaS is a platform for software operation, such as a database, a web container and the like, saaS is various business software, such as a web portal, a short message mass sender and the like.
The cloud games described may include, but are not limited to, action type games, adventure type games, first-person shooter type games (FPS), fighting technology type games (fight technologygame, FTG), music type games (musicgame, MUG), educational type games (puzzle game, PUZ), racing type games (racegame, RAC), sports type games (sportsgame, SPG), shooting type games (shootinggame, STG), desktop type games (tablegame, TAB), instant strategy games (real-TIMESTRATEGY, RTS), role playing type games (role-PLAYING GAME, RPG), strategy type games (simulationgame, SLG), card type games (collectible cardgame, CCG), massively multiplayer online role playing games (multiplayeronline role-playinggame, MMORPG), multiplayer online technology game (multiplayeronline battlearena, MOBA), clouds, developing world games, etc., without limitation in embodiments of the present application.
In order to facilitate understanding of the technical solution of the present application, the method for rendering processing provided by the embodiment of the present application is described below from the perspective of interaction between the terminal device and the first server with reference to the accompanying drawings. Fig. 3 shows a flowchart of a method of rendering processing provided by an embodiment of the present application. As shown in fig. 3, the method of rendering processing may include the steps of:
301. And the terminal equipment sends a game operation event aiming at the cloud game to the first server.
In this example, after a user performs an operation on a cloud game in a terminal device, the terminal device may generate a game operation event for the cloud game. For example, a user can perform opening operation, pause operation, video recording, photographing, sharing or screen capturing on a cloud game through the terminal device; or the user can control the virtual object or the virtual character in the cloud game through the terminal equipment, for example, the user controls the virtual object to move, rotate, jump and attack through the terminal equipment. At this time, the terminal device may generate a corresponding game operation event, for example, the user wants to share a certain content in the cloud game to other users, and after the user selects the sharing operation in the terminal device, the terminal device may generate a corresponding sharing game operation event. It should be noted that the described game play event may include a corresponding function module identification. The described function module identifiers can be used to identify system capabilities that the terminal device can support when executing the game operation event, such as user interface (userinterface, UI) canvas, photo, code scanning, screen capturing, clipboard, keyboard, file storage, map, etc., and the embodiments of the application are not limited to this.
In this way, after generating the game operation event of the cloud game, the terminal device can send the game operation event for the cloud game to the first server.
In order to adapt to the cloud game engine shared by the webpage cloud game application program and the terminal cloud game application program, in the embodiment of the application, a software development kit (softwaredevelopmentkit, SDK) is further adopted to access the terminal equipment and the first server which use the cloud game. The described manner of employing the SDK can be understood as a call made by the SDK provided to the terminal device and the SDK of the first server in the form of an interface. The interface call information may be obtained, for example, by an SDK in the terminal device, which in turn invokes an application programming interface (applicationprogramminginterface, API). Similarly, the first server may also obtain interface call information through the built-in SDK to be able to call the API. In this way, when the cloud game application is started, the terminal device and the first server can be accessed in advance through the SDK mode. Fig. 4 is a schematic flow chart of SDK registration provided in an embodiment of the present application. As shown in fig. 4, when a user starts a cloud game application, a first server opens rights to the cloud game application to generate a corresponding account number tag by inputting an application package name of the cloud game application (accounttoken). The corresponding cloud gaming application can be identified by the account mark. Similarly, the terminal device may call an initialization interface of the SDK through the startup module, so as to transfer account numbers, service uniform resource location systems (uniformresourcelocator, URLs), cloud game Identity (ID) service parameters, and the like into the initialization interface, so that the SDK of the terminal device registers with the second server. The second server described is understood to be the server to which the service application corresponds, i.e. the service server. Thus, after receiving the initialization parameter or the service request of the cloud game based on the receiving module, the service server generates a signature and sends the request to the first server. Thus, the first server can check the name, signature and account mark of the application program package after receiving the request sent by the second server, and further generate a small text file (cookie) after the check is successful. The first server then sends the cookie to the second server so that the second server sends the cookie to the terminal device. At this time, after receiving the cookie, the SDK in the terminal device can generate a corresponding session (session), so that the user can carry the session to perform service interaction with the second server each time when operating the cloud game application. In addition, fig. 4 also describes a procedure (dotted line portion) of service invocation, and may be specifically understood with reference to the contents of steps 301 to 305, which are not described herein.
302. The first server determines a target call protocol based on the game operation event, the target call protocol being a call protocol used when a target function module is called, the target function module being used for executing processing capacity required by the game operation event.
In this example, the first server would have previously configured a respective call protocol for each functional module in the terminal device. For example, the first server configures loadCamera:// "protocols for" camera function module "; similarly, the first server configures loadCavas:// "protocols for" create canvas window function module "and so on, and the embodiments of the present application are not limited. In this way, after configuring a corresponding call protocol for each functional module, the first server can generate the first mapping table based on the association relationship between the configured functional module and the corresponding call protocol. It should be noted that, the association relationship between each calling protocol and the corresponding functional module can be indicated through the first mapping table.
In this way, after receiving the game operation event sent by the terminal device, the first server can perform demapping processing on the game operation event, so as to determine the function module identifier corresponding to the game operation event. Moreover, since the function module identifier can be used to identify the system capability that the terminal device can support when executing the game operation event, it can be specifically understood with reference to the foregoing description of step 301, which is not repeated herein. After the first server determines the function module identifier corresponding to the game operation event, the first server can know which target function module or target function modules need to be realized to support the capability. Moreover, the first server can generate and store the first mapping table after the association between each functional module and the corresponding call protocol is configured in advance. In this way, after the first server obtains the function module identifier through demapping, the call protocol associated with the function module identified by the function module identifier can be searched from the first mapping table, so that the target call protocol is searched and obtained. The described target call protocol may be understood as a call protocol used when a target function module is called, and the target function module may be understood as processing power required to execute a game operation event.
For example, if the function module identifier corresponding to the game operation event is "photographing", this may be implemented by executing the "camera function module". In this way, the first server is able to determine the target call protocol, i.e. "loadCamera://" protocol, from the first mapping table based on the "take a picture". It should be noted that the examples herein are merely illustrative, and other examples may be used in the practice of the present application, and the embodiments of the present application are not limited thereto.
303. The first server sends a game picture of the cloud game and a target calling protocol to the terminal equipment.
In this example, the first server may collect a game frame of the cloud game during the operation of the cloud game, in addition to determining the target call protocol. Further, the first server may send the game frame of the cloud game and the target call protocol determined in step 302 to the terminal device. The first server may also send the game screen and the target call protocol to the second server, and the second server may forward the game screen and the target call protocol to the terminal device.
As an exemplary description, the first server may map the game screen and the target call protocol in a video stream, and then transmit the video stream to the terminal device. The first server may also, for example, send the video stream to the second server after mapping the game screen and the target call protocol to the video stream. The video stream is then forwarded by the second server to the terminal device.
The mode of the first server to send the game picture and the target call protocol to the terminal device is not limited in the embodiment of the present application.
304. The terminal equipment calls the target function module based on the target call protocol to generate a target control corresponding to the target function module.
In this example, after receiving the target call protocol sent by the first server, the terminal device can explicitly know which protocol needs to be adopted to call the target function module. Therefore, the terminal equipment can execute the call of the target function module based on the target call protocol, and further generate the target control corresponding to the target function module. As an exemplary description, in the process of calling the target function module based on the target call protocol, the terminal device may first obtain the first mapping table from the first server, and then call the target function module based on the target call protocol and the first mapping table. It should be noted that the first mapping table described in the foregoing step 302 may be understood with reference to the description of the foregoing step, which is not repeated herein. For example, if the association relationship between "loadCamera://" protocol and "camera function module" and the association relationship between "loadCavas://" protocol and "create canvas window function module" are stored in the first mapping table, the terminal device may explicitly call the target function module, that is, call the "camera function module", if the target call protocol obtained from the first server at this time is "loadCamera:/" protocol.
It should be noted that, the above-described association relationship between "loadCamera://" protocol and "camera function module" and the association relationship between "loadCavas://" protocol and "create canvas window function module" are stored in the first mapping table, which is merely a schematic description. In practical application, the first mapping table may also store association relations between other functional modules and corresponding call protocols, which is not limited in the embodiment of the present application.
The described target controls may also be referred to as components, and specific names are not limited in embodiments of the application. For example, since the target execution parameters include a target call protocol and call parameters required when the target call protocol is called, the terminal device may acquire the target execution parameters and generate a target control corresponding to the target function module based on the target execution parameters. In other examples, the terminal device may further obtain the callback parameter corresponding to the target function module after generating the target control corresponding to the target function module based on the target execution parameter. Further, the terminal device carries out callback processing on the target control based on the callback parameter so as to call the target control. For example, if the target function module called by the terminal device is a "camera function module", the terminal device may generate a control corresponding to the "camera function module", such as a "camera control", based on the corresponding calling parameter. Then, the terminal equipment can realize callback processing of the camera control based on the callback parameter, so that call processing of the target control is completed.
305. And the terminal equipment performs rendering processing on the target control and the game picture to obtain a target rendering result of the cloud game.
In this example, after the terminal device generates the target control and obtains the game screen, the terminal device may perform rendering processing on the target control and the game screen, so as to render and obtain a target rendering result of the cloud game.
Aiming at the rendering processing process of the terminal equipment on the target control and the game picture, the terminal equipment can firstly acquire the bitmap information of the current frame in the game picture and determine rendering environment parameters based on the bitmap information of the current frame. The described bitmap information can be used to indicate the frame condition of the current frame in the game screen. And then, the terminal equipment performs rendering processing on the target control and the game picture based on the rendering environment parameters to obtain a target rendering result of the cloud game.
For the above-mentioned rendering process, it can be understood with reference to the flow chart shown in fig. 5. As shown in fig. 5, in the course of the rendering process, the terminal device determines rendering environment parameters based on the bitmap information of the current frame after acquiring the bitmap information of the current frame. The terminal device then starts the context state machine and clears the buffer to free up memory for the next frame. The described context state machine can be used to save the state of the current buffer, shader, view, and vertex array, etc. The buffer area is understood to be a region storing image information, such as a video memory in a memory or a video memory in a GPU, which is not limited in the embodiment of the present application. In addition, the described buffer may include a color buffer, a depth buffer, a template buffer, etc., which are not limited in the embodiment of the present application.
Further, the terminal device may further determine whether the rendering environment parameter includes an RGBA color value of the target control. If the rendering environment parameter is judged not to include the RGBA color value of the target control, the terminal equipment acquires bitmap information of the next frame, so that the rendering environment parameter determined by the current frame is updated. If the rendering environment parameter is judged to comprise RGBA color values of the target control, the terminal equipment creates a target coloring container, and fills pixel parameters of the picture vertexes of the current frame based on the RGBA color values to obtain a first result. For example, the terminal device may connect the points in the vertex data to form an image through the target shading container and fill each pixel with an RGBA color value. Further, the terminal device also determines a second result based on the dynamic performance parameter of the current frame. The second result is described as indicating the relative position of the picture vertex, which is used to indicate the relative situation between the picture vertex and the decoder when the target control is actively changed. For example, the terminal device may determine matrix variables when the target control is subjected to various motion effects in translation, rotation and scaling based on the motion effect execution parameters of the current frame, and multiply the matrix variables when the target control is subjected to various motion effects in translation, rotation and scaling to obtain the multiplied matrix variables. At this time, the terminal device may fuse the multiplied matrix variables into matrix variables of the next frame, so as to obtain the relative position of the vertex of each picture. In this way, the terminal device performs coloring drawing processing on the first result and the second result based on the target coloring container, and a target rendering result of the cloud game is obtained. By the method, the terminal equipment completes filling processing of the current frame game picture based on the RGBA color value of the target control and the second result corresponding to the game picture when the target control is changed in the dynamic effect, and further the first result and the second result obtained by the filling processing can be rendered in the same created target coloring container, so that the same-layer rendering of the target control and the game picture is completed. That is, the application puts the target control and the game picture in the target coloring container of the same hierarchy, adopts the same-layer rendering mode to realize the rendering processing of the target control and the game picture, can effectively avoid the situation that the target control is covered on the player to cause the disordered rendering of the hierarchy, and can increase the fluency of the playing of the rendering result.
306. And the terminal equipment displays the target rendering result.
In this example, after the terminal device renders the target rendering result, the rendering result can be displayed on a display interface of the terminal device, so that a user can check the rendering result conveniently.
In the embodiment of the application, after sending the game operation event aiming at the cloud game to the first server, the terminal equipment can enable the first server to determine the target calling protocol based on the game operation event. Furthermore, the described target call protocol can be understood as a call protocol used when a target function module in the terminal device is called, which can perform processing power required when a game operation event is executed. In this way, after receiving the game picture and the target call protocol of the cloud game sent by the first server, the terminal equipment can call the target function module based on the target call protocol to generate a target control corresponding to the target function module, and further perform rendering processing on the target control and the game picture to obtain a target rendering result of the cloud game. And finally, the terminal equipment displays the target rendering result. By the method, the terminal equipment does not need to inform the first server of the related information of the system capacity of the terminal equipment, the rendering process of the cloud game is not needed to be completed on the first server side, the first server determines a corresponding target calling protocol through a game operation event, the required target calling protocol is sent to the terminal equipment side, and a corresponding target function module is called from the terminal equipment side as far as possible based on the target calling protocol, so that the control operation of a user on the cloud game is responded timely, and the game experience is improved; compared with the traditional scheme that the terminal equipment informs about the relevant information such as the capacity of the first server system, the method and the device complete the rendering processing of the cloud game at the terminal equipment side, can reasonably reuse the system capacity supported by the terminal equipment, and improve the user experience.
In addition, when the video stream of the cloud game is loaded, the related player is adopted to decode the data, so that the video stream sent by the server needs to be continuously adjusted to adapt to the webpage or the mobile webpage in the existing scheme. In the embodiment of the present application, however, the corresponding functional modules are executed by means of calling protocols, so that the supported system capabilities or base components of the terminal device are multiplexed as much as possible, for example, fig. 6 shows a schematic diagram of multiplexing "popup functional modules". As shown in fig. 6, in the cloud game, if the user operates the virtual character in the cloud game every time and enters the first five characters one by one, the terminal device will repeatedly call the "whether to continue the character by one champion" pop-up window function module. Therefore, on the basis of multiplexing supported system capacity or basic components of the terminal equipment, the user experience is improved under the condition of supporting the webpage and the mobile terminal webpage. Specifically, the method can be understood by referring to fig. 7A, which shows a schematic diagram of an application on a web page, and fig. 7B, which shows a schematic diagram of an application on a mobile page. For example, as shown in fig. 7A, a game screen of the cloud game shown in fig. 6 may be displayed in a web-side page; or, as shown in fig. 7B, a game screen of the cloud game shown in fig. 6 may be displayed in the mobile terminal page.
The foregoing description of the solution provided by the embodiments of the present application has been mainly presented in terms of a method. It should be understood that, in order to implement the above-described functions, hardware structures and/or software modules corresponding to the respective functions are included. Those of skill in the art will readily appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional modules of the device according to the method example, for example, each functional module can be divided corresponding to each function, and two or more functions can be integrated in one processing module. The integrated modules may be implemented in hardware or in software functional modules. It should be noted that, in the embodiment of the present application, the division of the modules is schematic, which is merely a logic function division, and other division manners may be implemented in actual implementation.
The following describes the terminal device in the embodiment of the present application in detail, and fig. 8 is a schematic diagram of an embodiment of the terminal device provided in the embodiment of the present application. As shown in fig. 8, the terminal device may include a transmission unit 801, an acquisition unit 802, and a processing unit 803.
The sending unit 801 is configured to send a game operation event for the cloud game to the first server, where the game operation event is used by the first server to determine a target call protocol, and the target call protocol is a call protocol used when a target function module is called, and the target function module is used to execute processing capability required by the game operation event. It is specifically understood that the foregoing description of step 301 in fig. 3 is referred to, and details are not repeated herein.
An obtaining unit 802, configured to receive a game frame of the cloud game and a target call protocol sent by the first server. It is specifically understood that the foregoing description of step 303 in fig. 3 is referred to, and details are not repeated herein.
And the processing unit 803 is used for calling the target function module based on the target calling protocol so as to generate a target control corresponding to the target function module. It is specifically understood that the foregoing description of step 304 in fig. 3 is referred to, and details are not repeated herein.
And the processing unit 803 is used for rendering the target control and the game picture to obtain a target rendering result of the cloud game. It is specifically understood that the foregoing description of step 305 in fig. 3 is referred to, and will not be repeated herein.
And a processing unit 803 for displaying the target rendering result. It is specifically understood that the foregoing description of step 306 in fig. 3 is referred to, and details are not repeated herein.
In some optional embodiments, the obtaining unit 802 is configured to obtain a first mapping table, where the first mapping table is used to indicate an association relationship between each calling protocol and each function module in the terminal device, the target calling protocol is one or more of each calling protocol, and the target function module is one or more of each function module. The processing unit 803 is configured to call the target function module based on the target call protocol and the first mapping table.
In other alternative embodiments, the obtaining unit 802 is configured to obtain target execution parameters, where the target execution parameters include a target calling protocol and a calling parameter when the target calling protocol is called. The processing unit 803 is configured to generate a target control corresponding to the target function module based on the target execution parameter.
In other optional embodiments, the obtaining unit 802 is further configured to obtain the callback parameter corresponding to the target function module after generating the target control corresponding to the target function module based on the target execution parameter. The processing unit 803 is configured to perform callback processing on the target control based on the callback parameter, so as to invoke the target control.
In other alternative embodiments, the obtaining unit 802 is configured to obtain bitmap information of a current frame in the game screen, where the bitmap information is used to indicate a frame condition of the current frame in the game screen. The processing unit 803 is configured to: determining rendering environment parameters based on bitmap information of the current frame; and rendering the target control and the game picture based on the rendering environment parameters to obtain a target rendering result of the cloud game.
In other alternative embodiments, the processing unit 803 is configured to: creating a target shading container when the rendering environment parameters include RGBA color values of the target control; filling pixel parameters of the picture vertexes of the current frame based on the RGBA color values to obtain a first result; determining a second result based on the dynamic effect execution parameters of the current frame, wherein the second result is used for indicating the relative position of the picture vertex, and the relative position is used for indicating the relative situation between the picture vertex and the decoder when the dynamic effect of the target control is changed; and coloring and drawing the first result and the second result based on the target coloring container to obtain a target rendering result of the cloud game.
In other optional embodiments, the obtaining unit 802 is configured to receive a game frame and a target call protocol of the cloud game sent by the second server, where the game frame and the target call protocol of the cloud game are sent by the first server to the second server.
In other optional embodiments, the obtaining unit 802 is configured to receive a video stream of the cloud game sent by the first server, where the video stream includes a game frame of the cloud game and a target call protocol. The processing unit 803 is configured to perform parsing processing on the video stream, and obtain a game frame and a target call protocol of the cloud game.
In other optional embodiments, the obtaining unit 802 is further configured to receive a game control operation of the cloud game before sending the game operation event for the cloud game to the first server. The processing unit 803 is configured to generate a game operation event based on the game control operation.
The terminal device in the embodiment of the present application is mainly described in fig. 8 from the point of view of a modularized functional entity, and the first server in the embodiment of the present application will be described in the point of view of the modularized functional entity. Fig. 9 shows a schematic structural diagram of a first server according to an embodiment of the present application. As shown in fig. 9, the first server includes an acquisition unit 901, a processing unit 902, and a transmission unit 903.
The acquiring unit 901 is configured to acquire a game operation event for a cloud game sent by a terminal device. It is specifically understood that the foregoing description of step 301 in fig. 3 is referred to, and details are not repeated herein.
The processing unit 902 is configured to determine a target calling protocol based on the game operation event, where the target calling protocol is a calling protocol used when calling a target function module, and the target function module is configured to execute processing capability required by the game operation event. It is specifically understood that the foregoing description of step 302 in fig. 3 is referred to, and details are not repeated herein.
The sending unit 903 is configured to send a game frame of the cloud game and a target call protocol to the terminal device, where the target call protocol is used for the terminal device to call the target function module, so that the terminal device generates a target control corresponding to the target function module, and performs rendering processing on the target control and the game frame, to obtain a target rendering result of the cloud game. It is specifically understood that the foregoing description of step 303 in fig. 3 is referred to, and details are not repeated herein.
In some optional embodiments, the processing unit 902 is further configured to determine a calling protocol corresponding to each function module in the terminal device before determining the target calling protocol based on the game operation event; and constructing an association relation between each calling protocol and the corresponding functional module to generate a first mapping table, wherein the first mapping table is used for indicating the association relation between each calling protocol and each functional module in the terminal equipment. The processing unit 902 is configured to: determining a function module identifier corresponding to the game operation event; and determining a target calling protocol from the first mapping table based on the identification of the functional module.
In other optional embodiments, the sending unit 903 is configured to send the game frame of the cloud game and the target call protocol to the second server, where the second server is configured to send the game frame of the cloud game and the target call protocol to the terminal device.
The rendering processing apparatus in the embodiment of the present application is described above from the viewpoint of a modularized functional entity, and the rendering processing apparatus in the embodiment of the present application is described below from the viewpoint of hardware processing. The described rendering processing apparatus may be the terminal device shown in fig. 8, the first server shown in fig. 9, or the like. Fig. 10 is a schematic structural diagram of a rendering processing apparatus according to an embodiment of the present application. The rendering processing device may be subject to relatively large differences due to configuration or performance differences. The rendering processing apparatus may include at least one processor 1001, a communication line 1007, a memory 1003, and at least one communication interface 1004.
The processor 1001 may be a general purpose central processing unit (centralprocessingunit, CPU), microprocessor, application-specific integrated circuit (server IC) or one or more integrated circuits for controlling the execution of the program of the present application.
Communication line 1007 may include a pathway to transfer information between the components.
Communication interface 1004, a device using any transceiver or the like for communicating with other devices or communication networks, such as ethernet, radio access network (radioaccessnetwork, RAN), wireless local area network (wirelesslocalareanetworks, WLAN), etc.
The memory 1003 may be a read-only memory (ROM) or other type of static storage device that may store static information and instructions, a random access memory (randomaccess memory, RAM) or other type of dynamic storage device that may store information and instructions, and the memory may be stand alone and coupled to the processor via a communication line 1007. The memory may also be integrated with the processor.
The memory 1003 is used for storing computer-executable instructions for executing the present application, and is controlled to be executed by the processor 1001. The processor 1001 is configured to execute computer-executable instructions stored in the memory 1003, thereby implementing the method provided by the above-described embodiment of the present application.
Alternatively, the computer-executable instructions in the embodiments of the present application may be referred to as application program codes, which are not particularly limited in the embodiments of the present application.
In a specific implementation, as an embodiment, the rendering processing apparatus may include a plurality of processors, such as the processor 1001 and the processor 1002 in fig. 10. Each of these processors may be a single-core (single-CPU) processor or may be a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
In a specific implementation, as an embodiment, the rendering processing apparatus may further include an output device 1005 and an input device 1006. The output device 1005 communicates with the processor 1001 and may display information in a variety of ways. The input device 1006 is in communication with the processor 1001 and may receive input of a target object in a variety of ways. For example, the input device 1006 may be a mouse, a touch screen device, a sensing device, or the like.
The rendering processing device may be a general-purpose device or a special-purpose device. In a specific implementation, the rendering processing apparatus may be a server, a terminal device, or the like, or an apparatus having a similar structure in fig. 10. The embodiment of the application is not limited to the type of the rendering processing device.
It should be noted that, the processor 1001 in fig. 10 may cause the rendering processing apparatus to execute the method in the method embodiment corresponding to fig. 2 by calling the computer-executable instructions stored in the memory 1003.
Specifically, the functions/implementation procedures of the processing unit 803 in fig. 8 and the processing unit 902 in fig. 9 may be implemented by the processor 1001 in fig. 10 calling computer-executable instructions stored in the memory 1003. The functions/implementation procedures of the acquisition unit 802 and the transmission unit 801 in fig. 8, the acquisition unit 901 and the transmission unit 903 in fig. 9 can be implemented by the communication interface 1004 in fig. 10.
The embodiment of the present application also provides a computer storage medium storing a computer program for electronic data exchange, the computer program causing a computer to execute part or all of the steps of any one of the rendering processes described in the above method embodiments.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of a method of any one of the rendering processes described in the method embodiments above.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, or other various media capable of storing program codes.
The above-described embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof, and when implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer instructions. When the computer-executable instructions are loaded and executed on a computer, the processes or functions in accordance with embodiments of the present application are fully or partially produced. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). Computer readable storage media can be any available media that can be stored by a computer or data storage devices such as servers, data centers, etc. that contain an integration of one or more available media. Usable media may be magnetic media (e.g., floppy disks, hard disks, magnetic tape), optical media (e.g., DVD), or semiconductor media (e.g., SSD)), or the like.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application.
Claims (17)
1. A method of rendering processing, applied to a terminal device, the method comprising:
Sending a game operation event aiming at a cloud game to a first server, wherein the game operation event is used for determining a target calling protocol by the first server, the target calling protocol is used when a target function module in the terminal equipment is called, and the target function module is used for executing processing capacity required by the game operation event;
receiving a game picture of the cloud game sent by the first server and the target call protocol;
Invoking the target function module based on the target invoking protocol to generate a target control corresponding to the target function module;
Rendering the target control and the game picture to obtain a target rendering result of the cloud game;
and displaying the target rendering result.
2. The method of claim 1, wherein the invoking the target function module based on the target invocation protocol comprises:
Acquiring a first mapping table, wherein the first mapping table is used for indicating the association relation between each calling protocol and each functional module in the terminal equipment, the target calling protocol is one or more of the calling protocols, and the target functional module is one or more of the functional modules;
And calling the target function module based on the target calling protocol and the first mapping table.
3. The method of any of claims 1-2, wherein invoking the target function module based on the target invocation protocol to generate a target control corresponding to the target function module comprises:
Acquiring target execution parameters, wherein the target execution parameters comprise the target calling protocol and calling parameters when the target calling protocol is called;
and generating a target control corresponding to the target function module based on the target execution parameter.
4. The method of claim 3, wherein after generating the target control corresponding to the target function module based on the target execution parameter, the method further comprises:
Obtaining callback parameters corresponding to the target function module;
and carrying out callback processing on the target control based on the callback parameter so as to call the target control.
5. The method according to any one of claims 1 to 2, wherein the rendering the target control and the game screen to obtain a target rendering result of the cloud game includes:
Acquiring bitmap information of a current frame in the game picture, wherein the bitmap information is used for indicating the frame condition of the current frame in the game picture;
determining rendering environment parameters based on bitmap information of the current frame;
and rendering the target control and the game picture based on the rendering environment parameters to obtain a target rendering result of the cloud game.
6. The method of claim 5, wherein rendering the target control and the game frame based on the rendering environment parameter to obtain a target rendering result of the cloud game, comprises:
creating a target shading container when the rendering environment parameters include RGBA color values of the target control;
filling pixel parameters of the picture vertexes of the current frame based on the RGBA color values to obtain a first result;
Determining a second result based on the dynamic performance parameter of the current frame, wherein the second result is used for indicating the relative position of the picture vertex, and the relative position is used for indicating the relative situation between the picture vertex and a decoder when the dynamic change occurs to the target control;
and coloring and drawing the first result and the second result based on the target coloring container to obtain a target rendering result of the cloud game.
7. The method according to any one of claims 1 to 2, wherein the receiving the game screen of the cloud game and the target call protocol transmitted by the first server includes:
And receiving the game picture of the cloud game and the target call protocol sent by a second server, wherein the game picture of the cloud game and the target call protocol are sent to the second server by the first server.
8. The method according to any one of claims 1 to 2, wherein the receiving the game screen of the cloud game and the target call protocol transmitted by the first server includes:
Receiving a video stream of the cloud game sent by the first server, wherein the video stream comprises a game picture of the cloud game and the target call protocol;
and analyzing the video stream to obtain a game picture of the cloud game and the target call protocol.
9. The method according to any one of claims 1 to 2, wherein before sending a game operation event for a cloud game to the first server, the method further comprises:
Receiving a game control operation of the cloud game;
The game operation event is generated based on the game control operation.
10. A method of rendering processing, applied to a first server, the method comprising:
acquiring a game operation event aiming at a cloud game, which is sent by terminal equipment;
determining a target calling protocol based on the game operation event, wherein the target calling protocol is a calling protocol used when a target functional module is called, and the target functional module is used for executing processing capacity required by the game operation event;
And sending a game picture of the cloud game and the target calling protocol to the terminal equipment, wherein the target calling protocol is used for calling a target functional module by the terminal equipment so that the terminal equipment generates a target control corresponding to the target functional module, and rendering the target control and the game picture to obtain a target rendering result of the cloud game.
11. The method of claim 10, wherein prior to determining a target call protocol based on the game play event, the method further comprises:
determining a calling protocol corresponding to each functional module in the terminal equipment;
Constructing an association relation between each calling protocol and the corresponding functional module to generate a first mapping table, wherein the first mapping table is used for indicating the association relation between each calling protocol and each functional module in the terminal equipment;
the determining a target call protocol based on the game operation event includes:
Determining a function module identifier corresponding to the game operation event;
and determining a target calling protocol from the first mapping table based on the identification of the functional module.
12. The method according to any one of claims 10 to 11, wherein the sending the game screen of the cloud game and the target call protocol to the terminal device includes:
And sending the game picture of the cloud game and the target call protocol to a second server, wherein the second server is used for sending the game picture of the cloud game and the target call protocol to the terminal equipment.
13. A terminal device, comprising:
a sending unit, configured to send a game operation event for a cloud game to a first server, where the game operation event is used for the first server to determine a target call protocol, where the target call protocol is a call protocol used when a target function module is called, and the target function module is used for executing processing capability required by the game operation event;
the acquisition unit is used for receiving the game picture of the cloud game and the target call protocol, which are sent by the first server;
the processing unit is used for calling the target function module based on the target calling protocol so as to generate a target control corresponding to the target function module;
The processing unit is used for rendering the target control and the game picture to obtain a target rendering result of the cloud game;
The processing unit is used for displaying the target rendering result.
14. A first server is provided, which comprises a first server, characterized by comprising the following steps:
The acquisition unit is used for acquiring game operation events aiming at the cloud game and sent by the terminal equipment;
The processing unit is used for determining a target calling protocol based on the game operation event, wherein the target calling protocol is a calling protocol used when a target functional module is called, and the target functional module is used for executing the processing capacity required by the game operation event;
The sending unit is used for sending the game picture of the cloud game and the target calling protocol to the terminal equipment, wherein the target calling protocol is used for the terminal equipment to call a target function module so that the terminal equipment generates a target control corresponding to the target function module, and rendering is carried out on the target control and the game picture to obtain a target rendering result of the cloud game.
15. A rendering processing apparatus, characterized in that the rendering processing apparatus comprises: an input/output (I/O) interface, a processor, and a memory, the memory having program instructions stored therein;
The processor is configured to execute program instructions stored in a memory, to perform the method of any one of claims 1 to 9, or to perform the method of any one of claims 10 to 12.
16. A computer readable storage medium comprising instructions which, when run on a computer device, cause the computer device to perform the method of any one of claims 1 to 9 or to perform the method of any one of claims 10 to 12.
17. A computer program product, characterized in that the computer program product comprises instructions which, when run on a computer device or a processor, cause the computer device or the processor to perform the method of any one of claims 1 to 9 or to perform the method of any one of claims 10 to 12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310390253.3A CN118781242A (en) | 2023-04-04 | 2023-04-04 | Rendering processing method, device, equipment, storage medium and program product |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310390253.3A CN118781242A (en) | 2023-04-04 | 2023-04-04 | Rendering processing method, device, equipment, storage medium and program product |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118781242A true CN118781242A (en) | 2024-10-15 |
Family
ID=92994246
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310390253.3A Pending CN118781242A (en) | 2023-04-04 | 2023-04-04 | Rendering processing method, device, equipment, storage medium and program product |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118781242A (en) |
-
2023
- 2023-04-04 CN CN202310390253.3A patent/CN118781242A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP4122568A1 (en) | Data processing method and device and storage medium | |
AU2019233201B2 (en) | Resource configuration method and apparatus, terminal, and storage medium | |
JP6310073B2 (en) | Drawing system, control method, and storage medium | |
US9814979B2 (en) | Data provision system, provision apparatus, execution apparatus, control method, and recording medium | |
CN113457160B (en) | Data processing method, device, electronic equipment and computer readable storage medium | |
CN112791399B (en) | Method, device, system, medium and electronic equipment for displaying cloud game picture | |
CN107566346B (en) | 3D game data transmission method and device, storage medium and electronic device | |
TWI421118B (en) | Online gaming system and method of resources to handle online games | |
CN112316433B (en) | Game picture rendering method, device, server and storage medium | |
CN110213265B (en) | Image acquisition method, image acquisition device, server and storage medium | |
CN108389241A (en) | The methods, devices and systems of textures are generated in scene of game | |
CN112023402B (en) | Game data processing method, device, equipment and medium | |
JP7111822B2 (en) | Group gameplay with users in close proximity using gaming platforms | |
CN112307403A (en) | Page rendering method, device, storage medium and terminal | |
CN111672132B (en) | Game control method, game control device, server, and storage medium | |
CN105597314B (en) | Rendering system and method of 2D game and terminal equipment | |
CN111437610B (en) | Cloud game system processing method and device and cloud server | |
CN115794139A (en) | Mirror image data processing method, device, equipment and medium | |
CN115955590A (en) | Video processing method, video processing device, computer equipment and medium | |
CN113244609A (en) | Multi-picture display method and device, storage medium and electronic equipment | |
CN112245906A (en) | Data synchronization method and device, electronic equipment and storage medium | |
CN118781242A (en) | Rendering processing method, device, equipment, storage medium and program product | |
CN116966546A (en) | Image processing method, apparatus, medium, device, and program product | |
CN113908532B (en) | Data processing method and device, storage medium and electronic equipment | |
JP6054677B2 (en) | Processing system, information processing apparatus, control method, program, and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication |