[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN114937103A - Model rendering method and device for dynamic effect, electronic equipment and storage medium - Google Patents

Model rendering method and device for dynamic effect, electronic equipment and storage medium Download PDF

Info

Publication number
CN114937103A
CN114937103A CN202210601263.2A CN202210601263A CN114937103A CN 114937103 A CN114937103 A CN 114937103A CN 202210601263 A CN202210601263 A CN 202210601263A CN 114937103 A CN114937103 A CN 114937103A
Authority
CN
China
Prior art keywords
parameter
map
material ball
transparency
dynamic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210601263.2A
Other languages
Chinese (zh)
Inventor
黄旭帆
钱静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202210601263.2A priority Critical patent/CN114937103A/en
Publication of CN114937103A publication Critical patent/CN114937103A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/02Non-photorealistic rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Multimedia (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides a model rendering method and device of a dynamic effect, electronic equipment and a storage medium, wherein an initial transparency parameter of an initial material ball is obtained according to material information of a target model; adjusting an initial transparency parameter of the initial material ball based on the pixel parameter of the texture map to obtain a target material ball with the transparency parameter changing along with time; wherein the texture map comprises a dynamic texture map; the dynamic texture mapping is used for rendering the dynamic effect of the surface texture of the target model; and rendering the target model by using the target material ball to obtain the dynamic effect of the target model. Therefore, under the condition that an animation special effect is not required to be made, a target model with a dynamic effect can be simulated through the transparency parameter which is carried by the target material ball and changes along with time, and therefore, the calculation consumption in the rendering process of the terminal equipment is reduced and the making difficulty of the dynamic effect is reduced in a mode of avoiding using the animation special effect.

Description

Model rendering method and device for dynamic effect, electronic equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method and an apparatus for rendering a dynamic effect model, an electronic device, and a storage medium.
Background
In order to simulate a picture having a dynamic effect in a game, it is often selected to produce a special effect animation. The simulation of the dynamic effect is realized by the special effect animation, a large number of particles are needed to be used by calling a relevant model, and a plurality of special effect pictures are needed to be made to realize the change of the change style along with the continuous change of the dynamic effect.
In the process of rendering, along with the increase of the number of the special effect pictures, the calculation examples consumed by the terminal equipment are increased, the requirement on the performance of the terminal equipment is extremely high when the terminal equipment is rendered in the mode, and due to the fact that the number of the special effect pictures is large, the time consumed by the terminal equipment for rendering is long, and the rendering rate is low.
Disclosure of Invention
In view of the above, an object of the present application is to provide a method and an apparatus for rendering a dynamic effect model, an electronic device, and a storage medium, which can simulate an object model with a dynamic effect through a time-varying transparency parameter carried by an object material ball without creating an animation special effect, thereby reducing example consumption in a rendering process of a terminal device and reducing difficulty in creating the dynamic effect by avoiding using the animation special effect.
The embodiment of the application provides a model rendering method for dynamic effects, which comprises the following steps:
acquiring an initial transparency parameter of an initial material ball according to the material information of the target model;
adjusting an initial transparency parameter of the initial material ball based on a pixel parameter of the texture mapping to obtain a target material ball with the transparency parameter changing along with time; wherein the texture map comprises a dynamic texture map; the dynamic texture mapping is used for performing dynamic effect rendering on the surface texture of the target model;
and rendering the target model by using the target material ball to obtain the dynamic effect of the target model.
In one possible implementation, the texture map further comprises a static texture map; the texture mapping-based pixel parameter adjusting the initial transparency parameter of the initial material ball to obtain the target material ball with the transparency parameter changing along with time comprises the following steps:
adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain a first material ball with a first transparency parameter; wherein the static texture map is used for performing static pattern rendering on the surface texture of the target model;
and adjusting a first transparency parameter of the first material ball by using the second pixel parameter of the dynamic texture map changing along with the time to obtain a target material ball of which the transparency parameter changes along with the time.
In one possible implementation, the static texture map comprises a foreground pattern map; the adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain the first material ball with the first transparency parameter includes:
determining a transparent area and a non-transparent area in the initial material ball according to the surface texture of the target model;
adjusting the initial transparency parameter of the non-transparent area by using the first pixel parameter of the foreground pattern map to obtain a first material ball with a first transparency parameter; wherein the foreground pattern map is used for rendering foreground patterns in the surface texture of the target model.
In a possible implementation manner, the adjusting an initial transparency parameter of the initial material ball by using a first pixel parameter of the foreground pattern map to obtain a first material ball having a first transparency parameter includes:
adjusting the initial transparency parameter in the nontransparent area by using the first pixel parameter of the foreground pattern map to obtain a second material ball with a second transparency parameter;
synthesizing the initial material ball and the second material ball by using the first transparency difference value of each pixel position between the initial transparency parameter and the second transparency parameter to obtain a third material ball with a third transparency parameter;
performing negation treatment on the initial transparency parameter of the initial material ball to obtain a reverse material ball with a reverse transparency parameter;
synthesizing the second material ball and the reverse material ball by using a second transparency difference value of each pixel position between the second transparency parameter and the reverse transparency parameter to obtain a fourth material ball with a fourth transparency parameter;
synthesizing the third material ball and the fourth material ball by using the first maximum transparency value of each pixel position between the third transparency parameter and the fourth transparency parameter to obtain a fifth material ball with a fifth transparency parameter;
and synthesizing the second material ball and the fifth material ball by using the second transparency maximum value of each pixel position between the second transparency parameter and the fifth transparency parameter to obtain the first material ball with the first transparency parameter.
In one possible embodiment, the static texture map comprises an environment map, and the first pixel parameter comprises a gray pixel parameter and/or a pixel parameter of any color channel; the adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain the first material ball with the first transparency parameter includes:
adjusting the initial transparency parameter of the initial material ball by utilizing the gray pixel parameter of the environment map or the pixel parameter of any color channel to obtain a first material ball with a first transparency parameter; wherein the environment map is used for rendering an environment pattern in the surface texture of the target model.
In a possible embodiment, the second pixel parameter includes a pixel parameter of any one color channel; the adjusting the first transparency parameter of the first material ball by using the second pixel parameter of the dynamic texture map changing along with the time to obtain the target material ball with the transparency parameter changing along with the time comprises the following steps:
and adjusting the first transparency parameter of the first material ball by using the pixel parameter of any color channel of the dynamic texture mapping changing along with the time to obtain a target material ball of which the transparency parameter changes along with the time.
In one possible embodiment, the dynamic texture map comprises a fog dynamic map; obtaining the fog dynamic mapping by the following steps:
acquiring an original noise map according to the surface texture of the target model;
carrying out offset processing on a first original coordinate of each pixel position in the original noise map by using a preset first offset parameter to obtain a first offset coordinate of each pixel position;
sampling the original noise map according to the first original coordinate of each pixel position and the first offset coordinate of each pixel position to obtain the fog dynamic map; and the fog dynamic mapping is used for carrying out fog effect rendering on the surface texture of the target model.
In one possible embodiment, the dynamic texture map further comprises a shadow dynamic map; acquiring the dynamic shadow map by the following steps:
acquiring an original light and shadow map according to the surface texture of the target model;
performing offset processing on a second original coordinate of each pixel position in the original light and shadow map by using a preset second offset parameter to obtain a flowing light and shadow map; wherein each pixel location in the flow light map has a second offset coordinate;
according to the flicker speed and the flicker time of the light and shadow dynamic mapping on the texture surface of the target model, carrying out offset processing on a second original coordinate of each pixel position in the original light and shadow mapping to obtain a third offset coordinate of each pixel position;
sampling the original light and shadow map by using the third offset coordinate of each pixel position to obtain a flicker light and shadow map;
synthesizing the flowing light shadow map and the flickering light shadow map according to the second offset coordinate of each pixel position and the third offset coordinate of each pixel position to obtain the light shadow dynamic map; and the light and shadow dynamic mapping is used for rendering the light and shadow effect of the surface texture of the target model.
In one possible embodiment, the texture map comprises a foreground pattern map, an environment map, a fog dynamic map, and a light and shadow dynamic map; the texture mapping-based pixel parameter adjusting the initial transparency parameter of the initial material ball to obtain the target material ball with the transparency parameter changing along with time comprises the following steps:
gradually adjusting the initial transparency parameter of the initial material ball based on the pixel parameter of the foreground pattern map, the pixel parameter of the environment map, the pixel parameter of the fog dynamic map and the pixel parameter of the light and shadow dynamic map respectively to obtain a target material ball after gradual adjustment; wherein, the transparency parameter of the target material ball after gradual adjustment changes along with time;
the rendering processing is performed on the target model by using the target material ball to obtain the dynamic effect of the target model, and the method comprises the following steps:
rendering the target model by using the gradually adjusted target material ball to obtain a target model with a dynamic effect; the foreground pattern map and the environment map are displayed in the dynamic effect, and the dynamic effect has a fog dynamic effect and a light and shadow dynamic effect.
The embodiment of the present application further provides a model rendering device for dynamic effects, where the model rendering device includes:
the parameter acquisition module is used for acquiring an initial transparency parameter of the initial material ball according to the material information of the target model;
the parameter adjusting module is used for adjusting the initial transparency parameter of the initial material ball based on the pixel parameter of the texture map to obtain a target material ball with the transparency parameter changing along with time; wherein the texture map comprises a dynamic texture map; the dynamic texture mapping is used for performing dynamic effect rendering on the surface texture of the target model;
and the effect rendering module is used for rendering the target model by using the target material ball to obtain the dynamic effect of the target model.
In one possible implementation, the texture map further comprises a static texture map; the parameter adjusting module is configured to, when adjusting an initial transparency parameter of the initial material ball based on a pixel parameter of a texture map to obtain a target material ball with the transparency parameter changing with time,:
adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain a first material ball with a first transparency parameter; wherein the static texture map is used for performing static pattern rendering on the surface texture of the target model;
and adjusting a first transparency parameter of the first material ball by using the second pixel parameter of the dynamic texture map changing along with the time to obtain a target material ball of which the transparency parameter changes along with the time.
In one possible implementation, the static texture map comprises a foreground pattern map; when the parameter adjusting module is configured to adjust the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain a first material ball having a first transparency parameter, the parameter adjusting module is configured to:
determining a transparent area and a non-transparent area in the initial material ball according to the surface texture of the target model;
adjusting the initial transparency parameter of the non-transparent area by using the first pixel parameter of the foreground pattern map to obtain a first material ball with a first transparency parameter; wherein the foreground pattern map is used for rendering foreground patterns in the surface texture of the target model.
In a possible implementation manner, when the parameter adjusting module is configured to adjust the initial transparency parameter of the non-transparent region by using the first pixel parameter of the foreground pattern map, so as to obtain the first material ball with the first transparency parameter, the parameter adjusting module is configured to:
adjusting the initial transparency parameter in the nontransparent area by using the first pixel parameter of the foreground pattern map to obtain a second material ball with a second transparency parameter;
synthesizing the initial material ball and the second material ball by using a first transparency difference value of each pixel position between the initial transparency parameter and the second transparency parameter to obtain a third material ball with a third transparency parameter;
performing negation treatment on the initial transparency parameter of the initial material ball to obtain a reverse material ball with a reverse transparency parameter;
synthesizing the second material ball and the reverse material ball by using a second transparency difference value of each pixel position between the second transparency parameter and the reverse transparency parameter to obtain a fourth material ball with a fourth transparency parameter;
synthesizing the third material ball and the fourth material ball by using the first transparency maximum value of each pixel position between the third transparency parameter and the fourth transparency parameter to obtain a fifth material ball with a fifth transparency parameter;
and synthesizing the second material ball and the fifth material ball by using the second transparency maximum value of each pixel position between the second transparency parameter and the fifth transparency parameter to obtain the first material ball with the first transparency parameter.
In one possible embodiment, the static texture map comprises an environment map, and the first pixel parameter comprises a gray pixel parameter and/or a pixel parameter of any color channel; when the parameter adjusting module is configured to adjust the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain the first material ball having the first transparency parameter, the parameter adjusting module is configured to:
adjusting the initial transparency parameter of the initial material ball by utilizing the gray pixel parameter of the environment map or the pixel parameter of any color channel to obtain a first material ball with a first transparency parameter; wherein the environment map is used for rendering an environment pattern in the surface texture of the target model.
In a possible embodiment, the second pixel parameter includes a pixel parameter of any one color channel; when the parameter adjusting module is configured to adjust the first transparency parameter of the first material ball by using the second pixel parameter of the dynamic texture map changing with time to obtain the target material ball with the transparency parameter changing with time, the parameter adjusting module is configured to:
and adjusting the first transparency parameter of the first material ball by using the pixel parameter of any color channel of the dynamic texture mapping changing along with the time to obtain a target material ball with the transparency parameter changing along with the time.
In one possible embodiment, the dynamic texture map comprises a fog dynamic map; the parameter adjusting module is used for obtaining the fog dynamic mapping through the following steps:
acquiring an original noise map according to the surface texture of the target model;
carrying out offset processing on a first original coordinate of each pixel position in the original noise map by using a preset first offset parameter to obtain a first offset coordinate of each pixel position;
sampling the original noise map according to the first original coordinate of each pixel position and the first offset coordinate of each pixel position to obtain the fog dynamic map; and the fog dynamic mapping is used for performing fog effect rendering on the surface texture of the target model.
In one possible embodiment, the dynamic texture map further comprises a shadow dynamic map; the parameter adjusting module is used for obtaining the dynamic light and shadow map through the following steps:
acquiring an original light and shadow map according to the surface texture of the target model;
performing offset processing on a second original coordinate of each pixel position in the original light and shadow map by using a preset second offset parameter to obtain a flowing light and shadow map; wherein each pixel position in the flow light shadow map has a second offset coordinate;
according to the flicker speed and the flicker time of the dynamic light and shadow map on the texture surface of the target model, carrying out offset processing on a second original coordinate of each pixel position in the original light and shadow map to obtain a third offset coordinate of each pixel position;
sampling the original light and shadow map by using the third offset coordinate of each pixel position to obtain a flicker light and shadow map;
synthesizing the flowing light shadow map and the flickering light shadow map according to the second offset coordinate of each pixel position and the third offset coordinate of each pixel position to obtain the light shadow dynamic map; and the light and shadow dynamic mapping is used for rendering the light and shadow effect of the surface texture of the target model.
In one possible embodiment, the texture map comprises a foreground pattern map, an environment map, a fog dynamic map, and a light and shadow dynamic map; the parameter adjusting module is configured to, when adjusting an initial transparency parameter of the initial material ball based on a pixel parameter of a texture map to obtain a target material ball with the transparency parameter changing with time,:
gradually adjusting the initial transparency parameter of the initial material ball based on the pixel parameter of the foreground pattern map, the pixel parameter of the environment map, the pixel parameter of the fog dynamic map and the pixel parameter of the light and shadow dynamic map respectively to obtain a target material ball after gradual adjustment; wherein, the transparency parameter of the target material ball after gradual adjustment changes along with time;
when the effect rendering module is used for rendering the target model by using the target material ball to obtain the dynamic effect of the target model, the effect rendering module is used for:
rendering the target model by using the gradually adjusted target material ball to obtain a target model with a dynamic effect; the dynamic effect displays the foreground pattern map and the environment map and has a fog dynamic effect and a light and shadow dynamic effect.
An embodiment of the present application further provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is in operation, the machine-readable instructions, when executed by the processor, performing the steps of the method of model rendering of dynamic effects as described above.
Embodiments of the present application further provide a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to perform the steps of the method for rendering a dynamic effect model as described above.
According to the model rendering method and device for the dynamic effect, the electronic equipment and the storage medium, the initial transparency parameter of the initial material ball is obtained according to the material information of the target model; adjusting an initial transparency parameter of the initial material ball based on the pixel parameter of the texture map to obtain a target material ball with the transparency parameter changing along with time; and rendering the target model by using the target material ball to obtain the dynamic effect of the target model. Therefore, under the condition that the animation special effect does not need to be made, the target model with the dynamic effect can be simulated through the transparency parameter which is carried by the target material ball and changes along with time, and therefore, the calculation consumption in the rendering process of the terminal equipment is reduced by avoiding using the animation special effect, and the making difficulty of the dynamic effect is reduced.
In order to make the aforementioned objects, features and advantages of the present application comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a flowchart of a method for rendering a dynamic model according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a transparency adjustment process according to an embodiment of the present application;
FIG. 3 is a schematic view of a process for making a dynamic fog map according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a manufacturing process of a light and shadow dynamic map provided in an embodiment of the present application;
fig. 5 is a schematic structural diagram of a model rendering apparatus for dynamic effect according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. Every other embodiment that can be obtained by a person skilled in the art without making creative efforts based on the embodiments of the present application falls within the protection scope of the present application.
Research shows that in a game, in order to simulate a picture with a dynamic effect, the mode of producing special effect animation is usually selected to realize the simulation. The simulation of the dynamic effect is realized by the special effect animation mode, a large number of particles are needed to be used by calling a relevant model, and a multi-frame special effect picture needs to be made to realize the conversion of the change style along with the continuous change of the dynamic effect. In the process of rendering, along with the increase of the number of the special effect pictures, the calculation examples consumed by the terminal equipment are increased, when the rendering is performed in the way, the requirement on the performance of the terminal equipment is extremely high, and due to the fact that the number of the special effect pictures is large, the time consumed by the terminal equipment for rendering is long, and the rendering rate is low.
Based on this, the embodiment of the application provides a model rendering method with a dynamic effect, which can render a target model with a dynamic effect through the change of a transparency parameter carried by a target material ball with time, so that on the premise of ensuring the authenticity of a rendered object to be simulated, the number of calculations consumed in the rendering process of a terminal device can be reduced, and the rendering efficiency of a game picture is improved.
Referring to fig. 1, fig. 1 is a flowchart illustrating a model rendering method for dynamic effects according to an embodiment of the present disclosure. As shown in fig. 1, a method for rendering a dynamic effect model provided in an embodiment of the present application includes:
s101, obtaining an initial transparency parameter of the initial material ball according to the material information of the target model.
S102, adjusting an initial transparency parameter of the initial material ball based on the pixel parameter of the texture map to obtain a target material ball with the transparency parameter changing along with time.
S103, rendering the target model by using the target material ball to obtain a dynamic effect of the target model.
The embodiment of the application provides a model rendering method of a dynamic effect, when a target model displayed in a game scene has the dynamic effect, acquiring an initial transparency parameter of a material ball for rendering the target model according to material information when the target model is displayed; in order to render a target model with a dynamic effect, the initial transparency parameter in the initial material ball is adjusted through a dynamic texture mapping for rendering the dynamic effect on the surface texture of the target model, and because the transparency parameter of the dynamic texture mapping changes along with time, the transparency parameter carried by the adjusted target material ball can also change along with time, and further, the dynamic effect of the target model is simulated. Therefore, the target model with the dynamic effect can be simulated without additionally manufacturing the animation special effect, so that the consumption of terminal equipment for rendering special effect pictures can be reduced, and the rendering efficiency of the terminal equipment is improved.
If a target model with a real effect is simulated in a game, the material of the target model needs to be simulated; here, the material means real attributes of the target object, for example, color information, reflection effect, transparency information, and the like of the target object.
The texture ball is a VR parameter file, which includes the color parameters, the light reflection effect, the transparency parameters, and other physical property parameters of the target model, and also includes various static maps, such as texture maps, Material Capture maps, and normal maps, so that what effect the texture ball can actually render depends on the physical property parameters and various map information included in the texture map.
In step S101, an initial transparency parameter of an initial material ball used for rendering the target model may be obtained in advance according to material information exhibited by the target model in the game scene.
Here, while obtaining the initial transparency parameter, the method may further set other initial parameters in the initial material ball by referring to the material information of the target model, and specifically may include parameters such as an initial color parameter, a reflection effect parameter, and an initial transparency parameter.
For example, here, taking the target model as "moon" as an example, in order to make the displayed "moon" more beautiful and fit to the display effect of the real "moon" in the game, the "moon" is usually made to present a semi-transparent state, that is, the outer contour of the "moon" is an opaque boundary, and the inside of the "moon" is a completely transparent state.
Since the object model with the dynamic effect is obtained by rendering in a manner that the transparency parameter changes with time in the application, other parameters (for example, color parameters) are not changed in the process of adjusting the transparency, and the other parameters can be directly added to the material ball under the condition that the parameters are not changed in the process of adjusting the transparency, so that only the adjusting process of the transparency is explained in the subsequent process, and the adding process of the other parameters is not repeated.
The above process can only render a statically displayed target model, and in some game scenes, in order to improve the reality of the game picture, some dynamic effects are usually present on the target model, for example, "smoke" floats on the surface of the "moon" and the like.
In step S102, in order to obtain an object model with a dynamic effect, an initial transparency parameter of an initial material ball is adjusted by using a pixel parameter of a texture map, so that the transparency parameter in the adjusted material ball changes with time; therefore, the dynamic effect can be simulated by changing the transparency parameter along with the time, and special effect animation does not need to be additionally made to realize the dynamic effect.
The texture mapping comprises a dynamic texture mapping, and pixel parameters in the dynamic texture mapping change along with time; the dynamic texture map is used for rendering dynamic effects on the surface texture of the target model.
Further, for an object model, the surface texture of the object model not only includes dynamic effects, but also includes some predetermined patterns (e.g., foreground patterns and background patterns) displayed fixedly; however, if the object model is in a completely transparent state, no pattern can be displayed on the surface texture of the object model, and only in a non-transparent state, such as an opaque or partially opaque state, the non-transparent region can display the corresponding pattern.
Therefore, in order to enable the surface texture of the rendered target model to display a corresponding pattern, the initial transparency of the initial material ball used for rendering the target model may be adjusted according to the first pixel parameter of the static texture map, so as to ensure that the static texture map presents a non-transparent state at the corresponding display area in the target model.
In one embodiment, the texture map further comprises a static texture map; referring to fig. 2, fig. 2 is a schematic diagram illustrating a transparency adjustment process according to an embodiment of the present disclosure. As shown in fig. 2, step S102 includes:
and S1021, adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture mapping to obtain the first material ball with the first transparency parameter.
In the step, the adjustment of the initial transparency parameter of the initial material ball is gradually completed by utilizing the static texture mapping and the dynamic texture mapping respectively; specifically, adjusting an initial transparency parameter of an initial material ball by using a first pixel parameter carried by a static texture mapping; here, for a static texture map, it is composed of a plurality of polygon meshes, each polygon mesh is equivalent to a pixel position (point) of the static texture map, each pixel position has a first pixel value, when adjusting the initial transparency parameter, the first pixel value of each pixel position can be used to adjust the transparency value at the corresponding pixel position in the initial texture sphere; and then obtaining the first material ball with the first transparency parameter after adjustment.
Since the transparency is actually a value between 0 and 1, changing the value to be within the range of 0 to 1 and not 0 can change the transparency of each pixel position, so that the pixel position which is originally completely transparent can be changed into an opaque or semitransparent state;
the first pixel parameters of the static texture map, for example, the pixel value of any color channel, or the transparency parameter carried by itself, or the gray level pixel value, etc., are all a "numerical value" within a range of 0 to 1, and thus, the first pixel parameters herein may include a color parameter and a transparency parameter (Alpha, a); the color parameters may include color parameters of three color channels of Red (Red, R), Green (Green, G), and Blue (Blue, B), and grayscale pixel parameters, etc.
In one embodiment, the static texture map comprises a foreground pattern map; step S1021 includes: determining a transparent area and a non-transparent area in the initial material ball according to the surface texture of the target model; and adjusting the initial transparency parameter of the non-transparent area by using the first pixel parameter of the foreground pattern map to obtain a first material ball with a first transparency parameter.
The foreground pattern map is used for rendering foreground patterns in the surface texture of the target model, and the foreground pattern map can be displayed on the surface of the target model obtained through rendering of the material ball by adding the foreground pattern map into the material ball.
In the step, for the target model, a transparent area and a non-transparent area exist in the surface texture of the target model, namely, no pattern needs to be displayed in the transparent area of the surface texture of the target model; as long as it is ensured that the non-transparent region can display the foreground pattern map, therefore, in order to reduce a part of data processing amount, only the foreground pattern map is needed to be used to adjust the initial transparency parameter in the non-transparent region in the initial material ball; specifically, a transparent area and a non-transparent area in the initial material ball are determined according to the surface texture when the target model is displayed; for the determined non-transparent area, adjusting the initial transparency parameter of the non-transparent area in the initial material ball by using a first pixel parameter (for example, a pixel value of any color channel, or a carried transparency parameter, or a gray pixel value, etc.) of the foreground pattern map through corresponding operation, so as to obtain the first material ball with the first transparency parameter. Here, the corresponding operation includes addition, subtraction, or, xor, inversion, and the like.
Aiming at the determined transparent area, in order to further reduce the calculated amount in the rendering process, for the transparent area, the initial transparency parameter of each pixel point in the transparent area can be adjusted by directly utilizing the preset transparency parameter and in a value assignment mode.
Corresponding to the above embodiment, it is assumed that the pattern displayed on the "moon" is "leaf", the initial material sphere for rendering the "moon" is a semi-transparent material sphere "solid outside and virtual inside" (the rendered "moon" gradually changes from the outer contour to the center from the completely opaque state to the completely transparent state), the center area of the "moon" is in the completely transparent state, and the "leaf" in the center area cannot be displayed in the completely transparent state, so the initial transparency parameter of each pixel position at the position of the "leaf" in the initial material sphere needs to be adjusted by using the first pixel parameter of the foreground pattern map corresponding to the "leaf" at this time, so that the transparency of the pixel position at the position of the "leaf" is not 0, that is, the pixel position is in the non-transparent state.
In order to enable the target model to have a more real effect, the outer ring boundary contour of the surface texture of the target model can be processed into a model with a multi-level gradient effect through multiple times of synthesis processing.
In one embodiment, the adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the foreground pattern map to obtain the first material ball with the first transparency parameter includes:
step a, adjusting the initial transparency parameter in the nontransparent area by using the first pixel parameter of the foreground pattern map to obtain a second material ball with a second transparency parameter.
In the step, the foreground pattern map is used for adjusting the initial transparency parameter in the non-transparent area in the initial material ball to obtain a second material ball with a second transparency parameter; at this time, the target model is rendered by using the second material balls, so that the target model with the surface displaying the foreground pattern map can be obtained.
And b, synthesizing the initial material ball and the second material ball by using the first transparency difference value of each pixel position between the initial transparency parameter and the second transparency parameter to obtain a third material ball with a third transparency parameter.
After the second material ball is obtained, calculating a first transparency difference value between an initial transparency value of each pixel position in the initial transparency parameter and a second transparency value of each pixel position in the second transparency parameter, and synthesizing the initial material ball and the second material ball by using the first transparency difference value of each pixel position to obtain a third material ball with a third transparency parameter; at this time, the target model is rendered by using the third material ball, so that the target model with the boundary having the first gradual change effect and without displaying the foreground pattern map can be obtained.
Here, the first fade effect refers to an outer circle boundary contour of the object model, which gradually changes from a non-transparent state to a transparent state.
And c, performing negation treatment on the initial transparency parameter of the initial material ball to obtain a reverse material ball with a reverse transparency parameter.
In the step, the initial transparency parameter carried by the initial material ball can be subjected to negation treatment; specifically, the inverse material sphere with the inverse transparency parameter is obtained by calculating the difference between the initial transparency value at each pixel position and "1".
And d, synthesizing the second material ball and the reverse material ball by using the second transparency difference value of each pixel position between the second transparency parameter and the reverse transparency parameter to obtain a fourth material ball with a fourth transparency parameter.
In the step, after the reverse material ball is obtained, a second transparency difference value between a second transparency value of each pixel position in the second transparency parameter and a reverse transparency value of each pixel position in the reverse transparency parameter is calculated, and the second material ball and the reverse material ball are subjected to synthesis processing by using the second transparency difference value of each pixel position to obtain a fourth material ball with a fourth transparency parameter; at this time, the target model is rendered by using the fourth material ball, so that the target model with the boundary having the second gradual change effect and without displaying the foreground pattern map can be obtained.
Here, the second gradient effect refers to an outer circle boundary contour of the target model, which gradually changes from a transparent state to a non-transparent state.
And e, synthesizing the third material ball and the fourth material ball by using the first maximum transparency value of each pixel position between the third transparency parameter and the fourth transparency parameter to obtain a fifth material ball with a fifth transparency parameter.
In the step, a first transparency maximum value between a third transparency value of each pixel position in a third transparency parameter and a fourth transparency value of each pixel position in a fourth transparency parameter is calculated, and a third material ball and a fourth material ball are subjected to synthesis processing by using the first transparency maximum value of each pixel position to obtain a fifth material ball with a fifth transparency parameter; at this time, the target model is rendered by using the fifth material ball, so that the target model with the outer ring boundary contour having a multi-layer gradient effect and without displaying the foreground pattern map can be obtained.
And f, synthesizing the second material ball and the fifth material ball by using the second transparency maximum value of each pixel position between the second transparency parameter and the fifth transparency parameter to obtain the first material ball with the first transparency parameter.
In the step, a second transparency maximum value between a second transparency value of each pixel position in the second transparency parameter and a fifth transparency value of each pixel position in the fifth transparency parameter is calculated, and the second transparency maximum value of each pixel position is utilized to carry out synthesis processing on the second material ball and the fifth material ball to obtain a first material ball with the first transparency parameter; at the moment, the target model is rendered by utilizing the first material ball, so that the target model with the outer ring boundary contour having a multi-layer gradient effect and displaying the foreground pattern chartlet can be obtained.
For the target model, the environment to which the target model belongs may also have a certain influence on the display effect of the target model, for example, the lighting effect existing in the environment; in order to further increase the reality of the rendering effect, the rendered target object may be influenced by the false ambient light through the environment map, so that false illumination information may be created for the rendered target model.
For some static texture maps, they may not themselves have a transparency parameter (e.g., ambient Matcap map), i.e., there are no portions of such static texture maps (e.g., ambient Matcap map) that have different transparency values, and are completely opaque at each pixel location; at this time, the transparency parameter in the material ball cannot be adjusted according to the transparency parameter carried by the material ball, and in this case, the transparency parameter in the material ball can be adjusted by using the gray pixel value carried by the material ball or the pixel parameter of any color channel.
In another embodiment, the static texture map comprises an environment map, and the first pixel parameter comprises a gray pixel parameter and/or a pixel parameter of any color channel; step S1021 includes: and adjusting the initial transparency parameter of the initial material ball by utilizing the gray pixel parameter of the environment map or the pixel parameter of any color channel to obtain the first material ball with the first transparency parameter.
Wherein the environment map is used for rendering an environment pattern in the surface texture of the target model.
In the step, on the premise that the environment map does not have the transparency parameter, the gray pixel value of each pixel position in the environment map or the pixel parameter of any color channel or the gray pixel value of each pixel position in the environment map is selected and used, and the initial transparency parameter of the initial material ball is adjusted through corresponding operation.
If the gray pixel parameter of the environment map is used for adjusting the initial transparency parameter, firstly, converting the obtained environment map into a gray environment map to obtain a gray pixel value of each pixel position in the environment map, wherein the gray pixel value of each pixel position is a numerical value within a range of 0-1; since the transparency parameter is actually a value between 0 and 1, the initial transparency parameter of the corresponding pixel position in the initial material ball can be adjusted by using the gray pixel value of each pixel position in the gray environment map through a corresponding operation mode, so as to obtain the first material ball with the first transparency parameter.
Therefore, the illumination information carried by the environment map can be transmitted to the material ball for rendering the target object, so that the rendered target object can be influenced by false environment light, and a more real target object is obtained.
It should be noted that the foreground pattern map and the environment map may be used to gradually adjust the initial transparency parameter of the initial material ball; namely, the foreground pattern map can be used for adjusting the initial transparency parameter of the initial material ball; then, the environment map is utilized, and secondary transparency parameter adjustment is carried out on the basis of the transparency parameter after the foreground pattern map adjustment, so that a first material ball with a first transparency parameter is obtained; in practice, the order of adjusting the transparency parameter of the material ball by the foreground pattern map and the environment map is not limited, that is, the transparency parameter of the material ball can be adjusted by the foreground pattern map, and then the transparency parameter after the initial adjustment can be adjusted by the environment map for the second time; the transparency parameter of the material ball can be adjusted by utilizing the environment map, and the transparency parameter after the primary adjustment is adjusted for the second time by utilizing the foreground pattern map, which can be determined according to the actual situation and is not limited herein; the transparency parameter is adjusted in the same manner as the transparency parameter of the above embodiment, and details are not repeated herein.
Step S1022, adjusting the first transparency parameter of the first material ball by using the second pixel parameter of the dynamic texture map changing with time, to obtain a target material ball with a transparency parameter changing with time.
In the step, the second pixel parameter of the dynamic texture mapping is utilized, and the first transparency parameter carried by the first material ball is adjusted again on the basis of the first transparency adjustment; similarly, the second pixel value at each pixel position in the dynamic texture map can be used to adjust the transparency value at the corresponding pixel position in the first material ball; here, since the second pixel parameter carried by the dynamic texture map changes with time, the transparency parameter of the target material ball obtained by adjusting the second pixel parameter also changes with time.
Here, the second pixel parameter may include a color parameter and a transparency parameter (Alpha, a); the color parameters may include color parameters of three color channels of Red (Red, R), Green (Green, G), and Blue (Blue, B), and grayscale pixel parameters, etc.
In one embodiment, the second pixel parameter comprises a pixel parameter of any one color channel; step S1022 includes: and adjusting the first transparency parameter of the first material ball by using the pixel parameter of any color channel of the dynamic texture mapping changing along with the time to obtain a target material ball with the transparency parameter changing along with the time.
In the step, for each pixel position in the dynamic texture map, determining a pixel value of any color channel at the pixel position, wherein the pixel value of any color channel at the pixel position is changed with time; on the basis, the first transparency value of the corresponding pixel position in the first material ball can be adjusted by utilizing the pixel value of the pixel position changing along with the time, so that the target material ball with the transparency parameter changing along with the time is obtained.
Here, the dynamic texture map may include a fog dynamic map and a light and shadow dynamic map; wherein the fog dynamic mapping is used for performing fog effect rendering on the surface texture of the target model; and the light and shadow dynamic mapping is used for rendering the light and shadow effect of the surface texture of the target model.
It should be noted that the initial transparency parameter of the initial material ball can be gradually adjusted by using the static texture map and the dynamic texture map; in the above embodiment, the initial transparency parameter of the initial material ball is adjusted by using the static texture map; secondly, performing secondary transparency parameter adjustment on the basis of a first transparency parameter obtained after static texture mapping adjustment by using the dynamic texture mapping to obtain a target material ball with the transparency parameter changing along with time; in fact, the order of adjusting the transparency parameter of the material ball by the static texture map and the dynamic texture map is not limited, that is, the transparency parameter of the material ball can be adjusted by the static texture map, and then the transparency parameter after the initial adjustment can be adjusted by the dynamic texture map; the transparency parameter of the material ball can be adjusted by utilizing the dynamic texture mapping, and the transparency parameter after the initial adjustment is adjusted for the second time by utilizing the static texture mapping, which can be determined according to the actual situation and is not limited herein; the transparency parameter is adjusted in the same manner as the transparency parameter of the above embodiment, and details are not repeated herein.
In one embodiment, please refer to fig. 3, wherein fig. 3 is a schematic view illustrating a manufacturing process of a mist dynamic map according to an embodiment of the present disclosure. As shown in fig. 3, the mist dynamic map is obtained by the following steps:
and S301, acquiring an original noise map according to the surface texture of the target model.
In this step, an original noise map for rendering the fog effect of the surface of the target model may be acquired according to the display form of the surface texture of the target model in the game scene.
Step S302, carrying out offset processing on the first original coordinate of each pixel position in the original noise map by using a preset first offset parameter to obtain a first offset coordinate of each pixel position.
In the step, the original noise map which is originally in a static state can be enabled to move in a UV flowing mode; specifically, a first original coordinate of each pixel position in the original noise map can be subjected to offset processing by using a preset first offset parameter, and the first original coordinate of each pixel position in the original noise map is changed along with time by changing the way of the first original coordinate of each pixel position in the original noise map, so that a first offset coordinate of each pixel position is obtained; here, the obtained change with time of the first offset coordinate at each pixel position is moved to a first specific direction; furthermore, the original noise map which is originally in a static state is subjected to lateral or longitudinal deviation in a first specific direction along with the change of time.
Here, the first specific direction may include a lateral direction, a longitudinal direction, an upper left direction, a lower left direction, an upper right direction, a lower right direction, and the like, and may specifically specify an offset angle and the like of the first specific direction.
And S303, sampling the original noise map according to the first original coordinate of each pixel position and the first offset coordinate of each pixel position to obtain the fog dynamic map.
In the step, in order to enable the original noise map to have a 'distortion' dynamic effect similar to smoke floating, sampling the original noise map through corresponding calculation processing according to a first original coordinate of each pixel position and a first offset coordinate of each pixel position, and further obtaining a fog dynamic map; for example, first, an addition operation is performed between the first original coordinate of each pixel position and the first offset coordinate at the pixel position; secondly, inputting the coordinate value of each pixel position obtained by the addition operation to the corresponding pixel position in the original noise map again to obtain a fog dynamic map; at this time, the pixel parameter of any color channel of the fog dynamic mapping is changed along with time.
Here, the corresponding operation may include addition, subtraction, or, xor, inversion, and the like.
In another embodiment, please refer to fig. 4, wherein fig. 4 is a schematic diagram illustrating a process of manufacturing a light and shadow dynamic map according to an embodiment of the present application. As shown in fig. 4, the dynamic shadow map is obtained by the following steps:
s401, obtaining an original light and shadow map according to the surface texture of the target model.
In this step, an original light and shadow map for rendering the light and shadow effect of the surface of the target model may be acquired according to the display form of the surface texture of the target model in the game scene.
S402, carrying out offset processing on a second original coordinate of each pixel position in the original light and shadow map by using a preset second offset parameter to obtain a flowing light and shadow map.
In the step, the original light shadow map which is originally in a static state can be moved in a UV flowing mode; specifically, a preset second offset parameter can be used for offset processing of a second original coordinate of each pixel position in the original light and shadow map, the second original coordinate of each pixel position in the original light and shadow map is changed along with time in a mode of changing the second original coordinate of each pixel position in the original light and shadow map, a second offset coordinate of each pixel position is obtained, and then the flowing light and shadow map is obtained; here, the obtained change of the second offset coordinate at each pixel position with time is moved to a second specific direction; furthermore, the original static light shadow map is shifted in a second specific direction, such as left upper or right upper direction, along with the change of time.
Here, the second specific direction may include a lateral direction, a longitudinal direction, an upper left direction, a lower left direction, an upper right direction, a lower right direction, and the like, and may specifically specify an offset angle and the like of the second specific direction.
S403, according to the flicker speed and the flicker time of the dynamic light and shadow map on the texture surface of the target model, performing offset processing on the second original coordinate of each pixel position in the original light and shadow map to obtain a third offset coordinate of each pixel position.
In this step, for the target model, according to its own design, the light and shadow effect displayed on its surface may have a certain flicker speed and corresponding flicker time; therefore, in order to enable the dynamic light and shadow map to "flash", the second original coordinate at each pixel position in the original light and shadow map is subjected to offset processing according to the flash speed and flash time of the dynamic light and shadow map on the texture surface of the target model, and a third offset coordinate of each pixel position is obtained.
S404, sampling the original light and shadow map by using the third offset coordinate of each pixel position to obtain a flicker light and shadow map.
In the step, in order to enable the original light and shadow map to have a flickering dynamic effect, sampling processing is carried out on the original light and shadow map through corresponding calculation processing by utilizing the third offset coordinate of each pixel position, and then the flickering light and shadow map is obtained; for example, first, the second original coordinate of each pixel position and the third offset coordinate at the pixel position are added; and then, inputting the coordinate value of each pixel position obtained by the addition operation to the corresponding pixel position in the original light and shadow map again to obtain the flicker light and shadow map.
S405, according to the second offset coordinate of each pixel position and the third offset coordinate of each pixel position, the flowing light shadow map and the flickering light shadow map are subjected to synthesis processing to obtain the light shadow dynamic map.
In the step, in order to enable the original light and shadow map to have dynamic effects of 'flickering' and 'flowing', the second offset coordinate of each pixel position and the third offset coordinate of each pixel position are utilized, and the flowing light and shadow map and the flickering light and shadow map are subjected to synthesis processing through corresponding calculation processing, so that the 'flickering' and 'flowing' light and shadow dynamic map is obtained; for example, first, a multiplication operation is performed between the second offset coordinate of each pixel position and the third offset coordinate at the pixel position; and then, limiting the coordinate value of each pixel position obtained by multiplication to be 0-1, and obtaining the dynamic shadow map.
In step S103, the obtained target material ball is used to render the target model, so that a target model with dynamic effect can be rendered according to parameter information such as transparency parameter carried in the target material ball.
In one embodiment, the texture map comprises a foreground pattern map, an environment map, a fog dynamic map, and a light and shadow dynamic map; step S102 includes: and gradually adjusting the initial transparency parameter of the initial material ball based on the pixel parameter of the foreground pattern map, the pixel parameter of the environment map, the pixel parameter of the fog dynamic map and the pixel parameter of the light and shadow dynamic map respectively to obtain a gradually adjusted target material ball.
In the step, when the texture map used for adjusting the transparency parameter of the initial material ball comprises a foreground pattern map, an environment map, a fog dynamic map and a light and shadow dynamic map, the initial transparency parameter of the initial material ball is gradually adjusted by respectively using the pixel parameter of the foreground pattern map, the pixel parameter of the environment map, the pixel parameter of the fog dynamic map and the pixel parameter of the light and shadow dynamic map;
specifically, firstly, adjusting an initial transparency parameter of an initial material ball by using a pixel parameter of a foreground pattern map; secondly, adjusting the transparency parameter of the material ball after being adjusted by the foreground pattern mapping by using the pixel parameter of the environment mapping; then, adjusting the transparency parameter of the material ball after being adjusted by the environment map by using the pixel parameter of the fog dynamic map; finally, adjusting the transparency parameter of the material ball subjected to the fog dynamic mapping adjustment by using the pixel parameter of the light and shadow dynamic mapping to obtain a target material ball subjected to gradual adjustment; wherein, the transparency parameter of the target material ball after gradual adjustment changes along with time.
It should be noted that, for the foreground pattern map, the environment map, the fog dynamic map and the light and shadow dynamic map, there is no limitation on the adjustment sequence of the transparency parameter in the material ball, and in other embodiments, the environment map may be used to adjust the initial transparency parameter in the initial material ball first; secondly, adjusting the transparency parameter of the material ball after the environment map is adjusted by using the foreground pattern map; then, adjusting the transparency parameter of the material ball after the foreground pattern mapping is adjusted by utilizing the fog dynamic mapping; and finally, adjusting the transparency parameter of the material ball after the fog dynamic mapping is adjusted by utilizing the light and shadow dynamic mapping.
Step S103 includes: rendering the target model by using the gradually adjusted target material ball to obtain a target model with a dynamic effect; wherein the foreground pattern map and the environment map are displayed in the dynamic effect, and the dynamic effect has a fog effect and a light and shadow effect.
In this step, the target model is rendered by using the target material ball which is gradually adjusted by the foreground pattern map, the environment map, the fog dynamic map and the light and shadow dynamic map, so that the target model which displays the foreground pattern map, the environment map and has the fog dynamic effect and the light and shadow dynamic effect can be obtained.
According to the model rendering method for the dynamic effect, provided by the embodiment of the application, the initial transparency parameter of the initial material ball is obtained according to the material information of the target model; adjusting an initial transparency parameter of the initial material ball based on the pixel parameter of the texture map to obtain a target material ball with the transparency parameter changing along with time; and rendering the target model by using the target material ball to obtain the dynamic effect of the target model. Therefore, under the condition that the animation special effect does not need to be made, the target model with the dynamic effect can be simulated through the transparency parameter which is carried by the target material ball and changes along with time, and therefore, the calculation consumption in the rendering process of the terminal equipment is reduced by avoiding using the animation special effect, and the making difficulty of the dynamic effect is reduced.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a model rendering apparatus with dynamic effect according to an embodiment of the present disclosure. As shown in fig. 5, the model rendering apparatus 500 includes:
a parameter obtaining module 510, configured to obtain an initial transparency parameter of an initial material ball according to the material information of the target model;
a parameter adjusting module 520, configured to adjust an initial transparency parameter of the initial material ball based on a pixel parameter of the texture map, to obtain a target material ball with a transparency parameter changing with time; wherein the texture map comprises a dynamic texture map; the dynamic texture mapping is used for performing dynamic effect rendering on the surface texture of the target model;
and an effect rendering module 530, configured to perform rendering processing on the target model by using the target material ball, so as to obtain a dynamic effect of the target model.
Further, the texture map further comprises a static texture map; when the parameter adjusting module 520 is configured to adjust the initial transparency parameter of the initial material ball based on the pixel parameter of the texture map to obtain the target material ball with the transparency parameter changing with time, the parameter adjusting module 520 is configured to:
adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain a first material ball with a first transparency parameter; wherein the static texture map is used for static pattern rendering of the surface texture of the target model;
and adjusting the first transparency parameter of the first material ball by using the second pixel parameter of the dynamic texture mapping changing along with the time to obtain a target material ball with the transparency parameter changing along with the time.
Further, the static texture map comprises a foreground pattern map; when the parameter adjusting module 520 is configured to adjust the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map, so as to obtain the first material ball with the first transparency parameter, the parameter adjusting module 520 is configured to:
determining a transparent area and a non-transparent area in the initial material ball according to the surface texture of the target model;
adjusting the initial transparency parameter of the non-transparent area by using the first pixel parameter of the foreground pattern map to obtain a first material ball with a first transparency parameter; wherein the foreground pattern map is used for rendering foreground patterns in the surface texture of the target model.
Further, when the parameter adjusting module 520 is configured to adjust the initial transparency parameter of the non-transparent region by using the first pixel parameter of the foreground pattern map to obtain the first material ball having the first transparency parameter, the parameter adjusting module 520 is configured to:
adjusting the initial transparency parameter in the nontransparent area by using the first pixel parameter of the foreground pattern map to obtain a second material ball with a second transparency parameter;
synthesizing the initial material ball and the second material ball by using a first transparency difference value of each pixel position between the initial transparency parameter and the second transparency parameter to obtain a third material ball with a third transparency parameter;
performing negation treatment on the initial transparency parameter of the initial material ball to obtain a reverse material ball with a reverse transparency parameter;
synthesizing the second material ball and the reverse material ball by using a second transparency difference value of each pixel position between the second transparency parameter and the reverse transparency parameter to obtain a fourth material ball with a fourth transparency parameter;
synthesizing the third material ball and the fourth material ball by using the first transparency maximum value of each pixel position between the third transparency parameter and the fourth transparency parameter to obtain a fifth material ball with a fifth transparency parameter;
and synthesizing the second material ball and the fifth material ball by using the second transparency maximum value of each pixel position between the second transparency parameter and the fifth transparency parameter to obtain the first material ball with the first transparency parameter.
Further, the static texture map comprises an environment map, and the first pixel parameter comprises a gray pixel parameter and/or a pixel parameter of any color channel; when the parameter adjusting module 520 is configured to adjust the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain the first material ball with the first transparency parameter, the parameter adjusting module 520 is configured to:
adjusting the initial transparency parameter of the initial material ball by utilizing the gray pixel parameter of the environment map or the pixel parameter of any color channel to obtain a first material ball with a first transparency parameter; wherein the environment map is used for rendering an environment pattern in the surface texture of the target model.
Further, the second pixel parameter includes a pixel parameter of any color channel; when the parameter adjusting module 520 is configured to adjust the first transparency parameter of the first material ball by using the second pixel parameter of the dynamic texture map changing with time, so as to obtain the target material ball with the transparency parameter changing with time, the parameter adjusting module 520 is configured to:
and adjusting the first transparency parameter of the first material ball by using the pixel parameter of any color channel of the dynamic texture mapping changing along with the time to obtain a target material ball with the transparency parameter changing along with the time.
Further, the dynamic texture mapping comprises a fog dynamic mapping; the parameter adjusting module 520 is configured to obtain the fog dynamic map by:
acquiring an original noise map according to the surface texture of the target model;
carrying out offset processing on a first original coordinate of each pixel position in the original noise map by using a preset first offset parameter to obtain a first offset coordinate of each pixel position;
sampling the original noise map according to the first original coordinate of each pixel position and the first offset coordinate of each pixel position to obtain the fog dynamic map; and the fog dynamic mapping is used for performing fog effect rendering on the surface texture of the target model.
Further, the dynamic texture map further comprises a light and shadow dynamic map; the parameter adjusting module 520 is configured to obtain the dynamic shadow map by:
acquiring an original light and shadow map according to the surface texture of the target model;
performing offset processing on a second original coordinate of each pixel position in the original light and shadow map by using a preset second offset parameter to obtain a flowing light and shadow map; wherein each pixel location in the flow light map has a second offset coordinate;
according to the flicker speed and the flicker time of the light and shadow dynamic mapping on the texture surface of the target model, carrying out offset processing on a second original coordinate of each pixel position in the original light and shadow mapping to obtain a third offset coordinate of each pixel position;
sampling the original light and shadow map by using the third offset coordinate of each pixel position to obtain a flicker light and shadow map;
synthesizing the flowing light shadow map and the flickering light shadow map according to the second offset coordinate of each pixel position and the third offset coordinate of each pixel position to obtain the light shadow dynamic map; and the light and shadow dynamic mapping is used for rendering the light and shadow effect of the surface texture of the target model.
Further, the texture map comprises a foreground pattern map, an environment map, a fog dynamic map and a light and shadow dynamic map; when the parameter adjusting module 520 is configured to adjust the initial transparency parameter of the initial material ball based on the pixel parameter of the texture map to obtain the target material ball with the transparency parameter changing with time, the parameter adjusting module 520 is configured to:
gradually adjusting the initial transparency parameter of the initial material ball based on the pixel parameter of the foreground pattern map, the pixel parameter of the environment map, the pixel parameter of the fog dynamic map and the pixel parameter of the light and shadow dynamic map respectively to obtain a target material ball after gradual adjustment; wherein, the transparency parameter of the target material ball after gradual adjustment changes along with time;
when the effect rendering module 530 is configured to render the target model by using the target material ball to obtain the dynamic effect of the target model, the effect rendering module 530 is configured to:
rendering the target model by using the gradually adjusted target material ball to obtain a target model with a dynamic effect; the foreground pattern map and the environment map are displayed in the dynamic effect, and the dynamic effect has a fog dynamic effect and a light and shadow dynamic effect.
According to the model rendering device with the dynamic effect, provided by the embodiment of the application, the initial transparency parameter of the initial material ball is obtained according to the material information of the target model; adjusting an initial transparency parameter of the initial material ball based on the pixel parameter of the texture map to obtain a target material ball with the transparency parameter changing along with time; and rendering the target model by using the target material ball to obtain the dynamic effect of the target model. Therefore, under the condition that the animation special effect does not need to be made, the target model with the dynamic effect can be simulated through the transparency parameter which is carried by the target material ball and changes along with time, and therefore, the calculation consumption in the rendering process of the terminal equipment is reduced by avoiding using the animation special effect, and the making difficulty of the dynamic effect is reduced.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure. As shown in fig. 6, the electronic device 600 comprises a processor 610, a memory 620 and a bus 630, the memory 620 stores machine readable instructions executable by the processor 610, when the electronic device executes a model rendering method of dynamic effect as in the embodiment, the processor 610 and the memory 620 communicate through the bus 630, the processor 610 executes the machine readable instructions, the processor 610 is a preamble of the method item to execute the following steps:
acquiring an initial transparency parameter of an initial material ball according to the material information of the target model;
adjusting an initial transparency parameter of the initial material ball based on a pixel parameter of the texture map to obtain a target material ball with the transparency parameter changing along with time; wherein the texture map comprises a dynamic texture map; the dynamic texture mapping is used for performing dynamic effect rendering on the surface texture of the target model;
and rendering the target model by using the target material ball to obtain the dynamic effect of the target model.
In one possible embodiment, the texture map further comprises a static texture map; the processor 610 is configured to execute a pixel parameter based on a texture map, adjust an initial transparency parameter of the initial material ball, and obtain a target material ball with a transparency parameter changing with time, and specifically configured to:
adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain a first material ball with a first transparency parameter; wherein the static texture map is used for performing static pattern rendering on the surface texture of the target model;
and adjusting the first transparency parameter of the first material ball by using the second pixel parameter of the dynamic texture mapping changing along with the time to obtain a target material ball with the transparency parameter changing along with the time.
In one possible embodiment, the static texture map comprises a foreground pattern map; the processor 610, when configured to perform adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain a first material ball having a first transparency parameter, is specifically configured to:
determining a transparent area and a non-transparent area in the initial material ball according to the surface texture of the target model;
adjusting the initial transparency parameter of the non-transparent area by using the first pixel parameter of the foreground pattern map to obtain a first material ball with a first transparency parameter; wherein the foreground pattern map is used to render foreground patterns in the surface texture of the target model.
In a possible embodiment, the processor 610, when configured to perform the adjusting of the initial transparency parameter of the initial material ball by using the first pixel parameter of the foreground pattern map, to obtain a first material ball having a first transparency parameter, is specifically configured to:
adjusting the initial transparency parameter in the nontransparent area by using the first pixel parameter of the foreground pattern map to obtain a second material ball with a second transparency parameter;
synthesizing the initial material ball and the second material ball by using a first transparency difference value of each pixel position between the initial transparency parameter and the second transparency parameter to obtain a third material ball with a third transparency parameter;
performing negation treatment on the initial transparency parameter of the initial material ball to obtain a reverse material ball with a reverse transparency parameter;
synthesizing the second material ball and the reverse material ball by using a second transparency difference value of each pixel position between the second transparency parameter and the reverse transparency parameter to obtain a fourth material ball with a fourth transparency parameter;
synthesizing the third material ball and the fourth material ball by using the first transparency maximum value of each pixel position between the third transparency parameter and the fourth transparency parameter to obtain a fifth material ball with a fifth transparency parameter;
and synthesizing the second material ball and the fifth material ball by using the second transparency maximum value of each pixel position between the second transparency parameter and the fifth transparency parameter to obtain the first material ball with the first transparency parameter.
In one possible embodiment, the static texture map comprises an environment map, and the first pixel parameter comprises a gray pixel parameter and/or a pixel parameter of any color channel; the processor 610, when configured to perform adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain a first material ball having a first transparency parameter, is specifically configured to:
adjusting the initial transparency parameter of the initial material ball by utilizing the gray pixel parameter of the environment map or the pixel parameter of any color channel to obtain a first material ball with a first transparency parameter; wherein the environment map is used for rendering an environment pattern in the surface texture of the target model.
In one possible embodiment, the second pixel parameter includes a pixel parameter of any one color channel; when the processor 610 is configured to adjust the first transparency parameter of the first material ball by using the second pixel parameter of the dynamic texture map changing with time, to obtain a target material ball with a transparency parameter changing with time, specifically:
and adjusting the first transparency parameter of the first material ball by using the pixel parameter of any color channel of the dynamic texture mapping changing along with the time to obtain a target material ball with the transparency parameter changing along with the time.
In one possible embodiment, the dynamic texture map comprises a fog dynamic map; the processor 610 is configured to obtain the fog dynamic map by:
acquiring an original noise map according to the surface texture of the target model;
carrying out offset processing on a first original coordinate of each pixel position in the original noise map by using a preset first offset parameter to obtain a first offset coordinate of each pixel position;
sampling the original noise map according to the first original coordinate of each pixel position and the first offset coordinate of each pixel position to obtain the fog dynamic map; and the fog dynamic mapping is used for carrying out fog effect rendering on the surface texture of the target model.
In one possible embodiment, the dynamic texture map further comprises a shadow dynamic map; the processor 610 is configured to obtain the dynamic shadow map by:
acquiring an original light and shadow map according to the surface texture of the target model;
performing offset processing on a second original coordinate of each pixel position in the original light and shadow map by using a preset second offset parameter to obtain a flowing light and shadow map; wherein each pixel location in the flow light map has a second offset coordinate;
according to the flicker speed and the flicker time of the dynamic light and shadow map on the texture surface of the target model, carrying out offset processing on a second original coordinate of each pixel position in the original light and shadow map to obtain a third offset coordinate of each pixel position;
sampling the original light and shadow map by using the third offset coordinate of each pixel position to obtain a flicker light and shadow map;
synthesizing the flowing light shadow map and the flickering light shadow map according to the second offset coordinate of each pixel position and the third offset coordinate of each pixel position to obtain the light shadow dynamic map; and the light and shadow dynamic mapping is used for rendering the light and shadow effect of the surface texture of the target model.
In one possible embodiment, the texture map comprises a foreground pattern map, an environment map, a fog dynamic map, and a light and shadow dynamic map; the processor 610 is configured to execute pixel parameters based on texture mapping, adjust an initial transparency parameter of the initial material ball, and when a target material ball with a transparency parameter changing with time is obtained, specifically configured to:
gradually adjusting the initial transparency parameter of the initial material ball based on the pixel parameter of the foreground pattern map, the pixel parameter of the environment map, the pixel parameter of the fog dynamic map and the pixel parameter of the light and shadow dynamic map respectively to obtain a target material ball after gradual adjustment; wherein, the transparency parameter of the target material ball after gradual adjustment changes along with time;
the processor 610 is configured to perform rendering processing on the target model by using the target material ball, and when obtaining a dynamic effect of the target model, specifically configured to:
rendering the target model by using the gradually adjusted target material ball to obtain a target model with a dynamic effect; the foreground pattern map and the environment map are displayed in the dynamic effect, and the dynamic effect has a fog dynamic effect and a light and shadow dynamic effect.
In the above manner, according to the material information of the target model, the initial transparency parameter of the initial material ball is obtained; adjusting an initial transparency parameter of the initial material ball based on the pixel parameter of the texture map to obtain a target material ball with the transparency parameter changing along with time; and rendering the target model by using the target material ball to obtain the dynamic effect of the target model. Therefore, under the condition that an animation special effect is not required to be made, an object model with a dynamic effect can be simulated through the transparency parameter which is carried by the object material ball and changes along with time, and therefore, the calculation consumption in the rendering process of the terminal equipment is reduced and the making difficulty of the dynamic effect is reduced in a mode of avoiding using the animation special effect; in addition, in order to enable the surface texture of the rendered target model to display a corresponding static texture map, the transparency of the texture ball may be adjusted according to a first pixel parameter (e.g., a transparency parameter) of the static texture map, so as to ensure that pixels in an area of the target model displaying the static texture map are in a non-transparent state until the pixels are displayed; in addition, in the process of changing the transparency parameter of the material ball, in order to further increase the reality degree of the rendering effect, the surface texture of the rendered target model can be created with false ambient light through the environment map, so that false illumination information can be created on the surface texture of the rendered target model.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the processor performs the following steps:
acquiring an initial transparency parameter of an initial material ball according to the material information of the target model;
adjusting an initial transparency parameter of the initial material ball based on a pixel parameter of the texture map to obtain a target material ball with the transparency parameter changing along with time; wherein the texture map comprises a dynamic texture map; the dynamic texture mapping is used for performing dynamic effect rendering on the surface texture of the target model;
and rendering the target model by using the target material ball to obtain the dynamic effect of the target model.
In one possible embodiment, the texture map further comprises a static texture map; the processor is configured to execute a pixel parameter based on a texture map, adjust an initial transparency parameter of the initial material ball, and obtain a target material ball with a transparency parameter changing with time, and specifically configured to:
adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain a first material ball with a first transparency parameter; wherein the static texture map is used for performing static pattern rendering on the surface texture of the target model;
and adjusting the first transparency parameter of the first material ball by using the second pixel parameter of the dynamic texture mapping changing along with the time to obtain a target material ball with the transparency parameter changing along with the time.
In one possible embodiment, the static texture map comprises a foreground pattern map; when the processor is configured to execute the adjusting of the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain the first material ball with the first transparency parameter, the processor is specifically configured to:
determining a transparent area and a non-transparent area in the initial material ball according to the surface texture of the target model;
adjusting the initial transparency parameter of the non-transparent area by using the first pixel parameter of the foreground pattern map to obtain a first material ball with a first transparency parameter; wherein the foreground pattern map is used to render foreground patterns in the surface texture of the target model.
In a possible embodiment, the processor, when being configured to perform the step of adjusting an initial transparency parameter of the initial material ball by using the first pixel parameter of the foreground pattern map to obtain a first material ball having a first transparency parameter, is specifically configured to:
adjusting the initial transparency parameter in the nontransparent area by using the first pixel parameter of the foreground pattern map to obtain a second material ball with a second transparency parameter;
synthesizing the initial material ball and the second material ball by using a first transparency difference value of each pixel position between the initial transparency parameter and the second transparency parameter to obtain a third material ball with a third transparency parameter;
performing negation treatment on the initial transparency parameter of the initial material ball to obtain a reverse material ball with a reverse transparency parameter;
synthesizing the second material ball and the reverse material ball by using a second transparency difference value of each pixel position between the second transparency parameter and the reverse transparency parameter to obtain a fourth material ball with a fourth transparency parameter;
synthesizing the third material ball and the fourth material ball by using the first transparency maximum value of each pixel position between the third transparency parameter and the fourth transparency parameter to obtain a fifth material ball with a fifth transparency parameter;
and synthesizing the second material ball and the fifth material ball by using the second transparency maximum value of each pixel position between the second transparency parameter and the fifth transparency parameter to obtain the first material ball with the first transparency parameter.
In one possible embodiment, the static texture map comprises an environment map, and the first pixel parameter comprises a gray pixel parameter and/or a pixel parameter of any color channel; the processor, when configured to perform adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain a first material ball having a first transparency parameter, is specifically configured to:
adjusting the initial transparency parameter of the initial material ball by utilizing the gray pixel parameter of the environment map or the pixel parameter of any color channel to obtain a first material ball with a first transparency parameter; wherein the environment map is used for rendering an environment pattern in the surface texture of the target model.
In one possible embodiment, the second pixel parameter includes a pixel parameter of any one color channel; when the processor is configured to execute adjusting the first transparency parameter of the first material ball by using the second pixel parameter of the dynamic texture map changing with time to obtain a target material ball with a transparency parameter changing with time, the processor is specifically configured to:
and adjusting the first transparency parameter of the first material ball by using the pixel parameter of any color channel of the dynamic texture mapping changing along with the time to obtain a target material ball with the transparency parameter changing along with the time.
In one possible embodiment, the dynamic texture map comprises a fog dynamic map; the processor is used for obtaining the fog dynamic mapping by the following steps:
acquiring an original noise map according to the surface texture of the target model;
carrying out offset processing on a first original coordinate of each pixel position in the original noise map by using a preset first offset parameter to obtain a first offset coordinate of each pixel position;
sampling the original noise map according to the first original coordinate of each pixel position and the first offset coordinate of each pixel position to obtain the fog dynamic map; and the fog dynamic mapping is used for performing fog effect rendering on the surface texture of the target model.
In one possible embodiment, the dynamic texture map further comprises a shadow dynamic map; the processor is used for acquiring the dynamic light and shadow map through the following steps:
acquiring an original light and shadow map according to the surface texture of the target model;
performing offset processing on a second original coordinate of each pixel position in the original light and shadow map by using a preset second offset parameter to obtain a flowing light and shadow map; wherein each pixel location in the flow light map has a second offset coordinate;
according to the flicker speed and the flicker time of the dynamic light and shadow map on the texture surface of the target model, carrying out offset processing on a second original coordinate of each pixel position in the original light and shadow map to obtain a third offset coordinate of each pixel position;
sampling the original light and shadow map by using the third offset coordinate of each pixel position to obtain a flicker light and shadow map;
synthesizing the flowing light shadow map and the flickering light shadow map according to the second offset coordinate of each pixel position and the third offset coordinate of each pixel position to obtain the light shadow dynamic map; and the light and shadow dynamic mapping is used for rendering the light and shadow effect of the surface texture of the target model.
In one possible embodiment, the texture map comprises a foreground pattern map, an environment map, a fog dynamic map, and a light and shadow dynamic map; the processor is configured to execute a pixel parameter based on a texture map, adjust an initial transparency parameter of the initial material ball, and obtain a target material ball with a transparency parameter varying with time, and specifically configured to:
gradually adjusting the initial transparency parameter of the initial material ball based on the pixel parameter of the foreground pattern map, the pixel parameter of the environment map, the pixel parameter of the fog dynamic map and the pixel parameter of the light and shadow dynamic map respectively to obtain a target material ball after gradual adjustment; wherein, the transparency parameter of the target material ball after gradual adjustment changes along with time;
the processor is configured to perform rendering processing on the target model by using the target material ball, and when a dynamic effect of the target model is obtained, the processor is specifically configured to:
rendering the target model by using the gradually adjusted target material ball to obtain a target model with a dynamic effect; the foreground pattern map and the environment map are displayed in the dynamic effect, and the dynamic effect has a fog dynamic effect and a light and shadow dynamic effect.
In the above manner, according to the material information of the target model, the initial transparency parameter of the initial material ball is obtained; adjusting an initial transparency parameter of the initial material ball based on the pixel parameter of the texture map to obtain a target material ball with the transparency parameter changing along with time; and rendering the target model by using the target material ball to obtain the dynamic effect of the target model. Therefore, under the condition that an animation special effect is not required to be made, an object model with a dynamic effect can be simulated through the transparency parameter which is carried by the object material ball and changes along with time, and therefore, the calculation consumption in the rendering process of the terminal equipment is reduced and the making difficulty of the dynamic effect is reduced in a mode of avoiding using the animation special effect; in addition, in order to enable the surface texture of the rendered target model to display a corresponding static texture map, the transparency of the texture ball may be adjusted according to a first pixel parameter (e.g., a transparency parameter) of the static texture map, so as to ensure that pixels in an area of the target model displaying the static texture map are in a non-transparent state until the pixels are displayed; in addition, in the process of changing the transparency parameter of the material ball, in order to further increase the reality degree of the rendering effect, the surface texture of the rendered target model can be created with false ambient light through the environment map, so that false illumination information can be created on the surface texture of the rendered target model.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the units into only one type of logical function may be implemented in other ways, and for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solutions of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present application, and are used for illustrating the technical solutions of the present application, but not limiting the same, and the scope of the present application is not limited thereto, and although the present application is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the exemplary embodiments of the present application, and are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (12)

1. A method for model rendering of dynamic effects, the method comprising:
acquiring an initial transparency parameter of an initial material ball according to the material information of the target model;
adjusting an initial transparency parameter of the initial material ball based on a pixel parameter of the texture mapping to obtain a target material ball with the transparency parameter changing along with time; wherein the texture map comprises a dynamic texture map; the dynamic texture mapping is used for performing dynamic effect rendering on the surface texture of the target model;
and rendering the target model by using the target material ball to obtain the dynamic effect of the target model.
2. The model rendering method of claim 1, wherein the texture map further comprises a static texture map; the texture mapping-based pixel parameter adjusting the initial transparency parameter of the initial material ball to obtain the target material ball with the transparency parameter changing along with time comprises the following steps:
adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture mapping to obtain a first material ball with a first transparency parameter; wherein the static texture map is used for performing static pattern rendering on the surface texture of the target model;
and adjusting a first transparency parameter of the first material ball by using the second pixel parameter of the dynamic texture map changing along with the time to obtain a target material ball of which the transparency parameter changes along with the time.
3. The model rendering method of claim 2, wherein the static texture map comprises a foreground pattern map; the adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain the first material ball with the first transparency parameter includes:
determining a transparent area and a non-transparent area in the initial material ball according to the surface texture of the target model;
adjusting the initial transparency parameter of the non-transparent area by using the first pixel parameter of the foreground pattern map to obtain a first material ball with a first transparency parameter; wherein the foreground pattern map is used to render foreground patterns in the surface texture of the target model.
4. The model rendering method of claim 3, wherein the adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the foreground pattern map to obtain the first material ball with the first transparency parameter comprises:
adjusting the initial transparency parameter in the nontransparent area by using the first pixel parameter of the foreground pattern map to obtain a second material ball with a second transparency parameter;
synthesizing the initial material ball and the second material ball by using a first transparency difference value of each pixel position between the initial transparency parameter and the second transparency parameter to obtain a third material ball with a third transparency parameter;
performing negation treatment on the initial transparency parameters of the initial material ball to obtain a reverse material ball with reverse transparency parameters;
synthesizing the second material ball and the reverse material ball by using a second transparency difference value of each pixel position between the second transparency parameter and the reverse transparency parameter to obtain a fourth material ball with a fourth transparency parameter;
synthesizing the third material ball and the fourth material ball by using the first maximum transparency value of each pixel position between the third transparency parameter and the fourth transparency parameter to obtain a fifth material ball with a fifth transparency parameter;
and synthesizing the second material ball and the fifth material ball by using the second transparency maximum value of each pixel position between the second transparency parameter and the fifth transparency parameter to obtain the first material ball with the first transparency parameter.
5. The model rendering method of claim 2, wherein the static texture map comprises an environment map, and the first pixel parameter comprises a grayscale pixel parameter and/or a pixel parameter of any color channel; the adjusting the initial transparency parameter of the initial material ball by using the first pixel parameter of the static texture map to obtain the first material ball with the first transparency parameter includes:
adjusting the initial transparency parameter of the initial material ball by utilizing the gray pixel parameter of the environment map or the pixel parameter of any color channel to obtain a first material ball with a first transparency parameter; wherein the environment map is used for rendering an environment pattern in the surface texture of the target model.
6. The model rendering method of claim 2, wherein the second pixel parameter comprises a pixel parameter of any one color channel; the adjusting the first transparency parameter of the first material ball by using the second pixel parameter of the dynamic texture map changing along with the time to obtain the target material ball with the transparency parameter changing along with the time comprises the following steps:
and adjusting the first transparency parameter of the first material ball by using the pixel parameter of any color channel of the dynamic texture mapping changing along with the time to obtain a target material ball with the transparency parameter changing along with the time.
7. The model rendering method of claim 1, wherein the dynamic texture map comprises a fog dynamic map; obtaining the fog dynamic mapping by the following steps:
acquiring an original noise map according to the surface texture of the target model;
performing offset processing on a first original coordinate of each pixel position in the original noise map by using a preset first offset parameter to obtain a first offset coordinate of each pixel position;
sampling the original noise map according to the first original coordinate of each pixel position and the first offset coordinate of each pixel position to obtain the fog dynamic map; and the fog dynamic mapping is used for performing fog effect rendering on the surface texture of the target model.
8. The model rendering method of claim 1, wherein the dynamic texture map further comprises a shadow dynamic map; acquiring the dynamic shadow map by the following steps:
acquiring an original light and shadow map according to the surface texture of the target model;
performing offset processing on a second original coordinate of each pixel position in the original light and shadow map by using a preset second offset parameter to obtain a flowing light and shadow map; wherein each pixel location in the flow light map has a second offset coordinate;
according to the flicker speed and the flicker time of the dynamic light and shadow map on the texture surface of the target model, carrying out offset processing on a second original coordinate of each pixel position in the original light and shadow map to obtain a third offset coordinate of each pixel position;
sampling the original light and shadow map by using the third offset coordinate of each pixel position to obtain a flicker light and shadow map;
synthesizing the flowing light shadow map and the flickering light shadow map according to the second offset coordinate of each pixel position and the third offset coordinate of each pixel position to obtain the light shadow dynamic map; and the light and shadow dynamic mapping is used for rendering the light and shadow effect of the surface texture of the target model.
9. The model rendering method of claim 1, wherein the texture map comprises a foreground pattern map, an environment map, a fog dynamic map, and a light and shadow dynamic map; the texture mapping-based pixel parameter adjusting the initial transparency parameter of the initial material ball to obtain the target material ball with the transparency parameter changing along with time comprises the following steps:
gradually adjusting the initial transparency parameter of the initial material ball based on the pixel parameter of the foreground pattern map, the pixel parameter of the environment map, the pixel parameter of the fog dynamic map and the pixel parameter of the light and shadow dynamic map respectively to obtain a target material ball after gradual adjustment; wherein, the transparency parameter of the target material ball after gradual adjustment changes along with time;
the rendering processing of the target model by using the target material ball to obtain the dynamic effect of the target model comprises the following steps:
rendering the target model by using the gradually adjusted target material ball to obtain a target model with a dynamic effect; the foreground pattern map and the environment map are displayed in the dynamic effect, and the dynamic effect has a fog dynamic effect and a light and shadow dynamic effect.
10. A model rendering apparatus for a dynamic effect, the model rendering apparatus comprising:
the parameter acquisition module is used for acquiring an initial transparency parameter of the initial material ball according to the material information of the target model;
the parameter adjusting module is used for adjusting the initial transparency parameter of the initial material ball based on the pixel parameter of the texture map to obtain a target material ball with the transparency parameter changing along with time; wherein the texture map comprises a dynamic texture map; the dynamic texture mapping is used for performing dynamic effect rendering on the surface texture of the target model;
and the effect rendering module is used for rendering the target model by using the target material ball to obtain the dynamic effect of the target model.
11. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is run, the machine-readable instructions when executed by the processor performing the steps of the method for model rendering of dynamic effects according to any of claims 1 to 9.
12. A computer-readable storage medium, characterized in that a computer program is stored thereon, which, when being executed by a processor, performs the steps of the method for model rendering of dynamic effects according to any of claims 1 to 9.
CN202210601263.2A 2022-05-30 2022-05-30 Model rendering method and device for dynamic effect, electronic equipment and storage medium Pending CN114937103A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210601263.2A CN114937103A (en) 2022-05-30 2022-05-30 Model rendering method and device for dynamic effect, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210601263.2A CN114937103A (en) 2022-05-30 2022-05-30 Model rendering method and device for dynamic effect, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114937103A true CN114937103A (en) 2022-08-23

Family

ID=82866572

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210601263.2A Pending CN114937103A (en) 2022-05-30 2022-05-30 Model rendering method and device for dynamic effect, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114937103A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117541696A (en) * 2023-10-16 2024-02-09 北京百度网讯科技有限公司 Rendering and making method, device, equipment and storage medium of three-dimensional model map

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117541696A (en) * 2023-10-16 2024-02-09 北京百度网讯科技有限公司 Rendering and making method, device, equipment and storage medium of three-dimensional model map

Similar Documents

Publication Publication Date Title
CN110458930B (en) Rendering method and device of three-dimensional map and storage medium
US11734879B2 (en) Graphics processing using directional representations of lighting at probe positions within a scene
CN112316420B (en) Model rendering method, device, equipment and storage medium
US11010956B2 (en) Foveated rendering
CN110533707A (en) Illuminant estimation
CN113658316B (en) Rendering method and device of three-dimensional model, storage medium and computer equipment
CN108765520B (en) Text information rendering method and device, storage medium and electronic device
CN107492139B (en) Floodlight information processing method and device, storage medium, processor and terminal
CN105550973B (en) Graphics processing unit, graphics processing system and anti-aliasing processing method
JP2007066064A (en) Image generating device and image generating program
CN109712226A (en) The see-through model rendering method and device of virtual reality
WO2023098358A1 (en) Model rendering method and apparatus, computer device, and storage medium
WO2023098344A1 (en) Graphic processing method and apparatus, computer device, and storage medium
CN114937103A (en) Model rendering method and device for dynamic effect, electronic equipment and storage medium
CN115526976A (en) Virtual scene rendering method and device, storage medium and electronic equipment
CN108230430A (en) The processing method and processing device of cloud layer shade figure
Min et al. Soft shadow art
CN115272628A (en) Rendering method and device of three-dimensional model, computer equipment and storage medium
CN114266855A (en) Light effect simulation method and device of dot matrix screen and electronic equipment
CN114288671A (en) Method, device and equipment for making map and computer readable medium
CN115880127A (en) Rendering format selection method and related equipment thereof
CN112819929A (en) Water surface rendering method and device, electronic equipment and storage medium
CN117931979B (en) Building display method and related device in electronic map
CA2872653A1 (en) Image-generating system using beta distribution to provide accurate shadow mapping
CN116704098A (en) Method and device for generating directed distance field, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination