CN108159693B - Game scene construction method and device - Google Patents
Game scene construction method and device Download PDFInfo
- Publication number
- CN108159693B CN108159693B CN201711270075.1A CN201711270075A CN108159693B CN 108159693 B CN108159693 B CN 108159693B CN 201711270075 A CN201711270075 A CN 201711270075A CN 108159693 B CN108159693 B CN 108159693B
- Authority
- CN
- China
- Prior art keywords
- scene
- layer
- data
- constructed
- channel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Generation (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention provides a game scene construction method and device. The method comprises the following steps: reading scene terrain data of a scene to be constructed in a single channel corresponding to each image layer, and calculating to obtain a texture mixing result corresponding to the scene to be constructed according to the scene terrain data; and reading vertex color data of a vertex color layer corresponding to the scene to be constructed, and constructing a corresponding game scene environment based on the texture mixing result and the vertex color data. The game scene construction method can construct a game scene environment with rich layers by using few map resources, improve the scene construction efficiency and reduce the influence of the scene construction process on the game performance.
Description
Technical Field
The invention relates to the technical field of game scene processing, in particular to a game scene construction method and device.
Background
In the process of constructing the game scene, in order to ensure that the landform environment of each scene area has rich levels and the transition connection between different scene areas is natural, art workers are required to draw the corresponding scene landform environment by using a large number of earth surface maps, and each earth surface map comprises data volumes of four channels. With the increase of the scene area and the hierarchy of the geographic environment, a greater resource investment is increasingly required, but in the game running process, the greater the number of the surface maps and the data amount, the more serious resource burden is caused to the game running equipment (for example, a smart phone), and the game performance of the game running equipment is affected.
Disclosure of Invention
In order to overcome the defects in the prior art, the invention aims to provide a game scene construction method and a game scene construction device, wherein the game scene construction method can construct a game scene environment with rich layers by using few map resources, improve the scene construction efficiency and reduce the negative influence of the scene construction process on the game performance.
As for a method, a preferred embodiment of the present invention provides a game scene construction method, including:
reading scene terrain data of a scene to be constructed in a single channel corresponding to each image layer, and calculating to obtain a texture mixing result corresponding to the scene to be constructed according to the scene terrain data;
and reading vertex color data of a vertex color layer corresponding to the scene to be constructed, and constructing a corresponding game scene environment based on the texture mixing result and the vertex color data.
As for the apparatus, a preferred embodiment of the present invention provides a game scene constructing apparatus, including:
the data processing module is used for reading scene terrain data of a scene to be constructed in a single channel corresponding to each image layer, and calculating a texture mixing result corresponding to the scene to be constructed according to the scene terrain data;
and the scene construction module is used for reading the vertex color data of the vertex color layer corresponding to the scene to be constructed and correspondingly constructing a corresponding game scene environment based on the texture mixing result and the vertex color data.
Compared with the prior art, the game scene construction method and the game scene construction device provided by the preferred embodiment of the invention have the following beneficial effects: the game scene construction method can construct a game scene environment with rich layers by using few map resources, improve the scene construction efficiency and reduce the negative influence of the scene construction process on the game performance. Firstly, reading scene terrain data of a scene to be constructed in a single channel corresponding to each image layer, and calculating according to the scene terrain data to obtain a texture mixing result corresponding to the scene to be constructed; and then, the method correspondingly constructs a corresponding game scene environment based on the texture mixing result and the vertex color data by reading the vertex color data of the vertex color layer corresponding to the scene to be constructed. According to the method, the using quantity and the data quantity of the scene surface maps are greatly reduced in a mode of obtaining the scene terrain data corresponding to a single channel from each map layer, the resource occupation in game running equipment is reduced, the work load of art workers is lightened, and the scene construction efficiency is improved in a mode of replacing the art effect of a large number of surface maps by using the vertex color map layers, so that a game scene environment with rich layers is constructed by using few map resources, and the negative influence of the scene construction process on the game performance is reduced.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments are briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention, and therefore should not be considered as limiting the scope of the claims of the present invention, and it is obvious for those skilled in the art that other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 is a block diagram of an operating apparatus according to a preferred embodiment of the present invention.
Fig. 2 is a schematic flow chart of a game scene construction method according to a preferred embodiment of the present invention.
Fig. 3 is another flow chart of a game scene construction method according to a preferred embodiment of the invention.
Fig. 4 is a block diagram of the game scene constructing apparatus shown in fig. 1 according to a preferred embodiment of the present invention.
Fig. 5 is a block schematic diagram of the data processing module shown in fig. 4.
Fig. 6 is another block diagram of the game scene constructing apparatus shown in fig. 1 according to a preferred embodiment of the present invention.
Icon: 10-operating the equipment; 11-a memory; 12-a processor; 13-a communication unit; 14-a graphics card unit; 100-a game scene construction device; 110-a data processing module; 120-a scene construction module; 111-read submodule; 112-a mixing submodule; 130-data import module.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
Some embodiments of the invention are described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Fig. 1 is a block diagram of an operating apparatus 10 according to a preferred embodiment of the present invention. In the embodiment of the present invention, the operation device 10 can construct a game scene environment with rich layers with a very small number of map resources in the game operation process, thereby improving the scene construction efficiency and reducing the negative impact of the scene construction process on the game performance. In the present embodiment, the operation device 10 may be, but is not limited to, a smart phone, a Personal Computer (PC), a tablet PC, a Personal Digital Assistant (PDA), a server with an image processing function, or the like.
In this embodiment, the operation device 10 may include a game scene constructing apparatus 100, a memory 11, a processor 12, a communication unit 13, and a display card unit 14. The memory 11, the processor 12, the communication unit 13 and the display card unit 14 are electrically connected to each other directly or indirectly to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The game scene constructing apparatus 100 includes at least one software function module which can be stored in the memory 11 in the form of software or firmware (firmware), and the processor 12 executes various function applications and data processing by running software programs and modules stored in the memory 11.
In this embodiment, the Memory 11 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), an electrically Erasable Programmable Read-Only Memory (EEPROM), and the like. The memory 11 is used for storing a program, and the processor 12 executes the program after receiving an execution instruction. Further, the software programs and modules in the memory 11 may also include an operating system, which may include various software components and/or drivers for managing system tasks (e.g., memory management, storage device control, power management, etc.), and may communicate with various hardware or software components to provide an operating environment for other software components.
In this embodiment, the processor 12 may be an integrated circuit chip having signal processing capabilities. The Processor 12 may be a general-purpose Processor including a Central Processing Unit (CPU), a Network Processor (NP), and the like. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
In this embodiment, the communication unit 13 is configured to establish a communication connection between the operating device 10 and another external device through a network, and perform data transmission through the network.
In this embodiment, the graphics card unit 14 is used for performing operation processing on the graphics data to relieve the operation pressure of the processor 12. The core component of the Graphics card Unit 14 is a GPU (Graphics Processing Unit), and is configured to convert and drive Graphics data information required by the operating device 10, and control a display to display the Graphics data information.
In this embodiment, the operation device 10 constructs a scene environment to be constructed with rich scene structure layers by using few map resources through the game scene construction apparatus 100 stored in the memory 11, so that the resource occupation of the scene construction process in the operation device 10 is reduced, the influence of the scene construction process on the game performance is reduced, and the workload of the art personnel is correspondingly reduced.
It will be appreciated that the configuration shown in fig. 1 is merely a schematic illustration of the configuration of the operative apparatus 10, and that the operative apparatus 10 may include more or fewer components than shown in fig. 1, or have a different configuration than shown in fig. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
Fig. 2 is a schematic flow chart of a game scene construction method according to a preferred embodiment of the present invention. In the embodiment of the present invention, the game scene construction method is applied to the above-mentioned running device 10, and the specific flow and steps of the game scene construction method shown in fig. 2 are described in detail below.
In the embodiment of the invention, the game scene construction method comprises the following steps:
step S210, reading scene terrain data of a scene to be constructed in a single channel corresponding to each image layer, and calculating to obtain a texture mixing result corresponding to the scene to be constructed according to the scene terrain data.
In this embodiment, the scene terrain data may represent distribution conditions of various terrain structures of the scene to be constructed in single-channel maps corresponding to different layers, where the single-channel map corresponding to each layer may represent a surface texture condition of an object of the scene to be constructed in the corresponding layer. And when the texture mixing result corresponding to the scene to be constructed is the texture mixing of the scene topographic data of each layer at the same position, each pixel point represents the distribution condition of colors by the gray value data at the position.
The scene terrain data comprises channel texture data and terrain weight data corresponding to each layer of the scene to be constructed, the channel texture data is texture information in a corresponding single-channel map, and the terrain weight data can comprise distribution conditions of various terrain structures in corresponding layers, occupation ratio conditions of the terrain structures in the whole layers and occupation ratio conditions of the corresponding layers in the whole scene to be constructed. Optionally, the step of reading scene terrain data of a single channel corresponding to each layer of the scene to be constructed includes:
reading corresponding channel texture data from a single-channel map of the scene to be constructed on a single channel corresponding to each layer, wherein each layer corresponds to one channel;
and obtaining corresponding topographic weight data of each layer in the scene to be constructed according to the channel texture data corresponding to each layer.
In this embodiment, the pixel software three-dimensional engine running in the running device 10 correspondingly allocates one channel to each layer, so that the surface map corresponding to each layer is a single-channel map of the corresponding channel. For example, if an image layer is allocated with an a (alpha) channel, the surface map corresponding to the image layer is a single-channel map on the a channel; if an R (Red) channel is allocated to one layer, the surface map corresponding to the layer is a single-channel map on the R channel; if one layer is distributed with a G (Green) channel, the surface map corresponding to the layer is a single-channel map on the G channel; if a layer is allocated with a B (blue) channel, the surface map corresponding to the layer is a single-channel map on the B channel.
In this embodiment, the channel texture data corresponding to each layer includes color information of the corresponding layer, and the pixel software three-dimensional engine operating in the operating device 10 may obtain the color information corresponding to each layer by reading and analyzing the channel texture data corresponding to each layer, where the color information is gray value data of each layer in a single channel map. Optionally, the step of obtaining a texture mixing result corresponding to the scene to be constructed by calculating according to the terrain data of each scene includes:
obtaining a diffuse reflection map of the scene to be constructed based on the channel texture data corresponding to each map layer;
and calculating to obtain the texture mixing effect corresponding to each pixel point in the diffuse reflection map based on the color information corresponding to each map layer and the terrain weight data corresponding to each map layer.
In this embodiment, the pixel software three-dimensional engine running in the running device 10 obtains the diffuse reflection map capable of representing information such as inherent color and material characteristics of the surface of the scene to be constructed by performing texture mixing on the channel texture data corresponding to each pixel point in each layer. The pixel software three-dimensional engine can calculate and obtain texture mixing results corresponding to each pixel point in the diffuse reflection map according to color information corresponding to each pixel point in each layer and terrain weight data corresponding to each pixel point in each layer, wherein a calculation formula of the texture mixing results corresponding to each pixel point is as follows:
wherein n represents the total number of layers, CiColor information, W, representing the i-th layer mapiWeight data representing the i-th layer map, CtexRepresenting the mixed result of all the map-layer maps.
Step S220, reading vertex color data of a vertex color layer corresponding to the scene to be constructed, and constructing a corresponding game scene environment based on the texture mixing result and the vertex color data.
In this embodiment, the vertex color map layer is configured to render vertex colors of a topographic structure and an atmosphere of a scene environment, and the pixel software three-dimensional engine running in the running device 10 may obtain, through the texture mixing result of the scene to be constructed and the vertex color data of the vertex color map layer corresponding to the scene to be constructed, a color distribution condition of an environment corresponding to each pixel point when the scene to be constructed is constructed, and accordingly construct a corresponding scene environment. Optionally, the step of correspondingly constructing a corresponding game scene environment based on the texture mixing result and the vertex color data includes:
multiplying texture mixing effects corresponding to the pixel points in the diffuse reflection mapping with vertex color data of corresponding pixel points in a vertex color mapping layer to obtain color distribution conditions of the environment corresponding to the scene to be constructed;
and constructing a corresponding scene environment based on the diffuse reflection map and the color distribution condition of the environment corresponding to the scene to be constructed.
Fig. 3 is a schematic flow chart of a game scene construction method according to a preferred embodiment of the invention. In this embodiment of the present invention, before the step S210, the game scene constructing method may further include:
step S209, importing a single-channel map corresponding to each layer of the scene to be constructed at the single channel corresponding to each layer, and correspondingly importing the vertex color layer corresponding to the scene to be constructed.
In this embodiment, the pixel software three-dimensional engine running in the running device 10 may receive a single-channel map at a single channel corresponding to each map layer corresponding to the scene to be built, which is imported by a game designer, and a vertex color map layer corresponding to the scene to be built, where the single-channel map and the vertex color map layer corresponding to each map layer may be generated by drawing with other external drawing software (e.g., photoshop) according to the building requirement of the scene to be built by an art designer.
Fig. 4 is a block diagram of the game scene constructing apparatus 100 shown in fig. 1 according to a preferred embodiment of the present invention. In the embodiment of the present invention, the game scene constructing apparatus 100 includes a data processing module 110 and a scene constructing module 120.
The data processing module 110 is configured to read scene terrain data of a scene to be constructed in a single channel corresponding to each layer, and calculate a texture mixing result corresponding to the scene to be constructed according to the scene terrain data.
In this embodiment, the data processing module 110 may execute step S210 shown in fig. 2, and the specific execution process may refer to the above detailed description of step S210.
Optionally, please refer to fig. 5, which is a block diagram illustrating the data processing module 110 shown in fig. 4. In this embodiment, the data processing module 110 includes a reading sub-module 111 and a mixing sub-module 112.
The reading submodule 111 is configured to read scene terrain data corresponding to a scene to be constructed in each layer.
In this embodiment, the scene topographic data includes channel texture data and topographic weight data corresponding to the scene to be constructed on each layer, and the manner for the reading submodule 111 to read the scene topographic data corresponding to the scene to be constructed on each layer includes:
reading corresponding channel texture data from a single-channel map of the scene to be constructed on a single channel corresponding to each layer, wherein each layer corresponds to one channel;
and obtaining corresponding topographic weight data of each layer in the scene to be constructed according to the channel texture data corresponding to each layer.
And the mixing submodule 112 is configured to calculate a texture mixing result corresponding to the scene to be constructed according to the terrain data of each scene.
In this embodiment, the channel texture data corresponding to each layer includes color information of the corresponding layer, and the manner in which the mixing submodule 112 calculates the texture mixing result corresponding to the scene to be constructed according to the terrain data of each scene includes:
obtaining a diffuse reflection map of the scene to be constructed based on the channel texture data corresponding to each map layer;
and calculating to obtain the texture mixing effect corresponding to each pixel point in the diffuse reflection map based on the color information corresponding to each map layer and the terrain weight data corresponding to each map layer.
Referring to fig. 4 again, the scene constructing module 120 is configured to read vertex color data of a vertex color layer corresponding to the scene to be constructed, and construct a corresponding game scene environment based on the texture mixing result and the vertex color data.
In this embodiment, the manner for the scene constructing module 120 to correspondingly construct the corresponding game scene environment based on the texture mixing result and the vertex color data includes:
multiplying texture mixing effects corresponding to the pixel points in the diffuse reflection mapping with vertex color data of corresponding pixel points in a vertex color mapping layer to obtain color distribution conditions of the environment corresponding to the scene to be constructed;
and constructing a corresponding scene environment based on the diffuse reflection map and the color distribution condition of the environment corresponding to the scene to be constructed.
In this embodiment, the scene building module 120 may execute step S220 shown in fig. 2, and the specific execution process may refer to the above detailed description of step S220.
Fig. 6 is a block diagram of another game scene constructing device 100 shown in fig. 1 according to a preferred embodiment of the present invention. In this embodiment of the present invention, the game scene constructing apparatus 100 may further include a data importing module 130.
The data importing module 130 is configured to import a single-channel map corresponding to each layer of a scene to be constructed at a single channel corresponding to each layer, and correspondingly import a vertex color layer corresponding to the scene to be constructed.
In this embodiment, the data importing module 130 may execute step S209 shown in fig. 3, and the specific execution process may refer to the above detailed description of step S209.
In summary, in the game scene construction method and apparatus provided in the preferred embodiment of the present invention, the game scene construction method can construct a game scene environment with rich layers with very few map resources, improve the scene construction efficiency, and reduce the negative impact of the scene construction process on the game performance. Firstly, reading scene terrain data of a scene to be constructed in a single channel corresponding to each image layer, and calculating according to the scene terrain data to obtain a texture mixing result corresponding to the scene to be constructed; and then, the method correspondingly constructs a corresponding game scene environment based on the texture mixing result and the vertex color data by reading the vertex color data of the vertex color layer corresponding to the scene to be constructed. According to the method, the using quantity and the data quantity of the scene surface maps are greatly reduced in a mode of obtaining the scene terrain data corresponding to a single channel from each map layer, the resource occupation in game running equipment is reduced, the work load of art workers is lightened, and the scene construction efficiency is improved in a mode of replacing the art effect of a large number of surface maps by using the vertex color map layers, so that a game scene environment with rich layers is constructed by using few map resources, and the negative influence of the scene construction process on the game performance is reduced.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (8)
1. A game scene construction method, characterized in that the method comprises:
reading scene terrain data of a scene to be constructed in a single channel corresponding to each image layer, and calculating to obtain a texture mixing result corresponding to the scene to be constructed according to the scene terrain data;
reading vertex color data of a vertex color layer corresponding to the scene to be constructed, and constructing a corresponding game scene environment based on the texture mixing result and the vertex color data, wherein the vertex color layer is used for rendering the vertex color of the terrain structure and atmosphere of the scene environment;
the scene terrain data comprises channel texture data and terrain weight data corresponding to the scene to be constructed on each layer, and the step of reading the scene terrain data corresponding to the scene to be constructed on each layer comprises the following steps:
reading corresponding channel texture data from a single-channel map of the scene to be constructed on a single channel corresponding to each layer, wherein each layer corresponds to one channel;
and obtaining corresponding topographic weight data of each layer in the scene to be constructed according to the channel texture data corresponding to each layer.
2. The method according to claim 1, wherein the channel texture data corresponding to each layer includes color information of the corresponding layer, and the step of obtaining the texture mixing result corresponding to the scene to be constructed by calculating according to the terrain data of each scene includes:
obtaining a diffuse reflection map of the scene to be constructed based on the channel texture data corresponding to each map layer;
and calculating to obtain the texture mixing effect corresponding to each pixel point in the diffuse reflection map based on the color information corresponding to each map layer and the terrain weight data corresponding to each map layer.
3. The method of claim 2, wherein the step of constructing a corresponding game scene environment based on the texture blending result and the vertex color data correspondence comprises:
multiplying texture mixing effects corresponding to the pixel points in the diffuse reflection mapping with vertex color data of corresponding pixel points in a vertex color mapping layer to obtain color distribution conditions of the environment corresponding to the scene to be constructed;
and constructing a corresponding scene environment based on the diffuse reflection map and the color distribution condition of the environment corresponding to the scene to be constructed.
4. The method according to any one of claims 1 to 3, wherein before reading scene terrain data corresponding to a scene to be constructed in each layer and calculating a texture mixing result corresponding to the scene to be constructed according to each scene terrain data, the method further comprises:
and importing a single-channel map of the scene to be constructed in each layer at a single channel corresponding to each layer, and correspondingly importing a vertex color layer corresponding to the scene to be constructed.
5. A game scene constructing apparatus, characterized in that the apparatus comprises:
the data processing module is used for reading scene terrain data of a scene to be constructed in a single channel corresponding to each image layer, and calculating a texture mixing result corresponding to the scene to be constructed according to the scene terrain data;
the scene construction module is used for reading vertex color data of a vertex color layer corresponding to the scene to be constructed, and correspondingly constructing a corresponding game scene environment based on the texture mixing result and the vertex color data, wherein the vertex color layer is used for rendering the vertex color of the terrain structure and atmosphere of the scene environment;
the scene terrain data comprises channel texture data and terrain weight data corresponding to the scene to be constructed in each layer, and the data processing module comprises:
the reading submodule is used for reading scene terrain data corresponding to the scene to be constructed in each layer;
the mode of reading the scene terrain data corresponding to the scene to be constructed in each layer by the reading submodule comprises the following steps:
reading corresponding channel texture data from a single-channel map of the scene to be constructed on a single channel corresponding to each layer, wherein each layer corresponds to one channel;
and obtaining corresponding topographic weight data of each layer in the scene to be constructed according to the channel texture data corresponding to each layer.
6. The apparatus according to claim 5, wherein the channel texture data corresponding to each layer includes color information of the corresponding layer, and the data processing module further includes:
the mixing submodule is used for calculating and obtaining a texture mixing result corresponding to the scene to be constructed according to the terrain data of each scene;
the method for calculating and obtaining the texture mixing result corresponding to the scene to be constructed by the mixing submodule according to the terrain data of each scene comprises the following steps:
obtaining a diffuse reflection map of the scene to be constructed based on the channel texture data corresponding to each map layer;
and calculating to obtain the texture mixing effect corresponding to each pixel point in the diffuse reflection map based on the color information corresponding to each map layer and the terrain weight data corresponding to each map layer.
7. The apparatus of claim 6, wherein the scene construction module correspondingly constructs a corresponding game scene environment based on the texture blending result and the vertex color data comprises:
multiplying texture mixing effects corresponding to the pixel points in the diffuse reflection mapping with vertex color data of corresponding pixel points in a vertex color mapping layer to obtain color distribution conditions of the environment corresponding to the scene to be constructed;
and constructing a corresponding scene environment based on the diffuse reflection map and the color distribution condition of the environment corresponding to the scene to be constructed.
8. The apparatus of any one of claims 5-7, further comprising:
and the data import module is used for importing a single-channel chartlet corresponding to each layer of the scene to be constructed at the single channel corresponding to each layer, and correspondingly importing the vertex color layer corresponding to the scene to be constructed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711270075.1A CN108159693B (en) | 2017-12-05 | 2017-12-05 | Game scene construction method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711270075.1A CN108159693B (en) | 2017-12-05 | 2017-12-05 | Game scene construction method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108159693A CN108159693A (en) | 2018-06-15 |
CN108159693B true CN108159693B (en) | 2020-11-13 |
Family
ID=62524414
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711270075.1A Active CN108159693B (en) | 2017-12-05 | 2017-12-05 | Game scene construction method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108159693B (en) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110363837B (en) * | 2019-07-23 | 2023-05-23 | 网易(杭州)网络有限公司 | Method and device for processing texture image in game, electronic equipment and storage medium |
CN110665232A (en) * | 2019-08-30 | 2020-01-10 | 网易(杭州)网络有限公司 | Rendering method, device and equipment of ground surface in game and storage medium |
CN110975286A (en) * | 2019-12-19 | 2020-04-10 | 福建天晴在线互动科技有限公司 | Method and system for improving resource reusability based on game map |
CN112604273B (en) * | 2020-12-24 | 2021-08-24 | 完美世界(北京)软件科技发展有限公司 | Data-driven game system function loading method, device and storage medium |
CN113032699B (en) * | 2021-03-04 | 2023-04-25 | 广东博智林机器人有限公司 | Model construction method, model construction device and processor of robot |
CN113457125B (en) * | 2021-07-02 | 2024-02-23 | 珠海金山数字网络科技有限公司 | Game scene management method and system, computing device and computer readable storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102737097A (en) * | 2012-03-30 | 2012-10-17 | 北京峰盛博远科技有限公司 | Three-dimensional vector real-time dynamic stacking technique based on LOD (Level of Detail) transparent textures |
CN104835202A (en) * | 2015-05-20 | 2015-08-12 | 中国人民解放军装甲兵工程学院 | Quick three-dimensional virtual scene constructing method |
WO2016159884A1 (en) * | 2015-03-30 | 2016-10-06 | Agency For Science, Technology And Research | Method and device for image haze removal |
CN106204735A (en) * | 2016-07-18 | 2016-12-07 | 中国人民解放军理工大学 | Unity3D terrain data using method in Direct3D 11 environment |
CN106384375A (en) * | 2016-08-31 | 2017-02-08 | 北京像素软件科技股份有限公司 | Coloring fusion method and device for vegetation bottom in electronic game scene |
-
2017
- 2017-12-05 CN CN201711270075.1A patent/CN108159693B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102737097A (en) * | 2012-03-30 | 2012-10-17 | 北京峰盛博远科技有限公司 | Three-dimensional vector real-time dynamic stacking technique based on LOD (Level of Detail) transparent textures |
WO2016159884A1 (en) * | 2015-03-30 | 2016-10-06 | Agency For Science, Technology And Research | Method and device for image haze removal |
CN104835202A (en) * | 2015-05-20 | 2015-08-12 | 中国人民解放军装甲兵工程学院 | Quick three-dimensional virtual scene constructing method |
CN106204735A (en) * | 2016-07-18 | 2016-12-07 | 中国人民解放军理工大学 | Unity3D terrain data using method in Direct3D 11 environment |
CN106384375A (en) * | 2016-08-31 | 2017-02-08 | 北京像素软件科技股份有限公司 | Coloring fusion method and device for vegetation bottom in electronic game scene |
Also Published As
Publication number | Publication date |
---|---|
CN108159693A (en) | 2018-06-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108159693B (en) | Game scene construction method and device | |
CN107358649B (en) | Processing method and device of terrain file | |
US20230053462A1 (en) | Image rendering method and apparatus, device, medium, and computer program product | |
CN108038897B (en) | Shadow map generation method and device | |
CN111127590B (en) | Second-order Bezier curve drawing method and device | |
CN112489213A (en) | Three-dimensional terrain model generation method and device, electronic equipment and storage medium | |
US20240331221A1 (en) | Method and apparatus for displaying virtual landscape picture, storage medium, and electronic device | |
CN111798554A (en) | Rendering parameter determination method, device, equipment and storage medium | |
CN109887063B (en) | Method and device for realizing virtual fluid in three-dimensional space, medium and electronic equipment | |
US20120105465A1 (en) | Techniques for efficient sampling for image effects | |
CN112419430B (en) | Animation playing method and device and computer equipment | |
CN111476858B (en) | WebGL-based 2d engine rendering method, device and equipment | |
CN112734900A (en) | Baking method, baking device, baking equipment and computer-readable storage medium of shadow map | |
US20230377265A1 (en) | Systems for Efficiently Rendering Vector Objects | |
CN111882637B (en) | Picture rendering method, device, equipment and medium | |
US10290134B2 (en) | Coverage based approach to image rendering using opacity values | |
JP7352032B2 (en) | Video generation method, apparatus, electronic device and computer readable storage medium | |
CN115935863B (en) | Digital circuit load division processing method, device and computer equipment | |
CN108829840B (en) | Electronic fence map construction method and device | |
CN112330769A (en) | Method and device for generating dotted line texture and electronic equipment | |
CN113066178A (en) | Map data processing method, device, equipment and storage medium | |
CN118520065B (en) | GIS map-based data processing method and device, electronic equipment and storage medium | |
CN113487708B (en) | Flow animation implementation method based on graphics, storage medium and terminal equipment | |
CN111756997B (en) | Pixel storage method and device, computer equipment and readable storage medium | |
US20240054695A1 (en) | Generating Blend Objects from Objects with Pattern Fills |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |