[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN111973988A - Game model processing method, device, equipment and storage medium - Google Patents

Game model processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN111973988A
CN111973988A CN202010858322.5A CN202010858322A CN111973988A CN 111973988 A CN111973988 A CN 111973988A CN 202010858322 A CN202010858322 A CN 202010858322A CN 111973988 A CN111973988 A CN 111973988A
Authority
CN
China
Prior art keywords
model
skeleton
bone
transformation data
scaling
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010858322.5A
Other languages
Chinese (zh)
Other versions
CN111973988B (en
Inventor
周少怀
黄剑武
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010858322.5A priority Critical patent/CN111973988B/en
Priority to CN202311694353.1A priority patent/CN117861219A/en
Publication of CN111973988A publication Critical patent/CN111973988A/en
Application granted granted Critical
Publication of CN111973988B publication Critical patent/CN111973988B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides a method, a device, equipment and a storage medium for processing models in games, wherein the method comprises the following steps: obtaining a scaling instruction for the model, wherein the scaling instruction is used for instructing to scale a first bone in the model; obtaining model space transformation data of the sub-skeletons of the first skeleton according to the local transformation data before the scaling of the first skeleton; determining the scaled model space transformation data of the first skeleton according to the scaled local transformation data of the first skeleton; and displaying the scaled model on a graphical user interface according to the scaled model space transformation data of the first skeleton and the scaled model space transformation data of the sub-skeleton, so that the shape change of the model can be flexibly controlled by adjusting the skeleton, and the flexibility and the accuracy of the model are improved.

Description

Game model processing method, device, equipment and storage medium
Technical Field
The present application relates to computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for processing an in-game model.
Background
In the field of game or video production, creation of models is usually performed by art personnel through professional tools, and a model produced by art personnel can represent a specific character and can be bound with bones, and the model movement is driven through the movement of the bones.
If a character is required to show different body shapes, such as different fat, thin and different height, an art worker is required to separately make models of the two body shapes, and if a large number of different body shapes or face shapes are required, a large amount of art resources are required to be made to show the body shapes.
Although the existing animation technology can adjust the skeleton of the model in real time to form a certain shape or face adjustment effect, the existing technology cannot correctly process the data transmission of the parent skeleton and the child skeleton, so that the unreasonable deformation is caused during the skeleton adjustment.
Disclosure of Invention
The application provides a processing method, a processing device, equipment and a storage medium of a model in a game, which realize the flexible control of the shape change of the model by adjusting bones and improve the flexibility and accuracy of the model.
In a first aspect, the present application provides a method for processing an in-game model, the method comprising:
obtaining a scaling instruction for the model, wherein a skeleton of the model is composed of a plurality of bones, and the scaling instruction is used for instructing to scale a first bone in the model;
obtaining model space transformation data of the sub-skeletons of the first skeleton according to the local transformation data before the scaling of the first skeleton;
determining the first bone scaled model space transformation data from the first bone scaled local transformation data;
displaying the scaled model on a graphical user interface according to the scaled model space transformation data of the first skeleton and the model space transformation data of the sub-skeleton.
In one possible implementation, the obtaining model space transformation data of a sub-bone of the first bone from the local transformation data before scaling of the first bone includes:
determining model space transformation data of the first bone from the local transformation data of the first bone before scaling;
determining model space transformation data for a sub-bone of the first bone from the model space transformation data for the first bone.
In one possible implementation, the determining the first pre-bone-scale model space transformation data from the first pre-bone-scale local transformation data includes:
determining the first bone pre-scale model space transformation data from the first bone pre-scale local transformation data, model space transformation data for a root bone of the model, and local transformation data for bones at all levels between the root bone and the first bone.
In one possible implementation, the determining model space transformation data of a sub-bone of the first bone from model space transformation data of the first bone before scaling comprises:
determining model space transformation data for a sub-bone of the first bone from the pre-scaled model space transformation data for the first bone and local transformation data for the sub-bone of the first bone.
In one possible implementation, before determining the first bone scaled model space transformation data from the first bone scaled local transformation data, the method further includes:
and modifying the local transformation data of the first skeleton according to the scaling instruction to obtain the scaled local transformation data of the first skeleton.
In a possible implementation, the obtaining a scaling instruction for the model includes:
generating the scaling instruction in response to a scaling operation on a first skeleton of the model;
or,
generating the scaling instruction for the first skeleton when the model enters a preset scene or executes a preset action.
In a second aspect, the present application provides a method for processing an in-game model, the method comprising:
obtaining a scaling instruction for the model, wherein a skeleton of the model is composed of a plurality of bones, and the scaling instruction is used for instructing to scale a first bone in the model;
obtaining parameters of the first skeleton after zooming;
respectively determining parameters of other bones in the scaled model according to the parameters of the first bones and the overall skeleton condition of the model; wherein the overall skeleton condition is used for constraining the relative relationship of the skeleton of the model;
displaying the scaled model on a graphical user interface according to the parameter of the first bone and the scaled parameters of the other bones.
In a possible implementation manner, determining parameters of other bones in the model after scaling according to the parameter of the first bone and the overall skeleton condition of the model respectively includes:
determining parameters of other bones in the model after scaling according to the parameters of the first bone, local transformation data before scaling of the first bone, model space transformation data of a root bone of the model, local transformation data of bones of all levels between the root bone and the first bone, local transformation data of sub-bones of the first bone, and overall skeleton conditions of the model.
In one possible implementation, the global framework conditions include at least one of:
the height of the bone center point of the model is equal to the sum of the length of the lower body of the bone of the model and the height of the foot-sole bone of the model;
a height of a head bone of the model is equal to a sum of the center point height and a length of an upper body of a bone of the model;
the height of the neck bone of the model is equal to the height of the head bone of the model.
In one possible implementation, the parameters of the bone include height and/or length.
In a possible implementation, the obtaining a scaling instruction for the model includes:
generating the scaling instruction in response to a scaling operation on a first skeleton of the model;
or, when the model enters a preset scene or executes a preset action, generating the scaling instruction for the first skeleton.
In a third aspect, the present application provides an apparatus for processing an in-game model, comprising:
an obtaining module, configured to obtain a scaling instruction for the model, where a skeleton of the model is composed of a plurality of bones, and the scaling instruction is used to instruct to scale a first bone in the model;
the processing module is used for acquiring model space transformation data of the sub-skeletons of the first skeleton according to the local transformation data before the first skeleton is scaled; and determining the first skeleton scaled model space transformation data from the first skeleton scaled local transformation data;
and the display module is used for displaying the scaled model on the graphical user interface according to the scaled model space transformation data of the first skeleton and the model space transformation data of the sub-skeleton.
In one possible implementation, the processing module is configured to:
determining model space transformation data of the first bone from the local transformation data of the first bone before scaling;
determining model space transformation data for a sub-bone of the first bone from the model space transformation data for the first bone.
In one possible implementation, the processing module is configured to:
determining the first bone pre-scale model space transformation data from the first bone pre-scale local transformation data, model space transformation data for a root bone of the model, and local transformation data for bones at all levels between the root bone and the first bone.
In one possible implementation, the processing module is configured to:
determining model space transformation data for a sub-bone of the first bone from the pre-scaled model space transformation data for the first bone and local transformation data for the sub-bone of the first bone.
In one possible implementation, the processing module is configured to:
and modifying the local transformation data of the first skeleton according to the scaling instruction to obtain the scaled local transformation data of the first skeleton.
In one possible implementation manner, the obtaining module is configured to:
generating the scaling instruction in response to a user's scaling operation on a first skeleton of the model; or, when the model enters a preset scene or executes a preset action, generating the scaling instruction for the first skeleton.
In a fourth aspect, the present application provides an apparatus for processing an in-game model, comprising:
an obtaining module, configured to obtain a scaling instruction for the model, where a skeleton of the model is composed of a plurality of bones, and the scaling instruction is used to instruct to scale a first bone in the model; and obtaining parameters of the first skeleton after zooming;
the processing module is used for respectively determining parameters of other bones in the scaled model according to the parameters of the first bones and the overall skeleton condition of the model; wherein the overall skeleton condition is used for constraining the relative relationship of the skeleton of the model;
and the display module is used for displaying the scaled model on a graphical user interface according to the parameter of the first skeleton and the scaled parameters of the other skeletons.
In one possible implementation, the processing module is configured to:
determining parameters of other bones in the model after scaling according to the parameters of the first bone, local transformation data before scaling of the first bone, model space transformation data of a root bone of the model, local transformation data of bones of all levels between the root bone and the first bone, local transformation data of sub-bones of the first bone, and overall skeleton conditions of the model.
In one possible implementation, the global framework conditions include at least one of:
a center point height of a bone of the model is equal to a sum of a length of a lower body of the bone of the model and a height of a plantar bone of the model;
the height of the head bone of the model is equal to the sum of the height of the center point and the upper half length of the bone of the model;
the height of the neck bone of the model is equal to the height of the head bone of the model.
In one possible implementation, the parameters of the bone include height and/or length.
In one possible implementation manner, the obtaining module is configured to:
generating the scaling instruction in response to a user's scaling operation on a first skeleton of the model; or, when the model enters a preset scene or executes a preset action, generating the scaling instruction for the first skeleton.
In a fifth aspect, the present application provides an electronic device comprising a memory and a processor, the memory and the processor being connected;
the memory is used for storing a computer program;
the processor is configured to implement the method according to any of the first aspect as described above when the computer program is executed.
In a sixth aspect, the present application provides an electronic device comprising a memory and a processor, the memory and the processor being connected;
the memory is used for storing a computer program;
the processor is configured to implement the method according to any of the second aspect as described above when the computer program is executed.
In a seventh aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to any of the first aspect above.
In an eighth aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method according to any of the second aspects above.
The method separates the scaling operation of the skeleton from data transmission calculation by improving the transformation data transmission mode of a father skeleton and a son skeleton, so that the scaling operation of the father skeleton can not be transmitted to the son skeleton, the purpose of editing arbitrary scaling of the father skeleton in different axial directions without influencing the son skeleton is achieved, the free adjustment of the model skeleton is realized, and the flexibility and the accuracy of the model are improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1 is a first diagram illustrating scaling of a model according to the prior art;
FIG. 2 is a first flowchart illustrating a method for processing an in-game model according to the present application;
FIG. 3 is a first schematic diagram illustrating scaling of a model provided herein;
FIG. 4 is a diagram illustrating scaling of a model according to the prior art;
FIG. 5 is a second flowchart illustrating a method for processing an in-game model according to the present application;
FIG. 6 is a second schematic diagram of model scaling provided herein;
FIG. 7 is a first schematic structural diagram of a processing device for in-game models provided in the present application;
FIG. 8 is a second schematic structural diagram of a processing apparatus for in-game models provided in the present application;
fig. 9 is a first schematic structural diagram of an electronic device provided in the present application;
fig. 10 is a schematic structural diagram of an electronic device according to the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
First, terms of art referred to in the present application will be explained.
Bone (Bone): the method refers to a group of data which is produced by simulating a real person and driving a model vertex in a computer, and mainly has the attributes of movement, rotation, scaling and the like. The skeleton set of a person is in a tree form, the skeleton set of the person has a unique root skeleton (usually at the waist position of the person), and other sub-skeletons are gradually layered from the root skeleton. Each character model has a set of bones.
Root bone (RootBone): only the child nodes have no parent.
Level N sub-skeleton (LevelNBone): the nth layer of bone begins with the root bone.
Local transformation data of the skeleton: the relative relationship of the child skeleton in the spatial coordinate system of the parent skeleton is represented by a matrix.
Model space transformation data of bone: the relative relationship of the bone with respect to the model space.
In general 3D games, the motion of a character is generally calculated by a skeletal animation manner. The skeletal data is usually expressed as local transformation data of a child skeleton relative to a parent skeleton in a game, and is usually expressed by using a matrix or a movement/rotation/scaling component, different data forms are only used for convenience of project, and have no essential difference, and for convenience of description, the matrix is used for identifying the skeletal data in the embodiment of the application.
When the skeleton data is expressed by a matrix, the local transformation data of the child skeleton relative to the parent skeleton is called boneLocalmatrix, the local transformation data of the corresponding root skeleton is called RootBoneLocalmatrix, and the local transformation data of the nth layer child skeleton is called LevelNBoneLocalmatrix. The model space transformation data of the skeleton relative to the model space is called BoneModelmatrix, the model space transformation data of the corresponding root skeleton is called RootBoneModelmatrix, and the model space transformation data of the Nth layer of the sub-skeleton is called LevelNBoneModelmatrix. Since the root bone has no parent bone, but has the origin of the model space as its own origin, BoneLocalMatrix and BoneModelMatrix of the root bone are the same.
In order to drive the vertices of the model by bones, the space of the bones needs to be transformed into the space of the model, namely, the BoneLocalMatrix is transformed into BoneModelMatrix. In the prior art, the LevelNBoneModelMatrix of the nth level of sub-skeleton is usually calculated by a root skeleton- >1 level of sub-skeleton- >2 level of sub-skeleton- > N level of sub-skeleton, that is, the calculation is performed by using the following formula (1):
LevelNBoneModelMatrix=RootBoneModelMatrix*Level1BoneLocalMatrix*Level2BoneLocalMatrix*...*LevelNBoneLocalMatrix (1)
it can be seen from formula (1) that if there is an unequal scaling of the parent skeleton, for example, there is an additional scaling of the level kbonelocalmatrix of the K-th skeleton, the skeleton of the K + 1-th layer is also affected by the scaling of the K-th layer due to the matrix transfer of the skeleton from the parent node to the child node, that is, the scaling of the parent node directly acts on the space of the child node to cause an unpredictable result, resulting in an unpredictable deformation of the child skeleton. For example, as shown in FIG. 1, after the model's calf bone is reduced, the scaling of the calf bone is transferred to the foot bone, resulting in the model's foot also being unreasonably reduced.
In order to solve the problem and avoid unnecessary influence of the scaling of the father skeleton on the son skeleton, the data transformation transmission mode of the father skeleton and the son skeleton in the prior art is improved, the scaling operation of the skeleton is separated from the skeleton data transmission calculation, when the model space transformation data of the son skeleton is determined, the scaling of the father skeleton is not considered, the data before the scaling of the father skeleton is transmitted to the son skeleton, and then the data of the father skeleton is adjusted correspondingly according to the scaling operation of the father skeleton, so that the purpose that the son skeleton is not influenced by the scaling of the father skeleton is achieved, any part of the skeleton can be adjusted freely as required in a game to change the body type of the model, and unnecessary deformation of other parts is avoided.
The following describes the processing method of the in-game model provided by the present application in detail with reference to specific embodiments. It is to be understood that the following detailed description may be combined with certain embodiments, and that the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 2 is a first flowchart illustrating a method for processing an in-game model according to the present application. The execution subject of the method may be a processing device of the model in the game, the device is implemented by software and/or hardware, the device may be an electronic device, and may be, for example, a terminal device running a game application, such as a mobile phone, a tablet computer, a personal computer, and the like, and may also be a game server. According to the method, the terminal device obtains the graphical user interface through executing the game application and rendering on the display, the graphical user interface comprises the model, and the skeleton of the model is composed of a plurality of skeletons. The method comprises the following steps:
s201, obtaining a scaling instruction for the model, wherein the scaling instruction is used for instructing to scale the first skeleton in the model.
The model in this embodiment may be a character model, an animal model, or a model of other virtual images set in a game, which is not limited in this embodiment. For convenience of explanation, the present embodiment takes a character model as an example.
The zoom instruction for the model may be triggered by a user, for example, the user performs a drag operation on a portion of the model in the graphical user interface to perform zoom, that is, the user performs a zoom operation on the portion corresponding to the first skeleton, so that the electronic device generates the zoom instruction in response to the user performing the zoom operation on the first skeleton of the model.
The scaling instructions for the model may also be triggered automatically when the game is running. For example, when the model enters a preset scene or performs a preset action, the generation of a zoom instruction for the first skeleton is triggered.
It is understood that the first bone may include one or more bones. For example, a plurality of bones of the nose portion of the character model are added in one group, and when the nose is scaled, it is possible to scale all the bones of the group.
S202, obtaining model space transformation data of the sub-skeletons of the first skeleton according to the local transformation data before the scaling of the first skeleton.
In order to avoid that the scaling of the first bone is transferred to the child bone of the first bone, in the present embodiment, the model space transformation data of the child bone of the first bone is calculated based on the local transformation data before scaling of the first bone when determining the model space transformation data of the child bone of the first bone, so that the model space transformation data of the child bone of the first bone is not affected by the scaling of the first bone.
S203, determining model space transformation data after the first skeleton is zoomed according to the local transformation data after the first skeleton is zoomed.
After the model space transformation data of the sub-bones of the first bone are determined, the model space transformation data of the first bone can be determined according to the scaling operation on the first bone, namely, the scaled model space transformation data of the first bone is determined according to the scaled local transformation data of the first bone.
And S204, displaying the scaled model on a graphical user interface according to the scaled model space transformation data of the first skeleton and the scaled model space transformation data of the sub-skeleton.
After the scaled model space transformation data of the first skeleton and the scaled model space transformation data of the sub-skeleton are obtained through the steps, the scaled model of the first skeleton is displayed on the graphical user interface. It can be understood that, since the model space transformation data of the sub-bones is not affected by the scaling of the first bone, only the portion of the model displayed on the graphical user interface corresponding to the first bone shows the scaled deformation, and the other portions do not generate the deformation.
In the method of the embodiment, the scaling operation of the skeleton is separated from the data transfer calculation by improving the data transfer mode of the parent skeleton and the child skeleton in the prior art, so that the scaling operation of the parent skeleton is not transferred to the child skeleton, the purpose of editing any scaling of the parent skeleton in different axial directions without influencing the child skeleton is achieved, and the free adjustment of the model skeleton is realized.
The following describes a calculation method of the local variation data and the model space calculation data in the above embodiment. In S202, obtaining model space transformation data of a sub-skeleton of the first skeleton according to the local transformation data before scaling of the first skeleton, including: determining model space transformation data of the first skeleton according to the local transformation data before the scaling of the first skeleton; from the model space transformation data of the first bone, model space transformation data of sub-bones of the first bone are determined. The following description will be given with specific examples.
And (3) assuming that the first bone is the K-th layer bone, carrying out non-equal scaling on the K-th layer bone, wherein the local transformation data of the K-th layer bone is LevelKBoneLocalMatrix, and the LevelKBoneLocalMatrix is the local transformation data of the K-th layer bone after scaling. The scaled local transformation data may be obtained by modifying, by the electronic device, the local transformation data of the first skeleton according to the scaling instruction.
The scaling data of the K-th layer of bone is called LevelKBoneLocalMatrixScale, and LevelKBoneLocalMatrixScale can be separated from LevelKBoneLocalMatrix by the following formula, that is, the scaling of the bone is taken out from the local matrix data after the scaling of the bone:
LevelKBoneLocalMatrixScale=GetScale(LevelKBoneLocalMatrix) (2)
wherein, GetScale can be a method or function of extracting scaling data in the prior art.
The local transformation data of the K-th layer of skeleton after being scaled is separated into NoScaleLevelKBoneLocalMatrix, which may also be referred to as the local transformation data before the K-th layer of skeleton is scaled, and may be determined by the following formula:
NoScaleLevelKBoneLocalMatrix=RemoveScale(LevelKBoneLocalMatrix) (3)
wherein RemoveScale can be a method or function in the prior art to remove scaling data in native transform data.
The model space transformation data of the K-th layer of skeleton after being separated and scaled is called NoScaleLevelKBoneModelmatrix, and NoScaleLevelKBoneModelmatrix is model space transformation data without scaling and can be determined by the following formula:
NoScaleLevelKBoneModelMatrix=RootBoneModelMatrix*Level2BoneLocalMatrix*Level3BoneLocalMatrix*...*NoScaleLevelKBoneLocalMatrix (4)
assuming that the sub-skeleton of the K-th layer skeleton is the M-th layer, the model space matrix levelmboonemodelmatrix of the M-th layer skeleton can be determined by the following formula:
LevelMBoneModelMatrix=NoScaleLevelKBoneModelMatrix*LevelMBoneLocalMatrix (5)
wherein, the levelmbonenelocalmatrix is local transformation data of the M layer skeleton. Since the noscalelevelkbonemodel matrix is spatial model transformation data of the K-th skeleton without scaling, the model spatial transformation data of the M-th skeleton does not include scaling data of the K-th skeleton of the parent skeleton, and is not affected by scaling of the parent skeleton.
After the model space transformation data calculation of the M-th layer bone is completed, the scaled data of the K-th layer bone can be merged back into the own space model transformation data, and can be determined by the following formula:
LevelKBoneModelMatrix=LevelKBoneLocalMatrixScale*NoScaleLevelKBoneModelMatrix (6)
in the above method, the model space transformation data before the first bone scaling is determined according to the local transformation data before the first bone scaling, the model space transformation data of the root bone of the model, and the local transformation data of bones of all levels between the root bone and the first bone by the bone transfer formula in formulas (2) to (6); according to the model space transformation data before the scaling of the first skeleton and the local transformation data of the child skeleton of the first skeleton, the model space transformation data of the child skeleton of the first skeleton is determined, the scaling of the father skeleton is firstly separated, the data which is not scaled is transmitted to the child skeleton, and finally the scaled data of the father skeleton is merged back, so that the unequal scaling data of the skeleton can be flexibly set by randomly modifying the local transformation data of the father skeleton, and the influence of the child skeleton can not be generated. Under the method of this embodiment, the fine arts personnel only need make a model, alright show different new model outward appearances through the adjustment to the skeleton, saved work load and resource volume, improved efficiency.
In addition to the problem of deformation of the daughter bones caused by scaling of the parent bones set forth in the above-described embodiments, there is also a problem in adjusting the bones of the model in that scaling of parts of the bones causes separation of different parts of the model, for example, as shown in fig. 4, after shortening the waist of the model, the neck and head of the model separate, and after lengthening the legs of the model, the feet of the model do not fit the ground.
In order to solve the above problem, in the present application, in combination with the new bone transformation data transmission method proposed in the above embodiment and the specific conditions that the bones of the model meet, the parameters of each part of the model are recalculated for the scaled model, so as to solve the problem of separation of different parts of the model. The following examples are given by way of illustration.
Fig. 5 is a flowchart illustrating a method for processing an in-game model according to the present application. The execution subject of the method may be a processing device of the model in the game, the device is implemented by software and/or hardware, the device may be an electronic device, and may be, for example, a terminal device running a game application, such as a mobile phone, a tablet computer, a personal computer, and the like, and may also be a game server. According to the method, the terminal device obtains the graphical user interface through executing the game application and rendering on the display, the graphical user interface comprises the model, and the skeleton of the model is composed of a plurality of skeletons. The method comprises the following steps:
s501, obtaining a scaling instruction of the model, wherein the scaling instruction is used for instructing to scale the first skeleton in the model.
And S502, acquiring the parameters of the first skeleton after zooming.
The model in this embodiment may be a character model, an animal model, or a model of other virtual images set in a game, which is not limited in this embodiment. For convenience of explanation, the present embodiment takes a character model as an example.
The zoom instruction for the model may be triggered by a user, for example, the user performs a drag operation on a portion of the model in the graphical user interface to perform zoom, that is, the user performs a zoom operation on the portion corresponding to the first skeleton, so that the electronic device generates the zoom instruction in response to the user performing the zoom operation on the first skeleton of the model.
The scaling instructions for the model may also be triggered automatically when the game is running. For example, when the model enters a preset scene or performs a preset action, the generation of a zoom instruction for the first skeleton is triggered.
It is understood that the first bone may include one or more bones. For example, a plurality of bones of the nose portion of the character model are added in one group, and when the nose is scaled, it is possible to scale all the bones of the group.
After acquiring the scaling instruction, the electronic device acquires parameters of the scaled first bone according to the scaling instruction, wherein the parameters of the bone may include height and/or length, such as height of a head bone, length of an upper body bone, and the like.
S503, respectively determining parameters of other bones in the scaled model according to the parameters of the first bones and the overall skeleton condition of the model; wherein the overall skeleton condition is used to constrain the relative relationship of the skeleton of the model.
After scaling the first bone, based on the scaled parameters of the first bone, in combination with the bone transfer formulas (2) to (6) in the foregoing embodiments, that is, the parameters of the first bone, the local transformation data before scaling the first bone, the model space transformation data of the root bone of the model, the local transformation data of bones at all levels between the root bone and the first bone, the local transformation data of the sub-bones of the first bone, and the overall skeleton conditions of the model bone itself, the parameters of other bones, that is, the heights and/or lengths that the other bones in the model should be located after scaling the first bone, can be calculated.
And S504, displaying the scaled model on the graphical user interface according to the parameters of the first skeleton and the parameters of other skeletons in the scaled model.
After the parameters of the first skeleton after zooming and the parameters of other skeletons in the model after zooming are obtained through the steps, the parameters can be displayed on a graphical user interface according to the parameters.
Because the parameters of other bones are calculated again according to the bone transfer formula and the overall skeleton condition of the model after the first bone is zoomed, the parts corresponding to all bones of the model displayed according to the parameters can be normally attached, and the phenomenon of separation between different parts of the model can not occur.
The following describes a method for calculating parameters of the skeleton in S503 with reference to a specific example, which takes a character model as an example.
For the character model, each skeleton in a standard skeleton has the following parameters: the height of the center point of the bone (i.e., the center position of the upper and lower bodies) (CenterHeight), the height of the head bone (HeadHeight), the length of the upper body of the bone (UpperLength), the length of the lower body of the bone (LowerLength), the height of the foot bones (botttom bone height), the height of the neck bones (topbonehight), and the like. Here, HeadHeight refers to the lower height of the head bone, and topbonehight refers to the upper height of the neck bone.
The above parameters satisfy the following overall framework conditions under a standard framework:
CenterHeight=LowerLength
HeadHeight=CenterHeight+UpperLength
BottomBoneHeight=0
TopBoneHeight=HeadHeight
these parameters may change as the bone is adjusted, requiring recalculation. After the human skeleton is adjusted on the basis of the standard skeleton, the following parameters are added: shoe sole height (shoehight), different shoe soles, different heights; scaling and movement of the lower body of the skeleton relative to a standard skeleton (BoneLowerTransformOffset); scaling and movement of the upper body of the skeleton relative to a standard skeleton (BoneUpperpransformOffset). Obviously, the overall skeleton condition satisfied by the adjusted skeleton should be:
the height of the adjusted central point of the skeleton is equal to the sum of the length of the lower half of the adjusted skeleton and the height of the adjusted sole skeleton; the height of the head skeleton after adjustment is equal to the sum of the height of the center point after adjustment and the length of the upper half of the skeleton after adjustment; the height of the adjusted neck bone is equal to the height of the adjusted head bone.
In order to fit the bones correctly, new heights and/or lengths of the bones need to be calculated.
LowerLengthPost=CenterHeight–BottomBoneHeightPost (7)
UpperLengthPost=TopBoneHeightPost–CenterHeight (8)
CenterHeightPost=CenterHeight+ShoeHeight+LowerLengthPost-LowerLength (9)
HeadHeightPost=CenterHeightPost+UpperLengthPost–UpperLength (10)
Wherein, LowerLengthPost is the length of the lower half of the bone after adjustment; bottom BoneHeightpost is the height of the sole bone after adjustment; upperlengthpost is the length of the upper body of the adjusted bone; TopBoneHeightpost is the height of the adjusted neck bone; the CenterHeightPost is the height of the adjusted bone center point; headheight post is the adjusted head bone height.
The above botttom bonehightpost and topbonehightpost can obtain a matrix by calculation according to the adjustment amounts of bones, BoneLowerTransformOffset and boneuperpercransformoffset, and the bone transfer formulas (2) to (6) in the foregoing embodiments, and take out the two corresponding heights from the matrix.
After calculation according to the above equations (7) to (10), the body of the character model is set at the centerheight post, so that the feet of the character can be attached to the ground, and the head of the model is set at the headheight post, so that the head can be attached to the body, as shown in fig. 6 for example. In the method of the embodiment, the head and the body of the model can be manufactured in a split mode in a new skeleton transfer mode, and art workers can manufacture different clothes suits or headwear and other equipment on the same head, so that the workload and the resource amount are further saved.
The following is a description of the practical application of the method provided in the present application. Since it is supported to adjust the transformation range of the skeleton and to combine the parts of the character, there are many methods to provide data of the transformation range that the skeleton of the character can be adjusted, and the data can be selected according to the actual situation of the project, for example, by editing and exporting in 3 dsmax. In order to provide as much data as possible and ease of editing as is required within the game, an editor may be provided to edit the relevant parameters for the author. Therefore, the whole process is as follows: an art worker makes a character model through tools such as 3dsmax and binds bones; exporting the model and the skeleton, and importing the model and the skeleton into an editor; setting a transformation range such as displacement and/or scaling of bones in an editor; importing the model, skeleton and transformation range into the game; then, the model can be transformed in the real-time game according to the method of the application.
The flow of editing and exporting data in the editor is as follows:
and editing a skeleton transformation range and adding a group, selecting a skeleton to be edited in a skeleton list for editing, and editing three axial movement and scaling ranges.
After selecting the bone, the range of movement and scaling of each axial direction of the bone can be set in the bone adjustment interface, and the bone can be mirrored or a plurality of bones can be added into a group at the same time, so that the subsequent adjustment is facilitated.
And editing the groups, and after the transformation and the moving range of each skeleton are set, a plurality of proper skeletons can be programmed into the same group so as to be simultaneously adjusted, for example, the nose of the face is influenced by a plurality of skeletons, and after the skeletons are programmed into the same group, the movement and the scaling of the skeletons can be simultaneously controlled, so that the effect of simultaneously adjusting the nose is achieved.
Fig. 7 is a schematic structural diagram of a processing device of an in-game model according to the present application. The device obtains a graphical user interface through rendering of the game application on the display, wherein the graphical user interface comprises a model, and a skeleton of the model consists of a plurality of skeletons. As shown in fig. 7, the apparatus 70 includes:
an obtaining module 701, configured to obtain a scaling instruction for a model, where the scaling instruction is used to instruct to scale a first bone in the model;
a processing module 702, configured to obtain model space transformation data of a sub-skeleton of a first skeleton according to local transformation data before scaling of the first skeleton; determining the scaled model space transformation data of the first skeleton according to the scaled local transformation data of the first skeleton;
a display module 703, configured to display the scaled model on the graphical user interface according to the scaled model space transformation data of the first skeleton and the model space transformation data of the sub-skeleton.
In one possible implementation, the processing module 702 is configured to:
determining model space transformation data of the first skeleton according to the local transformation data before the scaling of the first skeleton;
from the model space transformation data of the first bone, model space transformation data of sub-bones of the first bone are determined.
In one possible implementation, the processing module 702 is configured to:
the model space transformation data before the first bone scaling is determined from the local transformation data before the first bone scaling, the model space transformation data of the root bone of the model, and the local transformation data of bones of all levels between the root bone and the first bone.
In one possible implementation, the processing module 702 is configured to:
determining model space transformation data of a sub-bone of the first bone from the model space transformation data before scaling of the first bone and local transformation data of the sub-bone of the first bone
In one possible implementation, the processing module 702 is configured to:
and modifying the local transformation data of the first skeleton according to the scaling instruction to obtain the scaled local transformation data of the first skeleton.
In one possible implementation, the obtaining module 701 is configured to:
generating a scaling instruction in response to a scaling operation of a first skeleton of a model by a user; alternatively, a zoom instruction for the first skeleton is generated when the model enters a preset scene or performs a preset action.
The apparatus provided in this embodiment may be used to execute the method embodiment shown in fig. 2, and the implementation principle and the calculation effect are similar, which are not described herein again.
Fig. 8 is a schematic structural diagram of a processing apparatus of an in-game model according to the present application. The device obtains a graphical user interface through rendering of the game application on the display, wherein the graphical user interface comprises a model, and a skeleton of the model consists of a plurality of skeletons. As shown in fig. 8, the apparatus 80 includes:
an obtaining module 801, configured to obtain a scaling instruction for the model, where the scaling instruction is used to instruct to scale a first bone in the model; and obtaining parameters of the scaled first skeleton;
a processing module 802, configured to determine parameters of other bones in the scaled model according to the parameter of the first bone and an overall skeleton condition of the model; wherein the overall skeleton condition is used for constraining the relative relationship of the skeleton of the model;
a display module 803, configured to display the scaled model on the graphical user interface according to the parameter of the first bone and the scaled parameters of the other bones.
In one possible implementation, the processing module 802 is configured to determine parameters of other bones in the scaled model according to parameters of the first bone, local transformation data before scaling of the first bone, model space transformation data of a root bone of the model, local transformation data of bones at all levels between the root bone and the first bone, local transformation data of sub-bones of the first bone, and overall skeletal conditions of the model.
In one possible implementation, the overall framework conditions include at least one of:
the height of the center point of the skeleton of the model is equal to the sum of the length of the lower half of the skeleton of the model and the height of the sole skeleton of the model;
the height of the head bone of the model is equal to the sum of the height of the center point and the length of the upper body of the bone of the model;
the height of the neck bone of the model is equal to the height of the head bone of the model.
In one possible implementation, the parameters of the bone include height and/or length.
In one possible implementation, the obtaining module 801 is configured to:
generating a scaling instruction in response to a scaling operation of a first skeleton of a model by a user; alternatively, a zoom instruction for the first skeleton is generated when the model enters a preset scene or performs a preset action.
The apparatus provided in this embodiment may be used to execute the method embodiment shown in fig. 5, and the implementation principle and the calculation effect are similar, which are not described herein again.
Fig. 9 is a first schematic structural diagram of an electronic device provided in the present application. As shown in fig. 9, the electronic device 90 includes a memory 901 and a processor 902, and the memory 901 and the processor 902 are connected by a bus 903.
The memory 901 is used for storing computer programs;
the processor 902 is adapted to carry out the method of the embodiment of fig. 2 as described above when the computer program is executed.
Fig. 10 is a schematic structural diagram of an electronic device according to the present application. As shown in fig. 10, the electronic apparatus 100 includes a memory 1001 and a processor 1002, the memory 1001 and the processor 1002 being connected by a bus 1003;
the memory 1001 is used for storing computer programs;
the processor 1002 is adapted to implement the method of the embodiment of fig. 5 as described above when the computer program is executed.
The present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method as described above in the embodiment of fig. 2.
The present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method as described above in the embodiment of fig. 5.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the embodiments of the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the embodiments of the present application fall within the scope of the claims of the present application and their equivalents, the present application is also intended to encompass such modifications and variations.
In the present application, the terms "include" and variations thereof may refer to non-limiting inclusions; the term "or" and variations thereof may mean "and/or". The terms "first," "second," and the like in this application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. In the present application, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (15)

1. A method of processing an in-game model, the method comprising:
obtaining a scaling instruction for a model, wherein a skeleton of the model is composed of a plurality of bones, and the scaling instruction is used for instructing to scale a first bone in the model;
obtaining model space transformation data of the sub-skeletons of the first skeleton according to the local transformation data before the scaling of the first skeleton;
determining the first bone scaled model space transformation data from the first bone scaled local transformation data;
displaying the scaled model on a graphical user interface according to the scaled model space transformation data of the first skeleton and the model space transformation data of the sub-skeleton.
2. The method of claim 1, wherein said obtaining model space transformation data of a sub-skeleton of said first skeleton from said local transformation data prior to scaling of said first skeleton comprises:
determining model space transformation data before the first skeleton is zoomed according to the local transformation data before the first skeleton is zoomed;
determining model space transformation data for a sub-bone of the first bone from the model space transformation data prior to scaling of the first bone.
3. The method of claim 2, wherein said determining said first bone pre-scale model space transformation data from said first bone pre-scale local transformation data comprises:
determining the first bone pre-scale model space transformation data from the first bone pre-scale local transformation data, model space transformation data for a root bone of the model, and local transformation data for bones at all levels between the root bone and the first bone.
4. The method of claim 2, wherein said determining model space transformation data for a sub-skeleton of said first skeleton from said first pre-skeleton model space transformation data comprises:
determining model space transformation data for a sub-bone of the first bone from the pre-scaled model space transformation data for the first bone and local transformation data for the sub-bone of the first bone.
5. The method of claim 1, wherein prior to determining the first bone scaled model space transformation data from the first bone scaled local transformation data, the method further comprises:
and modifying the local transformation data of the first skeleton according to the scaling instruction to obtain the scaled local transformation data of the first skeleton.
6. The method of any one of claims 1 to 5, wherein the obtaining of the scaling instruction for the model comprises:
generating the scaling instruction in response to a scaling operation on a first skeleton of the model;
or,
generating the scaling instruction for the first skeleton when the model enters a preset scene or executes a preset action.
7. A method of processing an in-game model, the method comprising:
obtaining a scaling instruction for a model, wherein a skeleton of the model is composed of a plurality of bones, and the scaling instruction is used for instructing to scale a first bone in the model;
obtaining parameters of the first skeleton after zooming;
respectively determining parameters of other bones in the scaled model according to the parameters of the first bones and the overall skeleton condition of the model; wherein the overall skeleton condition is used for constraining the relative relationship of the skeleton of the model;
displaying the scaled model on a graphical user interface according to the parameters of the first bone and the parameters of other bones in the scaled model.
8. The method of claim 7, wherein determining parameters of other bones in the model after scaling based on the parameters of the first bone and the overall skeletal condition of the model respectively comprises:
determining parameters of other bones in the model after scaling according to the parameters of the first bone, local transformation data before scaling of the first bone, model space transformation data of a root bone of the model, local transformation data of bones of all levels between the root bone and the first bone, local transformation data of sub-bones of the first bone, and overall skeleton conditions of the model.
9. The method of claim 7, wherein the global framework conditions include at least one of:
the height of the bone center point of the model is equal to the sum of the length of the lower body of the bone of the model and the height of the foot-sole bone of the model;
a height of a head bone of the model is equal to a sum of the center point height and a length of an upper body of a bone of the model;
the height of the neck bone of the model is equal to the height of the head bone of the model.
10. The method of claim 7, wherein the parameters of the bone include height and/or length.
11. The method of any one of claims 7 to 10, wherein the obtaining of the scaling instruction for the model comprises:
generating the scaling instruction in response to a scaling operation on a first skeleton of the model;
or,
generating the scaling instruction for the first skeleton when the model enters a preset scene or executes a preset action.
12. An apparatus for processing an in-game model, comprising:
an obtaining module, configured to obtain a scaling instruction for the model, where a skeleton of the model is composed of a plurality of bones, and the scaling instruction is used to instruct to scale a first bone in the model;
the processing module is used for acquiring model space transformation data of the sub-skeletons of the first skeleton according to the local transformation data before the first skeleton is scaled; and determining the first skeleton scaled model space transformation data from the first skeleton scaled local transformation data;
and the display module is used for displaying the scaled model on a graphical user interface according to the scaled model space transformation data of the first skeleton and the scaled model space transformation data of the sub-skeleton.
13. An apparatus for processing an in-game model, comprising:
an obtaining module, configured to obtain a scaling instruction for the model, where a skeleton of the model is composed of a plurality of bones, and the scaling instruction is used to instruct to scale a first bone in the model; and obtaining parameters of the first skeleton after zooming;
the processing module is used for respectively determining parameters of other bones in the scaled model according to the parameters of the first bones and the overall skeleton condition of the model; wherein the overall skeleton condition is used for constraining the relative relationship of the skeleton of the model;
and the display module is used for displaying the scaled model on a graphical user interface according to the parameter of the first skeleton and the scaled parameters of the other skeletons.
14. An electronic device comprising a memory and a processor, the memory and the processor being connected;
the memory is used for storing a computer program;
the processor is adapted to carry out the method of any one of claims 1-11 when the computer program is executed.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-11.
CN202010858322.5A 2020-08-24 2020-08-24 Method, device, equipment and storage medium for processing model in game Active CN111973988B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010858322.5A CN111973988B (en) 2020-08-24 2020-08-24 Method, device, equipment and storage medium for processing model in game
CN202311694353.1A CN117861219A (en) 2020-08-24 2020-08-24 Method, device, equipment and storage medium for processing model in game

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010858322.5A CN111973988B (en) 2020-08-24 2020-08-24 Method, device, equipment and storage medium for processing model in game

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202311694353.1A Division CN117861219A (en) 2020-08-24 2020-08-24 Method, device, equipment and storage medium for processing model in game

Publications (2)

Publication Number Publication Date
CN111973988A true CN111973988A (en) 2020-11-24
CN111973988B CN111973988B (en) 2024-02-06

Family

ID=73443984

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010858322.5A Active CN111973988B (en) 2020-08-24 2020-08-24 Method, device, equipment and storage medium for processing model in game
CN202311694353.1A Pending CN117861219A (en) 2020-08-24 2020-08-24 Method, device, equipment and storage medium for processing model in game

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN202311694353.1A Pending CN117861219A (en) 2020-08-24 2020-08-24 Method, device, equipment and storage medium for processing model in game

Country Status (1)

Country Link
CN (2) CN111973988B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113658319A (en) * 2021-05-17 2021-11-16 海南师范大学 Method and device for gesture migration between heterogeneous frameworks
CN115690282A (en) * 2022-12-30 2023-02-03 海马云(天津)信息技术有限公司 Virtual role adjusting method and device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040021660A1 (en) * 2002-08-02 2004-02-05 Victor Ng-Thow-Hing Anthropometry-based skeleton fitting
US20120058824A1 (en) * 2010-09-07 2012-03-08 Microsoft Corporation Scalable real-time motion recognition
US20170032055A1 (en) * 2015-07-27 2017-02-02 Technische Universiteit Delft Linear Blend Skinning Weight Optimization Utilizing Skeletal Pose Sampling
CN109191570A (en) * 2018-09-29 2019-01-11 网易(杭州)网络有限公司 Method of adjustment, device, processor and the terminal of game role facial model
CN110111417A (en) * 2019-05-15 2019-08-09 浙江商汤科技开发有限公司 Generation method, device and the equipment of three-dimensional partial body's model
CN110456905A (en) * 2019-07-23 2019-11-15 广东虚拟现实科技有限公司 Positioning and tracing method, device, system and electronic equipment
CN111062864A (en) * 2019-12-20 2020-04-24 网易(杭州)网络有限公司 Animation model scaling method and device, electronic equipment and storage medium
CN111563945A (en) * 2020-04-30 2020-08-21 完美世界(北京)软件科技发展有限公司 Generation method, device and equipment of character morphing animation and readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040021660A1 (en) * 2002-08-02 2004-02-05 Victor Ng-Thow-Hing Anthropometry-based skeleton fitting
US20120058824A1 (en) * 2010-09-07 2012-03-08 Microsoft Corporation Scalable real-time motion recognition
US20170032055A1 (en) * 2015-07-27 2017-02-02 Technische Universiteit Delft Linear Blend Skinning Weight Optimization Utilizing Skeletal Pose Sampling
CN109191570A (en) * 2018-09-29 2019-01-11 网易(杭州)网络有限公司 Method of adjustment, device, processor and the terminal of game role facial model
CN110111417A (en) * 2019-05-15 2019-08-09 浙江商汤科技开发有限公司 Generation method, device and the equipment of three-dimensional partial body's model
CN110456905A (en) * 2019-07-23 2019-11-15 广东虚拟现实科技有限公司 Positioning and tracing method, device, system and electronic equipment
CN111062864A (en) * 2019-12-20 2020-04-24 网易(杭州)网络有限公司 Animation model scaling method and device, electronic equipment and storage medium
CN111563945A (en) * 2020-04-30 2020-08-21 完美世界(北京)软件科技发展有限公司 Generation method, device and equipment of character morphing animation and readable storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113658319A (en) * 2021-05-17 2021-11-16 海南师范大学 Method and device for gesture migration between heterogeneous frameworks
CN113658319B (en) * 2021-05-17 2023-08-04 海南师范大学 Gesture migration method and device between heterogeneous frameworks
CN115690282A (en) * 2022-12-30 2023-02-03 海马云(天津)信息技术有限公司 Virtual role adjusting method and device

Also Published As

Publication number Publication date
CN117861219A (en) 2024-04-12
CN111973988B (en) 2024-02-06

Similar Documents

Publication Publication Date Title
US10262447B2 (en) Systems and methods for virtual entity animation
CN111714880B (en) Picture display method and device, storage medium and electronic device
US20230334744A1 (en) Method and apparatus for generating walk animation of virtual role, device and storage medium
CN106846499B (en) Virtual model generation method and device
US11816772B2 (en) System for customizing in-game character animations by players
CN112669414B (en) Animation data processing method and device, storage medium and computer equipment
CN111973988A (en) Game model processing method, device, equipment and storage medium
CN108837510B (en) Information display method and device, storage medium and electronic device
JP4936522B2 (en) Image processing method and image processing apparatus
CN104867171A (en) Transition animation generating method for three-dimensional roles
CN103824316A (en) Method and equipment for generating action pictures for object
JP2019204476A (en) Image creation device, image creation method, and program
CN110570500B (en) Character drawing method, device, equipment and computer readable storage medium
CN112348931A (en) Foot reverse motion control method, device, equipment and storage medium
US20230120883A1 (en) Inferred skeletal structure for practical 3d assets
CN116843809A (en) Virtual character processing method and device
US20220172431A1 (en) Simulated face generation for rendering 3-d models of people that do not exist
CN116385605A (en) Method and device for generating flight animation of target object and electronic equipment
JP2012247953A (en) Program, information storage medium, information processing system and information processing method
CN112090076A (en) Game character action control method, device, equipment and medium
WO2024212702A1 (en) Avatar configuration method and apparatus, device, and storage medium
Dovramadjiev Motion capture (MoCAP) and 3D computer design for ergonomics needs
CN113256770B (en) Skeleton-based animation generation method and device and electronic equipment
CN116617663B (en) Action instruction generation method and device, storage medium and electronic equipment
EP4420752A1 (en) Apparatus, systems and methods for animation data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant