CN115546356A - Animation generation method and device, computer equipment and storage medium - Google Patents
Animation generation method and device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN115546356A CN115546356A CN202211160234.3A CN202211160234A CN115546356A CN 115546356 A CN115546356 A CN 115546356A CN 202211160234 A CN202211160234 A CN 202211160234A CN 115546356 A CN115546356 A CN 115546356A
- Authority
- CN
- China
- Prior art keywords
- animation
- target
- configuration
- target animation
- template file
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
The embodiment of the application belongs to the field of big data, and relates to an animation generation method, which comprises the following steps: judging whether an animation configuration request triggered by a user is received; if yes, acquiring target animation scene information from the animation configuration request, and displaying animation configuration grade information corresponding to the target animation scene information; receiving target animation configuration level information selected by a user from animation configuration level information; acquiring a target animation template file corresponding to target animation scene information and target animation configuration grade information; displaying a preset parameter configuration page based on the target animation template file, and receiving animation configuration parameters input by a user on the parameter configuration page; and configuring the target animation template file based on the animation configuration parameters to generate the target animation. The application also provides an animation generation device, computer equipment and a storage medium. In addition, the present application also relates to blockchain techniques in which a target animation can be stored. The animation generation method and device can improve animation generation efficiency.
Description
Technical Field
The present application relates to the field of big data technologies, and in particular, to an animation generation method and apparatus, a computer device, and a storage medium.
Background
With the rapid development of the information-oriented society and the front-end development technology, the application of data visualization is more extensive. Data visualization can visually explain data through a graphical means, so that clear and effective interaction with a user can be realized. Animation is an important data visualization means, and is widely applied to various business systems, such as an electronic commerce system of an internet enterprise, a transaction system, and the like.
In order to realize animation of animation effect in an application program of a terminal, animation generation technology can be used, but the existing animation generation technology does not support dynamic configuration of codes, and resources are fixed when developers output animation resources. In some service scenes, the difference of animations in different service scenes is small, but a developer needs to design a corresponding set of completed animation resources according to each different service scene, which easily causes a large workload of the developer, consumes much labor time for generating the animations, and has low animation generation efficiency.
Disclosure of Invention
An embodiment of the present application provides an animation generation method, an apparatus, a computer device, and a storage medium, so as to solve the technical problems that the workload of developers is high, more manpower and time are required to generate an animation, and the generation efficiency of the animation is low in the existing animation generation method.
In order to solve the above technical problem, an embodiment of the present application provides an animation generation method, which adopts the following technical solutions:
judging whether an animation configuration request triggered by a user is received; the animation configuration request carries target animation scene information;
if so, acquiring the target animation scene information from the animation configuration request, and displaying animation configuration grade information corresponding to the target animation scene information; wherein the number of animation configuration level information comprises a plurality;
receiving target animation configuration level information selected by the user from the animation configuration level information;
acquiring a target animation template file corresponding to the target animation scene information and the target animation configuration grade information;
displaying a preset parameter configuration page based on the target animation template file, and receiving animation configuration parameters input by the user on the parameter configuration page; the parameter configuration page is a parameter filling page corresponding to configurable animation elements contained in the target animation template file;
and configuring the target animation template file based on the animation configuration parameters to generate corresponding target animation.
Further, the step of obtaining a target animation template file corresponding to both the target animation scene information and the target animation configuration level information specifically includes:
calling a preset animation template file library;
performing one-to-one corresponding data comparison processing on the target animation scene information and all animation scene information contained in the animation template file base based on a preset data parallel comparison instruction to obtain a plurality of corresponding data comparison results;
determining specified animation scene information which is the same as the target animation scene information from the animation template file library based on the data comparison result;
acquiring a first animation template file corresponding to the appointed animation scene information from the animation template file library; wherein the number of the first animation template files comprises a plurality;
screening out a second animation template file corresponding to the target animation configuration grade information from the first animation template file based on the target animation configuration grade information;
and taking the second animation template file as the target animation template file.
Further, the step of configuring the target animation template file based on the animation configuration parameters to generate a corresponding target animation specifically includes:
acquiring configurable animation elements in the target animation template file;
determining a parameter filling position corresponding to the configurable animation element from the target animation template file;
judging whether the parameter filling position is filled with animation parameters corresponding to the configurable animation elements;
if the animation parameters corresponding to the configurable animation elements are not filled, filling the animation configuration parameters into the parameter filling positions to obtain a first target animation file;
and generating the target animation based on the first target animation file.
Further, the step of generating the target animation based on the first target animation file specifically includes:
compiling the first target animation file to obtain a corresponding compiled file;
calling a preset renderer;
and rendering the compiled file based on the renderer to generate the target animation.
Further, after the step of determining whether the parameter filling position is filled with animation parameters corresponding to the configurable animation element, the method further includes:
if the animation parameters corresponding to the configurable animation elements are filled, replacing the animation parameters with the animation configuration parameters to obtain a second target animation file;
and generating the target animation based on the second target animation file.
Further, the animation generation method further includes:
calling a preset video program to record the generation operation steps of the target animation and generate a corresponding initial video;
acquiring an explanation voice corresponding to the generation operation step of the target animation;
calling a preset voice input program to add the explained voice in the initial video to obtain an operation demonstration video corresponding to the target animation;
and storing the operation demonstration video.
Further, the animation generation method further includes:
acquiring a code programming package corresponding to the target animation template file;
acquiring a communication mode of a target user;
and pushing the code programming package to user equipment corresponding to the target user based on the communication mode.
In order to solve the above technical problem, an embodiment of the present application further provides an animation generation apparatus, which adopts the following technical solutions:
the judging module is used for judging whether an animation configuration request triggered by a user is received or not; the animation configuration request carries target animation scene information;
the first obtaining module is used for obtaining the target animation scene information from the animation configuration request and displaying animation configuration grade information corresponding to the target animation scene information if the target animation scene information is the animation configuration grade information; wherein the number of animation configuration level information includes a plurality;
the first receiving module is used for receiving target animation configuration level information selected by the user from the animation configuration level information;
the second acquisition module is used for acquiring a target animation template file corresponding to the target animation scene information and the target animation configuration grade information;
the second receiving module is used for displaying a preset parameter configuration page based on the target animation template file and receiving animation configuration parameters input by the user on the parameter configuration page; the parameter configuration page is a parameter filling page corresponding to configurable animation elements contained in the target animation template file;
and the first generation module is used for carrying out configuration processing on the target animation template file based on the animation configuration parameters to generate corresponding target animation.
In order to solve the above technical problem, an embodiment of the present application further provides a computer device, which adopts the following technical solutions:
judging whether an animation configuration request triggered by a user is received; the animation configuration request carries target animation scene information;
if yes, acquiring the target animation scene information from the animation configuration request, and displaying animation configuration grade information corresponding to the target animation scene information; wherein the number of animation configuration level information comprises a plurality;
receiving target animation configuration level information selected by the user from the animation configuration level information;
acquiring a target animation template file corresponding to the target animation scene information and the target animation configuration grade information;
displaying a preset parameter configuration page based on the target animation template file, and receiving animation configuration parameters input by the user on the parameter configuration page; the parameter configuration page is a parameter filling page corresponding to configurable animation elements contained in the target animation template file;
and configuring the target animation template file based on the animation configuration parameters to generate corresponding target animation.
In order to solve the foregoing technical problem, an embodiment of the present application further provides a computer-readable storage medium, which adopts the following technical solutions:
judging whether an animation configuration request triggered by a user is received; the animation configuration request carries target animation scene information;
if so, acquiring the target animation scene information from the animation configuration request, and displaying animation configuration grade information corresponding to the target animation scene information; wherein the number of animation configuration level information includes a plurality;
receiving target animation configuration level information selected by the user from the animation configuration level information;
acquiring a target animation template file corresponding to the target animation scene information and the target animation configuration grade information;
displaying a preset parameter configuration page based on the target animation template file, and receiving animation configuration parameters input by the user on the parameter configuration page; the parameter configuration page is a parameter filling page corresponding to configurable animation elements contained in the target animation template file;
and configuring the target animation template file based on the animation configuration parameters to generate corresponding target animation.
Compared with the prior art, the embodiment of the application mainly has the following beneficial effects:
when receiving an animation configuration request triggered by a user, acquiring target animation scene information from the animation configuration request, displaying animation configuration level information corresponding to the target animation scene information, receiving target animation configuration level information selected by the user from the animation configuration level information, then acquiring a target animation template file corresponding to the target animation scene information and the target animation configuration level information, subsequently displaying a preset parameter configuration page based on the target animation template file, receiving animation configuration parameters input by the user on the parameter configuration page, and finally configuring the target animation template file based on the animation configuration parameters to generate corresponding target animation. According to the method and the device, the required target animation can be generated only by determining the corresponding target animation template file according to the target animation scene information and the target animation configuration level information input by the user and then correspondingly configuring the target animation template file based on the animation configuration parameters input by the user, so that the generation efficiency of the animation is effectively improved, and the use experience of the user is improved.
Drawings
In order to more clearly illustrate the solution of the present application, the drawings needed for describing the embodiments of the present application will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and that other drawings can be obtained by those skilled in the art without inventive effort.
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of an animation generation method according to the present application;
FIG. 3 is a schematic block diagram of one embodiment of an animation generation apparatus according to the present application;
FIG. 4 is a block diagram of one embodiment of a computer device according to the present application.
Detailed Description
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs; the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application; the terms "including" and "having," and any variations thereof, in the description and claims of this application and the description of the above figures are intended to cover non-exclusive inclusions. The terms "first," "second," and the like in the description and claims of this application or in the above-described drawings are used for distinguishing between different objects and not for describing a particular order.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
A user may use terminal devices 101, 102, 103 to interact with a server 105 over a network 104 to receive or send messages or the like. The terminal devices 101, 102, 103 may have various communication client applications installed thereon, such as a web browser application, a shopping application, a search application, an instant messaging tool, a mailbox client, social platform software, and the like.
The terminal devices 101, 102, 103 may be various electronic devices having a display screen and supporting web browsing, including but not limited to a smart phone, a tablet computer, an e-book reader, an MP3 player (Moving Picture experts Group Audio Layer III, motion Picture experts compression standard Audio Layer 3), an MP4 player (Moving Picture experts Group Audio Layer IV, motion Picture experts compression standard Audio Layer 4), a laptop portable computer, a desktop computer, and the like.
The server 105 may be a server providing various services, such as a background server providing support for pages displayed on the terminal devices 101, 102, 103.
It should be noted that the animation generation method provided in the embodiments of the present application is generally executed by a server/terminal device, and accordingly, the animation generation apparatus is generally disposed in the server/terminal device.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow diagram of one embodiment of an animation generation method according to the present application is shown. The animation generation method comprises the following steps:
step S201, judging whether an animation configuration request triggered by a user is received; and the animation configuration request carries target animation scene information.
In this embodiment, the electronic device (for example, the server/terminal device shown in fig. 1) on which the animation generation method operates may obtain the animation configuration request through a wired connection manner or a wireless connection manner. It should be noted that the above-mentioned wireless connection means may include, but is not limited to, 3G/4G/5G connection, wiFi connection, bluetooth connection, wiMAX connection, zigbee connection, UWB (ultra wideband) connection, and other now known or later developed wireless connection means. For example, if the user's requirement is to create an animation of a financial animation scene and the ID of the financial animation scene is 008, the target animation scene information is financial or 008.
Step S202, if yes, the target animation scene information is obtained from the animation configuration request, and animation configuration grade information corresponding to the target animation scene information is displayed; wherein the number of animation configuration level information includes a plurality.
In this embodiment, the animation configuration request may be parsed to extract the target animation scene information from the animation configuration request. The animation configuration level information corresponding to the target animation scene information can be displayed on the current page, and corresponding information selection content can be further displayed to prompt a user to select and process the animation configuration level information. The animation configuration level information may include a first level, a middle level and a high level.
Step S203, receiving the target animation configuration grade information selected from the animation configuration grade information by the user.
In this embodiment, after the user selects the required target animation configuration level information from the animation configuration level information, a corresponding information selection instruction is generated, and the target animation configuration level information selected by the user may be determined based on the information selection instruction.
And step S204, acquiring a target animation template file corresponding to the target animation scene information and the target animation configuration grade information.
In this embodiment, the specific implementation process of obtaining the target animation template file corresponding to the target animation scene information and the target animation configuration level information is described in further detail in the following specific embodiments, and will not be described in detail herein.
Step S205, displaying a preset parameter configuration page based on the target animation template file, and receiving animation configuration parameters input by the user on the parameter configuration page; and the parameter configuration page is a parameter filling page corresponding to the configurable animation elements contained in the target animation template file.
In this embodiment, after the user inputs the corresponding animation configuration parameters in the parameter configuration page, the user may click an information filling completion button in the parameter configuration page, and then the electronic device may obtain the animation configuration parameters input by the user in the parameter configuration page based on a trigger action of the information filling completion button.
And step S206, configuring the target animation template file based on the animation configuration parameters to generate corresponding target animation.
In this embodiment, the specific implementation process of configuring the target animation template file based on the animation configuration parameters to generate the corresponding target animation is described in further detail in the following specific embodiments, and will not be described in detail herein.
When receiving an animation configuration request triggered by a user, the method and the device can firstly obtain target animation scene information from the animation configuration request, display animation configuration grade information corresponding to the target animation scene information, then receive the target animation configuration grade information selected by the user from the animation configuration grade information, then obtain a target animation template file corresponding to both the target animation scene information and the target animation configuration grade information, subsequently display a preset parameter configuration page based on the target animation template file, receive animation configuration parameters input by the user on the parameter configuration page, and finally configure the target animation template file based on the animation configuration parameters to generate corresponding target animation. According to the method and the device, the corresponding target animation template file is determined only according to the target animation scene information and the target animation configuration level information input by the user, and then the target animation template file is correspondingly configured based on the animation configuration parameters input by the user, so that the required target animation can be generated, the animation generation efficiency is effectively improved, and the user experience is improved.
In some optional implementations, step S204 includes the following steps:
and calling a preset animation template file library.
In this embodiment, the animation template file library is a database storing animation scene information, animation configuration level information, and animation template files, and the animation scene information, the animation configuration level information, and the animation template files have a one-to-one mapping relationship. The same animation scene information in the animation template file library corresponds to a plurality of specific animation template files, and each specific animation template file has different animation configuration level information. The animation configuration level information includes a primary level, a middle level, and a high level.
And performing one-to-one corresponding data comparison processing on the target animation scene information and all animation scene information contained in the animation template file base based on a preset data parallel comparison instruction to obtain a plurality of corresponding data comparison results.
In this embodiment, the data parallel comparison instruction may specifically refer to a single instruction stream multiple data (SIMD) instruction. The data comparison result comprises the same data or different data. The data comparison processing of the target animation scene information and all the animation scene information contained in the animation template file base in one-to-one correspondence is simultaneously carried out by utilizing the parallel computing capability of the data parallel comparison instruction, so that the processing speed of data comparison is further improved.
And determining the appointed animation scene information which is the same as the target animation scene information from the animation template file base based on the data comparison result.
In this embodiment, a target data comparison result with the same content as the data may be screened from all data comparison results, and then animation scene information corresponding to the target data comparison result is obtained, that is, the specified animation scene information.
Acquiring a first animation template file corresponding to the appointed animation scene information from the animation template file library; wherein the number of the first animation template files includes a plurality.
In this embodiment, the first animation template file may specifically include an animation template file corresponding to the first-level animation arrangement level information, an animation template file corresponding to the middle-level animation arrangement level information, and an animation template file corresponding to the high-level animation arrangement level information.
And screening out a second animation template file corresponding to the target animation configuration grade information from the first animation template file based on the target animation configuration grade information.
In this embodiment, after the target animation arrangement level information is obtained, the animation template file corresponding to the animation arrangement level information which is stored in the first animation template file and is the same as the target animation arrangement level information may be selected as the second animation template file.
And taking the second animation template file as the target animation template file.
According to the method and the device, the target animation template file corresponding to the target animation scene information and the target animation configuration grade information is quickly and accurately acquired from the animation template file library through the use of the data parallel comparison instruction, so that the processing speed of data comparison is favorably improved, and the processing speed of acquiring the target animation template file is increased.
In some optional implementations of this embodiment, step S206 includes the following steps:
and acquiring the configurable animation elements in the target animation template file.
In this embodiment, the animation template file includes configurable animation elements, as well as non-configurable animation elements. Animations corresponding to the animation template files are designed by related designers, and the animation template files are generated through After effects. The animation template file may be a JSON file. The number of configurable animation elements may include one or more.
And determining a parameter filling position corresponding to the configurable animation element from the target animation template file.
In this embodiment, each configurable animation element in the animation template file is provided with an associated parameter filling position.
And judging whether the parameter filling position is filled with animation parameters corresponding to the configurable animation elements.
In this embodiment, in the target animation template file, the parameter filling position corresponding to the configurable animation element may be filled with the original animation parameter, or may be in a state not filled with the animation parameter.
And if the animation parameters corresponding to the configurable animation elements are not filled, filling the animation configuration parameters into the parameter filling positions to obtain a first target animation file.
In this embodiment, when the parameter filling position corresponding to the configurable animation element in the target animation template file is not filled with animation parameters, indicating that the animation parameters corresponding to the configurable animation element are in a blank state, the animation configuration parameters are filled into the parameter filling position, so as to obtain the first target animation file.
And generating the target animation based on the first target animation file.
In this embodiment, the above-mentioned specific implementation process of generating the target animation based on the first target animation file will be described in further detail in the following specific embodiments, which are not set forth herein too much.
According to the method and the device, the configurable animation elements in the target animation template file are obtained, when it is detected that the parameter filling positions corresponding to the configurable animation elements in the target animation template file are not filled with the animation parameters corresponding to the configurable animation elements, the animation configuration parameters are automatically filled into the parameter filling positions, the first target animation file is obtained, and therefore the follow-up method and the device are beneficial to quickly generating the required target animation based on the first target animation file, and the animation generation efficiency is effectively improved through the use of the animation template file.
In some optional implementations, the generating the target animation based on the first target animation file includes:
and compiling the first target animation file to obtain a corresponding compiled file.
In this embodiment, the specific manner adopted by the compiling process is not limited, and may be set according to actual use requirements. The manner of the compiling process may include a manner of compiling execution or a manner of interpreting execution. The first target animation file is compiled in a compiling and executing mode, so that the compiling speed can be guaranteed to be high, and the compiling accuracy can be guaranteed to be high by compiling the first target animation file in an interpreting and executing mode.
And calling a preset renderer.
In this embodiment, the renderer may be a local real-time renderer for rendering the animation file in advance.
And rendering the compiled file based on the renderer to generate the target animation.
In this embodiment, the first target animation file subjected to the compiling process is further subjected to a rendering process by using a renderer, and then a corresponding target animation is generated.
After the first target animation file is obtained, the corresponding compiled file is obtained by compiling the first target animation file, and then the preset renderer is called to render the compiled file, so that the required target animation can be rapidly generated, and the generation efficiency of the animation is effectively improved.
In some optional implementations, after the step of determining whether the parameter filling position is filled with the animation parameter corresponding to the configurable animation element, the electronic device may further perform the following steps:
and if the animation parameters corresponding to the configurable animation elements are filled, replacing the animation parameters with the animation configuration parameters to obtain a second target animation file.
In this embodiment, if the parameter filling position corresponding to the configurable animation element in the target animation template file is filled with animation parameters, the animation parameters are automatically replaced with animation configuration parameters to obtain a second target animation file
And generating the target animation based on the second target animation file.
In this embodiment, the specific implementation process of generating the target animation based on the second target animation file may refer to the specific implementation process of generating the target animation based on the first target animation file, which is not described herein too much.
According to the method and the device, when the situation that the parameter filling position corresponding to the configurable animation element in the target animation template file is filled with the animation parameter is detected, the animation parameter is automatically replaced by the animation configuration parameter to obtain a second target animation file, the required target animation is rapidly generated based on the second target animation file, and the animation generation efficiency is effectively improved through the use of the animation template file.
In some optional implementation manners of this embodiment, the electronic device may further perform the following steps:
and calling a preset video recording program to record the generation operation steps of the target animation and generate a corresponding initial video.
In this embodiment, the video recording program may be a program constructed by writing a generated service code according to a service requirement for recording an animation generation operation procedure. The above-mentioned operation steps of generating the target animation specifically refer to all the operation steps of S201 to S206.
And acquiring an explanation voice corresponding to the generation operation step of the target animation.
In this embodiment, in the process of generating the target animation, the step of generating the target animation may be manually interpreted to generate a corresponding interpreted voice, and the interpreted voice is stored.
And calling a preset voice recording program to add the explained voice in the initial video to obtain an operation demonstration video corresponding to the target animation.
In this embodiment, the voice recording program may be a program constructed and generated by writing a generated service code for adding a service requirement for explaining a voice in a video.
And storing the operation demonstration video.
In this embodiment, the storage manner of the operation demonstration video is not particularly limited, and may be set according to actual use requirements, for example, the operation demonstration video may be stored in a local database or in a block chain, and the like. Further, after the operation demonstration video is generated, the operation demonstration video can be pushed to relevant business personnel, so that the business personnel can be demonstrated and introduced through the operation demonstration video. The business personnel can be familiar with the animation generation operation method and quickly get up by viewing the operation demonstration video. In addition, a website link corresponding to the operation demonstration video can be generated, and the website link is pushed to relevant business personnel.
The method and the device record the generation operation steps of the target animation by calling the preset video recording program so as to generate the corresponding initial video, obtain the explanation voice corresponding to the generation operation steps of the target animation, and then call the preset voice input program to add in the initial video to explain the voice, obtain the operation demonstration video corresponding to the target animation, are favorable for pushing the operation demonstration video to relevant business personnel in the follow-up process, and facilitate demonstration and introduction of the business personnel through the operation demonstration video. The method and the device have the advantages that business personnel can know the animation generation operation method by checking the operation demonstration video and can quickly start to use, the use experience of the business personnel is improved, the promotion efficiency of the animation generation operation method is improved, and further the animation creation efficiency of the business personnel is improved.
In some optional implementation manners of this embodiment, the electronic device may further perform the following steps:
and acquiring a code programming package corresponding to the target animation template file.
In this embodiment, after the developer creates an animation template file corresponding to each animation scene, the code programming package corresponding to the animation template file is also synchronously stored. The storage mode of the code programming package corresponding to the animation template file is not specifically limited, and may be set according to actual use requirements, for example, the storage mode may be stored in a local database or a block chain.
And acquiring the communication mode of the target user.
In this embodiment, the target user may be an internet enterprise, such as a business person of a financial company. The communication mode comprises mail information or telephone numbers.
And pushing the code programming package to user equipment corresponding to the target user based on the communication mode.
In this embodiment, when the communication mode of the target user is obtained, a push mode corresponding to the communication mode, for example, a mode of a mailbox or a short message, may be used to push the code programming package to the user equipment corresponding to the target user.
According to the method and the device, after the animation template file required for generating the animation is produced, the code programming package corresponding to the animation template file can be intelligently obtained, and the code programming package is pushed to the user equipment corresponding to the target user, so that the target user can realize multiplexing of the animation template file based on the code programming package, and the use experience of the target user is improved.
It is emphasized that, to further ensure the privacy and security of the target animation, the target animation may also be stored in a node of a blockchain.
The block chain referred by the application is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a string of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, which is used for verifying the validity (anti-counterfeiting) of the information and generating a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
The embodiment of the application can acquire and process related data based on an artificial intelligence technology. Among them, artificial Intelligence (AI) is a theory, method, technique and application system that simulates, extends and expands human Intelligence using a digital computer or a machine controlled by a digital computer, senses the environment, acquires knowledge and uses the knowledge to obtain the best result.
The artificial intelligence infrastructure generally includes technologies such as sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technologies, operation/interaction systems, mechatronics, and the like. The artificial intelligence software technology mainly comprises a computer vision technology, a robot technology, a biological recognition technology, a voice processing technology, a natural language processing technology, machine learning/deep learning and the like.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware associated with computer readable instructions, which can be stored in a computer readable storage medium, and when executed, the processes of the embodiments of the methods described above can be included. The storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a Random Access Memory (RAM).
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of execution is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
With further reference to fig. 3, as an implementation of the method shown in fig. 2, the present application provides an embodiment of an animation generation apparatus, which corresponds to the embodiment of the method shown in fig. 2, and which is particularly applicable to various electronic devices.
As shown in fig. 3, the animation generation apparatus 300 according to the present embodiment includes: a judging module 301, a first obtaining module 302, a first receiving module 303, a second obtaining module 304, a second receiving module 305, and a first generating module 306. Wherein:
the judging module 301 is configured to judge whether an animation configuration request triggered by a user is received; the animation configuration request carries target animation scene information;
a first obtaining module 302, configured to obtain the target animation scene information from the animation configuration request if the target animation scene information is determined to be the animation configuration level information, and display animation configuration level information corresponding to the target animation scene information; wherein the number of animation configuration level information includes a plurality;
a first receiving module 303, configured to receive target animation configuration level information selected by the user from the animation configuration level information;
a second obtaining module 304, configured to obtain a target animation template file corresponding to the target animation scene information and the target animation configuration level information;
a second receiving module 305, configured to display a preset parameter configuration page based on the target animation template file, and receive animation configuration parameters input by the user on the parameter configuration page; the parameter configuration page is a parameter filling page corresponding to configurable animation elements contained in the target animation template file;
the first generating module 306 is configured to configure the target animation template file based on the animation configuration parameters, so as to generate a corresponding target animation.
In this embodiment, the operations that the modules or units are respectively configured to execute correspond to the steps of the animation generation method in the foregoing embodiment one to one, and are not described herein again.
In some optional implementations of this embodiment, the second obtaining module 304 includes:
the calling submodule is used for calling a preset animation template file library;
the comparison submodule is used for carrying out one-to-one corresponding data comparison processing on the target animation scene information and all animation scene information contained in the animation template file base based on a preset data parallel comparison instruction to obtain a plurality of corresponding data comparison results;
a first determining submodule, configured to determine, based on the data comparison result, specified animation scene information that is the same as the target animation scene information from the animation template library;
the first obtaining submodule is used for obtaining a first animation template file corresponding to the appointed animation scene information from the animation template file library; wherein the number of the first animation template files comprises a plurality;
the screening submodule is used for screening out a second animation template file corresponding to the target animation configuration grade information from the first animation template file based on the target animation configuration grade information;
and the second determining submodule is used for taking the second animation template file as the target animation template file.
In this embodiment, the operations that the modules or units are respectively configured to execute correspond to the steps of the animation generation method in the foregoing embodiment one to one, and are not described herein again.
In some optional implementations of the present embodiment, the first generating module 306 includes:
the second obtaining submodule is used for obtaining configurable animation elements in the target animation template file;
a third determining submodule, configured to determine, from the target animation template file, a parameter filling position corresponding to the configurable animation element;
the judgment submodule is used for judging whether the parameter filling position is filled with animation parameters corresponding to the configurable animation elements;
the filling submodule is used for filling the animation configuration parameters into the parameter filling position to obtain a first target animation file if the animation parameters corresponding to the configurable animation elements are not filled;
and the first generation submodule is used for generating the target animation based on the first target animation file.
In this embodiment, the operations that the modules or units are respectively used to execute correspond to the steps of the animation generation method in the foregoing embodiment one to one, and are not described herein again.
In some optional implementations of this embodiment, the first generation submodule includes:
the compiling unit is used for compiling the first target animation file to obtain a corresponding compiled file;
the calling unit is used for calling a preset renderer;
and the generating unit is used for rendering the compiled file based on the renderer to generate the target animation.
In this embodiment, the operations that the modules or units are respectively configured to execute correspond to the steps of the animation generation method in the foregoing embodiment one to one, and are not described herein again.
In some optional implementations of the present embodiment, the first generating module 306 further includes:
the replacing submodule is used for replacing the animation parameters with the animation configuration parameters if the animation parameters corresponding to the configurable animation elements are filled, so that a second target animation file is obtained;
and the second generation submodule is used for generating the target animation based on the second target animation file.
In this embodiment, the operations that the modules or units are respectively configured to execute correspond to the steps of the animation generation method in the foregoing embodiment one to one, and are not described herein again.
In some optional implementations of this embodiment, the animation generating apparatus further includes:
the second generation module is used for calling a preset video program to record the generation operation steps of the target animation and generate a corresponding initial video;
a third obtaining module, configured to obtain an explanation voice corresponding to the generation operation step of the target animation;
the third generation module is used for calling a preset voice input program to add the explained voice in the initial video to obtain an operation demonstration video corresponding to the target animation;
and the storage module is used for storing the operation demonstration video.
In this embodiment, the operations that the modules or units are respectively configured to execute correspond to the steps of the animation generation method in the foregoing embodiment one to one, and are not described herein again.
In some optional implementations of this embodiment, the animation generating apparatus further includes:
the fourth acquisition module is used for acquiring a code programming package corresponding to the target animation template file;
a fifth obtaining module, configured to obtain a communication mode of a target user;
and the pushing module is used for pushing the code programming package to user equipment corresponding to the target user based on the communication mode.
In this embodiment, the operations that the modules or units are respectively configured to execute correspond to the steps of the animation generation method in the foregoing embodiment one to one, and are not described herein again.
In order to solve the technical problem, the embodiment of the application further provides computer equipment. Referring to fig. 4 in particular, fig. 4 is a block diagram of a basic structure of a computer device according to the embodiment.
The computer device 4 comprises a memory 41, a processor 42, and a network interface 43, which are communicatively connected to each other via a system bus. It is noted that only computer device 4 having components 41-43 is shown, but it is understood that not all of the shown components are required to be implemented, and that more or fewer components may be implemented instead. As will be understood by those skilled in the art, the computer device is a device capable of automatically performing numerical calculation and/or information processing according to instructions set or stored in advance, and the hardware thereof includes but is not limited to a microprocessor, an Application Specific Integrated Circuit (ASIC), a Programmable Gate Array (FPGA), a Digital Signal Processor (DSP), an embedded device, and the like.
The computer device can be a desktop computer, a notebook, a palm computer, a cloud server and other computing devices. The computer equipment can carry out man-machine interaction with a user through a keyboard, a mouse, a remote controller, a touch panel or voice control equipment and the like.
The memory 41 includes at least one type of readable storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, etc. In some embodiments, the memory 41 may be an internal storage unit of the computer device 4, such as a hard disk or a memory of the computer device 4. In other embodiments, the memory 41 may also be an external storage device of the computer device 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the computer device 4. Of course, the memory 41 may also include both internal and external storage devices of the computer device 4. In this embodiment, the memory 41 is generally used for storing an operating system and various application software installed on the computer device 4, such as computer readable instructions of an animation generation method. Further, the memory 41 may also be used to temporarily store various types of data that have been output or are to be output.
The processor 42 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip in some embodiments. The processor 42 is typically used to control the overall operation of the computer device 4. In this embodiment, the processor 42 is configured to execute computer readable instructions stored in the memory 41 or process data, such as computer readable instructions for executing the animation generation method.
The network interface 43 may comprise a wireless network interface or a wired network interface, and the network interface 43 is generally used for establishing communication connection between the computer device 4 and other electronic devices.
Compared with the prior art, the embodiment of the application mainly has the following beneficial effects:
in the embodiment of the application, when an animation configuration request triggered by a user is received, target animation scene information is firstly obtained from the animation configuration request, animation configuration grade information corresponding to the target animation scene information is displayed, then the target animation configuration grade information selected by the user from the animation configuration grade information is received, then a target animation template file corresponding to the target animation scene information and the target animation configuration grade information is obtained, a preset parameter configuration page is displayed based on the target animation template file, animation configuration parameters input by the user on the parameter configuration page are received, finally the target animation template file is configured based on the animation configuration parameters, and corresponding target animation is generated. According to the method and the device, the required target animation can be generated only by determining the corresponding target animation template file according to the target animation scene information and the target animation configuration level information input by the user and then correspondingly configuring the target animation template file based on the animation configuration parameters input by the user, so that the generation efficiency of the animation is effectively improved, and the use experience of the user is improved.
The present application further provides another embodiment, which is to provide a computer-readable storage medium storing computer-readable instructions executable by at least one processor to cause the at least one processor to perform the steps of the animation generation method as described above.
Compared with the prior art, the embodiment of the application mainly has the following beneficial effects:
in the embodiment of the application, when an animation configuration request triggered by a user is received, target animation scene information is obtained from the animation configuration request, animation configuration grade information corresponding to the target animation scene information is displayed, target animation configuration grade information selected by the user from the animation configuration grade information is received, a target animation template file corresponding to the target animation scene information and the target animation configuration grade information is obtained, a preset parameter configuration page is displayed based on the target animation template file, animation configuration parameters input by the user on the parameter configuration page are received, and finally the target animation template file is configured based on the animation configuration parameters to generate corresponding target animation. According to the method and the device, the required target animation can be generated only by determining the corresponding target animation template file according to the target animation scene information and the target animation configuration grade information input by the user and then correspondingly configuring the target animation template file based on the animation configuration parameters input by the user, so that the generation efficiency of the animation is effectively improved, and the use experience of the user is improved.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present application or portions thereof that contribute to the prior art may be embodied in the form of a software product, where the computer software product is stored in a storage medium (such as a ROM/RAM, a magnetic disk, and an optical disk), and includes several instructions for enabling a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
It is to be understood that the above-described embodiments are merely illustrative of some, but not restrictive, of the broad invention, and that the appended drawings illustrate preferred embodiments of the invention and do not limit the scope of the invention. This application is capable of embodiments in many different forms and is provided for the purpose of enabling a thorough understanding of the disclosure of the application. Although the present application has been described in detail with reference to the foregoing embodiments, it will be apparent to one skilled in the art that the present application may be practiced without modification or with equivalents of some of the features described in the foregoing embodiments. All equivalent structures made by using the contents of the specification and the drawings of the present application are directly or indirectly applied to other related technical fields and are within the protection scope of the present application.
Claims (10)
1. An animation generation method, comprising the steps of:
judging whether an animation configuration request triggered by a user is received; the animation configuration request carries target animation scene information;
if so, acquiring the target animation scene information from the animation configuration request, and displaying animation configuration grade information corresponding to the target animation scene information; wherein the number of animation configuration level information includes a plurality;
receiving target animation configuration level information selected by the user from the animation configuration level information;
acquiring a target animation template file corresponding to the target animation scene information and the target animation configuration grade information;
displaying a preset parameter configuration page based on the target animation template file, and receiving animation configuration parameters input by the user on the parameter configuration page; the parameter configuration page is a parameter filling page corresponding to configurable animation elements contained in the target animation template file;
and configuring the target animation template file based on the animation configuration parameters to generate corresponding target animation.
2. The animation generation method according to claim 1, wherein the step of obtaining a target animation template file corresponding to both the target animation scene information and the target animation configuration level information specifically includes:
calling a preset animation template file library;
performing one-to-one corresponding data comparison processing on the target animation scene information and all animation scene information contained in the animation template file base based on a preset data parallel comparison instruction to obtain a plurality of corresponding data comparison results;
determining specified animation scene information which is the same as the target animation scene information from the animation template file base based on the data comparison result;
acquiring a first animation template file corresponding to the appointed animation scene information from the animation template file library; wherein the number of the first animation template files comprises a plurality;
screening out a second animation template file corresponding to the target animation configuration grade information from the first animation template file based on the target animation configuration grade information;
and taking the second animation template file as the target animation template file.
3. The animation generation method according to claim 1, wherein the step of configuring the target animation template file based on the animation configuration parameters to generate the corresponding target animation specifically includes:
acquiring configurable animation elements in the target animation template file;
determining a parameter filling position corresponding to the configurable animation element from the target animation template file;
judging whether the parameter filling position is filled with animation parameters corresponding to the configurable animation elements;
if the animation parameters corresponding to the configurable animation elements are not filled, filling the animation configuration parameters into the parameter filling positions to obtain a first target animation file;
and generating the target animation based on the first target animation file.
4. The animation generation method according to claim 3, wherein the step of generating the target animation based on the first target animation file specifically includes:
compiling the first target animation file to obtain a corresponding compiled file;
calling a preset renderer;
and performing rendering processing on the compiled file based on the renderer to generate the target animation.
5. The animation generation method according to claim 3, further comprising, after the step of determining whether the parameter fill position is filled with animation parameters corresponding to the configurable animation element:
if the animation parameters corresponding to the configurable animation elements are filled, replacing the animation parameters with the animation configuration parameters to obtain a second target animation file;
and generating the target animation based on the second target animation file.
6. The animation generation method according to claim 1, further comprising:
calling a preset video program to record the generation operation steps of the target animation and generate a corresponding initial video;
acquiring an explanation voice corresponding to the generation operation step of the target animation;
calling a preset voice input program to add the explained voice in the initial video to obtain an operation demonstration video corresponding to the target animation;
and storing the operation demonstration video.
7. The animation generation method according to claim 1, further comprising:
acquiring a code programming package corresponding to the target animation template file;
acquiring a communication mode of a target user;
and pushing the code programming package to user equipment corresponding to the target user based on the communication mode.
8. An animation generation device, comprising:
the judging module is used for judging whether an animation configuration request triggered by a user is received or not; the animation configuration request carries target animation scene information;
a first obtaining module, configured to, if yes, obtain the target animation scene information from the animation configuration request, and display animation configuration level information corresponding to the target animation scene information; wherein the number of animation configuration level information includes a plurality;
the first receiving module is used for receiving target animation configuration level information selected by the user from the animation configuration level information;
the second acquisition module is used for acquiring a target animation template file corresponding to the target animation scene information and the target animation configuration grade information;
the second receiving module is used for displaying a preset parameter configuration page based on the target animation template file and receiving animation configuration parameters input by the user on the parameter configuration page; the parameter configuration page is a parameter filling page corresponding to configurable animation elements contained in the target animation template file;
and the first generation module is used for configuring the target animation template file based on the animation configuration parameters to generate corresponding target animation.
9. A computer device comprising a memory having computer readable instructions stored therein and a processor which when executed implements the steps of the animation generation method as claimed in any one of claims 1 to 7.
10. A computer-readable storage medium having computer-readable instructions stored thereon which, when executed by a processor, implement the steps of the animation generation method as claimed in any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211160234.3A CN115546356A (en) | 2022-09-22 | 2022-09-22 | Animation generation method and device, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211160234.3A CN115546356A (en) | 2022-09-22 | 2022-09-22 | Animation generation method and device, computer equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115546356A true CN115546356A (en) | 2022-12-30 |
Family
ID=84729112
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211160234.3A Pending CN115546356A (en) | 2022-09-22 | 2022-09-22 | Animation generation method and device, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115546356A (en) |
-
2022
- 2022-09-22 CN CN202211160234.3A patent/CN115546356A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN117195886A (en) | Text data processing method, device, equipment and medium based on artificial intelligence | |
CN115712422A (en) | Form page generation method and device, computer equipment and storage medium | |
CN115564000A (en) | Two-dimensional code generation method and device, computer equipment and storage medium | |
CN116661936A (en) | Page data processing method and device, computer equipment and storage medium | |
CN115562662A (en) | Application page creating method and device, computer equipment and storage medium | |
CN116383787A (en) | Page creation method, page creation device, computer equipment and storage medium | |
CN116821493A (en) | Message pushing method, device, computer equipment and storage medium | |
CN115309649A (en) | Test case generation method and device, computer equipment and storage medium | |
CN115809241A (en) | Data storage method and device, computer equipment and storage medium | |
CN115546356A (en) | Animation generation method and device, computer equipment and storage medium | |
CN115826973A (en) | List page generation method and device, computer equipment and storage medium | |
CN115080045A (en) | Link generation method and device, computer equipment and storage medium | |
CN116627416A (en) | Page configuration method, page configuration device, computer equipment and storage medium | |
CN116643884A (en) | Data computing method, device, equipment and storage medium based on rule engine | |
CN116466945A (en) | Data display method, device, computer equipment and storage medium | |
CN117390119A (en) | Task processing method, device, computer equipment and storage medium | |
CN116795882A (en) | Data acquisition method, device, computer equipment and storage medium | |
CN118964180A (en) | Test case generation method based on canvas address and related equipment | |
CN116932090A (en) | Tool pack loading method, device, computer equipment and storage medium | |
CN117251502A (en) | Data billboard generation method and device, computer equipment and storage medium | |
CN117076595A (en) | Text processing method, device, equipment and storage medium based on artificial intelligence | |
CN116932486A (en) | File generation method, device, computer equipment and storage medium | |
CN117850842A (en) | Plug-in updating method, device, equipment and storage medium thereof | |
CN118295908A (en) | Commodity data processing method, commodity data processing device, computer equipment and storage medium | |
CN117235260A (en) | Text labeling method, device, equipment and storage medium based on artificial intelligence |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |