Disclosure of Invention
The invention provides a video rendering method, a video rendering device, video rendering equipment and a video rendering medium based on FLUTTER, which are used for solving the problem that when a client of an existing iOS system renders a video in real time, a corresponding playing engine of the iOS system needs to be modified so as to meet the requirements of rendering and playing the video.
The invention provides a video rendering method based on FLUTTER, which is applied to a client provided with an iOS operating system; the method comprises the following steps:
acquiring compressed video data to be rendered;
converting the compressed video data into packed data in a preset code stream format;
determining a decoding type, and decoding the packed data according to the decoding type to obtain image frame data;
and transmitting the image frame data to an image cache component of the FLUTTER, and rendering the image frame data into video data through the image cache component.
Optionally, the determining a decoding type, and decoding the packed data according to the decoding type to obtain image frame data includes:
acquiring an interface function of a preset system library frame, and determining a decoding type according to the interface function;
and decoding the packed data according to the decoding type to obtain image frame data.
Optionally, the step of decoding the packed data according to the decoding type to obtain image frame data includes:
when the decoding type is hard decoding, the packed data can be decoded by an image processor to obtain image frame data.
Optionally, the step of decoding the packed data according to the decoding type to obtain image frame data includes:
when the decoding type is soft decoding, converting the packed data into YUV data through a preset decoding program;
and decoding the YUV data by an image processor to obtain image frame data.
Optionally, after the step of transmitting the image frame data to an image buffer component of the Flutter and rendering the image frame data into video data by the image buffer component, the method further includes:
and playing the video data obtained by rendering through a preset player.
The invention also provides a video rendering device based on the FLUTTER, which is applied to a client provided with the iOS operating system; the device comprises:
the acquisition module is used for acquiring compressed video data to be rendered;
the conversion module is used for converting the compressed video data into packed data in a preset code stream format;
the decoding module is used for determining a decoding type and decoding the packed data according to the decoding type to obtain image frame data;
and the rendering module is used for transmitting the image frame data to an image cache component of the Flutter, and rendering the image frame data into video data through the image cache component.
Optionally, the decoding module includes:
the decoding type determining submodule is used for acquiring an interface function of a preset system library frame and determining a decoding type according to the interface function;
and the decoding submodule is used for decoding the packed data according to the decoding type to obtain image frame data.
Optionally, the apparatus further comprises:
and the playing module is used for playing the video data obtained by rendering through a preset player.
The invention also provides a text-based entity recognition device, comprising a processor and a memory:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to execute the Flutter-based video rendering method according to any of the above instructions in the program code.
The present invention also provides a computer-readable storage medium, characterized in that the computer-readable storage medium is configured to store program code for executing the Flutter-based video rendering method according to any of the above.
According to the technical scheme, the invention has the following advantages: the invention discloses a video rendering method based on Flutter, and particularly discloses: acquiring compressed video data to be rendered; converting the compressed video data into packed data in a preset code stream format; determining a decoding type, and decoding the packed data according to the decoding type to obtain image frame data; and transmitting the image frame data to an image cache component of the FLUTTER, and rendering the image frame data into video data through the image cache component. According to the invention, compressed video data to be rendered is converted into image frame data which can be processed by the Fluter, so that the image buffer component of the Fluter can render the image frame data into the video data, the Fluter-based video rendering effect is realized, and the modification of an engine of a client is not required.
Detailed Description
When a client of an existing iOS system renders a video in real time, a corresponding playing engine of the iOS system needs to be modified so as to meet the requirements of rendering and playing the video.
In view of this, embodiments of the present invention provide a video rendering method, apparatus, device and medium based on Flutter, which are used to solve the problem that when a client applying an iOS system renders a video in real time, a corresponding playing engine of the iOS system needs to be modified to meet the requirements of rendering and playing the video.
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the embodiments described below are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart illustrating steps of a video rendering method based on Flutter according to an embodiment of the present invention.
The invention provides a video rendering method based on FLUTTER, which is applied to a client provided with an iOS operating system; flutter is a mobile UI framework released by google, is an SDK for constructing a cross-platform mobile phone app, and can quickly construct a high-quality user interface on Android and iOS systems.
The method specifically comprises the following steps:
step 101, obtaining compressed video data to be rendered;
the client terminal in the embodiment of the present invention refers to a mobile phone, a tablet, a Mac, and other terminal devices that use the iOS operating system.
The compressed Video data to be rendered according to the embodiment of the present invention may be compressed Video data in an HEVC (High Efficiency Video Coding) format. HEVC, also known as h.256he MPEG-H part 2, is a video compression standard.
Further, the compressed Video data to be rendered according to the embodiment of the present invention may also be MPEG-4 AVC (MPEG-4Part 10, Advanced Video Coding), i.e. h.264, also called MPEG-4Part 10.
Step 102, converting the compressed video data into packed data in a preset code stream format;
in practical applications, the packing mode of the code stream of HEVC is generally an annex-B format, which is also called MPEG-2transport stream format, and is a default output format of most encoders, that is, the first 3 to 4 bytes of each frame are start _ code of h.265, 0x00000001, or 0x 000001.
However, HEVC compressed video data in the annex-B code stream format cannot be directly decoded by the system library framework provided by the iOS operating system, and therefore, packed data whose code stream format is converted into the HVCC code stream format supported by the system library framework is required.
103, determining a decoding type, and decoding the packed data according to the decoding type to obtain image frame data;
after the code stream format conversion of the compressed video data to be rendered is completed, the compressed video data can be decoded, and different decoding processes can be performed according to different decoding types, so that image frame data can be obtained through decoding.
Here, decoding refers to a process of restoring digital codes to contents represented by the digital codes or converting electric pulse signals to information, data, and the like represented by the electric pulse signals by a specific method. The decoding can comprise hard decoding and soft decoding, wherein the hard decoding refers to decoding the video through a video acceleration function of a display card (generally, a display card core GPU), and separating the video decoding work with large data volume and low difficulty from a CPU. The soft decoding is performed by a CPU occupied by the software.
And 104, transmitting the image frame data to an image cache component of the Flutter, and rendering the image frame data into video data through the image cache component.
After the image frame data is obtained through decoding, the image frame data can be bound to an image caching component (Texture Widget) through a Binding API of the Flutter, so that the image frame data is rendered into video data through the image caching component.
According to the invention, compressed video data to be rendered is converted into image frame data which can be processed by the Fluter, so that the image buffer component of the Fluter can render the image frame data into the video data, the Fluter-based video rendering effect is realized, and the modification of an engine of a client is not required.
Referring to fig. 2, fig. 2 is a flowchart illustrating a video rendering method based on Flutter according to another embodiment of the present invention. The method specifically comprises the following steps:
step 201, obtaining compressed video data to be rendered;
step 202, converting the compressed video data into packed data in a preset code stream format;
the steps 201-202 are the same as the steps 101-102, and the detailed description thereof can be referred to in the description of the steps 101-102, which is not repeated herein.
Step 203, acquiring an interface function of a preset system library frame, and determining a decoding type according to the interface function;
after the code stream format conversion of the compressed video data to be rendered is completed, the compressed video data can be decoded, and different decoding processes can be performed according to different decoding types, so that image frame data can be obtained through decoding.
In specific implementation, an interface function of a preset system library frame, a video toolbox frame, may be obtained to determine a decoding type, GPU hard decoding may be performed on packed data through the interface function of the video toolbox frame, if decoding can be performed, an operating system of a representation client supports hard decoding, if decoding cannot be performed, the operating system of the representation client may not support hard decoding, and at this time, soft decoding may be selected to decode the packed data.
Step 204, decoding the packed data according to the decoding type to obtain image frame data;
in one example, step 204 may include:
when the decoding type is hard decoding, the packed data can be decoded by an image processor to obtain image frame data.
In a specific implementation, when the operating system supports hard decoding, the packed data may be directly hard decoded through an interface function of the VideoToolbox framework to obtain image frame data CVPixelBufferRef.
In another example, step 204 may include:
when the decoding type is soft decoding, converting the packed data into YUV data through a preset decoding program;
and decoding the YUV data by an image processor to obtain image frame data.
YUV, a color coding method. Are often used in each image processing component. YUV allows for reduced bandwidth of chrominance in view of human perception when encoding photos or videos. Where "Y" represents brightness (Luma) or gray scale value, and "U" and "V" represent Chrominance (Chroma) or Chroma, which is used to describe the color and saturation of an image for specifying the color of a pixel.
In a specific implementation, when the operating system does not support hard decoding, the packed data may be decoded by means of soft decoding. Firstly, the packed data can be converted into data in YUV format, and then the YUV data is decoded by a VideoToolbox to obtain image frame data CVPixelBufferRef.
Step 205, transmitting the image frame data to an image cache component of the Flutter, and rendering the image frame data into video data through the image cache component.
After the image frame data is obtained through decoding, the image frame data can be bound to an image caching component (Texture Widget) through a Binding API of the Flutter, so that the image frame data is rendered into video data through the image caching component.
Further, in the embodiment of the present invention, after the video rendering is completed, the rendered video data may be played through a preset player.
According to the invention, compressed video data to be rendered is converted into image frame data which can be processed by the Fluter, so that the image buffer component of the Fluter can render the image frame data into the video data, the Fluter-based video rendering effect is realized, and the modification of an engine of a client is not required.
Referring to fig. 3, fig. 3 is a block diagram of a video rendering apparatus based on Flutter according to an embodiment of the present invention.
The embodiment of the invention provides a video rendering device based on FLUTTER, which is applied to a client provided with an iOS operating system; the device comprises:
an obtaining module 301, configured to obtain compressed video data to be rendered;
a conversion module 302, configured to convert the compressed video data into packed data in a preset code stream format;
a decoding module 303, configured to determine a decoding type, and decode the packed data according to the decoding type to obtain image frame data;
and a rendering module 304, configured to transmit the image frame data to an image buffer component of the Flutter, and render the image frame data into video data through the image buffer component.
In this embodiment of the present invention, the decoding module 303 includes:
the decoding type determining submodule is used for acquiring an interface function of a preset system library frame and determining a decoding type according to the interface function;
and the decoding submodule is used for decoding the packed data according to the decoding type to obtain image frame data.
In an embodiment of the present invention, the decoding sub-module includes:
and the hard decoding unit is used for decoding the packed data through an image processor to obtain image frame data when the decoding type is hard decoding.
In an embodiment of the present invention, the decoding sub-module includes:
the conversion unit is used for converting the packed data into YUV data through a preset decoding program when the decoding type is soft decoding;
and the decoding unit is used for decoding the YUV data through the image processor to obtain image frame data.
In an embodiment of the present invention, the apparatus further includes:
and the playing module is used for playing the video data obtained by rendering through a preset player.
The embodiment of the invention also provides entity identification equipment based on the text, which comprises a processor and a memory, wherein the processor comprises:
the memory is used for storing program codes and transmitting the program codes to the processor;
the processor is configured to execute the Flutter-based video rendering method according to the embodiment of the present invention according to the instructions in the program code.
The embodiment of the invention also provides a computer-readable storage medium, which is used for storing a program code, and the program code is used for executing the video rendering method based on Flutter in the embodiment of the invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.