[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN117979017A - Video processing method, device, electronic equipment and storage medium - Google Patents

Video processing method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117979017A
CN117979017A CN202410023744.9A CN202410023744A CN117979017A CN 117979017 A CN117979017 A CN 117979017A CN 202410023744 A CN202410023744 A CN 202410023744A CN 117979017 A CN117979017 A CN 117979017A
Authority
CN
China
Prior art keywords
video
decoding
data
display
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410023744.9A
Other languages
Chinese (zh)
Inventor
梁文博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaohongshu Technology Co ltd
Original Assignee
Xiaohongshu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaohongshu Technology Co ltd filed Critical Xiaohongshu Technology Co ltd
Priority to CN202410023744.9A priority Critical patent/CN117979017A/en
Publication of CN117979017A publication Critical patent/CN117979017A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The embodiment of the application provides a video processing method, a video processing device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring coding information of the high dynamic range video in the coding process; configuring decoding parameters based on the coding information, and decoding the high dynamic range video according to the decoding parameters to obtain video decoding data, wherein the decoding parameters are related to the coding information; the video decoded data is displayed. By adopting the embodiment of the application, the high dynamic range video can be correctly decoded and displayed.

Description

Video processing method, device, electronic equipment and storage medium
Technical Field
The present application relates to the field of video technologies, and in particular, to a video processing method, apparatus, electronic device, and storage medium.
Background
In the field of mobile devices and video entertainment, users' demand for a high quality video experience is increasing. High dynamic range (HIGH DYNAMIC RANGE, HDR) video brings a more realistic and engaging visual experience to the user by providing a wider color range, higher contrast, and brighter brightness.
However, the current open source player generally does not support displaying the HDR video, and a small number of open source players which hold the HDR video use the system capability, so that the HDR video cannot be processed and displayed correctly, which is easy to cause color cast, poor effect and the like.
Disclosure of Invention
The application provides a video processing method, a video processing device, an electronic device and a storage medium, which can correctly decode and display a high dynamic range video.
In a first aspect, an embodiment of the present application provides a video processing method, where the method includes:
Acquiring coding information of a high dynamic range video in a coding process;
configuring decoding parameters based on the coding information, and decoding the high dynamic range video according to the decoding parameters to obtain video decoding data, wherein the decoding parameters are related to the decoding information;
And displaying the video decoding data.
By the above embodiment, the terminal device may configure a decoding parameter based on the encoding information of the HDR video in the encoding process, and perform decoding processing on the HDR video according to the decoding parameter to obtain video decoding data, where the decoding parameter is related to the encoding information of the HDR video in the encoding process, and may be used to indicate how to correctly decode the HDR video, so that the terminal device may correctly decode the HDR video according to the decoding parameter, and may further correctly display the HDR video.
In a possible implementation manner of the first aspect, the decoding parameters include: bit depth, color matrix, color conversion relationship, and chromaticity reference;
the decoding processing is performed on the high dynamic range video according to the decoding parameters to obtain video decoding data, including:
performing inverse quantization processing on the high dynamic range video according to the bit depth to obtain first video data;
Mapping the first video data from a YUV color space to a nonlinear RGB color space according to the color matrix to obtain second video data;
Converting the second video data from the nonlinear RGB color space to a linear RGB color space according to the color conversion relationship to obtain third video data;
and performing color correction on the third video data according to the chromaticity standard to obtain the video decoding data.
Through the embodiment, the terminal equipment can decode the HDR video according to the bit depth, the color matrix, the color conversion relation and the chromaticity reference related to the coding information of the HDR video in the coding process, so that the HDR video can be correctly decoded, and correct video decoding data can be obtained.
In a possible implementation manner of the first aspect, the displaying the video decoding data includes:
According to the screen characteristics of a display screen, mapping the video decoding data to obtain fourth video data adaptive to the display screen;
And displaying the fourth video data.
According to the embodiment, before the video is displayed, the terminal device can firstly map the video decoding data according to the screen characteristics of the display screen to obtain the video data matched with the display screen, and then display the video data matched with the display screen, so that the video display effect can be improved, and in addition, the compatibility of the video data can be improved, so that the terminal device can adapt to different devices.
In a possible implementation manner of the first aspect, the screen characteristics of the display screen include: brightness information and color gamut information of the display screen;
The mapping processing is performed on the video decoding data according to the screen characteristics of the display screen to obtain fourth video data adapted to the display screen, including:
Determining tone mapping relation according to the brightness information of the display screen and the brightness information of the high dynamic range video;
determining a color gamut mapping relation according to the color gamut information of the display screen and the color gamut information of the high dynamic range video;
and mapping the video decoding data according to the tone mapping relation and the color gamut mapping relation to obtain fourth video data adaptive to the display screen.
Through the above embodiment, the terminal device may determine the tone mapping relationship according to the luminance information of the display screen and the luminance information of the HDR video, and determine the color gamut mapping relationship according to the color gamut information of the display screen and the color gamut information of the HDR video, and further perform corresponding mapping processing on the video decoding data according to the tone mapping relationship and the color gamut mapping relationship, so that the information such as luminance, tone, color gamut, etc. of the processed video data is adapted to the display screen, thereby improving the video display effect.
In a possible implementation manner of the first aspect, the displaying the video decoding data includes:
And displaying the video decoding data according to a preset display parameter, wherein the display parameter is related to the attribute information of the high dynamic range video.
In a possible implementation manner of the first aspect, the displaying the fourth video data includes:
And displaying the fourth video data according to a preset display parameter, wherein the display parameter is related to the attribute information of the high-dynamic-range video.
By the above embodiment, the terminal device may display decoded data (for example, the video decoding data or the fourth video data) corresponding to the HDR video according to a preset display parameter, where the display parameter is related to attribute information of the HDR video and may be used to indicate how to correctly restore and display the HDR video, so that the terminal device may correctly restore and display the HDR video according to the display parameter.
In a possible implementation manner of the first aspect, the displaying the video decoding data according to a preconfigured display parameter includes:
And displaying the video decoding data according to the preset display parameters and the customized video display effect.
In a possible implementation manner of the first aspect, the displaying the fourth video data according to a preconfigured display parameter includes:
and displaying the fourth video data according to the preset display parameters and the customized video display effect.
Through the embodiment, the terminal equipment expands the customizing capability of video display so that a user can customize the video display effect according to the requirement or preference, and the terminal equipment can display decoded data (such as the video decoding data or the fourth video data) corresponding to the HDR video according to the customized video display effect, so that the customized video display effect can be realized, and the requirement of the user on personalized watching experience can be met.
In a possible implementation manner of the first aspect, the method further includes:
And in the process of displaying the video decoding data, responding to the adjustment instruction of the display parameter, and adjusting the display parameter indicated by the adjustment instruction.
In a possible implementation manner of the first aspect, the method further includes:
and in the process of displaying the fourth video data, responding to the adjustment instruction of the display parameter, and adjusting the display parameter indicated by the adjustment instruction.
Through the embodiment, the terminal equipment expands the dynamic adjustment capability for video display, so that a user can initiate an adjustment instruction in real time according to the needs in the video display process, and the terminal equipment can dynamically adjust the video display effect according to the adjustment instruction, so that the user needs can be flexibly met.
In a second aspect, an embodiment of the present application provides a video processing apparatus, including:
The acquisition unit is used for acquiring the high dynamic range video and the coding information of the high dynamic range video in the coding process;
The decoding unit is used for configuring decoding parameters based on the coding information, and decoding the high dynamic range video according to the decoding parameters to obtain video decoding data, wherein the decoding parameters are related to the coding information;
And the display unit is used for displaying the video decoding data.
By the above embodiment, the video processing device may configure a decoding parameter based on the encoding information of the HDR video in the encoding process, and perform decoding processing on the HDR video according to the decoding parameter to obtain video decoding data, where the decoding parameter is related to the encoding information of the HDR video in the encoding process, and may be used to instruct how to correctly decode the HDR video, so that the video processing device may correctly decode the HDR video according to the decoding parameter, and may further correctly display the HDR video.
In a possible implementation manner of the second aspect, the decoding parameters include: bit depth, color matrix, color conversion relationship, and chromaticity reference;
The decoding unit is specifically configured to, when performing decoding processing on the high dynamic range video according to the decoding parameter to obtain video decoded data:
performing inverse quantization processing on the high dynamic range video according to the bit depth to obtain first video data;
Mapping the first video data from a YUV color space to a nonlinear RGB color space according to the color matrix to obtain second video data;
Converting the second video data from the nonlinear RGB color space to a linear RGB color space according to the color conversion relationship to obtain third video data;
and performing color correction on the third video data according to the chromaticity standard to obtain the video decoding data.
Through the embodiment, the video processing device can decode the HDR video according to the bit depth, the color matrix, the color conversion relation and the chromaticity reference related to the encoding information of the HDR video in the encoding process, so that the HDR video can be correctly decoded, and correct video decoding data can be obtained.
In a possible implementation manner of the second aspect, the display unit is specifically configured to, when displaying the video decoded data:
According to the screen characteristics of a display screen, mapping the video decoding data to obtain fourth video data adaptive to the display screen;
And displaying the fourth video data.
According to the embodiment, before the video is displayed, the video processing device can perform mapping processing on the video decoding data according to the screen characteristics of the display screen to obtain the video data matched with the display screen, and then display the video data matched with the display screen, so that the video display effect can be improved, and in addition, the compatibility of the video data can be improved, so that the video processing device can adapt to different devices.
In a possible implementation manner of the second aspect, the screen characteristics of the display screen include: brightness information and color gamut information of the display screen;
the display unit is specifically configured to, when performing mapping processing on the video decoding data according to a screen characteristic of a display screen to obtain fourth video data adapted to the display screen:
Determining tone mapping relation according to the brightness information of the display screen and the brightness information of the high dynamic range video;
determining a color gamut mapping relation according to the color gamut information of the display screen and the color gamut information of the high dynamic range video;
and mapping the video decoding data according to the tone mapping relation and the color gamut mapping relation to obtain fourth video data adaptive to the display screen.
Through the above embodiment, the video processing device may determine the tone mapping relationship according to the luminance information of the display screen and the luminance information of the HDR video, and determine the color gamut mapping relationship according to the color gamut information of the display screen and the color gamut information of the HDR video, and further perform corresponding mapping processing on the video decoding data according to the tone mapping relationship and the color gamut mapping relationship, so that the information such as the luminance, the tone, the color gamut, and the like of the processed video data is adapted to the display screen, thereby improving the video display effect.
In a possible implementation manner of the second aspect, the display unit is specifically configured to, when displaying the video decoded data:
And displaying the video decoding data according to a preset display parameter, wherein the display parameter is related to the attribute information of the high dynamic range video.
In a possible implementation manner of the second aspect, the display unit is specifically configured to, when displaying the fourth video data:
And displaying the fourth video data according to a preset display parameter, wherein the display parameter is related to the attribute information of the high-dynamic-range video.
By the above embodiment, the video processing device may display decoded data (for example, the video decoding data or the fourth video data) corresponding to the HDR video according to a preset display parameter, where the display parameter is related to attribute information of the HDR video and may be used to indicate how to correctly restore and display the HDR video, so that the video processing device may correctly restore and display the HDR video according to the display parameter.
In a possible implementation manner of the second aspect, the display unit is specifically configured to, when displaying the video decoding data according to a preconfigured display parameter:
And displaying the video decoding data according to the preset display parameters and the customized video display effect.
In a possible implementation manner of the second aspect, the display unit is specifically configured to, when displaying the fourth video data according to a preconfigured display parameter:
and displaying the fourth video data according to the preset display parameters and the customized video display effect.
Through the embodiment, the video processing device expands the customizing capability of video display so that a user can customize the video display effect according to the requirement or preference, and the video processing device can display decoded data (such as the video decoding data or the fourth video data) corresponding to the HDR video according to the customized video display effect, so that the customized video display effect can be realized, and the requirement of the user on personalized watching experience can be met.
In a possible implementation manner of the second aspect, the display unit is further configured to:
And in the process of displaying the video decoding data, responding to the adjustment instruction of the display parameter, and adjusting the display parameter indicated by the adjustment instruction.
In a possible implementation manner of the second aspect, the display unit is further configured to:
and in the process of displaying the fourth video data, responding to the adjustment instruction of the display parameter, and adjusting the display parameter indicated by the adjustment instruction.
Through the embodiment, the video processing device expands the dynamic adjustment capability for video display, so that a user can initiate an adjustment instruction in real time according to the needs in the video display process, and the terminal equipment can dynamically adjust the video display effect according to the adjustment instruction, so that the user needs can be flexibly met.
In a third aspect, an embodiment of the present application provides an electronic device, including a memory and a processor, where the memory stores a computer program, and the processor implements the method in the first aspect and any one of possible implementation manners of the first aspect when executing the computer program.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having a computer program stored therein, which when executed by a processor, implements the method of the first aspect and any of its possible implementation manners.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when executed by a processor, implements the method of the first aspect and any one of its possible implementation manners.
The advantages of the third to fifth aspects may be referred to the description of the advantages of the first to second aspects, and are not repeated here.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
In order to more clearly describe the embodiments of the present application or the technical solutions in the background art, the following description will describe the drawings that are required to be used in the embodiments of the present application or the background art.
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic view of an application environment of a video processing method according to an embodiment of the present application;
fig. 2 is a schematic flow chart of a video processing method according to an embodiment of the present application;
fig. 3 is a schematic diagram of an encoding process and a decoding process of an HDR video according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a rendering link of an HDR video according to an embodiment of the present application;
fig. 5 is a flowchart of another video processing method according to an embodiment of the present application;
Fig. 6 is a schematic structural diagram of a video processing apparatus according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms "first," "second," and the like in embodiments of the present application are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
It should be understood that in the present application, "at least one (item)" means one or more, "a plurality" means two or more, and "at least two (item)" means two or three and more. "and/or" for describing the association relationship of the association object, the representation may have three relationships, for example, "a and/or B" may represent: only a, only B and both a and B are present, wherein a, B may be singular or plural. The character "/" may indicate that the context-dependent object is an "or" relationship, meaning any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
In the field of mobile devices and video entertainment, users' demand for a high quality video experience is increasing. High dynamic range (HIGH DYNAMIC RANGE, HDR) video brings a more realistic and engaging visual experience to the user by providing a wider color range, higher contrast, and brighter brightness. In an audio and video player application, it is very important to correctly process and render video in various formats, and the rendering module is responsible for displaying decoded audio and video frames on a screen, so that whether the HDR video can be correctly processed and restored is very important.
However, the current open source player (such as ijkplayer) generally does not support displaying the HDR video, and a small number of open source players which hold the HDR video use the system capability, so that the HDR video cannot be processed and displayed correctly, which is easy to cause color cast, poor effect, and the like. In addition, the current open source player usually only focuses on the functions of decoding and rendering, lacks expansibility and compatibility, and cannot flexibly cope with the requirements of different application scenes or different users.
Based on the above, the embodiment of the application provides a video processing method, which can correctly display HDR video.
The execution subject of the video processing method may be a video processing apparatus, for example, the video processing method may be executed by a terminal device or other processing device, wherein the terminal device may be, but is not limited to, various personal computers, notebook computers, tablet computers, smart phones, and portable wearable devices. In some possible implementations, the video processing method may be implemented by way of a processor invoking computer readable instructions stored in a memory.
Referring to fig. 1, fig. 1 is a schematic view of an application environment of a video processing method according to an embodiment of the present application. The application environment relates to a terminal device 101 and a server 102, wherein the terminal device 101 can be communicatively connected to the server 102. A player 103 is provided in the terminal device 101.
For example, the user may access a platform capable of providing a video service (referred to as a video platform for short) through the terminal device 101, and the server 102 may be a server where the video platform is located. The terminal device 101 can acquire video from the server 102 and decode and display the video using the player 103.
Referring to fig. 2, fig. 2 is a flowchart of a video processing method according to an embodiment of the present application, where the video processing method can be applied to the terminal device 101 or the player 103 in fig. 1. The following description will take an example in which the video processing method is applied to the terminal apparatus 101 in fig. 1.
As shown in fig. 2, the video processing method includes, but is not limited to, the following steps S201 to S203.
S201, obtaining a high dynamic range video and coding information of the high dynamic range video in a coding process.
The high dynamic range (HIGH DYNAMIC RANGE, HDR) video refers to the video to be processed, that is, the format of the video to be processed is the HDR format.
In one example, the terminal device may obtain HDR video to be processed from a video platform server. In another example, the terminal device may also obtain the HDR video to be processed from the local file. It should be appreciated that embodiments of the present application are not limited as to the source of the HDR video to be processed.
The HDR video acquired by the terminal device is an encoded (or compressed) video, and when the terminal device acquires the HDR video, the terminal device can synchronously acquire encoding information of the HDR video in an encoding process, where the encoding information can include parameters used in an encoding process corresponding to the HDR video.
S202, configuring decoding parameters based on the coding information, and decoding the high dynamic range video according to the decoding parameters to obtain video decoding data, wherein the decoding parameters are related to the coding information.
The decoding parameters refer to parameters used for decoding the HDR video. The decoding parameters are related to the encoding information of the HDR video during encoding and may be used to indicate how to correctly decode the HDR video.
Specifically, after the terminal device obtains the HDR video and the encoding information of the HDR video in the encoding process, a decoding parameter is configured based on the encoding information, and decoding processing is performed on the HDR video according to the configured decoding parameter, so that the HDR video can be correctly decoded, thereby obtaining correct video decoding data.
Alternatively, the terminal device may perform decoding processing on the HDR video, which may be hard decoding on the HDR video, that is, performing decoding processing on the HDR video using hardware capabilities of the terminal device (e.g., a hardware resource such as a display card built in the terminal device).
The HDR video has high resolution and high frame rate, and the HDR video is hard decoded, so that the playing fluency of the HDR video is improved, the HDR video with high resolution and high frame rate can be smoothly played on the device, and the video watching experience of a user is improved. In addition, the power consumption required by hard decoding is lower, so that the consumption of equipment resources can be reduced while the HDR video playing performance is ensured, and the loss of the service life of the equipment battery is further reduced.
S203, displaying the video decoding data.
After the terminal device decodes the HDR video to obtain video decoding data, the video decoding data may be rendered, so that the video decoding data is displayed on a screen for a user to watch.
By the above embodiment, the terminal device may configure a decoding parameter based on the encoding information of the HDR video in the encoding process, and perform decoding processing on the HDR video according to the decoding parameter to obtain video decoding data, where the decoding parameter is related to the encoding information of the HDR video in the encoding process, and may be used to indicate how to correctly decode the HDR video, so that the terminal device may correctly decode the HDR video according to the decoding parameter, and may further correctly display the HDR video.
In one possible implementation, the decoding parameters include: bit depth, color matrix, color conversion relationship, and chromaticity references. Decoding the high dynamic range video according to the decoding parameters to obtain video decoding data, which may specifically include: performing inverse quantization processing on the high dynamic range video according to the bit depth to obtain first video data; mapping the first video data from the YUV color space to a nonlinear RGB color space according to the color matrix to obtain second video data; converting the second video data from the nonlinear RGB color space to the linear RGB color space according to the color conversion relationship to obtain third video data; and performing color correction on the third video data according to the chromaticity standard to obtain video decoding data.
Where bit depth refers to the number of binary bits used to store each image pixel. The larger the bit depth, the greater the number of colors that it can support. For example, assuming that the bit depth of an image is denoted by a, the number of colors that the image can support is 2 a. The bit depth in the decoding parameters may be used to determine the number of colors of the video data. Illustratively, the bit depth of HDR video is 10 bits (bits).
A color matrix (color matrix) is a linear transformation matrix that can be used to adjust the color of an image in a color space. Color matrices can be used to change the contrast, brightness, hue, etc. properties of an image, which can be adjusted for color and appearance by linearly combining the colors of the different channels. For example, a color matrix may be used to determine the conversion matrix used by the mapping of video data from the YUV color space to the RGB color space, or from the RGB color space to the YUV color space. The color matrix in the decoding parameters may be used to map video data from the YUV color space to the non-linear RGB color space.
Color transfer refers to a transfer relationship used in color transfer technology that can be used to transfer the color characteristics of one image to another image. Color transfer techniques adjust the color of a target image to be closer to a source image by analyzing the color distribution and characteristics between the source image and the target image. Illustratively, the color conversion relationship may include gamma values of a gamma curve for determining a conversion relationship used for converting video data from a linear RGB color space to a nonlinear RGB color space or from a nonlinear RGB color space to a linear RGB color space. The color conversion relationship in the decoding parameters may be used to convert video data from a non-linear RGB color space to a linear RGB color space.
The color primary (color primary) specifies the main three primary colors in a color image and their corresponding chromaticity coordinates, and may define the range and characteristics of the color space, such as the sRGB color space having its particular chromaticity reference. Chromaticity references are critical to accurately representing and presenting the colors of an image, and different chromaticity references can affect the color rendering and accuracy of an image. The chromaticity reference in the decoding parameters can be used to determine the final color of the three primary colors RGB when the video data is displayed.
Referring to fig. 3, fig. 3 is a schematic diagram illustrating an encoding process and a decoding process of an HDR video according to an embodiment of the present application, wherein (a) in fig. 3 corresponds to the encoding process of the HDR video, and (b) in fig. 3 corresponds to the decoding process of the HDR video. The encoding information of the HDR video in the encoding process may include: color primary (color primary), color conversion relation (color transfer), color matrix (color matrix), and bit depth (bit depth). After the HDR video encoding (encoding) is completed, the encoding information may be stored as metadata (metadata) from which decoding parameters used for the subsequent decoding (decoding) of the HDR video may be derived.
As shown in fig. 3 (a), during encoding of HDR video, scene light (SCENE LIGHT) captured by a camera may be converted into video signals via an opto-electronic transfer function (OETF) to effect conversion of video data from a linear RGB (linear RGB) color space to a non-linear RGB color space. Video data may then be mapped from the non-linear RGB color space to the YUV color space by chroma subsampling (chroma subsampling). The video data may then be quantized (quantized) to a bit depth of 10 bits (bit) to obtain HDR video.
As shown in fig. 3 b, during the decoding process of the HDR video, the HDR video may be dequantized (inverse quantization) to obtain video data (denoted as first video data) in the YUV color space. The first video data may then be mapped from the YUV color space to the non-linear RGB color space by inverse chroma subsampling (inverse chroma subsampling) to obtain second video data. The video signal may then be converted to display light (DISPLAY LIGHT) by an electrotransfer light transfer function (electro-optical transfer function, EOTF) to effect conversion of the second video data from the non-linear RGB color space to the linear RGB color space resulting in third video data. The third video data may be further color corrected according to the chrominance reference to obtain final video decoded data.
Through the embodiment, the terminal equipment can decode the HDR video according to the bit depth, the color matrix, the color conversion relation and the chromaticity reference related to the coding information of the HDR video in the coding process, so that the HDR video can be correctly decoded, and correct video decoding data can be obtained.
In one possible implementation manner, displaying the video decoding data may specifically include: according to the screen characteristics of the display screen, mapping the video decoding data to obtain fourth video data matched with the display screen; and displaying the fourth video data.
Wherein the display screen refers to a screen used by the terminal device to display the HDR video. The screen characteristics of a display screen can be understood as characteristics related to the video display effect of the display screen. For example, the screen characteristics of a display screen may indicate the range of brightness and the range of color gamut that the display screen is capable of supporting.
The properties of the video decoding data (e.g., the brightness range, the color gamut range, etc.) may not match the screen characteristics of the display screen, which may result in poor video display if the video decoding data is directly displayed on the display screen. Therefore, the video decoding data can be mapped according to the screen characteristics of the display screen, so that the video data (marked as fourth video data) matched with the display screen is obtained, and then the fourth video data is displayed, so that the video display effect can be improved.
In addition, the screen characteristics of the display screens of different terminal apparatuses may be different, and the display effect of the same video data on the different display screens may be different. By mapping the video decoding data into the video data matched with the display screen and then displaying the video decoding data, the same video data can obtain better video display effect on different terminal devices, thereby improving the compatibility of the video data.
According to the embodiment, before the video is displayed, the terminal device can firstly map the video decoding data according to the screen characteristics of the display screen to obtain the video data matched with the display screen, and then display the video data matched with the display screen, so that the video display effect can be improved, and in addition, the compatibility of the video data can be improved, so that the terminal device can adapt to different devices.
In one possible implementation, the screen characteristics of the display screen include: displaying brightness information and color gamut information of a screen; according to the screen characteristics of the display screen, mapping the video decoding data to obtain fourth video data adaptive to the display screen, which specifically includes: determining tone mapping relation according to brightness information of a display screen and brightness information of a high dynamic range video; determining a color gamut mapping relation according to the color gamut information of the display screen and the color gamut information of the high dynamic range video; and mapping the video decoding data according to the tone mapping relation and the color gamut mapping relation to obtain fourth video data which is adaptive to the display screen.
The luminance information of the display screen may include a luminance range of the display screen, and a maximum luminance that the display screen can support for display may be determined according to the luminance information of the display screen. The color gamut information of the display screen may include a color gamut range of the display screen, and a maximum color gamut that the display screen can support for display may be determined from the color gamut information of the display screen.
The luminance information of the HDR video may include a luminance range of the HDR video, which may or may not be the same as the luminance range of the display screen. The gamut information of the HDR video may include a gamut range of the HDR video, which may or may not be the same as a gamut range of the display screen.
The tone mapping relationship is used to represent the tone conversion required to map video decoding data to video data adapted to a display screen (i.e., fourth video data). The tone mapping relationship may be determined from luminance information of the display screen and luminance information of the HDR video. Illustratively, the tone mapping relationship may be obtained by fitting a functional relationship between luminance information of the display screen and luminance information of the HDR video.
The color gamut mapping relationship is used to represent the color gamut conversion required to map the video decoding data to video data (i.e., fourth video data) adapted to the display screen. The gamut mapping relationship may be determined from the gamut information of the display screen and the gamut information of the HDR video. Illustratively, the gamut mapping relationship may be obtained by fitting a functional relationship between the gamut information of the display screen and the gamut information of the HDR video.
Referring to fig. 4, fig. 4 is a schematic view of a rendering link of an HDR video according to an embodiment of the present application, wherein source represents the HDR video and screen represents a display screen. The video data converted by an electric-to-optical transfer function (EOTF) is used as video decoding data, and optionally, parameter information such as perceptual quantization (perceptual quantizer, PQ)/hybrid log-gamma (HLG) is used in the EOTF conversion process.
As shown in fig. 4, the rendering link may include the following processing: and determining a tone mapping relation according to the brightness information of the display screen and the brightness information (source/screen luma) of the HDR video, and performing tone mapping (tone map) on video decoding data according to the tone mapping relation to obtain first processed data. And determining a color gamut mapping relation according to the color gamut information of the display screen and the color gamut information (source/screen gamut) of the HDR video, and performing color gamut mapping (gamut map) on the first processed data according to the color gamut mapping relation to obtain second processed data. Further, the second processed data is color converted (DISPLAY TRANFER) according to the color conversion parameters (SCREEN TRANSFER) of the display screen, so that the second processed data is converted from the linear RGB (linear RGB) color space to the nonlinear RGB (RGB) color space, and finally processed data, namely fourth video data which is adaptive to the display screen, is obtained.
Optionally, the processed data is transferred to a display window (view), where the view refers to the position where the video is finally displayed, and corresponds to a frame buffer. And transmitting the processed data to a frame buffer to display related video pictures.
Optionally, after the terminal device displays the HDR video, for example, when the player of the terminal device exits or destroys, the relevant rendering resources, such as view environment, rendering link, etc., may be released in time, so as to ensure that the resources can be released and recovered in time.
Through the above embodiment, the terminal device may determine the tone mapping relationship according to the luminance information of the display screen and the luminance information of the HDR video, and determine the color gamut mapping relationship according to the color gamut information of the display screen and the color gamut information of the HDR video, and further perform corresponding mapping processing on the video decoding data according to the tone mapping relationship and the color gamut mapping relationship, so that the information such as luminance, tone, color gamut, etc. of the processed video data is adapted to the display screen, thereby improving the video display effect.
In one possible implementation manner, displaying the video decoding data may specifically include: and displaying the video decoding data according to a preset display parameter, wherein the display parameter is related to the attribute information of the high dynamic range video.
In a possible implementation manner, displaying the fourth video data may specifically include: and displaying the fourth video data according to a preset display parameter, wherein the display parameter is related to the attribute information of the high dynamic range video.
The display parameter is a parameter used when the fourth video data is displayed in a display window (view). By way of example, the display parameters may include parameters such as color, brightness, and contrast. The attribute information of the HDR video may include, but is not limited to, color, brightness, contrast, and the like of the HDR video. The display parameters are related to attribute information of the HDR video for indicating how to correctly restore and display the HDR video.
Specifically, when the terminal device acquires the HDR video, the terminal device may synchronously acquire attribute information of the HDR video, and configure display parameters according to the attribute information of the HDR video. And the terminal equipment displays the decoded data (such as the video decoding data or the fourth video data) corresponding to the HDR video according to the configured display parameters, so that the HDR video can be correctly restored and displayed.
By the above embodiment, the terminal device may display the decoded data corresponding to the HDR video according to the preset display parameter, where the display parameter is related to the attribute information of the HDR video and may be used to indicate how to correctly restore and display the HDR video, so that the terminal device may correctly restore and display the HDR video according to the display parameter.
In one possible implementation manner, the display of the video decoding data according to the preset display parameters may specifically include: and displaying the video decoding data according to the preset display parameters and the customized video display effect.
In a possible implementation manner, the displaying of the fourth video data according to the preset display parameters may specifically include: and displaying the fourth video data according to the preset display parameters and the customized video display effect.
The customized video display effect may be a video display effect customized by a user according to a requirement (or preference). By way of example, user requirements may include, but are not limited to, requirements for brightness, contrast, enhancement processing, and special effects processing.
Specifically, the terminal device may add relevant customizing capability in the video display process to support the user to customize the video display effect. The terminal equipment displays decoded data (such as the video decoding data or the fourth video data) corresponding to the HDR video according to the preset display parameters and the customized video display effect, so that the HDR video can be correctly restored and displayed, the customized video display effect can be realized, and the requirement of a user on personalized watching experience can be met.
Through the embodiment, the terminal equipment expands the customizing capability of video display so that a user can customize the video display effect according to the requirement or preference, and the terminal equipment can display decoded data corresponding to the HDR video according to the customized video display effect, so that the customized video display effect can be realized, and the requirement of the user on personalized watching experience can be met.
In one possible embodiment, in response to an adjustment instruction for a display parameter during display of video decoded data, the adjustment indicated by the adjustment instruction is made for the display parameter.
In one possible embodiment, in the process of displaying the fourth video data, in response to an adjustment instruction for a display parameter, an adjustment indicated by the adjustment instruction is performed for the display parameter.
The adjustment instruction is used to instruct to adjust the display parameter, and may be understood as also being used to instruct to adjust the video display effect. Illustratively, the adjustment indicated by the adjustment instruction may include, but is not limited to, adjusting brightness, contrast, etc. of the video display effect.
Specifically, during video display, a user may initiate an adjustment instruction if he or she wants to change the video display effect. After receiving the adjustment instruction, the terminal device can correspondingly adjust the display parameters, so that dynamic adjustment of the video display effect is realized, and further, the video display effect can be adjusted frame by frame.
Through the embodiment, the terminal equipment expands the dynamic adjustment capability for video display, so that a user can initiate an adjustment instruction in real time according to the needs in the video display process, and the terminal equipment can dynamically adjust the video display effect according to the adjustment instruction, so that the user needs can be flexibly met.
Referring to fig. 5, fig. 5 is a flowchart illustrating another video processing method according to an embodiment of the application. As shown in fig. 5, the video processing method includes, but is not limited to, the following steps S501 to S504.
S501, obtaining coding information of the high dynamic range video in the coding process.
S502, configuring decoding parameters based on the coding information, and decoding the high dynamic range video according to the decoding parameters to obtain video decoding data, wherein the decoding parameters are related to the coding information.
S503, mapping the video decoding data according to the screen characteristics of the display screen to obtain fourth video data adaptive to the display screen.
And S504, displaying the fourth video data according to a preset display parameter, wherein the display parameter is related to attribute information of the high dynamic range video.
It should be understood that, for the specific description of the above steps S501 to S504, reference may be made to the related description in the foregoing embodiments, which is not repeated herein.
By the above embodiment, the terminal device may configure a decoding parameter based on the encoding information of the HDR video in the encoding process, and perform decoding processing on the HDR video according to the decoding parameter to obtain video decoding data, where the decoding parameter is related to the encoding information of the HDR video in the encoding process, and may be used to indicate how to correctly decode the HDR video, so that the terminal device may correctly decode the HDR video according to the decoding parameter, thereby helping to correctly display the HDR video. Before displaying the video, the terminal device can map the video decoding data according to the screen characteristics of the display screen to obtain video data matched with the display screen, and then display the video data matched with the display screen, so that the video display effect can be improved, and in addition, the compatibility of the video data can be improved, so that the terminal device can adapt to different devices. In addition, the terminal device may further display decoded data corresponding to the HDR video according to a preset display parameter, where the display parameter is related to attribute information of the HDR video and may be used to indicate how to correctly restore and display the HDR video, so that the terminal device may correctly restore and display the HDR video according to the display parameter.
The video processing method according to the embodiment of the present application is described below taking a scenario in which an iOS device plays an HDR video of a video recommendation page as an example. The iOS device refers to a device using an iOS operating system, such as an iPad or iPhone device. The video recommendation page may be a video recommendation page in a platform capable of providing video services (e.g., a content sharing platform, a social platform, or a video playback platform, etc.), in which one or more video covers (or thumbnails) may be displayed.
When a user clicks on a certain video cover in the video recommendation page through the iOS device, the iOS device starts to create a player and performs the following steps: acquiring an HDR video corresponding to the video cover, coding information of the HDR video in a coding process and attribute information of the HDR video; configuring decoding parameters based on the encoding information, and configuring display parameters based on the attribute information; decoding the HDR video according to the decoding parameters by utilizing hardware resources of the iOS equipment to obtain video decoding data; mapping the video decoding data according to the screen characteristics of the display screen of the iOS equipment to obtain video data which is matched with the display screen of the iOS equipment; and playing and displaying the video data matched with the display screen of the iOS equipment according to the display parameters.
Through the above embodiments, on the one hand, decoding parameters are configured based on encoding information in the encoding process of the HDR video, and the HDR video can be correctly decoded according to the decoding parameters; on the other hand, the hardware resources of the iOS equipment are utilized to decode the HDR video according to the decoding parameters, so that the hardware acceleration capability of the iOS equipment can be fully exerted, and the playing fluency of the HDR video can be further improved on the basis of correct decoding; in still another aspect, mapping processing is performed on the video decoding data according to the screen characteristics of the display screen of the iOS device, and video data adapted to the display screen of the iOS device is obtained and then displayed, so that the video display effect can be improved, and the video decoding data can be adapted to the iOS device; in yet another aspect, display parameters are configured based on attribute information of an HDR video, according to which the HDR video can be correctly restored.
It will be appreciated by those skilled in the art that in the above-described method of the specific embodiments, the written order of steps is not meant to imply a strict order of execution but rather should be construed according to the function and possibly inherent logic of the steps.
The foregoing description of various embodiments is intended to highlight differences between the various embodiments, which may be the same or similar to each other by reference, and is not repeated herein for the sake of brevity.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a video processing apparatus according to an embodiment of the present application, and the video processing apparatus 600 includes: an acquisition unit 601, a decoding unit 602, and a display unit 603, wherein:
an obtaining unit 601, configured to obtain a high dynamic range video and coding information of the high dynamic range video in a coding process;
The decoding unit 602 is configured to configure decoding parameters based on the encoding information, and perform decoding processing on the high dynamic range video according to the decoding parameters to obtain video decoding data, where the decoding parameters are related to the decoding information;
and a display unit 603 for displaying the video decoded data.
By the above embodiment, the video processing device may configure a decoding parameter based on the encoding information of the HDR video in the encoding process, and perform decoding processing on the HDR video according to the decoding parameter to obtain video decoding data, where the decoding parameter is related to the encoding information of the HDR video in the encoding process, and may be used to instruct how to correctly decode the HDR video, so that the video processing device may correctly decode the HDR video according to the decoding parameter, and may further correctly display the HDR video.
In some possible implementations, the decoding parameters include: bit depth, color matrix, color conversion relationship, and chromaticity reference; the decoding unit 602 is specifically configured to, when performing decoding processing on the high dynamic range video according to the decoding parameters to obtain video decoded data: performing inverse quantization processing on the high dynamic range video according to the bit depth to obtain first video data; mapping the first video data from the YUV color space to a nonlinear RGB color space according to the color matrix to obtain second video data; converting the second video data from the nonlinear RGB color space to the linear RGB color space according to the color conversion relationship to obtain third video data; and performing color correction on the third video data according to the chromaticity standard to obtain video decoding data.
Through the embodiment, the video processing device can decode the HDR video according to the bit depth, the color matrix, the color conversion relation and the chromaticity reference related to the encoding information of the HDR video in the encoding process, so that the HDR video can be correctly decoded, and correct video decoding data can be obtained.
In some possible embodiments, the display unit 603 is specifically configured to, when displaying the video decoded data: according to the screen characteristics of the display screen, mapping the video decoding data to obtain fourth video data matched with the display screen; and displaying the fourth video data.
According to the embodiment, before the video is displayed, the video processing device can perform mapping processing on the video decoding data according to the screen characteristics of the display screen to obtain the video data matched with the display screen, and then display the video data matched with the display screen, so that the video display effect can be improved, and in addition, the compatibility of the video data can be improved, so that the video processing device can adapt to different devices.
In some possible implementations, the screen characteristics of the display screen include: displaying brightness information and color gamut information of a screen; the display unit 603 is specifically configured to, when mapping the video decoding data according to the screen characteristic of the display screen to obtain fourth video data adapted to the display screen: determining tone mapping relation according to brightness information of a display screen and brightness information of a high dynamic range video; determining a color gamut mapping relation according to the color gamut information of the display screen and the color gamut information of the high dynamic range video; and mapping the video decoding data according to the tone mapping relation and the color gamut mapping relation to obtain fourth video data which is adaptive to the display screen.
Through the above embodiment, the video processing device may determine the tone mapping relationship according to the luminance information of the display screen and the luminance information of the HDR video, and determine the color gamut mapping relationship according to the color gamut information of the display screen and the color gamut information of the HDR video, and further perform corresponding mapping processing on the video decoding data according to the tone mapping relationship and the color gamut mapping relationship, so that the information such as the luminance, the tone, the color gamut, and the like of the processed video data is adapted to the display screen, thereby improving the video display effect.
In some possible embodiments, the display unit 603 is specifically configured to, when displaying the video decoded data: and displaying the video decoding data according to a preset display parameter, wherein the display parameter is related to the attribute information of the high dynamic range video.
In some possible embodiments, the display unit 603 is specifically configured to, when displaying the fourth video data: and displaying the fourth video data according to a preset display parameter, wherein the display parameter is related to the attribute information of the high dynamic range video.
By the above embodiment, the video processing device may display decoded data (for example, the video decoding data or the fourth video data) corresponding to the HDR video according to a preset display parameter, where the display parameter is related to attribute information of the HDR video and may be used to indicate how to correctly restore and display the HDR video, so that the video processing device may correctly restore and display the HDR video according to the display parameter.
In some possible embodiments, the display unit 603 is specifically configured to, when displaying the video decoded data according to a pre-configured display parameter: and displaying the video decoding data according to the preset display parameters and the customized video display effect.
In some possible embodiments, the display unit 603 is specifically configured to, when displaying the fourth video data according to a preconfigured display parameter: and displaying the fourth video data according to the preset display parameters and the customized video display effect.
Through the embodiment, the video processing device expands the customizing capability of video display so that a user can customize the video display effect according to the requirement or preference, and the video processing device can display decoded data (such as the video decoding data or the fourth video data) corresponding to the HDR video according to the customized video display effect, so that the customized video display effect can be realized, and the requirement of the user on personalized watching experience can be met.
In some possible implementations, the display unit 603 is further configured to: in the process of displaying the video decoding data, responding to the adjustment instruction of the display parameters, and adjusting the display parameters indicated by the adjustment instruction.
In some possible implementations, the display unit 603 is further configured to: in the process of displaying the fourth video data, responding to the adjustment instruction of the display parameters, and adjusting the display parameters indicated by the adjustment instruction.
Through the embodiment, the video processing device expands the dynamic adjustment capability for video display, so that a user can initiate an adjustment instruction in real time according to the needs in the video display process, and the terminal equipment can dynamically adjust the video display effect according to the adjustment instruction, so that the user needs can be flexibly met.
For specific limitations of the video processing apparatus, reference may be made to the above limitations of the video processing method, and no further description is given here. The respective units in the video processing apparatus described above may be implemented in whole or in part by software, hardware, and combinations thereof. The units may be embedded in hardware or independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor invokes and executes operations corresponding to the units.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the application. The electronic device 700 includes a memory 701 and a processor 702. Optionally, the electronic device 700 further comprises a communication interface 703 and a bus 704. The memory 701, the processor 702 and the communication interface 703 realize communication connection with each other through a bus 704. The memory 701 stores a computer program, and the processor 702 is configured to execute the computer program stored in the memory 701 to implement the method in the above-described method embodiments.
The embodiment of the application also provides a computer readable storage medium, wherein a computer program is stored in the computer readable storage medium, and the computer program realizes the method in the above method embodiments when being executed by a processor.
The present application also provides a computer program product comprising a computer program which, when executed by a processor, implements the method of the method embodiments described above.
It should be appreciated that the memory/readable storage medium in embodiments of the present application may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The nonvolatile memory may be a hard disk (HARD DISK DRIVE, HDD), a Solid State Disk (SSD), a read-only memory (ROM), a flash memory, or the like. Volatile memory can be random access memory (random access memory, RAM) or external cache memory, and RAM can be in various forms, such as static random access memory (static random access memory, SRAM) or dynamic random access memory (dynamic random access memory, DRAM), for example, by way of illustration and not limitation.
The processor in the embodiment of the present application may be one or a combination of processing modules such as a central processing unit (central processing unit, CPU), a graphics card processor (graphics processing unit, GPU) or a microprocessor (microprocessor unit, MPU), and may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL processors, DSP), application Specific Integrated Circuits (ASIC), off-the-shelf programmable gate arrays (field programmable GATE ARRAY, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, and the like.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application.

Claims (11)

1. A method of video processing, the method comprising:
Acquiring coding information of a high dynamic range video in a coding process;
configuring decoding parameters based on the coding information, and decoding the high dynamic range video according to the decoding parameters to obtain video decoding data, wherein the decoding parameters are related to the coding information;
And displaying the video decoding data.
2. The method of claim 1, wherein the decoding parameters comprise: bit depth, color matrix, color conversion relationship, and chromaticity reference;
the decoding processing is performed on the high dynamic range video according to the decoding parameters to obtain video decoding data, including:
performing inverse quantization processing on the high dynamic range video according to the bit depth to obtain first video data;
Mapping the first video data from a YUV color space to a nonlinear RGB color space according to the color matrix to obtain second video data;
Converting the second video data from the nonlinear RGB color space to a linear RGB color space according to the color conversion relationship to obtain third video data;
and performing color correction on the third video data according to the chromaticity standard to obtain the video decoding data.
3. The method of claim 1, wherein displaying the video decoded data comprises:
According to the screen characteristics of a display screen, mapping the video decoding data to obtain fourth video data adaptive to the display screen;
And displaying the fourth video data.
4. A method according to claim 3, wherein the screen characteristics of the display screen include: brightness information and color gamut information of the display screen;
The mapping processing is performed on the video decoding data according to the screen characteristics of the display screen to obtain fourth video data adapted to the display screen, including:
Determining tone mapping relation according to the brightness information of the display screen and the brightness information of the high dynamic range video;
determining a color gamut mapping relation according to the color gamut information of the display screen and the color gamut information of the high dynamic range video;
and mapping the video decoding data according to the tone mapping relation and the color gamut mapping relation to obtain fourth video data adaptive to the display screen.
5. A method according to claim 3, wherein said displaying said fourth video data comprises:
And displaying the fourth video data according to a preset display parameter, wherein the display parameter is related to the attribute information of the high-dynamic-range video.
6. The method of claim 5, wherein displaying the fourth video data according to pre-configured display parameters comprises:
and displaying the fourth video data according to the preset display parameters and the customized video display effect.
7. The method of claim 6, wherein the method further comprises:
and in the process of displaying the fourth video data, responding to the adjustment instruction of the display parameter, and adjusting the display parameter indicated by the adjustment instruction.
8. A video processing apparatus, the apparatus comprising:
The acquisition unit is used for acquiring the high dynamic range video and the coding information of the high dynamic range video in the coding process;
The decoding unit is used for configuring decoding parameters based on the coding information, and decoding the high dynamic range video according to the decoding parameters to obtain video decoding data, wherein the decoding parameters are related to the coding information;
And the display unit is used for displaying the video decoding data.
9. An electronic device comprising a memory storing a computer program and a processor implementing the method of any one of claims 1 to 7 when the computer program is executed by the processor.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when executed by a processor, implements the method of any of claims 1 to 7.
11. A computer program product, characterized in that the computer program product comprises a computer program which, when executed by a processor, implements the method of any one of claims 1 to 7.
CN202410023744.9A 2024-01-08 2024-01-08 Video processing method, device, electronic equipment and storage medium Pending CN117979017A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410023744.9A CN117979017A (en) 2024-01-08 2024-01-08 Video processing method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410023744.9A CN117979017A (en) 2024-01-08 2024-01-08 Video processing method, device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117979017A true CN117979017A (en) 2024-05-03

Family

ID=90858793

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410023744.9A Pending CN117979017A (en) 2024-01-08 2024-01-08 Video processing method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117979017A (en)

Similar Documents

Publication Publication Date Title
EP3465673B1 (en) Transitioning between video priority and graphics priority
US9984446B2 (en) Video tone mapping for converting high dynamic range (HDR) content to standard dynamic range (SDR) content
KR102617258B1 (en) Image processing method and apparatus
JP6946325B2 (en) Methods and devices for encoding high dynamic range pictures, corresponding coding methods and devices
US20180005357A1 (en) Method and device for mapping a hdr picture to a sdr picture and corresponding sdr to hdr mapping method and device
CN108460732B (en) Method and computing device for processing images
US20180005358A1 (en) A method and apparatus for inverse-tone mapping a picture
US8878867B2 (en) Transparency information in image or video format not natively supporting transparency
JP2017522794A (en) Method and apparatus for signaling in a bitstream the picture / video format of an LDR picture and the picture / video format of a decoded HDR picture obtained from the LDR picture and the illumination picture
CN114501023B (en) Video processing method, device, computer equipment and storage medium
US10573279B2 (en) Systems and methods for combining video and graphic sources for display
KR20200095651A (en) An electronic device for playing high dynamic range content and method thereof
US11094286B2 (en) Image processing apparatus and image processing method
US10423587B2 (en) Systems and methods for rendering graphical assets
CN117979017A (en) Video processing method, device, electronic equipment and storage medium
WO2024097135A1 (en) High dynamic range video formats with low dynamic range compatibility
WO2024023008A1 (en) Method for preventing clipping in sl-hdrx systems
CN117014692A (en) Screen projection method and related device
WO2024096931A1 (en) High dynamic range image format with low dynamic range compatibility
WO2023194089A1 (en) Method for correcting sdr pictures in a sl-hdr1 system
WO2023150074A1 (en) Beta scale dynamic display mapping
CN118781033A (en) Image processing method, image processing apparatus, electronic device, and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination