[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN116962688A - Loop filtering method, video encoding and decoding method, device, medium and electronic equipment - Google Patents

Loop filtering method, video encoding and decoding method, device, medium and electronic equipment Download PDF

Info

Publication number
CN116962688A
CN116962688A CN202210444332.3A CN202210444332A CN116962688A CN 116962688 A CN116962688 A CN 116962688A CN 202210444332 A CN202210444332 A CN 202210444332A CN 116962688 A CN116962688 A CN 116962688A
Authority
CN
China
Prior art keywords
unit
aps
image frame
video image
filtering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210444332.3A
Other languages
Chinese (zh)
Inventor
张瀚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202210444332.3A priority Critical patent/CN116962688A/en
Priority to PCT/CN2022/137900 priority patent/WO2023202097A1/en
Publication of CN116962688A publication Critical patent/CN116962688A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/117Filters, e.g. for pre-processing or post-processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/80Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation
    • H04N19/82Details of filtering operations specially adapted for video compression, e.g. for pixel interpolation involving filtering within a prediction loop

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The embodiment of the application provides a loop filtering method, a video encoding and decoding method, a device, a medium and electronic equipment. The loop filtering method comprises the following steps: obtaining a reconstruction component corresponding to a video image frame; the reconstruction component is subjected to filtering processing through a loop filtering unit, wherein the loop filtering unit comprises a deblocking effect filtering DF unit, a designated loop filter and an adaptive loop filtering ALF unit which are connected in sequence; wherein the input of the ALF unit further comprises at least one of: a signal input to the DF unit, an output signal of the DF unit; and taking the output of the loop filtering unit as a reconstructed image corresponding to the video image frame. The technical scheme of the embodiment of the application can improve the filtering effect and the filtering quality of ALF, thereby being beneficial to improving the coding and decoding performance.

Description

Loop filtering method, video encoding and decoding method, device, medium and electronic equipment
Technical Field
The present application relates to the field of computers and communications technologies, and in particular, to a loop filtering method, a video encoding and decoding method, a device, a medium, and an electronic apparatus.
Background
ALF (Adaptive Loop Filtering ) is a wiener filter, which can use the correlation with surrounding pixels within the coverage of the filter template to derive the optimal coefficient, improve the quality of the pixel after ALF filtering, and reduce the difference with the original pixel. However, in the related art, the ALF uses a fixed template, and uses the output of a single loop filter as the input of the ALF, which affects the performance of the ALF itself and restricts the improvement of the filtering effect.
Disclosure of Invention
The embodiment of the application provides a loop filtering method, a video encoding and decoding method, a device, a medium and electronic equipment, which can improve the filtering effect and the filtering quality of ALF, thereby being beneficial to improving the encoding and decoding performance.
Other features and advantages of the application will be apparent from the following detailed description, or may be learned by the practice of the application.
According to an aspect of an embodiment of the present application, there is provided a loop filtering method including: obtaining a reconstruction component corresponding to a video image frame; the reconstruction component is subjected to filtering processing through a loop filtering unit, wherein the loop filtering unit comprises a deblocking effect filtering DF unit, a designated loop filter and an adaptive loop filtering ALF unit which are connected in sequence; wherein the input of the ALF unit further comprises at least one of: a signal input to the DF unit, an output signal of the DF unit; and taking the output of the loop filtering unit as a reconstructed image corresponding to the video image frame.
According to an aspect of an embodiment of the present application, there is provided a video decoding method including: the method comprises the steps of performing filtering processing on a reconstruction component corresponding to a video image frame through a loop filtering method in the embodiment of the application to obtain a reconstruction image corresponding to the video image frame; and taking the reconstructed image corresponding to the video image frame as a video image obtained by decoding.
According to an aspect of an embodiment of the present application, there is provided a video encoding method including: acquiring a video image frame to be encoded; the reconstruction components corresponding to the video image frames to be encoded are subjected to filtering processing by the loop filtering method in the embodiment of the application, so that the reconstruction images corresponding to the video image frames to be encoded are obtained.
According to an aspect of an embodiment of the present application, there is provided a loop filter apparatus including: an acquisition unit configured to acquire a reconstruction component corresponding to a video image frame; the filtering unit is configured to carry out filtering processing on the reconstruction component through a loop filtering unit, and the loop filtering unit comprises a deblocking effect filtering DF unit, a designated loop filter and an adaptive loop filtering ALF unit which are sequentially connected; wherein the input of the ALF unit further comprises at least one of: a signal input to the DF unit, an output signal of the DF unit; and taking the output of the loop filtering unit as a reconstructed image corresponding to the video image frame.
In some embodiments of the application, based on the foregoing, if the input of the ALF unit includes a signal input to the DF unit, the filtering unit is configured to: if the DF unit is not started by the video image frame or the current stripe in the video image frame, limiting the video image frame or the current stripe to refer to a first adaptive parameter set APS during filtering processing; the first adaptive parameter set APS includes a first flag bit, and a value of the first flag bit included in the first APS is used to indicate a filter coefficient at a first corresponding position in the first adaptive parameter set APS, where the first corresponding position is a coefficient position corresponding to a signal input to the DF unit.
In some embodiments of the application, based on the foregoing, if the input of the ALF unit includes a signal input to the DF unit, the filtering unit is configured to: if the DF unit is turned on by the video image frame or the current stripe, limiting the video image frame or the current stripe to refer to a second APS during filtering; the second APS includes a first flag bit, and a value of the first flag bit included in the second APS is used to indicate a default filter coefficient in the second APS at the first corresponding position.
In some embodiments of the application, based on the foregoing, if the input of the ALF unit includes a signal input to the DF unit, the filtering unit is configured to: if the DF unit is not turned on by the video image frame or the current stripe of the video image frame and the video image frame or the current stripe references a first APS, setting a filter coefficient at a first corresponding position in the first APS to 0 when the first APS is applied to the video image frame or the current stripe for filtering processing; the first APS includes a first flag bit, and a value of the first flag bit included in the first APS is used to indicate that the first APS includes a filter coefficient at the first corresponding position, where the first corresponding position is a coefficient position corresponding to a signal input to the DF unit.
In some embodiments of the application, based on the foregoing, if the input of the ALF unit includes a signal input to the DF unit, the filtering unit is configured to: if the video image frame or the current stripe starts the DF unit and the video image frame or the current stripe references a second APS, filling a filter coefficient at the first corresponding position in the second APS to 0 when the second APS is applied to the video image frame or the current stripe for filtering; the second APS includes a first flag bit, and a value of the first flag bit included in the second APS is used to indicate a default filter coefficient in the second APS at the first corresponding position.
In some embodiments of the present application, based on the foregoing, if the input of the ALF unit includes the output signal of the DF unit, the filtering unit is configured to: if the video image frame or the current stripe in the video image frame does not turn on the designated loop filter, limiting the video image frame or the current stripe to refer to a third APS during filtering processing; the third APS includes a second flag bit, and a value of the second flag bit included in the third APS is used to indicate that the third APS includes a filter coefficient at a second corresponding position, where the second corresponding position is a coefficient position corresponding to an output signal of the finger DF unit.
In some embodiments of the present application, based on the foregoing, if the input of the ALF unit includes the output signal of the DF unit, the filtering unit is configured to: limiting the video image frame or the current slice to refer to a fourth APS during filtering processing if the video image frame or the current slice turns on the designated loop filter; wherein a second flag bit is included in the fourth APS, and a value of the second flag bit included in the fourth APS is used to indicate a filter coefficient at the second corresponding position by default in the fourth APS.
In some embodiments of the present application, based on the foregoing, if the input of the ALF unit includes the output signal of the DF unit, the filtering unit is configured to: if the specified loop filter is not turned on by the video image frame or the current stripe in the video image frame, and the video image frame or the current stripe references a third APS, setting a filter coefficient at a second corresponding position in the third APS to 0 when the third APS is applied to the video image frame or the current stripe for filtering processing; the third APS includes a second flag bit, and a value of the second flag bit included in the third APS is used to indicate that the third APS includes a filter coefficient at the second corresponding position, where the second corresponding position is a coefficient position corresponding to an output signal of the finger DF unit.
In some embodiments of the present application, based on the foregoing, if the input of the ALF unit includes the output signal of the DF unit, the filtering unit is configured to: if the video image frame or the current stripe turns on the specified loop filter and the video image frame or the current stripe references a fourth APS, filling a filter coefficient at the second corresponding position in the fourth APS with 0 when the fourth APS is applied to the video image frame or the current stripe for filtering processing; wherein a second flag bit is included in the fourth APS, and a value of the second flag bit included in the fourth APS is used to indicate a filter coefficient at the second corresponding position by default in the fourth APS.
In some embodiments of the present application, based on the foregoing, if the input of the ALF unit includes a signal input to the DF unit, the loop filtering apparatus further includes: a processing unit configured to: if the video image frame or the current stripe in the video image frame does not start the DF unit, the filter coefficient at the first corresponding position is set to 0 in the APS containing the filter coefficient of the video image frame or the current stripe, and the first corresponding position is the coefficient position corresponding to the signal input to the DF unit.
In some embodiments of the application, based on the foregoing, if the input of the ALF unit includes a signal input to the DF unit, the filtering unit is configured to: if the video image frame or the current stripe in the video image frame does not start the DF unit, when the APS referenced by the video image frame or the current stripe is applied to the video image frame or the current stripe for filtering, setting a filter coefficient at a first corresponding position in the APS referenced by the video image frame or the current stripe to 0, where the first corresponding position is a coefficient position corresponding to a signal input to the DF unit.
In some embodiments of the present application, based on the foregoing, if the input of the ALF unit includes the output signal of the DF unit, the loop filtering apparatus further includes: and the processing unit is configured to set a filter coefficient at a second corresponding position to be 0 in the APS containing the filter coefficient of the video image frame or the current band if the specified loop filter is not started by the current band in the video image frame or the video image frame, wherein the second corresponding position is a coefficient position corresponding to an output signal of the DF unit.
In some embodiments of the present application, based on the foregoing, if the input of the ALF unit includes the output signal of the DF unit, the filtering unit is configured to: if the specified loop filter is not turned on by the video image frame or the current stripe in the video image frame, when the APS referenced by the video image frame or the current stripe is applied to the video image frame or the current stripe for filtering, a filter coefficient at a second corresponding position in the APS referenced by the video image frame or the current stripe is set to 0, where the second corresponding position is a coefficient position corresponding to an output signal of the DF unit.
In some embodiments of the application, based on the foregoing, the specified loop filter includes at least one of: double sideband filter, sampling point self-adaptive compensation filter and cross-component sampling point self-adaptive compensation filter.
According to an aspect of an embodiment of the present application, there is provided a video decoding apparatus including: an acquisition unit configured to acquire a reconstruction component corresponding to a video image frame; the filtering unit is configured to carry out filtering processing on the reconstruction component through a loop filtering unit, and the loop filtering unit comprises a deblocking effect filtering DF unit, a designated loop filter and an adaptive loop filtering ALF unit which are sequentially connected; wherein the input of the ALF unit further comprises at least one of: a signal input to the DF unit, an output signal of the DF unit; and taking the output of the loop filtering unit as a reconstructed image corresponding to the video image frame; and the decoding unit is configured to take the reconstructed image corresponding to the video image frame as a video image obtained by decoding.
According to an aspect of an embodiment of the present application, there is provided a video encoding apparatus including: an acquisition unit configured to acquire a video image frame to be encoded and a reconstruction component corresponding to the video image frame to be encoded; the filtering unit is configured to carry out filtering processing on reconstruction components corresponding to the video image frames to be encoded through a loop filtering unit, and the loop filtering unit comprises a deblocking filtering DF unit, a designated loop filter and an adaptive loop filtering ALF unit which are sequentially connected; wherein the input of the ALF unit further comprises at least one of: a signal input to the DF unit, an output signal of the DF unit; and taking the output of the loop filtering unit as a reconstructed image corresponding to the video image frame to be encoded.
In some embodiments of the present application, based on the foregoing, if the input of the ALF unit includes a signal input to the DF unit, the video encoding apparatus further includes: and the encoding unit is configured to add a first flag bit in an APS containing the filter coefficients of the video image frame or the current band according to whether the DF unit is started or not, wherein the value of the first flag bit is used for indicating whether the filter coefficients at a first corresponding position are default in the APS or not, and the first corresponding position is a coefficient position corresponding to a signal input to the DF unit.
In some embodiments of the application, based on the foregoing scheme, the encoding unit is configured to: if the video image frame or the current stripe does not start the DF unit, setting the value of a first flag bit added in the APS to be a first value so as to indicate a filter coefficient at the first corresponding position default in the APS; if the video image frame or the current stripe turns on the DF unit, the value of the first flag bit added in the APS is set to a second value to indicate that the APS includes the filter coefficient at the first corresponding position.
In some embodiments of the present application, based on the foregoing, if the input of the ALF unit includes the output signal of the DF unit, the video encoding apparatus further includes: and the encoding unit is configured to add a second flag bit in the APS containing the filter coefficients of the video image frame or the current band according to whether the specified loop filter is started or not by the current band in the video image frame or the video image frame, wherein the value of the second flag bit is used for indicating whether the filter coefficients at a second corresponding position default in the APS are indicated, and the second corresponding position is a coefficient position corresponding to an output signal of the DF unit.
In some embodiments of the application, based on the foregoing scheme, the encoding unit is configured to: if the video image frame or the current stripe does not turn on the designated loop filter, setting a value of a second flag bit added in the APS to a first value to indicate a filter coefficient at the second corresponding position by default in the APS; and if the video image frame or the current band starts the designated loop filter, setting a value of a second flag bit added in the APS to be a second value so as to indicate that the APS comprises a filter coefficient at the second corresponding position.
According to an aspect of the embodiments of the present application, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements a loop filtering method, a video decoding method or a video encoding method as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided an electronic apparatus including: one or more processors; and storage means for storing one or more computer programs which, when executed by the one or more processors, cause the electronic device to implement the loop filtering method, the video decoding method, or the video encoding method as described in the above embodiments.
According to an aspect of an embodiment of the present application, there is provided a computer program product comprising a computer program stored in a computer readable storage medium. The processor of the electronic device reads and executes the computer program from the computer-readable storage medium, so that the electronic device performs the loop filtering method, the video decoding method, or the video encoding method provided in the above-described various alternative embodiments.
In the technical solutions provided in some embodiments of the present application, at least one of the signal input to the DF unit and the output signal of the DF unit is used as an additional input signal of the ALF unit during loop filtering, so that the correlation of peripheral pixels can be better utilized to perform filtering processing, and the filtering effect and filtering quality of the ALF are improved, thereby being beneficial to improving the coding and decoding performance.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application as claimed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application. It is evident that the drawings in the following description are only some embodiments of the present application and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art. In the drawings:
FIG. 1 shows a schematic diagram of an exemplary system architecture to which the technical solution of an embodiment of the application may be applied;
fig. 2 shows a schematic diagram of the placement of a video encoding device and a video decoding device in a streaming system;
FIG. 3 shows a basic flow diagram of a video encoder;
FIG. 4 shows an overall structure of a VVC and a loop filtering process schematic;
fig. 5 is a schematic diagram showing a module structure of loop filtering in VCC;
FIG. 6 shows a block diagram of loop filtering in an ECM;
FIG. 7 is a schematic diagram showing a module configuration of loop filtering in one embodiment of the present application;
FIG. 8 is a schematic diagram showing a loop filter module structure in one embodiment of the application;
FIG. 9 is a schematic diagram showing a module configuration of loop filtering in one embodiment of the present application;
FIG. 10 shows a flow chart of a loop filtering method according to one embodiment of the application;
FIG. 11 shows a block diagram of a ring filter device according to one embodiment of the application;
fig. 12 shows a block diagram of a video decoding apparatus according to an embodiment of the present application;
fig. 13 shows a block diagram of a video encoding apparatus according to an embodiment of the present application;
fig. 14 shows a schematic diagram of a computer system suitable for use in implementing an embodiment of the application.
Detailed Description
Example embodiments are now described in a more complete manner with reference being made to the figures. However, the illustrated embodiments may be embodied in various forms and should not be construed as limited to only these examples; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics of the application may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the application. However, it will be recognized by one skilled in the art that the present inventive arrangements may be practiced without all of the specific details of the embodiments, that one or more specific details may be omitted, or that other methods, elements, devices, steps, etc. may be used.
The block diagrams depicted in the figures are merely functional entities and do not necessarily correspond to physically separate entities. That is, the functional entities may be implemented in software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
The flow diagrams depicted in the figures are exemplary only, and do not necessarily include all of the elements and operations/steps, nor must they be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the order of actual execution may be changed according to actual situations.
Fig. 1 shows a schematic diagram of an exemplary system architecture to which the technical solution of an embodiment of the present application may be applied.
As shown in fig. 1, the system architecture 100 includes a plurality of terminal devices that can communicate with each other through, for example, a network 150. For example, the system architecture 100 may include a first terminal device 110 and a second terminal device 120 interconnected by a network 150. In the embodiment of fig. 1, the first terminal apparatus 110 and the second terminal apparatus 120 perform unidirectional data transmission.
For example, the first terminal device 110 may encode video data (e.g., a stream of video pictures collected by the terminal device 110) for transmission over the network 150 to the second terminal device 120, the encoded video data transmitted in one or more encoded video code streams, the second terminal device 120 may receive the encoded video data from the network 150, decode the encoded video data to recover the video data, and display the video pictures in accordance with the recovered video data.
In one embodiment of the application, the system architecture 100 may include a third terminal device 130 and a fourth terminal device 140 that perform bi-directional transmission of encoded video data, such as may occur during a video conference. For bi-directional data transmission, each of the third terminal device 130 and the fourth terminal device 140 may encode video data (e.g., a stream of video pictures collected by the terminal device) for transmission over the network 150 to the other of the third terminal device 130 and the fourth terminal device 140. Each of the third terminal device 130 and the fourth terminal device 140 may also receive encoded video data transmitted by the other of the third terminal device 130 and the fourth terminal device 140, and may decode the encoded video data to restore the video data, and may display a video picture on an accessible display device according to the restored video data.
In the embodiment of fig. 1, the first terminal apparatus 110, the second terminal apparatus 120, the third terminal apparatus 130, and the fourth terminal apparatus 140 may be servers or terminals, and the servers may be independent physical servers, or may be server clusters or distributed systems formed by a plurality of physical servers, or may be cloud servers that provide cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, CDNs (Content Delivery Network, content delivery networks), and basic cloud computing services such as big data and artificial intelligence platforms. The terminal may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, a smart voice interaction device, a smart home appliance, a vehicle-mounted terminal, an aircraft, etc.
Network 150 represents any number of networks that transfer encoded video data between first terminal device 110, second terminal device 120, third terminal device 130, and fourth terminal device 140, including, for example, wired and/or wireless communication networks. The communication network 150 may exchange data in circuit-switched and/or packet-switched channels. The network may include a telecommunications network, a local area network, a wide area network, and/or the internet. For the purposes of the present application, the architecture and topology of network 150 may be irrelevant to the operation of the present disclosure, unless explained below.
In one embodiment of the present application, fig. 2 illustrates the placement of a video encoding device and a video decoding device in a streaming environment. The presently disclosed subject matter is equally applicable to other video-enabled applications including, for example, video conferencing, digital TV (television), storing compressed video on digital media including CDs, DVDs, memory sticks, etc.
The streaming system may include an acquisition subsystem 213, and the acquisition subsystem 213 may include a video source 201, such as a digital camera, that creates an uncompressed video picture stream 202. In an embodiment, the video picture stream 202 includes samples taken by a digital camera. The video picture stream 202 is depicted as a bold line to emphasize a high data volume video picture stream compared to the encoded video data 204 (or the encoded video code stream 204), the video picture stream 202 may be processed by an electronic device 220, the electronic device 220 comprising a video encoding device 203 coupled to the video source 201. The video encoding device 203 may include hardware, software, or a combination of hardware and software to implement or implement aspects of the disclosed subject matter as described in more detail below. The encoded video data 204 (or encoded video stream 204) is depicted as a thin line compared to the video picture stream 202 to emphasize a lower amount of encoded video data 204 (or encoded video stream 204), which may be stored on the streaming server 205 for future use. One or more streaming client subsystems, such as client subsystem 206 and client subsystem 208 in fig. 2, may access streaming server 205 to retrieve copies 207 and 209 of encoded video data 204. Client subsystem 206 may include video decoding device 210, for example, in electronic device 230. Video decoding device 210 decodes an incoming copy 207 of the encoded video data and generates an output video picture stream 211 that may be presented on a display 212 (e.g., a display screen) or another presentation device. In some streaming systems, the encoded video data 204, 207, and 209 (e.g., video streams) may be encoded according to some video encoding/compression standard.
It should be noted that electronic device 220 and electronic device 230 may include other components not shown in the figures. For example, electronic device 220 may comprise a video decoding device, and electronic device 230 may also comprise a video encoding device.
In one embodiment of the present application, taking the international video coding standard HEVC (High Efficiency Video Coding ), VVC (Versatile Video Coding, multi-function video coding), and the national video coding standard AVS as examples, after a video frame image is input, the video frame image is divided into a plurality of non-overlapping processing units according to a block size, and each processing unit will perform a similar compression operation. This processing unit is called CTU or LCU (Largest Coding Unit, maximum coding unit). The CTU may further perform finer division down to obtain one or more basic Coding units CU (Coding Unit), which are the most basic elements in a Coding link.
Some concepts when coding a CU are presented below:
predictive coding (Predictive Coding): the predictive coding comprises modes of intra-frame prediction, inter-frame prediction and the like, and the residual video signal is obtained after the original video signal is predicted by the selected reconstructed video signal. The encoding end needs to decide which predictive coding mode to select for the current CU and inform the decoding end. Wherein, intra-frame prediction refers to that a predicted signal comes from a region which is already coded and reconstructed in the same image; inter prediction refers to a predicted signal from an already encoded picture (referred to as a reference picture) other than the current picture.
Transform & Quantization): the residual video signal is subjected to transformation operations such as DFT (Discrete Fourier Transform ), DCT (Discrete Cosine Transform, discrete cosine transform), etc., and then the signal is converted into a transform domain, which is called a transform coefficient. The transformation coefficient is further subjected to lossy quantization operation, and certain information is lost, so that the quantized signal is favorable for compression expression. In some video coding standards, there may be more than one transform mode to choose, so the coding end also needs to choose one of the transform modes for the current CU and inform the decoding end. The quantization refinement is generally determined by a quantization parameter (Quantization Parameter, QP for short), and the QP is larger, so that the coefficients representing a larger range of values will be quantized to the same output, and thus will generally bring more distortion and lower code rate; conversely, a smaller QP value will represent a smaller range of coefficients to be quantized to the same output, and therefore will typically result in less distortion, while corresponding to a higher code rate.
Entropy Coding (Entropy Coding) or statistical Coding: the quantized transform domain signal is subjected to statistical compression coding according to the occurrence frequency of each value, and finally a binary (0 or 1) compressed code stream is output. Meanwhile, encoding generates other information, such as a selected encoding mode, motion vector data, etc., and entropy encoding is also required to reduce the code rate. The statistical coding is a lossless coding mode, which can effectively reduce the code rate required for expressing the same signal, and common statistical coding modes are variable length coding (Variable Length Coding, abbreviated as VLC) or context-based binary arithmetic coding (Content Adaptive Binary Arithmetic Coding, abbreviated as CABAC).
The context-based binary arithmetic coding (CABAC) process mainly includes 3 steps: binarization, context modeling, and binary arithmetic coding. After binarizing the input syntax element, the binary data may be encoded by a normal encoding mode and a bypass encoding mode (Bypass Coding Mode). The bypass coding mode does not need to allocate a specific probability model for each binary bit, and the input binary bit bin value is directly coded by a simple bypass coder so as to accelerate the whole coding and decoding speed. In general, different syntax elements are not completely independent, and the same syntax elements have a certain memory. Therefore, according to the conditional entropy theory, the encoding performance can be further improved compared to independent encoding or memoryless encoding by using other encoded syntax elements for conditional encoding. These encoded symbol information used as conditions are referred to as contexts. In the conventional coding mode, the bits of the syntax element enter the context modeler sequentially, and the encoder assigns an appropriate probability model to each input bit based on the values of the previously encoded syntax element or bit, a process that models the context. The context model corresponding to the syntax element can be located through ctxIdxInc (context index increment ) and ctxIdxStart (context index Start, context start index). After the bin values are fed into the binary arithmetic coder together with the assigned probability model for coding, the context model needs to be updated according to the bin values, i.e. the adaptation process in the coding.
Loop Filtering (Loop Filtering): the modified and quantized signal is subjected to inverse quantization, inverse transformation and prediction compensation to obtain a reconstructed image. The reconstructed image differs from the original image in part of the information due to quantization effects compared to the original image, i.e. the reconstructed image is distorted (Distortion). Therefore, the filtering operation can be performed on the reconstructed image to effectively reduce the degree of distortion generated by quantization. Since these filtered reconstructed images will be used as a reference for subsequent encoded images to predict future image signals, the above-described filtering operation is also referred to as loop filtering, i.e. a filtering operation within the encoding loop.
In one embodiment of the application, fig. 3 shows a basic flow diagram of a video encoder, in which intra prediction is illustrated as an example. Wherein the original image signal s k [x,y]And predicting image signalsPerforming a difference operation to obtain a residual signal u k [x,y]Residual signal u k [x,y]The quantized coefficient is obtained after transformation and quantization processing, and on one hand, the quantized coefficient is encoded by entropy encodingThe bit stream after that, on the other hand, the reconstructed residual signal u 'is obtained by inverse quantization and inverse transformation processing' k [x,y]Predictive image signal +.>And reconstructing residual signal u' k [x,y]Superposition to generate reconstructed image signalsReconstructing an image signal +.>On one hand, the video signal is input into an intra-frame mode decision module and an intra-frame prediction module to carry out intra-frame prediction processing, and on the other hand, the video signal is filtered through loop filtering to output a filtered video signal s'. k [x,y]Filtered image signal s' k [x,y]Motion estimation and motion compensation prediction can be performed as a reference image of the next frame. Then based on the result s 'of the motion compensated prediction' r [x+m x ,y+m y ]And intra prediction result->Obtaining a predicted image signal of the next frame +.>And continuing to repeat the process until the encoding is completed.
In the above coding process, loop filtering is one of the core modules of video coding, so as to effectively remove various coding distortions. The latest generation international video coding standard VVC supports four different types of loop filters: deblocking filter (Deblocking filter, DF for short), sample adaptive compensation (Sample Adaptive Offset, SAO for short), adaptive loop filtering (Adaptive Loop Filter, ALF for short), and cross-component adaptive loop filtering (CC-ALF).
Alternatively, the overall structure of VVC and the loop filtering process are shown in fig. 4, and the overall flow is similar to the encoder flow shown in fig. 3, in which ALF and CC-ALF are wiener filters, and filter coefficients can be adaptively determined according to the content of different video components, so as to reduce the mean square error (Mean Square Error, MSE for short) between the reconstructed component and the original component. The ALF is input with reconstructed pixel values after DF and SAO filtering, and the output is an enhanced reconstructed brightness image and an enhanced reconstructed chromaticity image; the input of CC-ALF is the brightness component after DF and SAO filtering and before ALF processing, and the corrected value of the corresponding chroma component is output. That is, CC-ALF acts only on the chrominance component, specifically, by using the correlation between the luminance component and the chrominance component, a correction value of the chrominance component is obtained by linear filtering of the luminance component, and the correction value is added to the chrominance component after ALF filtering as a final reconstructed chrominance component. As an adaptive filter, wiener filters can generate different filter coefficients for video contents with different characteristics, so that ALF and CC-ALF need to classify video contents, and corresponding filters are used for each class of video contents. In current VVC designs, the ALF of the luma component supports 25 different classes of filters, up to 8 different classes of filters per chroma component, and up to 4 different classes of filters per chroma component.
The ALF filter related parameters are stored in APS (Adaptation Parameter Set, adaptive parameter set). An APS may contain up to 25 sets of luma filter coefficients and corresponding clipping value indices, up to 8 sets of chroma filter coefficients and corresponding clipping value indices for two chroma components, and up to 4 sets of CC-ALF filter coefficients for each chroma component.
As shown in fig. 5, the loop filtering process in VVC includes five main modules: luma map chroma scaling (Luma Mapping with Chroma Scaling, LMCS for short), deblocking Filter (DF), sample adaptive compensation (SAO), adaptive Loop Filtering (ALF), and cross-component adaptive loop filtering (CC-ALF). The input-output relationship of the five modules is shown in fig. 5, the reconstructed pixel is input to the LMCS, the input of DF is the reconstructed pixel after LMCS processing, the input of ALF is the reconstructed pixel value after DF and SAO processing, and the output is the pixel value after ALF filtering enhancement. The ALF filtering process may be represented by the following formula (1):
in the case of the formula (1),representing the filtered pixels; r (x, y) represents the current pixel to be filtered; f (f) i,0 ,f i,1 Representing the clipping value of the difference value between the peripheral pixels covered by the ALF filter template used by the VVC and the current pixel; c i I=0, …,11 represents the filter coefficients at the corresponding positions in the filter template, which need to be transmitted to the decoding side. The ALF filter shape used by VVC at present is a centrosymmetric 7 x 7 diamond filter.
Further enhanced compression models (Enhanced Compression Model, ECM for short) have begun to be explored on VVC basis. The loop filter part of ECM introduces various loop filters such as double sideband filtering (Bilateral Filtering, BIF) and Cross component sample adaptive compensation (Cross-Component Sample Adaptive Offset, CCSAO) in addition to the existing loop filter in VVC. The loop filtering process of ECM is shown in fig. 6, the BIF and CCSAO operate in parallel with SAO, and the generated correction value and the correction value generated by SAO are added to the reconstructed pixel subjected to deblocking filtering at the same time, so that the input of ALF in ECM is the reconstructed value after BIF, SAO, CCSAO processing, and output as the enhanced pixel value. The ALF filtering process may be represented by the following formula (2):
in the formula (2) of the present invention,representing the filtered pixels; r (x, y) represents the current pixel to be filtered; f (f) i,0 ,f i,1 Limiting values representing differences between peripheral pixels covered by an ALF filter template used by the ECM and the current pixel; c i I=0, …,19 denotes the filter coefficients at the corresponding positions in the filter template; g i A clipping value representing a difference value between the intermediate value generated by the fixed filter and the current pixel value to be filtered; c i I=20, 21 denotes a filter coefficient corresponding to the intermediate value. These filter coefficients also need to be transmitted to the decoding side. The ALF filter shape currently used by ECM is a centrosymmetric 9 x 9 diamond filter.
ALF in ECM still uses only reconstructed pixels after SAO as input to ALF. Since the ALF in the related art uses a fixed template and uses the output of a single loop filter as the input of the ALF, these all affect the performance of the ALF itself.
On this basis, embodiments of the present application propose a solution that uses reconstructed pixels prior to deblocking filtering as an additional input to the ALF to achieve further performance improvement. As shown in fig. 7, the reconstructed pixel after the output of LMCS and before the input of DF is taken as an additional input of ALF, and the filtering process of ALF can be shown in the following formula (3):
in the formula (3) of the present application,representing the filtered pixels; r (x, y) represents the current pixel to be filtered; f (f) i,0 ,f i,1 Limiting values representing differences between peripheral pixels covered by an ALF filter template used by the ECM and the current pixel; c i I=0, …,19 denotes the filter coefficients at the corresponding positions in the filter template; g i A clipping value representing a difference value between the intermediate value generated by the fixed filter and the current pixel value to be filtered; c i I=20, 21 denotes a filter coefficient corresponding to the intermediate value; h is a i,0 ,h i,1 Limiting representing the difference between surrounding pixels and current pixel prior to deblocking filteringAmplitude value; c i I=22, …, N represents the filter coefficient corresponding to the pixel difference before deblocking filtering, and the filter coefficient is also required to be transmitted to the decoding end; n-22 indicates that the pixel difference value before deblocking filtering corresponds to the number of coefficients that need to be indexed. The ALF filter shape used in this embodiment may be a centrosymmetric 9 x 9 diamond filter.
In addition to using reconstructed pixels prior to deblocking filtering as an additional input to the ALF, embodiments of the present application also propose a scheme that can use reconstructed pixels after deblocking filtering and prior to next loop filtering as an additional input to the ALF to achieve further performance improvement. As shown in fig. 8, the reconstructed pixel after the DF output and before the SAO input is taken as an additional input of the ALF, and the filtering process of the ALF can be shown in the following formula (4):
In the formula (4) of the present application,representing the filtered pixels; r (x, y) represents the current pixel to be filtered; f (f) i,0 ,f i,1 Limiting values representing differences between peripheral pixels covered by an ALF filter template used by the ECM and the current pixel; c i I=0, …,19 denotes the filter coefficients at the corresponding positions in the filter template; g i A clipping value representing a difference value between the intermediate value generated by the fixed filter and the current pixel value to be filtered; c i I=20, 21 denotes a filter coefficient corresponding to the intermediate value; p is p i,0 ,p i,1 A clipping value representing a difference value between the reconstructed pixel after deblocking filtering and the current pixel before next loop filtering; c i I=22, …, K represents the filter coefficient corresponding to the pixel difference after deblocking filtering, and the filter coefficient is also required to be transmitted to the decoding end; the ALF filter shape used in this embodiment may be any shape.
In addition, the embodiment of the application also provides a scheme that the reconstructed pixels before the deblocking filtering can be used simultaneously, and the reconstructed pixels after the deblocking filtering and before the next loop filtering are used as additional inputs of the ALF, so as to achieve further performance improvement. As shown in fig. 9, the reconstructed pixel after the output of LMCS and before the input of DF, and the reconstructed pixel after the output of DF and before the input of SAO are taken as additional inputs of ALF, and the filtering process of ALF can be shown by the following formula:
The meaning of the specific parameters in the formula is the same as the formula (3) and the formula (4).
In the embodiments shown in fig. 7 to 9, the loop filter after DF output may be processed in parallel by SAO, BIF, and CCSAO, or may be processed by one filter of SAO, BIF, and CCSAO separately, or may be processed in parallel by two filters of SAO, BIF, and CCSAO, as shown in fig. 7 to 9.
The implementation details of the technical scheme of the embodiment of the application are described in detail below:
fig. 10 shows a flow chart of a loop filtering method according to an embodiment of the application, which may be performed by a video encoding device or a video decoding device. Referring to fig. 10, the loop filtering method at least includes steps S1010 to S1030, and is described in detail as follows:
in step S1010, a reconstructed component corresponding to a video image frame is acquired.
In one embodiment of the present application, the reconstructed component corresponding to the video image frame may be a reconstructed image signal generated after the reconstructed residual signal is superimposed with the predicted image signal. For the encoding end, the reconstruction component corresponding to the video image frame is the reconstruction component corresponding to the video image frame to be encoded; for the decoding end, the reconstructed component corresponding to the video image frame is a decoded reconstructed component.
In step S1020, the reconstructed component corresponding to the video image frame is filtered by a loop filtering unit, where the loop filtering unit includes a DF unit, a specified loop filter, and an adaptive loop filtering ALF unit that are sequentially connected; wherein the input of the ALF unit further comprises at least one of: a signal input to the DF unit, an output signal of the DF unit.
Optionally, the specified loop filter includes at least one of: double sideband filter BIF, sampling point self-adaptive compensation filter SAO and cross-component sampling point self-adaptive compensation filter CCSAO. If two or three of SAO, BIF and CCSAO are involved, these loop filters may be processed in parallel.
In an embodiment of the present application, if the input of the ALF unit further includes a signal input to the DF unit, a specific loop filtering process may be as shown in fig. 7; if the input of the ALF unit also includes the output signal of the DF unit, a specific loop filtering process may be as shown in fig. 8; if the inputs to the ALF unit also include the signal input to the DF unit and the output signal of the DF unit, then a specific loop filtering process may be as shown in fig. 9.
Of course, it is shown in fig. 7 to 9 in such a manner that the designated loop filters are SAO, BIF and CCSAO processed in parallel. In other embodiments of the present application, the designated loop filter may be replaced by a mode that one filter of SAO, BIF and CCSAO is processed separately, and may also be replaced by a mode that two filters of SAO, BIF and CCSAO are processed in parallel.
The following details of the specific filtering process are described by taking the following examples of the input of the ALF unit further including the signal input to the DF unit, or further including the output signal of the DF unit, or including the signal input to the DF unit and the output signal of the DF unit, respectively:
the input of the ALF unit comprises a signal input to the DF unit
In one embodiment of the present application, if the input of the ALF unit includes a signal input to the DF unit, then at the time of the filtering process, if the DF unit is not turned on by the video image frame or the current slice in the video image frame, the video image frame or the current slice is limited to refer to the first APS at the time of the filtering process; the first APS includes a first flag bit, and a value of the first flag bit included in the first APS is used to indicate that the first adaptive parameter set APS includes a filter coefficient at a first corresponding position, where the first corresponding position is a coefficient position corresponding to a signal input to the DF unit.
In this embodiment, since the video image frame or the current stripe does not turn on the DF unit, it is not necessary to use the filter coefficient corresponding to the signal input to the DF unit at the time of the filtering process, and thus it is possible to restrict the video image frame or the current stripe from referring to the first APS, that is, from referring to the APS including the filter coefficient corresponding to the signal input to the DF unit.
In one embodiment of the present application, if the input of the ALF unit includes a signal input to the DF unit, the video image frame or the current band is limited to reference to the second APS during the filtering process if the DF unit is turned on during the filtering process; the second APS includes a first flag bit, and a value of the first flag bit included in the second APS is used to indicate a filter coefficient at a default first corresponding position in the second APS.
In this embodiment, since the DF unit is turned on by the video image frame or the current band, it is necessary to use the filter coefficient corresponding to the signal input to the DF unit at the time of the filtering process, and therefore it is possible to restrict the video image frame or the current band from referring to the second APS, that is, to restrict the APS from referring to the filter coefficient corresponding to the signal default to the DF unit.
In one embodiment of the present application, if the input of the ALF unit includes a signal input to the DF unit, then at the time of filtering processing, if the DF unit is not turned on by the video image frame or the current stripe of the video image frame, and the video image frame or the current stripe references the first APS, the filter coefficient at the first corresponding position in the first APS is set to 0 at the time of filtering processing by applying the first APS to the video image frame or the current stripe; the first APS includes a first flag bit, and a value of the first flag bit included in the first APS is used to indicate a filter coefficient at a first corresponding position included in the first APS, where the first corresponding position is a coefficient position corresponding to a signal input to the DF unit.
In this embodiment, since the video image frame or the current stripe does not turn on the DF unit, it is not necessary to use the filter coefficient corresponding to the signal input to the DF unit at the time of the filtering process, and thus if the video image frame or the current stripe refers to the first APS including the filter coefficient corresponding to the signal input to the DF unit, the filter coefficient at the first corresponding position in the first APS may be set to 0.
In one embodiment of the present application, if the input of the ALF unit includes a signal input to the DF unit, at the time of filtering, if the video image frame or the current stripe turns on the DF unit and the video image frame or the current stripe refers to the second APS, the filter coefficient at the first corresponding position in the second APS is filled with 0 at the time of filtering the second APS applied to the video image frame or the current stripe; the second APS includes a first flag bit, and a value of the first flag bit included in the second APS is used to indicate a filter coefficient at a default first corresponding position in the second APS.
In this embodiment, since the DF unit is turned on by the video image frame or the current stripe, it is necessary to use the filter coefficient corresponding to the signal input to the DF unit at the time of the filtering process, and thus if the video image frame or the current stripe refers to the second APS of the filter coefficients corresponding to the signal input to the DF unit by default, the filter coefficient at the first corresponding position in the second APS can be filled with 0.
In one embodiment of the present application, if the input of the ALF unit includes a signal input to the DF unit, when the DF unit is not turned on by the video image frame or the current slice in the video image frame, the filter coefficient at the first corresponding position, which is the coefficient position corresponding to the signal input to the DF unit, may be set to 0 in the APS including the filter coefficient of the video image frame or the current slice.
In this embodiment, since the DF unit is not turned on by the video image frame or the current stripe, it is not necessary to use the filter coefficient corresponding to the signal input to the DF unit at the time of the filtering process, and thus the filter coefficient at the first corresponding position in the APS containing the filter coefficient of the video image frame or the current stripe may be set to 0 to reduce codec redundancy.
In one embodiment of the present application, if the input of the ALF unit includes a signal input to the DF unit and the DF unit is not turned on by the video image frame or the current slice in the video image frame, when the APS referenced by the video image frame or the current slice is applied to the video image frame or the current slice for filtering, the filter coefficient at the first corresponding position in the APS referenced by the video image frame or the current slice may be set to 0, and the first corresponding position is the coefficient position corresponding to the signal input to the DF unit.
In this embodiment, since the DF unit is not turned on by the video image frame or the current stripe, it is not necessary to use the filter coefficient corresponding to the signal input to the DF unit at the time of the filtering process, and thus the filter coefficient at the first corresponding position in the APS to which the video image frame or the current stripe refers may be set to 0.
The input of the ALF unit comprises the output signal of the DF unit
In one embodiment of the present application, if the input of the ALF unit includes the output signal of the DF unit, then at the time of the filtering process, if the video image frame or the current stripe in the video image frame does not turn on the designated loop filter, the video image frame or the current stripe is restricted from referring to the third APS at the time of the filtering process; the third APS includes a second flag bit, and a value of the second flag bit included in the third APS is used to indicate a filter coefficient included in the third APS at a second corresponding position, where the second corresponding position is a coefficient position corresponding to an output signal of the DF unit.
In this embodiment, since the video image frame or the current stripe does not turn on the designated loop filter, it is not necessary to use the filter coefficient corresponding to the signal input to the designated loop filter (i.e., the output signal of the DF unit) at the time of the filtering process, so that it is possible to restrict the video image frame or the current stripe from referring to the third APS, i.e., to restrict the APS referring to the filter coefficient corresponding to the output signal including the DF unit.
In one embodiment of the present application, if the input of the ALF unit includes the output signal of the DF unit, the video image frame or the current stripe is limited to refer to the fourth APS at the time of the filtering process if the video image frame or the current stripe turns on the designated loop filter; the fourth APS includes a second flag bit, and the value of the second flag bit included in the fourth APS is used to indicate a filter coefficient at a default second corresponding position in the fourth APS.
In this embodiment, since the video image frame or the current stripe turns on the designated loop filter, it is necessary to use the filter coefficient corresponding to the output signal of the DF unit at the time of the filtering process, and therefore it is possible to restrict the video image frame or the current stripe from referring to the fourth APS, that is, to restrict the APS referring to the filter coefficient corresponding to the output signal of the DF unit, which is default.
In one embodiment of the present application, if the input of the ALF unit includes the output signal of the DF unit, at the time of filtering processing, if the video image frame or the current stripe in the video image frame does not turn on the designated loop filter, and the video image frame or the current stripe refers to the third APS, the filter coefficient at the second corresponding position in the third APS is set to 0 at the time of filtering processing applied to the video image frame or the current stripe; the third APS includes a second flag bit, and a value of the second flag bit included in the third APS is used to indicate a filter coefficient included in the third APS at a second corresponding position, where the second corresponding position is a coefficient position corresponding to an output signal of the DF unit.
In this embodiment, since the video image frame or the current stripe does not turn on the designated loop filter, it is not necessary to use the filter coefficient corresponding to the output signal of the DF unit at the time of the filtering process, and thus if the video image frame or the current stripe refers to the third APS including the filter coefficient corresponding to the output signal of the DF unit, the filter coefficient at the second corresponding position in the third APS may be set to 0.
In one embodiment of the present application, if the input of the ALF unit includes the output signal of the DF unit, then, at the time of filtering processing, if the video image frame or the current stripe turns on a designated loop filter, and the video image frame or the current stripe refers to the fourth APS, the filter coefficient at the second corresponding position in the fourth APS is filled with 0 at the time of filtering processing by applying the fourth APS to the video image frame or the current stripe; the fourth APS includes a second flag bit, and the value of the second flag bit included in the fourth APS is used to indicate a filter coefficient at a default second corresponding position in the fourth APS.
In this embodiment, since the video image frame or the current stripe turns on the designated loop filter, it is necessary to use the filter coefficient corresponding to the output signal of the DF unit at the time of the filtering process, and thus if the video image frame or the current stripe refers to the fourth APS of the filter coefficient corresponding to the output signal of the default DF unit, the filter coefficient at the second corresponding position in the fourth APS may be filled with 0.
In one embodiment of the present application, if the input of the ALF unit includes the output signal of the DF unit, when the video image frame or the current slice in the video image frame does not turn on the designated loop filter, the filter coefficient at the second corresponding position may be set to 0 in the APS including the filter coefficient of the video image frame or the current slice, where the second corresponding position is the coefficient position corresponding to the output signal of the DF unit.
In this embodiment, since the video image frame or the current stripe does not turn on the designated loop filter, it is not necessary to use the filter coefficient corresponding to the output signal of the DF unit at the time of the filtering process, and thus the filter coefficient at the second corresponding position in the APS containing the filter coefficient of the video image frame or the current stripe may be set to 0 to reduce the codec redundancy.
In one embodiment of the present application, if the input of the ALF unit includes the output signal of the DF unit and the video image frame or the current slice in the video image frame does not turn on the designated loop filter, when the APS referenced by the video image frame or the current slice is applied to the video image frame or the current slice for filtering, the filter coefficient at the second corresponding position in the APS referenced by the video image frame or the current slice may be set to 0, and the second corresponding position is the coefficient position corresponding to the output signal of the DF unit.
In this embodiment, since the video image frame or the current stripe does not turn on the designated loop filter, it is not necessary to use the filter coefficient corresponding to the output signal of the DF unit at the time of the filtering process, and thus the filter coefficient at the second corresponding position in the APS to which the video image frame or the current stripe refers may be set to 0.
The inputs of the ALF unit include a signal input to the DF unit, and an output signal of the DF unit
In this application scenario, the foregoing embodiment in which the input of the ALF unit includes a signal input to the DF unit and the embodiment in which the input of the ALF unit includes an output signal of the DF unit are combined, that is, according to whether the DF unit is turned on and whether a specified loop filter is turned on to limit APS referenced by a video image frame or a current band, and whether a filter coefficient of a corresponding position is set to 0 at the time of filtering processing.
In step S1030, the output of the loop filter unit is taken as a reconstructed image corresponding to the video image frame.
In one embodiment of the present application, if the loop filtering method shown in fig. 10 is applied to a video decoding method, after obtaining a reconstructed image corresponding to a video image frame, the reconstructed image corresponding to the video image frame may be taken as a decoded video image.
In one embodiment of the present application, if the loop filtering method shown in fig. 10 is applied to the video encoding method, the reconstructed image corresponding to the video image frame to be encoded may be obtained by performing the filtering process on the reconstructed component corresponding to the video image frame to be encoded by the loop filtering method.
Optionally, in the video encoding method according to the embodiment of the present application, if the input of the ALF unit includes a signal input to the DF unit, a first flag bit may be added to the APS including the filter coefficient of the video image frame or the current slice according to whether the DF unit is turned on by the video image frame or the current slice in the video image frame, where the value of the first flag bit is used to indicate whether the filter coefficient at a first corresponding position in the APS is default, and the first corresponding position is a coefficient position corresponding to the signal input to the DF unit.
Specifically, if the DF unit is not turned on by the video image frame or the current band, the value of the first flag bit added in the APS is set to a first value to indicate the filter coefficient at the default first corresponding position in the APS; if the DF unit is turned on for the video image frame or the current stripe, the value of the first flag bit added in the APS is set to a second value to indicate that the APS includes the filter coefficient at the first corresponding position.
The technical scheme of the embodiment enables the encoding end to indicate whether the filter coefficient at the first corresponding position is default in the APS by setting the first flag bit in the APS, and further facilitates the decoding end to determine whether the filter coefficient at the first corresponding position is default in the referenced APS according to the first flag bit, so that corresponding countermeasures, namely the processing mode of the loop filtering method part, are adopted when loop filtering is carried out.
Optionally, in the video encoding method according to the embodiment of the present application, if the input of the ALF unit includes the output signal of the DF unit, a second flag bit may be added to the APS including the filter coefficient of the video image frame or the current slice according to whether the video image frame or the current slice in the video image frame turns on the designated loop filter, where the value of the second flag bit is used to indicate whether the filter coefficient in the APS is at a default second corresponding position, and the second corresponding position is a coefficient position corresponding to the output signal of the DF unit.
Specifically, if the video image frame or the current band does not turn on the specified loop filter, for example, the value of the second flag bit added in the APS is set to the first value to indicate the filter coefficient at the default second corresponding position in the APS; if the video image frame or the current slice turns on the designated loop filter, the value of the second flag bit added in the APS is set to a second value to indicate that the APS includes filter coefficients at a second corresponding location.
The technical scheme of the embodiment enables the encoding end to indicate whether the filter coefficient at the second corresponding position is default in the APS by setting the second flag bit in the APS, and further facilitates the decoding end to determine whether the filter coefficient at the second corresponding position is default in the referenced APS according to the second flag bit, so that corresponding countermeasures, namely the processing mode of the loop filtering method part, are adopted when loop filtering is carried out.
It should be noted that, in the embodiment of the present application, if the input of the ALF unit includes the signal input to the DF unit and the output signal of the DF unit, the encoding end may add the first flag bit and the second flag bit in the APS including the filter coefficient of the video image frame or the current stripe according to whether the DF unit is turned on or not and whether the specified loop filter is turned on in the video image frame or the current stripe, and set the corresponding values to indicate when encoding.
It can be seen that in the embodiment of the present application, a loop filtering method using reconstructed pixels at different positions (such as reconstructed pixels input to the DF unit and/or reconstructed pixels output from the DF unit) is provided, so as to better utilize correlation with surrounding pixels and improve quality after ALF filtering. Meanwhile, a corresponding filter coefficient index method is provided, so that the cost for coding the ALF filter coefficients is further reduced, and the coding performance of the whole ALF is improved. The following is described in more detail in two parts:
1. An embodiment using reconstructed pixels of loop filter outputs at different locations as filter coefficient indices for ALF inputs:
as described above, loop filter outputs at different positions according to the embodiment of the present application include: reconstructed pixels after LMCS and before DF (i.e., reconstructed pixels input to DF), reconstructed pixels after DF and before the next loop filter (i.e., reconstructed pixels output by DF), and combinations of the two.
Alternatively, a flag bit may be newly added in the APS, and according to the usage of the loop filter corresponding to the position of the reconstructed pixel to be used, it is determined whether to use the reconstructed pixel at the position as an input, and the usage of the reconstructed pixel at a different position is described by using the flag bit in the APS.
In one embodiment, for a method using reconstructed pixels prior to DF as an additional input to the ALF, if the current image or stripe selection does not turn on DF, the flag bit in the APS containing the filter coefficients of the current image or stripe is set to 0, indicating that no corresponding filter coefficients need to be transmitted. If the current image or slice is selected to turn on DF, the flag bit in the APS containing the filter coefficients of the current image or slice is set to 1, indicating that the corresponding filter coefficients are transmitted. The corresponding decoding process has the following two schemes:
a) Limiting the reference selection of APS. If the DF is not turned on by the current image or stripe selection, limiting the APS with the flag bit contained in the reference to be 1; if the current picture or slice selects to use DF (i.e., turns on DF), the APS with the flag bit contained in the reference being 0 is restricted.
b) And (5) adjusting the value of the coefficient. If the DF is not turned on by the current image or stripe selection and the corresponding flag bit in the referenced APS is 1, setting to 0 when the filter coefficient of the corresponding position in the APS is applied to the current image; if the current image or stripe is selected using DF and the corresponding flag bit in the referenced APS is 0, then 0 is used to fill the default corresponding position coefficients in the APS.
In one embodiment, for a method using reconstructed pixels after DF and before the next loop filtering as an additional input to the ALF, if the current image or slice is selected while the tools such as SAO, CCSAO, BIF are turned off, the flag in APS containing the filter coefficients of the current image or slice is set to 0, indicating that no corresponding filter coefficients need to be transmitted; if the current image or slice selection does not turn off tools such as SAO, CCSAO, BIF (i.e., turn on portions of SAO, CCSAO, and BIF), the flag in APS containing the filter coefficients of the current image or slice is set to 1. The corresponding decoding process has the following two schemes:
a) Limiting the reference selection of APS. If the current image or the band is selected and the tools such as SAO, CCSAO, BIF are closed, limiting the APS with the flag bit contained in the reference to be 1; if the current image or stripe is selected to use one or more of the tools SAO, CCSAO, BIF, etc., the APS with the flag bit contained in the reference is limited to 0.
b) And (5) adjusting the value of the coefficient. If the current image or the stripe is selected and the tools such as SAO, CCSAO, BIF are closed at the same time, and the corresponding flag bit in the referenced APS is 1, setting the filter coefficient of the corresponding position to 0 when the APS is applied to the current image or the stripe; if the current image or stripe is selected using one or more of the tools SAO, CCSAO, BIF, etc., and the corresponding flag bit in the referenced APS is 0, then the coefficients of the default corresponding position in the APS are padded with 0.
In one embodiment, for the method of using the reconstructed pixel before DF and the reconstructed pixel after DF and before the next loop filtering as the additional input of ALF at the same time, two flag bits may be added in the APS, corresponding to the use cases of the reconstructed pixel before DF and the use cases of the reconstructed pixel after DF and before the next loop filtering, respectively. The conditions according to the two embodiments set the corresponding flag bit and transmit the corresponding coefficient and the adjustment of the decoding process, respectively.
2. And setting the value of the corresponding coefficient according to the use condition of the loop filter corresponding to the position of the reconstructed pixel.
In one embodiment, for a method using reconstructed pixels prior to DF as the ALF additional input information, if the current image or stripe selection does not turn DF on, the filter coefficient for the corresponding position in the APS generated by the current frame is set to 0. If the filter coefficients in the APS generated by other frames are referenced, the filter coefficients at the corresponding positions are set to 0 when decoding the current image or slice, which is equivalent in effect to filtering without using reconstructed pixels at the corresponding positions. This modification does not affect the decoding of other image frames or slices that still open the DF.
In one embodiment, for the method using DF followed by loop filtering and followed by ALF additional input information, if the current image or slice selection is turned off while tools such as SAO, CCSAO, BIF are turned off, the filter coefficient of the corresponding position in the APS generated by the current frame is set to 0. If the filter coefficients in the APS generated by other frames are referenced, the filter coefficients at the corresponding positions are set to 0 when decoding the current image or slice, which is equivalent in effect to filtering without using reconstructed pixels at the corresponding positions. The modification does not affect the decoding of other frames or slices that still use one or more of the SAO, CCSAO, BIF tools.
In one embodiment, for the method of using reconstructed pixels before DF and reconstructed pixels after DF and before next loop filtering as ALF additional input information at the same time, the setting of the corresponding position coefficients is performed according to the conditions of the foregoing two embodiments.
The technical scheme of the embodiment of the application can better utilize the correlation of the peripheral pixels to carry out the filtering treatment, improves the filtering effect and the filtering quality of ALF, and is further beneficial to improving the coding and decoding performance.
The following describes embodiments of the apparatus of the present application that may be used to perform the methods of the above-described embodiments of the present application. For details not disclosed in the device embodiments of the present application, please refer to the method embodiments of the present application.
Fig. 11 shows a block diagram of a loop filter apparatus according to an embodiment of the application, which may be provided in a video encoding device or a video decoding device.
Referring to fig. 11, a loop filter apparatus 1100 according to an embodiment of the present application includes: an acquisition unit 1102 and a filtering unit 1104.
Wherein, the obtaining unit 1102 is configured to obtain a reconstruction component corresponding to the video image frame; the filtering unit 1104 is configured to perform filtering processing on the reconstructed component by a loop filtering unit including a deblocking filter DF unit, a specified loop filter, and an adaptive loop filter ALF unit, which are sequentially connected; wherein the input of the ALF unit further comprises at least one of: a signal input to the DF unit, an output signal of the DF unit; and taking the output of the loop filtering unit as a reconstructed image corresponding to the video image frame.
In some embodiments of the present application, based on the foregoing scheme, if the input of the ALF unit includes a signal input to the DF unit, the filtering unit 1104 is configured to: if the DF unit is not started by the video image frame or the current stripe in the video image frame, limiting the video image frame or the current stripe to refer to a first adaptive parameter set APS during filtering processing; the first adaptive parameter set APS includes a first flag bit, and a value of the first flag bit included in the first APS is used to indicate a filter coefficient at a first corresponding position in the first adaptive parameter set APS, where the first corresponding position is a coefficient position corresponding to a signal input to the DF unit.
In some embodiments of the present application, based on the foregoing scheme, if the input of the ALF unit includes a signal input to the DF unit, the filtering unit 1104 is configured to: if the DF unit is turned on by the video image frame or the current stripe, limiting the video image frame or the current stripe to refer to a second APS during filtering; the second APS includes a first flag bit, and a value of the first flag bit included in the second APS is used to indicate a default filter coefficient in the second APS at the first corresponding position.
In some embodiments of the present application, based on the foregoing scheme, if the input of the ALF unit includes a signal input to the DF unit, the filtering unit 1104 is configured to: if the DF unit is not turned on by the video image frame or the current stripe of the video image frame and the video image frame or the current stripe references a first APS, setting a filter coefficient at a first corresponding position in the first APS to 0 when the first APS is applied to the video image frame or the current stripe for filtering processing; the first APS includes a first flag bit, and a value of the first flag bit included in the first APS is used to indicate that the first APS includes a filter coefficient at the first corresponding position, where the first corresponding position is a coefficient position corresponding to a signal input to the DF unit.
In some embodiments of the present application, based on the foregoing scheme, if the input of the ALF unit includes a signal input to the DF unit, the filtering unit 1104 is configured to: if the video image frame or the current stripe starts the DF unit and the video image frame or the current stripe references a second APS, filling a filter coefficient at the first corresponding position in the second APS to 0 when the second APS is applied to the video image frame or the current stripe for filtering; the second APS includes a first flag bit, and a value of the first flag bit included in the second APS is used to indicate a default filter coefficient in the second APS at the first corresponding position.
In some embodiments of the present application, based on the foregoing scheme, if the input of the ALF unit includes the output signal of the DF unit, the filtering unit 1104 is configured to: if the video image frame or the current stripe in the video image frame does not turn on the designated loop filter, limiting the video image frame or the current stripe to refer to a third APS during filtering processing; the third APS includes a second flag bit, and a value of the second flag bit included in the third APS is used to indicate that the third APS includes a filter coefficient at a second corresponding position, where the second corresponding position is a coefficient position corresponding to an output signal of the finger DF unit.
In some embodiments of the present application, based on the foregoing scheme, if the input of the ALF unit includes the output signal of the DF unit, the filtering unit 1104 is configured to: limiting the video image frame or the current slice to refer to a fourth APS during filtering processing if the video image frame or the current slice turns on the designated loop filter; wherein a second flag bit is included in the fourth APS, and a value of the second flag bit included in the fourth APS is used to indicate a filter coefficient at the second corresponding position by default in the fourth APS.
In some embodiments of the present application, based on the foregoing scheme, if the input of the ALF unit includes the output signal of the DF unit, the filtering unit 1104 is configured to: if the specified loop filter is not turned on by the video image frame or the current stripe in the video image frame, and the video image frame or the current stripe references a third APS, setting a filter coefficient at a second corresponding position in the third APS to 0 when the third APS is applied to the video image frame or the current stripe for filtering processing; the third APS includes a second flag bit, and a value of the second flag bit included in the third APS is used to indicate that the third APS includes a filter coefficient at the second corresponding position, where the second corresponding position is a coefficient position corresponding to an output signal of the finger DF unit.
In some embodiments of the present application, based on the foregoing scheme, if the input of the ALF unit includes the output signal of the DF unit, the filtering unit 1104 is configured to: if the video image frame or the current stripe turns on the specified loop filter and the video image frame or the current stripe references a fourth APS, filling a filter coefficient at the second corresponding position in the fourth APS with 0 when the fourth APS is applied to the video image frame or the current stripe for filtering processing; wherein a second flag bit is included in the fourth APS, and a value of the second flag bit included in the fourth APS is used to indicate a filter coefficient at the second corresponding position by default in the fourth APS.
In some embodiments of the present application, based on the foregoing scheme, if the input of the ALF unit includes a signal input to the DF unit, the loop filtering apparatus 1100 further includes: a processing unit configured to: if the video image frame or the current stripe in the video image frame does not start the DF unit, the filter coefficient at the first corresponding position is set to 0 in the APS containing the filter coefficient of the video image frame or the current stripe, and the first corresponding position is the coefficient position corresponding to the signal input to the DF unit.
In some embodiments of the present application, based on the foregoing scheme, if the input of the ALF unit includes a signal input to the DF unit, the filtering unit 1104 is configured to: if the video image frame or the current stripe in the video image frame does not start the DF unit, when the APS referenced by the video image frame or the current stripe is applied to the video image frame or the current stripe for filtering, setting a filter coefficient at a first corresponding position in the APS referenced by the video image frame or the current stripe to 0, where the first corresponding position is a coefficient position corresponding to a signal input to the DF unit.
In some embodiments of the present application, based on the foregoing scheme, if the input of the ALF unit includes the output signal of the DF unit, the loop filtering apparatus 1100 further includes: and the processing unit is configured to set a filter coefficient at a second corresponding position to be 0 in the APS containing the filter coefficient of the video image frame or the current band if the specified loop filter is not started by the current band in the video image frame or the video image frame, wherein the second corresponding position is a coefficient position corresponding to an output signal of the DF unit.
In some embodiments of the present application, based on the foregoing scheme, if the input of the ALF unit includes the output signal of the DF unit, the filtering unit 1104 is configured to: if the specified loop filter is not turned on by the video image frame or the current stripe in the video image frame, when the APS referenced by the video image frame or the current stripe is applied to the video image frame or the current stripe for filtering, a filter coefficient at a second corresponding position in the APS referenced by the video image frame or the current stripe is set to 0, where the second corresponding position is a coefficient position corresponding to an output signal of the DF unit.
In some embodiments of the application, based on the foregoing, the specified loop filter includes at least one of: double sideband filter, sampling point self-adaptive compensation filter and cross-component sampling point self-adaptive compensation filter.
Fig. 12 shows a block diagram of a video decoding apparatus according to an embodiment of the application, which may be provided within a video decoding device.
Referring to fig. 12, a video decoding apparatus 1200 according to an embodiment of the present application includes: an acquisition unit 1202, a filtering unit 1204, and a decoding unit 1206.
Wherein the obtaining unit 1202 is configured to obtain a reconstruction component corresponding to a video image frame; the filtering unit 1204 is configured to perform filtering processing on the reconstructed component by a loop filtering unit including a deblocking filter DF unit, a specified loop filter, and an adaptive loop filter ALF unit, which are sequentially connected; wherein the input of the ALF unit further comprises at least one of: a signal input to the DF unit, an output signal of the DF unit; and taking the output of the loop filtering unit as a reconstructed image corresponding to the video image frame; the decoding unit 1206 is configured to take the reconstructed image corresponding to the video image frame as a decoded video image.
Fig. 13 shows a block diagram of a video encoding apparatus according to an embodiment of the application, which may be provided within a video encoding device.
Referring to fig. 13, a video encoding apparatus 1300 according to an embodiment of the present application includes: an acquisition unit 1302 and a filtering unit 1304.
Wherein, the obtaining unit 1302 is configured to obtain a video image frame to be encoded and a reconstruction component corresponding to the video image frame to be encoded; the filtering unit 1304 is configured to perform filtering processing on a reconstructed component corresponding to the video image frame to be encoded through a loop filtering unit, where the loop filtering unit includes a deblocking filtering DF unit, a designated loop filter, and an adaptive loop filtering ALF unit that are sequentially connected; wherein the input of the ALF unit further comprises at least one of: a signal input to the DF unit, an output signal of the DF unit; and taking the output of the loop filtering unit as a reconstructed image corresponding to the video image frame to be encoded.
In some embodiments of the present application, based on the foregoing, if the input of the ALF unit includes a signal input to the DF unit, the video encoding apparatus 1300 further includes: and the encoding unit is configured to add a first flag bit in an APS containing the filter coefficients of the video image frame or the current band according to whether the DF unit is started or not, wherein the value of the first flag bit is used for indicating whether the filter coefficients at a first corresponding position are default in the APS or not, and the first corresponding position is a coefficient position corresponding to a signal input to the DF unit.
In some embodiments of the application, based on the foregoing scheme, the encoding unit is configured to: if the video image frame or the current stripe does not start the DF unit, setting the value of a first flag bit added in the APS to be a first value so as to indicate a filter coefficient at the first corresponding position default in the APS; if the video image frame or the current stripe turns on the DF unit, the value of the first flag bit added in the APS is set to a second value to indicate that the APS includes the filter coefficient at the first corresponding position.
In some embodiments of the present application, based on the foregoing scheme, if the input of the ALF unit includes the output signal of the DF unit, the video encoding apparatus 1300 further includes: and the encoding unit is configured to add a second flag bit in the APS containing the filter coefficients of the video image frame or the current band according to whether the specified loop filter is started or not by the current band in the video image frame or the video image frame, wherein the value of the second flag bit is used for indicating whether the filter coefficients at a second corresponding position default in the APS are indicated, and the second corresponding position is a coefficient position corresponding to an output signal of the DF unit.
In some embodiments of the application, based on the foregoing scheme, the encoding unit is configured to: if the video image frame or the current stripe does not turn on the designated loop filter, setting a value of a second flag bit added in the APS to a first value to indicate a filter coefficient at the second corresponding position by default in the APS; and if the video image frame or the current band starts the designated loop filter, setting a value of a second flag bit added in the APS to be a second value so as to indicate that the APS comprises a filter coefficient at the second corresponding position.
Fig. 14 shows a schematic diagram of a computer system suitable for use in implementing an embodiment of the application.
It should be noted that, the computer system 1400 of the electronic device shown in fig. 14 is only an example, and should not impose any limitation on the functions and the application scope of the embodiments of the present application.
As shown in fig. 14, the computer system 1400 includes a central processing unit (Central Processing Unit, CPU) 1401, which can perform various appropriate actions and processes according to a program stored in a Read-Only Memory (ROM) 1402 or a program loaded from a storage section 1408 into a random access Memory (Random Access Memory, RAM) 1403, for example, performing the methods described in the above embodiments. In the RAM 1403, various programs and data required for system operation are also stored. The CPU 1401, ROM 1402, and RAM 1403 are connected to each other through a bus 1404. An Input/Output (I/O) interface 1405 is also connected to bus 1404.
The following components are connected to the I/O interface 1405: an input section 1406 including a keyboard, a mouse, and the like; an output portion 1407 including a Cathode Ray Tube (CRT), a liquid crystal display (Liquid Crystal Display, LCD), and a speaker; a storage section 1408 including a hard disk or the like; and a communication section 1409 including a network interface card such as a LAN (Local Area Network ) card, a modem, or the like. The communication section 1409 performs communication processing via a network such as the internet. The drive 1410 is also connected to the I/O interface 1405 as needed. Removable media 1411, such as magnetic disks, optical disks, magneto-optical disks, semiconductor memory, and the like, is installed as needed on drive 1410 so that a computer program read therefrom is installed as needed into storage portion 1408.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising a computer program for performing the method shown in the flowchart. In such an embodiment, the computer program can be downloaded and installed from a network via the communication portion 1409 and/or installed from the removable medium 1411. When executed by a Central Processing Unit (CPU) 1401, performs the various functions defined in the system of the present application.
It should be noted that, the computer readable medium shown in the embodiments of the present application may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-Only Memory (ROM), an erasable programmable read-Only Memory (Erasable Programmable Read Only Memory, EPROM), flash Memory, an optical fiber, a portable compact disc read-Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with a computer-readable computer program embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. A computer program embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. Where each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer programs.
The units involved in the embodiments of the present application may be implemented by software, or may be implemented by hardware, and the described units may also be provided in a processor. Wherein the names of the units do not constitute a limitation of the units themselves in some cases.
As another aspect, the present application also provides a computer-readable medium that may be contained in the electronic device described in the above embodiment; or may exist alone without being incorporated into the electronic device. The computer readable medium carries one or more computer programs which, when executed by the electronic device, cause the electronic device to implement the methods described in the above embodiments.
It should be noted that although in the above detailed description several modules or units of a device for action execution are mentioned, such a division is not mandatory. Indeed, the features and functions of two or more modules or units described above may be embodied in one module or unit in accordance with embodiments of the application. Conversely, the features and functions of one module or unit described above may be further divided into a plurality of modules or units to be embodied.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or may be implemented in software in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present application may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (may be a CD-ROM, a U-disk, a mobile hard disk, etc.) or on a network, and includes several instructions to cause a computing device (may be a personal computer, a server, a touch terminal, or a network device, etc.) to perform the method according to the embodiments of the present application.
Other embodiments of the application will be apparent to those skilled in the art from consideration of the specification and practice of the embodiments disclosed herein. This application is intended to cover any variations, uses, or adaptations of the application following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the application pertains.
It is to be understood that the application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (20)

1. A loop filtering method, comprising:
obtaining a reconstruction component corresponding to a video image frame;
the reconstruction component is subjected to filtering processing through a loop filtering unit, wherein the loop filtering unit comprises a deblocking effect filtering DF unit, a designated loop filter and an adaptive loop filtering ALF unit which are connected in sequence; wherein the input of the ALF unit further comprises at least one of: a signal input to the DF unit, an output signal of the DF unit;
And taking the output of the loop filtering unit as a reconstructed image corresponding to the video image frame.
2. The loop filtering method according to claim 1, wherein if the input of the ALF unit includes a signal input to the DF unit, filtering the reconstructed component by a loop filtering unit includes:
if the DF unit is not started by the video image frame or the current stripe in the video image frame, limiting the video image frame or the current stripe to refer to a first adaptive parameter set APS during filtering processing; the first adaptive parameter set APS comprises a first flag bit, and a value of the first flag bit contained in the first APS is used for indicating a filter coefficient at a first corresponding position in the first adaptive parameter set APS, wherein the first corresponding position is a coefficient position corresponding to a signal input to the DF unit;
if the DF unit is turned on by the video image frame or the current stripe, limiting the video image frame or the current stripe to refer to a second APS during filtering; the second APS includes a first flag bit, and a value of the first flag bit included in the second APS is used to indicate a default filter coefficient in the second APS at the first corresponding position.
3. The loop filtering method according to claim 1, wherein if the input of the ALF unit includes a signal input to the DF unit, filtering the reconstructed component by a loop filtering unit includes:
if the DF unit is not turned on by the video image frame or the current stripe of the video image frame and the video image frame or the current stripe references a first APS, setting a filter coefficient at a first corresponding position in the first APS to 0 when the first APS is applied to the video image frame or the current stripe for filtering processing; the first APS includes a first flag bit, and a value of the first flag bit included in the first APS is used to indicate that the first APS includes a filter coefficient at the first corresponding position, where the first corresponding position is a coefficient position corresponding to a signal input to the DF unit;
if the video image frame or the current stripe starts the DF unit and the video image frame or the current stripe references a second APS, filling a filter coefficient at the first corresponding position in the second APS to 0 when the second APS is applied to the video image frame or the current stripe for filtering; the second APS includes a first flag bit, and a value of the first flag bit included in the second APS is used to indicate a default filter coefficient in the second APS at the first corresponding position.
4. A loop filtering method according to any one of claims 1 to 3, wherein if the input of the ALF unit includes the output signal of the DF unit, the reconstructed component is subjected to filtering processing by a loop filtering unit, comprising:
if the video image frame or the current stripe in the video image frame does not turn on the designated loop filter, limiting the video image frame or the current stripe to refer to a third APS during filtering processing; the third APS includes a second flag bit, and a value of the second flag bit included in the third APS is used to indicate that the third APS includes a filter coefficient at a second corresponding position, where the second corresponding position is a coefficient position corresponding to an output signal of the finger DF unit;
limiting the video image frame or the current slice to refer to a fourth APS during filtering processing if the video image frame or the current slice turns on the designated loop filter; wherein a second flag bit is included in the fourth APS, and a value of the second flag bit included in the fourth APS is used to indicate a filter coefficient at the second corresponding position by default in the fourth APS.
5. A loop filtering method according to any one of claims 1 to 3, wherein if the input of the ALF unit includes the output signal of the DF unit, the reconstructed component is subjected to filtering processing by a loop filtering unit, comprising:
if the specified loop filter is not turned on by the video image frame or the current stripe in the video image frame, and the video image frame or the current stripe references a third APS, setting a filter coefficient at a second corresponding position in the third APS to 0 when the third APS is applied to the video image frame or the current stripe for filtering processing; the third APS includes a second flag bit, and a value of the second flag bit included in the third APS is used to indicate that the third APS includes a filter coefficient at the second corresponding position, where the second corresponding position is a coefficient position corresponding to an output signal of the finger DF unit;
if the video image frame or the current stripe turns on the specified loop filter and the video image frame or the current stripe references a fourth APS, filling a filter coefficient at the second corresponding position in the fourth APS with 0 when the fourth APS is applied to the video image frame or the current stripe for filtering processing; wherein a second flag bit is included in the fourth APS, and a value of the second flag bit included in the fourth APS is used to indicate a filter coefficient at the second corresponding position by default in the fourth APS.
6. The loop filtering method of claim 1, wherein if the input of the ALF unit includes a signal input to the DF unit, the method further comprises:
if the video image frame or the current stripe in the video image frame does not start the DF unit, the filter coefficient at the first corresponding position is set to 0 in the APS containing the filter coefficient of the video image frame or the current stripe, and the first corresponding position is the coefficient position corresponding to the signal input to the DF unit.
7. The loop filtering method according to claim 1, wherein if the input of the ALF unit includes a signal input to the DF unit, filtering the reconstructed component by a loop filtering unit includes:
if the video image frame or the current stripe in the video image frame does not start the DF unit, when the APS referenced by the video image frame or the current stripe is applied to the video image frame or the current stripe for filtering, setting a filter coefficient at a first corresponding position in the APS referenced by the video image frame or the current stripe to 0, where the first corresponding position is a coefficient position corresponding to a signal input to the DF unit.
8. The loop filtering method of any one of claims 1, 6 to 7, wherein if the input of the ALF unit comprises the output signal of the DF unit, the method further comprises:
and if the specified loop filter is not started by the video image frame or the current stripe in the video image frame, setting a filter coefficient at a second corresponding position to be 0 in APS (analog signal processor) containing the filter coefficient of the video image frame or the current stripe, wherein the second corresponding position is a coefficient position corresponding to an output signal of the DF unit.
9. The loop filtering method according to any one of claims 1, 6 to 7, wherein if the input of the ALF unit includes the output signal of the DF unit, filtering the reconstructed component by a loop filtering unit, comprising:
if the specified loop filter is not turned on by the video image frame or the current stripe in the video image frame, when the APS referenced by the video image frame or the current stripe is applied to the video image frame or the current stripe for filtering, a filter coefficient at a second corresponding position in the APS referenced by the video image frame or the current stripe is set to 0, where the second corresponding position is a coefficient position corresponding to an output signal of the DF unit.
10. The loop filtering method according to any one of claims 1 to 3, 6 to 7, characterized in that the specified loop filter comprises at least one of: double sideband filter, sampling point self-adaptive compensation filter and cross-component sampling point self-adaptive compensation filter.
11. A video decoding method, comprising:
performing filtering processing on a reconstruction component corresponding to a video image frame by the loop filtering method according to any one of claims 1 to 10 to obtain a reconstruction image corresponding to the video image frame;
and taking the reconstructed image corresponding to the video image frame as a video image obtained by decoding.
12. A video encoding method, comprising:
acquiring a video image frame to be encoded;
the method according to any one of claims 1 to 10, wherein the reconstructed image corresponding to the video image frame to be encoded is obtained by performing a filtering process on the reconstructed component corresponding to the video image frame to be encoded.
13. The video coding method of claim 12, wherein if the input of the ALF unit includes a signal input to the DF unit, the method further comprises:
And adding a first flag bit in an APS containing the filter coefficients of the video image frame or the current band according to whether the DF unit is started or not, wherein the value of the first flag bit is used for indicating whether the filter coefficients at a first corresponding position are default in the APS or not, and the first corresponding position is a coefficient position corresponding to a signal input to the DF unit.
14. The video encoding method of claim 13, wherein adding a first flag bit in an APS containing filter coefficients of the video image frame or the current slice according to whether the DF unit is turned on by the video image frame or the current slice in the video image frame comprises:
if the video image frame or the current stripe does not start the DF unit, setting the value of a first flag bit added in the APS to be a first value so as to indicate a filter coefficient at the first corresponding position default in the APS;
if the video image frame or the current stripe turns on the DF unit, the value of the first flag bit added in the APS is set to a second value to indicate that the APS includes the filter coefficient at the first corresponding position.
15. The video coding method according to any one of claims 12 to 14, characterized in that if the input of the ALF unit comprises the output signal of the DF unit, the method further comprises:
and adding a second flag bit in an APS containing the filter coefficients of the video image frame or the current band according to whether the specified loop filter is started or not by the current band in the video image frame or the video image frame, wherein the value of the second flag bit is used for indicating whether the filter coefficients at a second corresponding position are default in the APS or not, and the second corresponding position is a coefficient position corresponding to an output signal of the DF unit.
16. The video encoding method according to claim 15, wherein adding a second flag bit in the APS containing the filter coefficients of the video image frame or the current slice according to whether the specified loop filter is turned on or not by the video image frame or the current slice in the video image frame, comprises:
if the video image frame or the current stripe does not turn on the designated loop filter, setting a value of a second flag bit added in the APS to a first value to indicate a filter coefficient at the second corresponding position by default in the APS;
And if the video image frame or the current band starts the designated loop filter, setting a value of a second flag bit added in the APS to be a second value so as to indicate that the APS comprises a filter coefficient at the second corresponding position.
17. A loop filter apparatus, comprising:
an acquisition unit configured to acquire a reconstruction component corresponding to a video image frame;
the filtering unit is configured to carry out filtering processing on the reconstruction component through a loop filtering unit, and the loop filtering unit comprises a deblocking effect filtering DF unit, a designated loop filter and an adaptive loop filtering ALF unit which are sequentially connected; wherein the input of the ALF unit further comprises at least one of: a signal input to the DF unit, an output signal of the DF unit; and taking the output of the loop filtering unit as a reconstructed image corresponding to the video image frame.
18. A computer readable medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the loop filtering method according to any one of claims 1 to 10; or implementing the video decoding method of claim 11; or implementing a video decoding method according to any of claims 12 to 16.
19. An electronic device, comprising:
one or more processors;
a memory for storing one or more computer programs that, when executed by the one or more processors, cause the electronic device to implement the loop filtering method of any of claims 1-10; or implementing the video decoding method of claim 11; or implementing a video decoding method according to any of claims 12 to 16.
20. A computer program product, characterized in that it comprises a computer program stored in a computer readable storage medium, from which a processor of an electronic device reads and executes the computer program, causing the electronic device to perform the loop filtering method according to any one of claims 1 to 10; or implementing the video decoding method of claim 11; or implementing a video decoding method according to any of claims 12 to 16.
CN202210444332.3A 2022-04-20 2022-04-20 Loop filtering method, video encoding and decoding method, device, medium and electronic equipment Pending CN116962688A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210444332.3A CN116962688A (en) 2022-04-20 2022-04-20 Loop filtering method, video encoding and decoding method, device, medium and electronic equipment
PCT/CN2022/137900 WO2023202097A1 (en) 2022-04-20 2022-12-09 Loop filtering method, video coding method and apparatus, video decoding method and apparatus, medium, program product, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210444332.3A CN116962688A (en) 2022-04-20 2022-04-20 Loop filtering method, video encoding and decoding method, device, medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN116962688A true CN116962688A (en) 2023-10-27

Family

ID=88419029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210444332.3A Pending CN116962688A (en) 2022-04-20 2022-04-20 Loop filtering method, video encoding and decoding method, device, medium and electronic equipment

Country Status (2)

Country Link
CN (1) CN116962688A (en)
WO (1) WO2023202097A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9860530B2 (en) * 2011-10-14 2018-01-02 Hfi Innovation Inc. Method and apparatus for loop filtering
KR102276854B1 (en) * 2014-07-31 2021-07-13 삼성전자주식회사 Method and apparatus for video encoding for using in-loof filter parameter prediction, method and apparatus for video decoding for using in-loof filter parameter prediction
US20160241881A1 (en) * 2015-02-13 2016-08-18 Mediatek Inc. Method and Apparatus of Loop Filters for Efficient Hardware Implementation
US11451773B2 (en) * 2018-06-01 2022-09-20 Qualcomm Incorporated Block-based adaptive loop filter (ALF) design and signaling

Also Published As

Publication number Publication date
WO2023202097A1 (en) 2023-10-26

Similar Documents

Publication Publication Date Title
WO2022174660A1 (en) Video coding and decoding method, video coding and decoding apparatus, computer-readable medium, and electronic device
WO2022078304A1 (en) Video decoding method and apparatus, computer readable medium, program, and electronic device
CN112543337B (en) Video decoding method, device, computer readable medium and electronic equipment
WO2022174701A1 (en) Video coding method and apparatus, video decoding method and apparatus, and computer-readable medium and electronic device
WO2022174637A1 (en) Video encoding and decoding method, video encoding and decoding apparatus, computer-readable medium and electronic device
WO2023202097A1 (en) Loop filtering method, video coding method and apparatus, video decoding method and apparatus, medium, program product, and electronic device
WO2021263251A1 (en) State transition for dependent quantization in video coding
CN112449185B (en) Video decoding method, video encoding device, video encoding medium, and electronic apparatus
CN115209157A (en) Video encoding and decoding method and device, computer readable medium and electronic equipment
CN114979656B (en) Video encoding and decoding method and device, computer readable medium and electronic equipment
CN115086664A (en) Decoding method, encoding method, decoder and encoder for unmatched pixels
WO2023051222A1 (en) Filtering method and apparatus, encoding method and apparatus, decoding method and apparatus, computer-readable medium, and electronic device
WO2023130899A1 (en) Loop filtering method, video encoding/decoding method and apparatus, medium, and electronic device
WO2024082632A1 (en) Video coding method and apparatus, video decoding method and apparatus, and computer-readable medium and electronic device
EP4412218A1 (en) Filtering method and apparatus, encoding method and apparatus, decoding method and apparatus, computer-readable medium, and electronic device
WO2024212676A1 (en) Method and apparatus for video encoding and decoding, computer-readable medium and electronic device
WO2022174659A1 (en) Video coding and decoding method and apparatus, computer-readable medium, and electronic device
WO2024109099A1 (en) Video coding method and apparatus, video decoding method and apparatus, and computer-readable medium and electronic device
JP7483029B2 (en) VIDEO DECODING METHOD, VIDEO ENCODING METHOD, DEVICE, MEDIUM, AND ELECTRONIC APPARATUS
WO2024119821A1 (en) Video data processing method and apparatus, storage medium, device, and program product
CN115209138A (en) Video encoding and decoding method and device, computer readable medium and electronic equipment
WO2023168257A2 (en) State transition of dependent quantization for aom enhanced compression model
CN115209141A (en) Video encoding and decoding method and device, computer readable medium and electronic equipment
JP2024514934A (en) Method, computing system and computer program for division-free stochastic regularization for arithmetic coding
CN115209146A (en) Video encoding and decoding method and device, computer readable medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication