[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US8228988B2 - Encoding device, encoding method, decoding device, and decoding method - Google Patents

Encoding device, encoding method, decoding device, and decoding method Download PDF

Info

Publication number
US8228988B2
US8228988B2 US13/046,341 US201113046341A US8228988B2 US 8228988 B2 US8228988 B2 US 8228988B2 US 201113046341 A US201113046341 A US 201113046341A US 8228988 B2 US8228988 B2 US 8228988B2
Authority
US
United States
Prior art keywords
encoding
decoding
pictures
encoded information
delay
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US13/046,341
Other versions
US20110158318A1 (en
Inventor
Hendrikus Markus VELTMAN
Yoichi Yagasaki
Teruhiko Suzuki
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority to US13/046,341 priority Critical patent/US8228988B2/en
Publication of US20110158318A1 publication Critical patent/US20110158318A1/en
Application granted granted Critical
Publication of US8228988B2 publication Critical patent/US8228988B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/577Motion compensation with bidirectional frame interpolation, i.e. using B-pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/114Adapting the group of pictures [GOP] structure, e.g. number of B-frames between two anchor frames
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/146Data rate or code amount at the encoder output
    • H04N19/152Data rate or code amount at the encoder output by measuring the fullness of the transmission buffer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/156Availability of hardware or computational resources, e.g. encoding based on power-saving criteria
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/157Assigned coding mode, i.e. the coding mode being predefined or preselected to be further used for selection of another element or parameter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/172Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding

Definitions

  • This invention relates to an encoding apparatus, encoding method, decoding apparatus and decoding apparatus, and for example, is suitably applied for transmitting image information (hereinafter, referred to as successive image information) composed of a plurality of unit image information being continued, via a network medium such as satellite broadcasting, cable TV or the Internet, or for processing successive image information on a storage medium such as an optical disc, a magnetic disk or a flash memory.
  • image information hereinafter, referred to as successive image information
  • the MPEG2 (ISO/IEC 13818-2) encoding system is defined as a general image encoding system, and is widely used as an application for professionals and for consumers since it can treat interlaced images and progressively scanned images, and standard resolution images and high resolution images.
  • this MPEG2 encoding system high encoding efficiency (compression rate) and high quality of images can be provided, for example, by assigning interlaced images of standard resolution of 720 ⁇ 480 pixels an amount of encoding (bit rate) of 4 to 8 [Mbps], or by assigning progressively scanned images of high resolution of 1920 ⁇ 1088 pixels a bit rate of 18 to 22 [Mbps].
  • the MPEG2 encoding system is mainly used for encoding high quality of images for broadcasting and does not cope with an amount of encoding (bit rate) lower than that used by the MPEG1 encoding system, that is, an encoding method with high encoding efficiency. It was expected that popularization of mobile terminals brings high needs of such an encoding system, and therefore the MPEG4 encoding system was standardized.
  • the MPEG4 encoding system for images was approved as international standard ISO/IEC 14496-2 in December 1998.
  • JTV encoding system an encoding system (hereinafter referred to as JTV encoding system) called MPEG4AVC or H.264 was standardized by a joint video team composed of a VCEG group and an MPEG group. Compared with the MPEG2 and the MPEG4, this JVT encoding system can provide higher encoding efficiency although it requires more operations for encoding and decoding.
  • FIG. 8 shows a rough construction of an encoding apparatus which realizes an encoding process with any of the encoding systems referred above.
  • the encoding apparatus 100 is composed of an image rearrangement buffer 102 , an adder 103 , an orthogonal transformation unit 104 , a quantization unit 105 , a reverse encoding unit 106 , a storage buffer 107 , a dequantization unit 108 , an inverse orthogonal transformation unit 109 , a frame memory 110 , a motion prediction/compensation unit 111 and a rate control unit 112 .
  • the encoding apparatus 100 stores successive image information in the image rearrangement buffer 102 to rearrange the successive image information according to GOP (Group of Pictures) structure on a unit-image-information basis (frame by frame or field by field).
  • GOP Group of Pictures
  • the image rearrangement buffer 102 gives the orthogonal transformation unit 104 unit image information out of the successive image information, which should be intra-prediction-encoded.
  • the orthogonal transformation unit 104 applies orthogonal transformation such as the discrete cosine transform or the Karhunen Loeve transform, to the unit image information and gives an obtained orthogonal transformation coefficient to the quantization unit 105 .
  • the quantization unit 105 performs a quantization process on the orthogonal transformation coefficient given from the orthogonal transformation unit 104 , under the control of the rate control unit 112 , and supplies obtained quantized information (a quantized orthogonal transformation coefficient) to the reverse encoding unit 106 and the dequantization unit 108 .
  • the reverse encoding unit 106 applies variable-length coding or reverse encoding such as arithmetic coding to the quantized information, and stores obtained encoded information (encoded quantized-information) in the storage buffer 107 .
  • the dequantization unit 108 applies a dequantization process to the quantized information and supplies obtained orthogonal transformation coefficient to the inverse orthogonal transformation unit 109 .
  • the inverse orthogonal transformation unit 109 applies the inverse orthogonal transformation to the orthogonal transformation coefficient and stores, if necessary, obtained unit image information in the frame memory 110 as reference image information.
  • the image rearrangement buffer 102 supplies unit image information which should be inter-prediction-encoded, out of the successive image information to the motion prediction/compensation unit 111 .
  • the motion prediction/compensation unit 111 performs a motion prediction/compensation process by using the unit image information and reference image information read from the frame memory 10 , and supplies obtained predicted image information to the adder 103 .
  • the adder 103 supplies to the orthogonal transformation unit 104 difference between the predicted image information and corresponding unit image information as differential information.
  • This differential information is subjected to various processes, as in the case of the intra-encoding, and the resultant is stored in the storage buffer 107 as encoded information and is stored, if necessary, in the frame memory 110 as reference image information.
  • the motion compensation/prediction unit 111 gives the reverse encoding unit 106 motion vector information which is obtained together with the predicted image information as a result of the motion prediction/compensation process.
  • the reverse encoding unit 106 performs the reverse encoding process on the motion vector information to thereby obtain encoded motion vector information for the header part of the corresponding encoded information.
  • the encoding apparatus 100 successively creates encoded information on a unit-image-information basis by performing the encoding process on the successive image information, and successively outputs the encoded information via the storage buffer 107 .
  • FIG. 9 shows a rough construction of a decoding apparatus which performs a decoding process corresponding to the encoding system of the encoding apparatus 100 .
  • the decoding apparatus 120 is composed of a storage buffer 121 , a reverse decoding unit 122 , a dequantization unit 123 , an inverse orthogonal transformation unit 124 , an adder 125 , an image rearrangement buffer 126 , a motion prediction/compensation unit 127 , and a frame memory 128 .
  • the decoding apparatus 120 temporarily stores encoded information which is successively inputted, in the storage buffer 121 and then supplies it to the reverse decoding unit 122 .
  • the reverse decoding unit 122 applies a decoding process, variable-length decoding or arithmetic decoding, to the encoded information, and supplies obtained quantized information to the dequantization unit 123 .
  • the dequantization unit 123 applies a dequantization process to the quantized information given from the reverse decoding unit 122 and supplies obtained orthogonal transformation coefficient to the inverse orthogonal transformation unit 124 .
  • the inverse orthogonal transformation unit 124 applies an inverse orthogonal transformation process to the orthogonal transformation coefficient to thereby create the original image information before the encoding process (hereinafter, referred to as restored image information), and stores this in the image rearrangement buffer 126 .
  • the reverse decoding unit 122 performs a decoding process on both of this encoded information and the encoded motion vector information which has been inserted in the header part of the encoded information, and supplies obtained quantized information to the dequantization unit 123 and supplies the motion vector information to the motion prediction/compensation unit 127 .
  • the quantized information is subjected to various processes, as in the case of decoding encoded information intra-encoded, and is then supplied to the adder 125 as differential information.
  • the motion prediction/compensation unit 127 creates predicted image information based on the motion vector information and reference image information stored in the frame memory 128 , and supplies this to the adder 125 .
  • the adder 125 synthesizes the reference image information and the differential information, and stores obtained restored image information to the image rearrangement buffer 126 .
  • the decoding apparatus 120 successively creates restored image information by performing the decoding process on the encoded information successively inputted, and successively outputs the restored image information via the image rearrangement buffer 126 to, for example, a display unit (not shows) for successive reproduction.
  • the MPEG2 encoding system prescribes that only I (Intra)-pictures and P (Predictive)-pictures are used as pictures for inter-prediction-encoding, a decoding order for the decoding process is naturally determined.
  • the decoding apparatus 120 can appropriately display images based on the restored image information on the display unit without adjusting the output timing of the restored image information at the image rearrangement buffer 126 .
  • the JVT encoding system has a larger degree of freedom for selection of pictures for prediction-encoding, for example, it can treat not only I- and P-pictures but also B (Bidirectional)-pictures as pictures for inter-prediction-encoding.
  • the JVT encoding system does not prescribe a decoding order in a decoding process and further does not specify output timing of restored image information.
  • the decoding apparatus 120 successively reproduces restored image information which is created by performing the decoding process on encoded information with the JVT encoding system, such happening occurs that encoded information is still being decoded at the output timing for restored image information corresponding to the encoded information due to a limited resource of the image rearrangement buffer 126 , and as a result, the continuousness is broken.
  • This invention was made in view of the above points and intends to an encoding apparatus and encoding method for making a decoding apparatus perform successive reproduction, and to a decoding apparatus capable of performing successive reproduction and a decoding method.
  • the encoding apparatus of this invention can make the decoding side recognize output timing calculated assuming that encoded information will be decoded on the decoding side, so as to keep the output continuousness of restored image information even the encoded information have been subjected to encoding with an encoding system which does not allow a decoding order to be naturally determined, thus the decoding side can perform continuous reproduction.
  • an encoding method for performing an encoding process with an encoding system capable of treating at least B-pictures as pictures to be prediction-encoded it is anticipated that a plurality of encoded information created by performing the encoding process will be successively decoded on a decoding side, and output timing of results of decoding the encoded information are calculated and the decoding side is notified of the calculated output timing before a result of decoding corresponding encoded information is obtained.
  • the decoding side by making the decoding side recognize output timing calculated anticipating that encoded information will be decoded on the decoding side, the output continuousness of restored image information can be kept even the encoded information have been subjected to encoding with an encoding system which does not allow a decoding order to be naturally determined, thus the decoding side can perform continuous reproduction.
  • the decoding apparatus of this invention can keep the output continuousness of restored image information even encoded information have been subjected to encoding with an encoding system which does not allow a decoding order to be naturally determined, and thus can perform continuous reproduction.
  • a decoding method for performing a decoding process on a plurality of encoded information which have been subjected to encoding with an encoding system capable of at least B-pictures as pictures to be prediction-encoded comprises a first step of temporarily storing restored image information sequentially created as a result of the decoding process, a second step of outputting the restored image information to be stored, and a third step of, in a case where restored image information to be stored has been failed, re-outputting restored image information outputted just before the failure.
  • FIG. 1 is a block diagram showing an image reproduction system according to the first embodiment.
  • FIG. 2 is a functional block diagram showing the processing contents of an encoding control unit.
  • FIG. 3 is a table used for explaining calculation of output timing.
  • FIG. 4 is a flowchart showing a procedure for an output timing notification process.
  • FIG. 5 is a block diagram showing an image reproduction system according to the second embodiment.
  • FIG. 6 is a table used for explaining control of re-output.
  • FIG. 7 is a flowchart showing a procedure for a re-output control process.
  • FIG. 8 is a block diagram showing the construction of an encoding apparatus.
  • FIG. 9 is a block diagram showing the construction of a decoding apparatus.
  • reference numeral 1 shows an image reproduction system according to the first embodiment as a whole, which is constructed by connecting an encoding apparatus 2 with the JVT encoding system and a decoding apparatus 3 to each other with a prescribed transmission line.
  • the encoding apparatus 2 includes an encoding unit 10 having the same construction as the encoding apparatus 100 described with reference to FIG. 8 and an encoding control unit 11 for controlling the encoding unit 10 , and similarly to the case described with reference to FIG. 8 , performs an encoding process on successive image information D 1 through the encoding unit 10 controlled by the encoding control unit 11 to successively create encoded information D 2 (D 2 a , D 2 b , . . .
  • unit image information (frame data or field data) D 1 a to D 1 n , and then outputs the encoded information D 2 , the successive image information D 1 being supplied from the outside or being read from an internal recording medium (not shown) such as an HDD (Hard Disk Drive).
  • HDD Hard Disk Drive
  • the decoding apparatus 3 includes a decoding unit 20 having the same construction as the decoding apparatus 120 described with reference to FIG. 9 and a decoding control unit 21 for controlling the decoding unit 20 , and similarly to the case described with reference to FIG. 9 , performs a decoding process on the encoded information D 2 successively inputted through the transmission line, through the decoding unit 20 controlled by the decoding control unit 21 to successively create restored image information D 3 (D 3 a , D 3 b , . . . or D 3 n ), and then successively outputs the restored image information D 3 to a display unit (not shown), which results in continuous reproduction.
  • a decoding unit 20 having the same construction as the decoding apparatus 120 described with reference to FIG. 9 and a decoding control unit 21 for controlling the decoding unit 20 , and similarly to the case described with reference to FIG. 9 , performs a decoding process on the encoded information D 2 successively inputted through the transmission line, through the decoding unit 20 controlled by the decoding control unit 21 to successive
  • This encoding control unit 11 of the encoding apparatus 2 detects the GOP structure of the successive image information D 1 inputted to the encoding unit 10 and conditions on an encoding process, such as an encoding order, (hereinafter, referred to as encoding conditions), based on previously stored programs, table information and so on with the JVT encoding system, to control the encoding unit 10 on the encoding conditions.
  • encoding conditions an encoding order
  • the encoding control unit 11 calculates output timing of the restored image information D 3 which is a result of decoding the encoded information D 2 , anticipating that the encoded information D 2 will be successively decoded on a decoding side (decoding apparatus 3 ), and performs an output timing notification process to notify the decoding apparatus 3 of this output timing before the restored image information D 3 is created.
  • the processing contents of the output timing notification processing by the encoding control unit 11 is divided functionally, they can be divided into a delay calculation unit 11 a for calculating a period of time after a decoding process of the encoded information D 2 is started until the decoded image information D 3 created by the process is outputted (hereinafter, the period of time is referred to as decode delay), and a header addition unit 11 b for adding a decode delay as the header of corresponding encoded information D 2 , as shown in FIG. 2 .
  • a delay calculation unit 11 a for calculating a period of time after a decoding process of the encoded information D 2 is started until the decoded image information D 3 created by the process is outputted
  • the header addition unit 11 b for adding a decode delay as the header of corresponding encoded information D 2 , as shown in FIG. 2 .
  • a column “EI (Encoder Input)” shows picture types which are assigned to unit image information D 1 a to D 1 n to be inputted to the encoding unit 10 , according to the GOP structure, that is, a picture-type order before the encoding process.
  • a column “EO (Encoder Output)” shows the picture types of the encoded information D 2 a to D 2 n created by performing the encoding process on the unit image information D 1 a to D 1 n inputted to the encoding unit 10 in a prescribed encoding order, that is, a picture-type order after the encoding process.
  • a column “Ed (Encoder Delay)” shows a period of time after the encoding process of unit image information D 1 a , D 1 b , . . . or D 1 n is started until the encoded information D 2 created by the process is outputted (hereinafter, this period of time is referred to as an encode delay) and the period of time is calculated based on the encoding conditions. Based on encode delays (in a column “Ed”), output timing of the encoded information D 2 a to D 2 n (in a column “EO”) are adjusted if necessary.
  • the first unit image information D 1 a (“I00” in the column “EI”) which is of an I-picture type
  • the delay calculation unit 11 a calculates decode delays (in a column “Dd (Decoder Delay)”) for the encoded information D 2 based on the encode delays, and calculates the decode delays of the encoded information D 2 a to D 2 n so that a decode delay of encoded information D 2 (“B13”) having the longest encode delay (10 seconds) is the shortest (0 second).
  • Dd Decoder Delay
  • the delay calculation unit 11 a subtracts the value of the encode delay corresponding to each piece of encoded information D 2 a to D 2 n from 10 [seconds] to thereby calculate the values of decode delays (values in the column “Dd”) for the encoded information D 2 a to D 2 n.
  • the delay calculation unit 11 a creates decode delay information D 10 from thus calculated decode delays and sends them to the header addition unit 11 b.
  • the header addition unit 11 b adds, based on the decode delay information D 10 supplied from the delay calculation unit 11 a , a corresponding decode delay to the encoded information D 2 by placing the delay in the header.
  • the encoded information D 2 a (“I00” in the column “EO”) having such a decode delay added as the header is outputted when the encode delay is passed and is inputted into the decoding apparatus 3 through the transmission line.
  • the transmission time through the transmission line between the encoding apparatus 2 and the decoding apparatus 3 is not considered, and therefore the input timing of encoded information D 2 to the decoding apparatus 3 is taken as the same as the output timing (in the column “EO”) of the encoded information D 2 from the encoding apparatus 10 .
  • the decoding apparatus 3 adjusts the output timing of restored image information D 3 which is a result of decoding encoded information D 2 , based on the decode delays of the encoded information D 2 . That is, the decode delay (4 [seconds] in the column “Dd”) of the first encoded information D 2 a , for example, is recognized by the decoding control unit 21 of the decoding apparatus 3 based on the header added to the encoded information D 2 a before the encoded information D 2 a is decoded.
  • the other encoded information D 2 b to D 2 n similar to the encoded information D 2 a , they are converted into the restored image information D 3 b to D 3 n by the decoding unit 20 , and these restored image information D 3 b to D 3 n are rearranged, if necessary, by the image rearrangement buffer 126 ( FIG. 9 ) to have the same picture-type order before the encoding process (in a column “DO (Decoder Output)”), and are outputted after the output timing are adjusted based on corresponding encode delays, which results in continuous reproduction.
  • DO Decoder Output
  • the encode control unit 11 performs the output timing notification process as described above, to thereby make the decoding apparatus 3 recognize decode delays via headers, the decode delays calculated assuming that encoded information D 2 will be decoded by the decoding apparatus 3 .
  • the encoding control unit 11 starts the procedure RT 1 for the output timing notification process from step SP 0 when predetermined operations to execute the encoding process are performed with an input unit (not shown), and finds out the longest encode delay (which corresponds to 10 [seconds] in the column “Ed” in FIG. 3 ) out of encode delays calculated on the encoding conditions at following step SP 1 .
  • the encoding control unit 11 stores, for example, the encoded information D 2 a existing in the reverse encoding unit 106 ( FIG. 8 ) of the encoding unit 10 , in the storage buffer 107 ( FIG. 8 ) at step SP 2 , and judges at step SP 3 whether the storage has been done successfully, and if a negative result is obtained, the process returns back to step SP 2 to re-store the encoded information D 2 a.
  • the encoding control unit 11 subtracts the encode delay (which corresponds to 6 [seconds] in the column “Ed” in FIG. 3 ) for the encoded information D 2 a stored at step SP 2 from the encode delay recognized at step SP 1 , to thereby calculate a decode delay (which corresponds to 4 [seconds] in the column “Dd”) for the encoded information D 2 a , and adds the calculated decode delay to the encoded information D 2 a as a header at step SP 4 .
  • the encoding control unit 11 judges at step SP 5 whether all of the encoded information D 2 have been taken in the storage buffer 107 ( FIG. 8 ), and if a negative result is obtained, it returns back to step SP 2 and repeats the above processes, and on the contrary, if an affirmative result is obtained, it moves on to step SP 6 where the procedure RT 1 for the output timing notification process is terminated.
  • the encoding control unit 11 can execute the output timing notification process following the procedure RT 1 for the output timing notification process.
  • the encoding apparatus 2 calculates output timing (decode delay) for a result (restored image information D 3 ) of decoding the encoded information D 2 , anticipating that the encoded information D 2 obtained as a result of encoding with the JVT encoding system will be decoded on the decoding apparatus 3 side, and adds the calculated output timing as a header.
  • this encoding apparatus 2 can make the decoding apparatus 3 recognize via headers decode delays which are obtained assuming that the encoded information D 2 will be decoded by the decoding apparatus 3 , before the decoding process, so that even the encoded information D 2 have been encoded with the JVT encoding system which does not allow the decoding order to be naturally determined, the output continuousness of restored image information D 3 can be kept.
  • the encoding apparatus 2 calculates output timing (decode delays) for restored image information D 3 so as to immediately output a result of decoding encoded information D 2 (“B13” in FIG. 3 ) having the longest periods of time out of periods of time after the encoding process is started till encoded information D 2 is outputted.
  • the encoding apparatus 2 can calculate output timing (decode delays) for results (restored image information D 3 ) of decoding encoded information D 2 based on output timing (encode delays) from the decoding side for encoded information D 2 which needs the longest time to be decoded on the decoding side, and as a result, the output timing from the decoding apparatus 3 can be adjusted (by offset) so as not to occur underflow.
  • the aforementioned first embodiment has described the case where the JVT encoding system is applied.
  • This invention is not limited to this and another kind of encoding system which can treat at least B-pictures as pictures to be prediction-encoded can be applied.
  • a timing calculation means for assuming that a plurality of encoded information D 2 created by performing the encoding process are sequentially decoded on a decoding side, calculating output timing for results of decoding the encoded information calculates decode delays for results of decoding the encoded information so that a result of decoding encoded information D 2 having the longest encode delay out of the encode delays is immediately outputted.
  • This invention is not limited to this and a period of time after a result of decoding encoded information is obtained until its output timing may be calculated instead of the decode delays, or a decode delay for a result of decoding encoded information may be calculated based on a calculation result on an assumed occupation rate of information stored in a buffer of the decoding side. If output timing can be calculated based on such a calculation result, such output timing can be calculated as to make the decoding side perform stable continuous reproduction.
  • a timing notification means for notifying a decoding side of output timing before a result of decoding corresponding encoded information is obtained adds output timing (decode delay) to corresponding encoded information D 2 as its header based on decode delay information D 10 supplied from the delay calculation unit 11 a .
  • This invention is not limited to this and the decode delay information D 10 can be directly outputted to the decoding side before an encoding process, without being added as a header.
  • reference numeral 51 shows an image reproduction system according to the second embodiment as a whole and the system is constructed by connecting an encoding apparatus 52 with the JVT encoding system and a decoding apparatus 53 to each other with a prescribed transmission line.
  • the encoding apparatus 52 has an encoding unit 10 and an encoding control unit 61 for controlling the encoding unit 10 , and the encoding control unit 61 does not perform the aforementioned output timing notification process but carries out the other processes performed in the aforementioned first embodiment.
  • the encoding apparatus 52 performs an encoding process on successive image information D 1 with the encoding unit 10 controlled by the encoding control unit 61 , to thereby successively create encoded information D 2 without a decode delay added as a header, and then successively outputs the encoded information D 2 .
  • the decoding control unit 71 of this decoding unit 53 temporarily stores successively inputted encoded information D 2 in a storage buffer 121 ( FIG. 9 ) of the decoding unit 20 , detects conditions on decoding (hereinafter referred to as decoding conditions), such as a decoding order of the encoded information D 2 and a start time of a decoding process, based on the headers of the encoded information D 2 , and thus can control the decoding unit 20 on the decoding conditions.
  • decoding conditions such as a decoding order of the encoded information D 2 and a start time of a decoding process
  • the decoding control unit 71 watches a storage state of the restored image information D 3 stored in an image rearrangement buffer 126 ( FIG. 9 ) after the decoding process, and when detecting underflow as the storage state, carries out a re-output control process for re-outputting the restored image information D 3 outputted just before the detection (underflow). This re-output control process will be described by using an example shown in FIG. 6 .
  • a column “EI” shows a picture-type order before an encoding process
  • a column “EO” shows a picture-type order after the encoding process
  • a column “DO” shows a picture-type order after decoding
  • a column “Ed” shows encode delays
  • a column “Dd” shows decode delays.
  • a transmission time through the transmission line between the encoding apparatus 52 and the decoding apparatus 53 is not considered and an input time of encoded information D 2 to the decoding apparatus 53 is taken to the same as an output time (in the column “EO”) of the encoded information D 2 from the encoding unit 10 .
  • a column “Sud (Start-up delay)” shows a period of time after each piece of encoded information D 2 is inputted until a decoding process is started (hereinafter, this period of time is referred to as a start-up delay), and this start-up delay is calculated based on the encoding conditions by the encoding control unit 61 of the encoding apparatus 52 and is added as a header.
  • the decoding control unit 71 When the decoding control unit 71 receives the first encoded information D 2 a (“I00” in the column “EO”), it ignores the start-up delay for the encoded information D 2 a (“6” [seconds] in the column “Sud”) and performs control to immediately send the encoded information D 2 a to the reverse encoding unit 122 and start a decoding process (“0” [second] in the column “Dd”). As a result, the decoding control unit 71 can shorten a preparation time (driving time) for continuous reproduction.
  • the decoding control unit 71 have encoded information D 2 b , D 2 c , . . . successively inputted following the encoded information D 2 a (“P01”, “P02”, . . . in the column “EO”), decoded according to the start-up delays, and manages the storage state of the restored image information D 3 b , D 3 c , . . . (“P01”, “P02”, . . . in the column “DO”) being stored in the image rearrangement buffer 126 ( FIG. 9 ) after the decoding process.
  • the decoding control unit 71 re-outputs the restored image information D 3 outputted just before the failure (“P05” and “P10” in the column “DO”), to thereby periodically offset the other delay amounts to be adjusted.
  • the decoding control unit 71 executes the re-output control process, so as to keep the output continuousness of the restored image information D 3 .
  • the decoding control unit 71 starts the procedure RT 2 for the re-output control process from step SP 10 , and waits for the first encoded information D 2 a to be inputted at next step SP 11 , and when receiving the encoded information D 2 a , moves on to step SP 12 .
  • the decoding control unit 71 ignores a start-up delay for the encoded information D 2 a and immediately starts the decoding process at step SP 12 , and after outputting restored image information D 3 a obtained as a result of the decoding process at next step SP 13 , re-outputs the restored image information D 3 a at step SP 14 to offset a part of a delay amount to be adjusted.
  • the decoding control unit 71 starts decoding of following restored image information D 2 (D 2 b , D 2 c , . . . D 2 n ) and outputs restored image information D 3 (D 3 b , D 3 c , . . . D 3 n ) obtained by the process at step SP 15 , and at next step SP 16 judges whether restored image information D 3 to be stored in the image rearrangement buffer 126 ( FIG. 9 ) has been failed, and then if a negative result is obtained, returns back to step SP 15 and repeats the above processes.
  • the decoding control unit 71 judges at step SP 18 whether all of the delay amount to be adjusted is offset, and if a negative result is obtained, returns back to step SP 15 and repeats the above processes. And if an affirmative result is obtained, on the contrary, the decoding control unit 71 moves on to step SP 19 where this procedure RT 2 for the re-output control process is terminated.
  • the encoding control unit 11 can carries out the re-output control process following the procedure RT 2 for the re-output control process.
  • this decoding apparatus 53 temporarily stores the restored image information D 3 successively created by performing the decoding process on the encoded information D 2 which have been encoded with the JVT encoding system, in the image rearrangement buffer 126 ( FIG. 9 ), and if restored image information D 3 to be stored in the image rearrangement buffer 126 ( FIG. 9 ) is failed, re-outputs the restored image information D 3 outputted just before the failure.
  • the decoding apparatus 53 can keep the output continuousness of the restored image information D 3 even the encoded information D 2 have been subjected to encoding with the JVT encoding system which does not allow a decoding order to be naturally determined.
  • the decoding apparatus 53 ignores the decoding start time (start-up delay) set for the first encoded information D 2 stored in the storage buffer 121 ( FIG. 9 ) and immediately starts decoding of the encoded information D 2 a , and when restored image information D 3 to be stored in the image rearrangement buffer 126 ( FIG. 9 ) is failed, offsets a lag (delay amount to be adjusted) from the set decoding start time occurred due to the ignorance, by re-outputting the restored image information outputted just before the failure.
  • the decoding apparatus 53 can shorten a preparation time (driving time) for continuous reproduction and can keep the output continuousness of restored image information while periodically dispersing a lag occurred due to the shortening (delay amount to be adjusted).
  • the restored image information D 3 sequentially created by performing the decoding process on the encoded information D 2 which have been subjected to encoding with the JVT encoding system are temporarily stored in the image rearrangement buffer 126 ( FIG. 9 ), and when restored image information D 3 to be stored in the image rearrangement buffer 126 ( FIG. 9 ) is failed, the restored image information D 3 outputted just before the failure is re-outputted, so that even the encoded information D 2 have been subjected to encoding with the JVT encoding system which does not allow a decoding order to be naturally determined, the output continuousness of the restored image information D 3 can be kept, which results in the continuous reproduction.
  • the decoding control unit 71 serving as an output control means ignores a decoding start time (start-up delay) set for the first encoded information D 2 a being stored in the storage buffer 121 ( FIG. 9 ) and immediately starts decoding of the encoded information D 2 a , and when restored image information D 3 to be stored in the image rearrangement buffer 126 ( FIG. 9 ) is failed, offsets a lag (a delay amount to be adjusted) from the set decoding start time occurred due to the ignorance, by re-outputting restored image information outputted just before the failure.
  • the restored image information D 3 corresponding to the encoded information D 2 having the different order may be re-outputted.
  • a decode delay which becomes longer when a different order is generated can be filled by re-output, which results in more assured output continuousness of the restored image information D 3 .
  • This invention can be used for a case of transmitting successive image information via a network medium such as satellite broadcasting, cable TV or the Internet, or a case of processing the successive image information on a storage medium such as an optical disc, magnetic disk or flash memory.
  • a network medium such as satellite broadcasting, cable TV or the Internet
  • a storage medium such as an optical disc, magnetic disk or flash memory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

Continuous reproduction can be made possible. An encoding apparatus for executing an encoding process with an encoding system capable of at least B-pictures as pictures to be prediction-encoded comprises a timing calculation means for, anticipating that a plurality of encoded information created by performing the encoding process will be sequentially decoded on a decoding side, calculating output timing for results of decoding the encoded information, and a timing notification means for notifying the decoding side of the output timing calculated by the timing calculation means before a result of decoding corresponding encoded information is obtained.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is a continuation of U.S. application Ser. No. 10/519,840, filed Jul. 1, 2005, the entire contents of which are incorporated herein by reference. U.S. application Ser. No. 10/519,840 is a national Stage of PCT/JP03/09138 filed Jul. 18, 2003, and claims the benefit of priority under 35 U.S.C. §119 of Japanese Application No. 2002-211829 filed Jul. 19, 2002, and Japanese Application No. 2003-276326 filed Jul. 17, 2003.
TECHNICAL FIELD
This invention relates to an encoding apparatus, encoding method, decoding apparatus and decoding apparatus, and for example, is suitably applied for transmitting image information (hereinafter, referred to as successive image information) composed of a plurality of unit image information being continued, via a network medium such as satellite broadcasting, cable TV or the Internet, or for processing successive image information on a storage medium such as an optical disc, a magnetic disk or a flash memory.
BACKGROUND ART
Recently, for broadcasting stations and homes, such devices have been spread that adopt an encoding system such as MPEG (Moving Picture Experts Group) to encode (compress) successive images through orthogonal transformation such as discrete cosine transform, and motion compensation, utilizing the redundancy of the successive image information, for efficient information transmission or storage, by taking the successive images as digital information.
Especially, the MPEG2 (ISO/IEC 13818-2) encoding system is defined as a general image encoding system, and is widely used as an application for professionals and for consumers since it can treat interlaced images and progressively scanned images, and standard resolution images and high resolution images. By using this MPEG2 encoding system, high encoding efficiency (compression rate) and high quality of images can be provided, for example, by assigning interlaced images of standard resolution of 720×480 pixels an amount of encoding (bit rate) of 4 to 8 [Mbps], or by assigning progressively scanned images of high resolution of 1920×1088 pixels a bit rate of 18 to 22 [Mbps].
The MPEG2 encoding system is mainly used for encoding high quality of images for broadcasting and does not cope with an amount of encoding (bit rate) lower than that used by the MPEG1 encoding system, that is, an encoding method with high encoding efficiency. It was expected that popularization of mobile terminals brings high needs of such an encoding system, and therefore the MPEG4 encoding system was standardized. The MPEG4 encoding system for images was approved as international standard ISO/IEC 14496-2 in December 1998.
In addition, recently, an encoding system (hereinafter referred to as JTV encoding system) called MPEG4AVC or H.264 was standardized by a joint video team composed of a VCEG group and an MPEG group. Compared with the MPEG2 and the MPEG4, this JVT encoding system can provide higher encoding efficiency although it requires more operations for encoding and decoding.
Now, FIG. 8 shows a rough construction of an encoding apparatus which realizes an encoding process with any of the encoding systems referred above. As shown in FIG. 8, the encoding apparatus 100 is composed of an image rearrangement buffer 102, an adder 103, an orthogonal transformation unit 104, a quantization unit 105, a reverse encoding unit 106, a storage buffer 107, a dequantization unit 108, an inverse orthogonal transformation unit 109, a frame memory 110, a motion prediction/compensation unit 111 and a rate control unit 112.
In this case, the encoding apparatus 100 stores successive image information in the image rearrangement buffer 102 to rearrange the successive image information according to GOP (Group of Pictures) structure on a unit-image-information basis (frame by frame or field by field).
The image rearrangement buffer 102 gives the orthogonal transformation unit 104 unit image information out of the successive image information, which should be intra-prediction-encoded. The orthogonal transformation unit 104 applies orthogonal transformation such as the discrete cosine transform or the Karhunen Loeve transform, to the unit image information and gives an obtained orthogonal transformation coefficient to the quantization unit 105.
The quantization unit 105 performs a quantization process on the orthogonal transformation coefficient given from the orthogonal transformation unit 104, under the control of the rate control unit 112, and supplies obtained quantized information (a quantized orthogonal transformation coefficient) to the reverse encoding unit 106 and the dequantization unit 108. The reverse encoding unit 106 applies variable-length coding or reverse encoding such as arithmetic coding to the quantized information, and stores obtained encoded information (encoded quantized-information) in the storage buffer 107.
The dequantization unit 108 applies a dequantization process to the quantized information and supplies obtained orthogonal transformation coefficient to the inverse orthogonal transformation unit 109. The inverse orthogonal transformation unit 109 applies the inverse orthogonal transformation to the orthogonal transformation coefficient and stores, if necessary, obtained unit image information in the frame memory 110 as reference image information.
On the other hand, the image rearrangement buffer 102 supplies unit image information which should be inter-prediction-encoded, out of the successive image information to the motion prediction/compensation unit 111. The motion prediction/compensation unit 111 performs a motion prediction/compensation process by using the unit image information and reference image information read from the frame memory 10, and supplies obtained predicted image information to the adder 103. The adder 103 supplies to the orthogonal transformation unit 104 difference between the predicted image information and corresponding unit image information as differential information.
This differential information is subjected to various processes, as in the case of the intra-encoding, and the resultant is stored in the storage buffer 107 as encoded information and is stored, if necessary, in the frame memory 110 as reference image information.
In addition, the motion compensation/prediction unit 111 gives the reverse encoding unit 106 motion vector information which is obtained together with the predicted image information as a result of the motion prediction/compensation process. The reverse encoding unit 106 performs the reverse encoding process on the motion vector information to thereby obtain encoded motion vector information for the header part of the corresponding encoded information.
In such a manner, the encoding apparatus 100 successively creates encoded information on a unit-image-information basis by performing the encoding process on the successive image information, and successively outputs the encoded information via the storage buffer 107.
Next, FIG. 9 shows a rough construction of a decoding apparatus which performs a decoding process corresponding to the encoding system of the encoding apparatus 100. As shown in FIG. 9, the decoding apparatus 120 is composed of a storage buffer 121, a reverse decoding unit 122, a dequantization unit 123, an inverse orthogonal transformation unit 124, an adder 125, an image rearrangement buffer 126, a motion prediction/compensation unit 127, and a frame memory 128.
In this case, the decoding apparatus 120 temporarily stores encoded information which is successively inputted, in the storage buffer 121 and then supplies it to the reverse decoding unit 122. In a case where the encoded information have been subjected to the intra-prediction encoding, the reverse decoding unit 122 applies a decoding process, variable-length decoding or arithmetic decoding, to the encoded information, and supplies obtained quantized information to the dequantization unit 123.
The dequantization unit 123 applies a dequantization process to the quantized information given from the reverse decoding unit 122 and supplies obtained orthogonal transformation coefficient to the inverse orthogonal transformation unit 124. The inverse orthogonal transformation unit 124 applies an inverse orthogonal transformation process to the orthogonal transformation coefficient to thereby create the original image information before the encoding process (hereinafter, referred to as restored image information), and stores this in the image rearrangement buffer 126.
On the other hand, in a case where the encoded information have been subjected to the inter prediction encoding, the reverse decoding unit 122 performs a decoding process on both of this encoded information and the encoded motion vector information which has been inserted in the header part of the encoded information, and supplies obtained quantized information to the dequantization unit 123 and supplies the motion vector information to the motion prediction/compensation unit 127. The quantized information is subjected to various processes, as in the case of decoding encoded information intra-encoded, and is then supplied to the adder 125 as differential information.
In addition, the motion prediction/compensation unit 127 creates predicted image information based on the motion vector information and reference image information stored in the frame memory 128, and supplies this to the adder 125. The adder 125 synthesizes the reference image information and the differential information, and stores obtained restored image information to the image rearrangement buffer 126.
In the aforementioned manner, the decoding apparatus 120 successively creates restored image information by performing the decoding process on the encoded information successively inputted, and successively outputs the restored image information via the image rearrangement buffer 126 to, for example, a display unit (not shows) for successive reproduction.
By the way, since the MPEG2 encoding system prescribes that only I (Intra)-pictures and P (Predictive)-pictures are used as pictures for inter-prediction-encoding, a decoding order for the decoding process is naturally determined.
Therefore, in a case where the decoding apparatus 120 successively reproduces restored image information which is created by performing the decoding process on encoded information with the MPEG 2 encoding system, it can appropriately display images based on the restored image information on the display unit without adjusting the output timing of the restored image information at the image rearrangement buffer 126.
On the other hand, as compared with the MPEG2 encoding system, the JVT encoding system has a larger degree of freedom for selection of pictures for prediction-encoding, for example, it can treat not only I- and P-pictures but also B (Bidirectional)-pictures as pictures for inter-prediction-encoding.
However, the JVT encoding system does not prescribe a decoding order in a decoding process and further does not specify output timing of restored image information.
Therefore, if the decoding apparatus 120 successively reproduces restored image information which is created by performing the decoding process on encoded information with the JVT encoding system, such happening occurs that encoded information is still being decoded at the output timing for restored image information corresponding to the encoded information due to a limited resource of the image rearrangement buffer 126, and as a result, the continuousness is broken.
DISCLOSURE OF THE INVENTION
This invention was made in view of the above points and intends to an encoding apparatus and encoding method for making a decoding apparatus perform successive reproduction, and to a decoding apparatus capable of performing successive reproduction and a decoding method.
To solve such problems, in this invention, an encoding apparatus which carries out an encoding process with an encoding system capable of treating at least B-pictures as pictures to be prediction-encoded comprises a timing calculation means for calculating output timing of results of decoding a plurality of encoded information, anticipating that the encoded information created through the encoding process will be successively decoded on a decoding side, and a timing notification means for notifies the decoding side of each of the output timing calculated by the timing calculation means before a result of decoding corresponding encoded information is obtained.
Therefore, the encoding apparatus of this invention can make the decoding side recognize output timing calculated assuming that encoded information will be decoded on the decoding side, so as to keep the output continuousness of restored image information even the encoded information have been subjected to encoding with an encoding system which does not allow a decoding order to be naturally determined, thus the decoding side can perform continuous reproduction.
Further, in an encoding method for performing an encoding process with an encoding system capable of treating at least B-pictures as pictures to be prediction-encoded, it is anticipated that a plurality of encoded information created by performing the encoding process will be successively decoded on a decoding side, and output timing of results of decoding the encoded information are calculated and the decoding side is notified of the calculated output timing before a result of decoding corresponding encoded information is obtained.
Therefore, in the encoding method according to this invention, by making the decoding side recognize output timing calculated anticipating that encoded information will be decoded on the decoding side, the output continuousness of restored image information can be kept even the encoded information have been subjected to encoding with an encoding system which does not allow a decoding order to be naturally determined, thus the decoding side can perform continuous reproduction.
Still further, a decoding apparatus for performing a decoding process on a plurality of encoded information which have been subjected to encoding with an encoding system capable of treating at least B-pictures as pictures to be prediction-encoded comprises a storage means for temporarily storing restored image information successively created as a result of the decoding process, and an output control means for controlling output of the restored image information to be stored in the storage means. If restored image information to be stored in the storage means is failed, the output control means re-outputs restored image information outputted just before the failure.
Therefore, the decoding apparatus of this invention can keep the output continuousness of restored image information even encoded information have been subjected to encoding with an encoding system which does not allow a decoding order to be naturally determined, and thus can perform continuous reproduction.
Still further, in this invention, a decoding method for performing a decoding process on a plurality of encoded information which have been subjected to encoding with an encoding system capable of at least B-pictures as pictures to be prediction-encoded comprises a first step of temporarily storing restored image information sequentially created as a result of the decoding process, a second step of outputting the restored image information to be stored, and a third step of, in a case where restored image information to be stored has been failed, re-outputting restored image information outputted just before the failure.
Therefore, in the decoding method of this invention, even encoded information have been subjected to encoding with an encoding system which does not allow a decoding order to be naturally determined, the output continuousness of restored image information can be kept, thus continuous reproduction can be performed.
DETAILED DESCRIPTION OF THE DRAWINGS
FIG. 1 is a block diagram showing an image reproduction system according to the first embodiment.
FIG. 2 is a functional block diagram showing the processing contents of an encoding control unit.
FIG. 3 is a table used for explaining calculation of output timing.
FIG. 4 is a flowchart showing a procedure for an output timing notification process.
FIG. 5 is a block diagram showing an image reproduction system according to the second embodiment.
FIG. 6 is a table used for explaining control of re-output.
FIG. 7 is a flowchart showing a procedure for a re-output control process.
FIG. 8 is a block diagram showing the construction of an encoding apparatus.
FIG. 9 is a block diagram showing the construction of a decoding apparatus.
BEST MODE FOR CARRYING OUT THE INVENTION
Embodiment 1
(1) Construction of Image Reproduction System 1
Referring to FIG. 1, reference numeral 1 shows an image reproduction system according to the first embodiment as a whole, which is constructed by connecting an encoding apparatus 2 with the JVT encoding system and a decoding apparatus 3 to each other with a prescribed transmission line.
The encoding apparatus 2 includes an encoding unit 10 having the same construction as the encoding apparatus 100 described with reference to FIG. 8 and an encoding control unit 11 for controlling the encoding unit 10, and similarly to the case described with reference to FIG. 8, performs an encoding process on successive image information D1 through the encoding unit 10 controlled by the encoding control unit 11 to successively create encoded information D2 (D2 a, D2 b, . . . or D2 n) from unit image information (frame data or field data) D1 a to D1 n, and then outputs the encoded information D2, the successive image information D1 being supplied from the outside or being read from an internal recording medium (not shown) such as an HDD (Hard Disk Drive).
The decoding apparatus 3, on the other hand, includes a decoding unit 20 having the same construction as the decoding apparatus 120 described with reference to FIG. 9 and a decoding control unit 21 for controlling the decoding unit 20, and similarly to the case described with reference to FIG. 9, performs a decoding process on the encoded information D2 successively inputted through the transmission line, through the decoding unit 20 controlled by the decoding control unit 21 to successively create restored image information D3 (D3 a, D3 b, . . . or D3 n), and then successively outputs the restored image information D3 to a display unit (not shown), which results in continuous reproduction.
(2) Construction of Encoding Control Unit 11
This encoding control unit 11 of the encoding apparatus 2 detects the GOP structure of the successive image information D1 inputted to the encoding unit 10 and conditions on an encoding process, such as an encoding order, (hereinafter, referred to as encoding conditions), based on previously stored programs, table information and so on with the JVT encoding system, to control the encoding unit 10 on the encoding conditions.
In addition to the above structure, the encoding control unit 11 calculates output timing of the restored image information D3 which is a result of decoding the encoded information D2, anticipating that the encoded information D2 will be successively decoded on a decoding side (decoding apparatus 3), and performs an output timing notification process to notify the decoding apparatus 3 of this output timing before the restored image information D3 is created.
Now, if the processing contents of the output timing notification processing by the encoding control unit 11 is divided functionally, they can be divided into a delay calculation unit 11 a for calculating a period of time after a decoding process of the encoded information D2 is started until the decoded image information D3 created by the process is outputted (hereinafter, the period of time is referred to as decode delay), and a header addition unit 11 b for adding a decode delay as the header of corresponding encoded information D2, as shown in FIG. 2. Processes by the delay calculation unit 11 a and the header addition unit will be described by using an example shown in FIG. 3.
Referring to FIG. 3, a column “EI (Encoder Input)” shows picture types which are assigned to unit image information D1 a to D1 n to be inputted to the encoding unit 10, according to the GOP structure, that is, a picture-type order before the encoding process. A column “EO (Encoder Output)” shows the picture types of the encoded information D2 a to D2 n created by performing the encoding process on the unit image information D1 a to D1 n inputted to the encoding unit 10 in a prescribed encoding order, that is, a picture-type order after the encoding process.
A column “Ed (Encoder Delay)” shows a period of time after the encoding process of unit image information D1 a, D1 b, . . . or D1 n is started until the encoded information D2 created by the process is outputted (hereinafter, this period of time is referred to as an encode delay) and the period of time is calculated based on the encoding conditions. Based on encode delays (in a column “Ed”), output timing of the encoded information D2 a to D2 n (in a column “EO”) are adjusted if necessary. Specifically, as to the first unit image information D1 a (“I00” in the column “EI”) which is of an I-picture type, for example, if a period of time after it is inputted (t=0) until it is converted into the encoded information D2 a (“I00” in the column “EO”) is shorter than a corresponding encode delay (six seconds in the column “Ed”), its output timing is adjusted by storage in the storage buffer 107 (FIG. 8) of the encoding unit 10 and the information is outputted when the encode delay is passed (t=6).
The delay calculation unit 11 a calculates decode delays (in a column “Dd (Decoder Delay)”) for the encoded information D2 based on the encode delays, and calculates the decode delays of the encoded information D2 a to D2 n so that a decode delay of encoded information D2 (“B13”) having the longest encode delay (10 seconds) is the shortest (0 second).
Specifically, since the longest encode delay is 10 [seconds], the delay calculation unit 11 a subtracts the value of the encode delay corresponding to each piece of encoded information D2 a to D2 n from 10 [seconds] to thereby calculate the values of decode delays (values in the column “Dd”) for the encoded information D2 a to D2 n.
Then, the delay calculation unit 11 a creates decode delay information D10 from thus calculated decode delays and sends them to the header addition unit 11 b.
Every time when the encoding unit 10 creates encoded information D2, the header addition unit 11 b adds, based on the decode delay information D10 supplied from the delay calculation unit 11 a, a corresponding decode delay to the encoded information D2 by placing the delay in the header.
In this case, for example, the encoded information D2 a (“I00” in the column “EO”) having such a decode delay added as the header is outputted when the encode delay is passed and is inputted into the decoding apparatus 3 through the transmission line. Note that, in FIG. 3, the transmission time through the transmission line between the encoding apparatus 2 and the decoding apparatus 3 is not considered, and therefore the input timing of encoded information D2 to the decoding apparatus 3 is taken as the same as the output timing (in the column “EO”) of the encoded information D2 from the encoding apparatus 10.
The decoding apparatus 3 adjusts the output timing of restored image information D3 which is a result of decoding encoded information D2, based on the decode delays of the encoded information D2. That is, the decode delay (4 [seconds] in the column “Dd”) of the first encoded information D2 a, for example, is recognized by the decoding control unit 21 of the decoding apparatus 3 based on the header added to the encoded information D2 a before the encoded information D2 a is decoded. Then, the encoded information D2 a is converted into restored image information D3 a by the decoding unit 20, and if a passage time from the input (t=6) is shorter than the corresponding encode delay (4 [seconds]), its output timing is adjusted by storage in the image rearrangement buffer 126 (FIG. 9) according to necessity, and the restored image information D3 a is outputted when the decode delay (4 [seconds] (t=10)) is passed, to be displayed on the display unit.
In addition, as to the other encoded information D2 b to D2 n, similar to the encoded information D2 a, they are converted into the restored image information D3 b to D3 n by the decoding unit 20, and these restored image information D3 b to D3 n are rearranged, if necessary, by the image rearrangement buffer 126 (FIG. 9) to have the same picture-type order before the encoding process (in a column “DO (Decoder Output)”), and are outputted after the output timing are adjusted based on corresponding encode delays, which results in continuous reproduction.
The encode control unit 11 performs the output timing notification process as described above, to thereby make the decoding apparatus 3 recognize decode delays via headers, the decode delays calculated assuming that encoded information D2 will be decoded by the decoding apparatus 3.
Now, the output timing notification process as described is carried out following a procedure RT1 for the output timing notification process shown in FIG. 4.
Specifically, the encoding control unit 11 starts the procedure RT1 for the output timing notification process from step SP0 when predetermined operations to execute the encoding process are performed with an input unit (not shown), and finds out the longest encode delay (which corresponds to 10 [seconds] in the column “Ed” in FIG. 3) out of encode delays calculated on the encoding conditions at following step SP1.
Then the encoding control unit 11 stores, for example, the encoded information D2 a existing in the reverse encoding unit 106 (FIG. 8) of the encoding unit 10, in the storage buffer 107 (FIG. 8) at step SP2, and judges at step SP3 whether the storage has been done successfully, and if a negative result is obtained, the process returns back to step SP2 to re-store the encoded information D2 a.
If an affirmative result is obtained at step SP2, on the contrary, the encoding control unit 11 subtracts the encode delay (which corresponds to 6 [seconds] in the column “Ed” in FIG. 3) for the encoded information D2 a stored at step SP2 from the encode delay recognized at step SP1, to thereby calculate a decode delay (which corresponds to 4 [seconds] in the column “Dd”) for the encoded information D2 a, and adds the calculated decode delay to the encoded information D2 a as a header at step SP4.
Then, the encoding control unit 11 judges at step SP5 whether all of the encoded information D2 have been taken in the storage buffer 107 (FIG. 8), and if a negative result is obtained, it returns back to step SP2 and repeats the above processes, and on the contrary, if an affirmative result is obtained, it moves on to step SP6 where the procedure RT1 for the output timing notification process is terminated.
As described above, the encoding control unit 11 can execute the output timing notification process following the procedure RT1 for the output timing notification process.
In the aforementioned construction, the encoding apparatus 2 calculates output timing (decode delay) for a result (restored image information D3) of decoding the encoded information D2, anticipating that the encoded information D2 obtained as a result of encoding with the JVT encoding system will be decoded on the decoding apparatus 3 side, and adds the calculated output timing as a header.
Therefore, this encoding apparatus 2 can make the decoding apparatus 3 recognize via headers decode delays which are obtained assuming that the encoded information D2 will be decoded by the decoding apparatus 3, before the decoding process, so that even the encoded information D2 have been encoded with the JVT encoding system which does not allow the decoding order to be naturally determined, the output continuousness of restored image information D3 can be kept.
In this case, the encoding apparatus 2 calculates output timing (decode delays) for restored image information D3 so as to immediately output a result of decoding encoded information D2 (“B13” in FIG. 3) having the longest periods of time out of periods of time after the encoding process is started till encoded information D2 is outputted.
Therefore, the encoding apparatus 2 can calculate output timing (decode delays) for results (restored image information D3) of decoding encoded information D2 based on output timing (encode delays) from the decoding side for encoded information D2 which needs the longest time to be decoded on the decoding side, and as a result, the output timing from the decoding apparatus 3 can be adjusted (by offset) so as not to occur underflow.
According to the above construction, it is assumed that encoded information D2 obtained through encoding with the JVT encoding system will be decoded on the decoding apparatus 3 side, and output timing (decode delays) for restored image information D3 obtained by decoding the encoded information D2 are calculated and the calculated output timing are added as headers, and thereby even the encoded information D2 have been encoded with the JVT encoding system which does not allow the decoding order to be naturally determined, the output continuousness of the restored image information D3 can be kept, thus the decoding apparatus 3 can perform continuous reproduction.
(3) Other Embodiments
Note that, the aforementioned first embodiment has described the case where the JVT encoding system is applied. This invention, however, is not limited to this and another kind of encoding system which can treat at least B-pictures as pictures to be prediction-encoded can be applied.
Further, the aforementioned first embodiment has described the case where a timing calculation means for, assuming that a plurality of encoded information D2 created by performing the encoding process are sequentially decoded on a decoding side, calculating output timing for results of decoding the encoded information calculates decode delays for results of decoding the encoded information so that a result of decoding encoded information D2 having the longest encode delay out of the encode delays is immediately outputted. This invention, however, is not limited to this and a period of time after a result of decoding encoded information is obtained until its output timing may be calculated instead of the decode delays, or a decode delay for a result of decoding encoded information may be calculated based on a calculation result on an assumed occupation rate of information stored in a buffer of the decoding side. If output timing can be calculated based on such a calculation result, such output timing can be calculated as to make the decoding side perform stable continuous reproduction.
Further, the aforementioned first embodiment has described the case where a timing notification means for notifying a decoding side of output timing before a result of decoding corresponding encoded information is obtained adds output timing (decode delay) to corresponding encoded information D2 as its header based on decode delay information D10 supplied from the delay calculation unit 11 a. This invention, however, is not limited to this and the decode delay information D10 can be directly outputted to the decoding side before an encoding process, without being added as a header.
Embodiment 2
(1) Construction of Image Reproduction System 51
In FIG. 5 in which the same reference numerals are applied to those of corresponding parts in FIG. 1, reference numeral 51 shows an image reproduction system according to the second embodiment as a whole and the system is constructed by connecting an encoding apparatus 52 with the JVT encoding system and a decoding apparatus 53 to each other with a prescribed transmission line.
The encoding apparatus 52 has an encoding unit 10 and an encoding control unit 61 for controlling the encoding unit 10, and the encoding control unit 61 does not perform the aforementioned output timing notification process but carries out the other processes performed in the aforementioned first embodiment. In this case, the encoding apparatus 52 performs an encoding process on successive image information D1 with the encoding unit 10 controlled by the encoding control unit 61, to thereby successively create encoded information D2 without a decode delay added as a header, and then successively outputs the encoded information D2.
The decoding apparatus 53, on the other hand, has a decoding unit 20 and a decoding control unit 71 for controlling the decoding unit 20, and performs a decoding process on the encoded information D2 successively inputted through the transmission line, through the decoding unit 20 controlled by the decoding control unit 71, to thereby create restored image information D3, and then successively outputs the restored image information D3 to a display unit (not shown) for successive reproduction.
(2) Construction of Decoding Control Unit 71
Actually, the decoding control unit 71 of this decoding unit 53 temporarily stores successively inputted encoded information D2 in a storage buffer 121 (FIG. 9) of the decoding unit 20, detects conditions on decoding (hereinafter referred to as decoding conditions), such as a decoding order of the encoded information D2 and a start time of a decoding process, based on the headers of the encoded information D2, and thus can control the decoding unit 20 on the decoding conditions.
In addition to the above structure, the decoding control unit 71 watches a storage state of the restored image information D3 stored in an image rearrangement buffer 126 (FIG. 9) after the decoding process, and when detecting underflow as the storage state, carries out a re-output control process for re-outputting the restored image information D3 outputted just before the detection (underflow). This re-output control process will be described by using an example shown in FIG. 6.
In this FIG. 6, as in the case of FIG. 3, a column “EI” shows a picture-type order before an encoding process, a column “EO” shows a picture-type order after the encoding process, a column “DO” shows a picture-type order after decoding, a column “Ed” shows encode delays, and a column “Dd” shows decode delays. In addition, as in the case of FIG. 3, in this FIG. 6, a transmission time through the transmission line between the encoding apparatus 52 and the decoding apparatus 53 is not considered and an input time of encoded information D2 to the decoding apparatus 53 is taken to the same as an output time (in the column “EO”) of the encoded information D2 from the encoding unit 10.
In addition, a column “Sud (Start-up delay)” shows a period of time after each piece of encoded information D2 is inputted until a decoding process is started (hereinafter, this period of time is referred to as a start-up delay), and this start-up delay is calculated based on the encoding conditions by the encoding control unit 61 of the encoding apparatus 52 and is added as a header.
When the decoding control unit 71 receives the first encoded information D2 a (“I00” in the column “EO”), it ignores the start-up delay for the encoded information D2 a (“6” [seconds] in the column “Sud”) and performs control to immediately send the encoded information D2 a to the reverse encoding unit 122 and start a decoding process (“0” [second] in the column “Dd”). As a result, the decoding control unit 71 can shorten a preparation time (driving time) for continuous reproduction.
In this case, the decoding control unit 71 has a lag between the actual start time of the decoding process and the original start time, due to the control (this lag has to be adjusted and specifically, is 6 [seconds] in FIG. 6 and is hereinafter referred to as a delay amount to be adjusted), so that restored image information D3 a which is a result of decoding the encoded information D2 a is outputted via the image rearrangement buffer 126 and then re-outputted (at an item “t=8” in the column “DO”) so as to offset a part of the delay amount to be adjusted (which corresponds to “2” [seconds] of “t=8” in the column “Dd”).
Then, the decoding control unit 71 have encoded information D2 b, D2 c, . . . successively inputted following the encoded information D2 a (“P01”, “P02”, . . . in the column “EO”), decoded according to the start-up delays, and manages the storage state of the restored image information D3 b, D3 c, . . . (“P01”, “P02”, . . . in the column “DO”) being stored in the image rearrangement buffer 126 (FIG. 9) after the decoding process.
Under this state, every time when restored the image information D3 to be stored in the image rearrangement buffer 126 is failed (underflow), the decoding control unit 71 re-outputs the restored image information D3 outputted just before the failure (“P05” and “P10” in the column “DO”), to thereby periodically offset the other delay amounts to be adjusted.
In such a manner, the decoding control unit 71 executes the re-output control process, so as to keep the output continuousness of the restored image information D3.
Now, the output timing notification process described above is sequentially executed following a procedure RT2 for the re-output control process shown in FIG. 7.
That is, when predetermined operations to execute a decoding process are performed with an input unit (not shown) for example, the decoding control unit 71 starts the procedure RT2 for the re-output control process from step SP10, and waits for the first encoded information D2 a to be inputted at next step SP11, and when receiving the encoded information D2 a, moves on to step SP12.
Then, the decoding control unit 71 ignores a start-up delay for the encoded information D2 a and immediately starts the decoding process at step SP12, and after outputting restored image information D3 a obtained as a result of the decoding process at next step SP13, re-outputs the restored image information D3 a at step SP14 to offset a part of a delay amount to be adjusted.
Next, the decoding control unit 71 starts decoding of following restored image information D2 (D2 b, D2 c, . . . D2 n) and outputs restored image information D3 (D3 b, D3 c, . . . D3 n) obtained by the process at step SP15, and at next step SP16 judges whether restored image information D3 to be stored in the image rearrangement buffer 126 (FIG. 9) has been failed, and then if a negative result is obtained, returns back to step SP15 and repeats the above processes.
If an affirmative result is obtained, on the contrary, the decoding control unit 71 re-outputs the restored image information D3 (“P05” and “P10” in the column “DO”) outputted at step SP15 to offset a part (or all) of the remaining delay amount to be adjusted and moves on to next step SP18.
Then, the decoding control unit 71 judges at step SP18 whether all of the delay amount to be adjusted is offset, and if a negative result is obtained, returns back to step SP15 and repeats the above processes. And if an affirmative result is obtained, on the contrary, the decoding control unit 71 moves on to step SP19 where this procedure RT2 for the re-output control process is terminated.
In such a manner, the encoding control unit 11 can carries out the re-output control process following the procedure RT2 for the re-output control process.
According to the above construction, this decoding apparatus 53 temporarily stores the restored image information D3 successively created by performing the decoding process on the encoded information D2 which have been encoded with the JVT encoding system, in the image rearrangement buffer 126 (FIG. 9), and if restored image information D3 to be stored in the image rearrangement buffer 126 (FIG. 9) is failed, re-outputs the restored image information D3 outputted just before the failure.
Therefore, the decoding apparatus 53 can keep the output continuousness of the restored image information D3 even the encoded information D2 have been subjected to encoding with the JVT encoding system which does not allow a decoding order to be naturally determined.
In this case, the decoding apparatus 53 ignores the decoding start time (start-up delay) set for the first encoded information D2 stored in the storage buffer 121 (FIG. 9) and immediately starts decoding of the encoded information D2 a, and when restored image information D3 to be stored in the image rearrangement buffer 126 (FIG. 9) is failed, offsets a lag (delay amount to be adjusted) from the set decoding start time occurred due to the ignorance, by re-outputting the restored image information outputted just before the failure.
As a result, the decoding apparatus 53 can shorten a preparation time (driving time) for continuous reproduction and can keep the output continuousness of restored image information while periodically dispersing a lag occurred due to the shortening (delay amount to be adjusted).
According to the aforementioned construction, the restored image information D3 sequentially created by performing the decoding process on the encoded information D2 which have been subjected to encoding with the JVT encoding system are temporarily stored in the image rearrangement buffer 126 (FIG. 9), and when restored image information D3 to be stored in the image rearrangement buffer 126 (FIG. 9) is failed, the restored image information D3 outputted just before the failure is re-outputted, so that even the encoded information D2 have been subjected to encoding with the JVT encoding system which does not allow a decoding order to be naturally determined, the output continuousness of the restored image information D3 can be kept, which results in the continuous reproduction.
(3) Other Embodiment
Note that, the aforementioned second embodiment has described the case where the JVT encoding system is applied. This invention, however, is not limited to this and another kind of encoding system which can treat at least B-pictures as pictures to be prediction-encoded can be applied.
Further, the aforementioned second embodiment independently uses the storage buffer 121 and the rearrangement buffer 126 as storage means for temporarily storing encoded information and restored image information sequentially created by performing a decoding process on the encoded information. This invention, however, is not limited to this and the storage buffer 121 and the rearrangement buffer 126 are used in common. In this case, the number of buffers and manner of storage can be changed according to necessity.
Still further, the aforementioned second embodiment has described the case where the decoding control unit 71 serving as an output control means ignores a decoding start time (start-up delay) set for the first encoded information D2 a being stored in the storage buffer 121 (FIG. 9) and immediately starts decoding of the encoded information D2 a, and when restored image information D3 to be stored in the image rearrangement buffer 126 (FIG. 9) is failed, offsets a lag (a delay amount to be adjusted) from the set decoding start time occurred due to the ignorance, by re-outputting restored image information outputted just before the failure. In addition to this, if a storing order (in the column “EO”) of any of the encoded information D2 a to D2 b stored in the storage buffer 121 is different from an order before encoding (in the column “EI”) (for example, in the item “t=32” in FIG. 6), the restored image information D3 corresponding to the encoded information D2 having the different order may be re-outputted. By doing so, a decode delay which becomes longer when a different order is generated can be filled by re-output, which results in more assured output continuousness of the restored image information D3.
Industrial Applicability
This invention can be used for a case of transmitting successive image information via a network medium such as satellite broadcasting, cable TV or the Internet, or a case of processing the successive image information on a storage medium such as an optical disc, magnetic disk or flash memory.
Description of Reference Numerals
  • 1,51 . . . IMAGE REPRODUCTION SYSTEM, 2,52 . . . ENCODING APPARATUS, 3,53 . . . DECODING APPARATUS, 10 . . . ENCODING UNIT, 11,61 . . . ENCODING CONTROL UNIT 11 a . . . DELAY CALCULATION UNIT, 11 b . . . HEADER ADDITION UNIT, 20 . . . DECODING UNIT, 21,71 . . . DECODING CONTROL UNIT

Claims (7)

1. An encoding apparatus for executing an encoding process with an encoding system capable of treating at least B-pictures as pictures for inter-prediction-encoding, the encoding apparatus comprising:
longest delay calculation means for calculating a longest picture encoding delay for encoding the pictures in the encoding apparatus based on encode conditions, wherein each of the pictures are successively inputted to the encoding apparatus, each of the pictures are successively encoded in an encoding process, a separate encoding delay value is calculated for each of the pictures which indicates a difference in time between a respective picture being inputted and the respective picture being output from the encoding process, and the longest picture encoding delay is the encoding delay value having the highest value from among the plurality of pictures;
timing calculation means for calculating output timing for results of decoding the encoded information based on the longest picture encoding delay; and
timing notification means for notifying said decoding side of output timing calculated by said timing calculation means before a result of decoding corresponding encoded information is obtained.
2. The encoding apparatus according to claim 1, wherein
said timing calculation means calculates each unit of output timing for the results of decoding the encoded information based on the longest picture encoding delay.
3. An encoding method, implemented on an encoding apparatus, of executing an encoding process with an encoding system capable of treating at least B-pictures as pictures for inter-prediction-encoding, said encoding method comprising:
calculating a longest picture encoding delay for encoding the pictures in the encoding apparatus based on encode conditions, wherein each of the pictures are successively inputted to the encoding apparatus, each of the pictures are successively encoded in an encoding process, a separate encoding delay value is calculated for each of the pictures which indicates a difference in time between a respective picture being inputted and the respective picture being output from the encoding process, and the longest picture encoding delay is the encoding delay value having the highest value from among the plurality of pictures;
calculating output timing for results of decoding the encoded information based on the longest picture encoding delay; and
notifying said decoding side of output timing calculated in said first step before a result of decoding corresponding encoded information is obtained.
4. The encoding method according to claim 3, further comprising:
calculating each unit of output timing for the results of decoding the encoded information based on the longest picture encoding delay.
5. An encoding apparatus for executing an encoding process with an encoding system capable of treating at least B-pictures as pictures for inter-prediction-encoding, the encoding apparatus comprising:
a longest encoding delay calculator that calculates a longest encoding delay for encoding the pictures in the encoding apparatus based on encode conditions, wherein each of the pictures are successively inputted to the encoding apparatus, each of the pictures are successively encoded in an encoding process, a separate encoding delay value is calculated for each of the pictures which indicates a difference in time between a respective picture being inputted and the respective picture being output from the encoding process, and the longest picture encoding delay is the encoding delay value having the highest value from among the plurality of pictures;
a timing calculator that calculates output timing for results of decoding the encoded information based on the longest picture encoding delay; and
a timing notifier that notifies said decoding side of output timing calculated by said timing calculator before a result of decoding corresponding encoded information is obtained.
6. The encoding apparatus according to claim 5, wherein
said timing calculator calculates each unit of output timing for the results of decoding the encoded information based on the longest picture encoding delay.
7. The encoding apparatus according to claim 1, wherein the timing calculation means calculates the output timing for the results of decoding the encoded information by subtracting the calculated encoding delay value for each picture from the longest picture encoding delay value.
US13/046,341 2002-07-19 2011-03-11 Encoding device, encoding method, decoding device, and decoding method Expired - Fee Related US8228988B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US13/046,341 US8228988B2 (en) 2002-07-19 2011-03-11 Encoding device, encoding method, decoding device, and decoding method

Applications Claiming Priority (7)

Application Number Priority Date Filing Date Title
JP2002-211829 2002-07-19
JP2002211829 2002-07-19
JP2003276326A JP4806888B2 (en) 2002-07-19 2003-07-17 Decoding device and decoding method
JP2003-276326 2003-07-17
PCT/JP2003/009138 WO2004010707A1 (en) 2002-07-19 2003-07-18 Encoding device, encoding method, decoding device, and decoding method
US10/519,840 US8259799B2 (en) 2002-07-19 2003-07-18 Encoding device, encoding method, decoding device, and decoding method
US13/046,341 US8228988B2 (en) 2002-07-19 2011-03-11 Encoding device, encoding method, decoding device, and decoding method

Related Parent Applications (3)

Application Number Title Priority Date Filing Date
PCT/JP2003/009138 Continuation WO2004010707A1 (en) 2002-07-19 2003-07-18 Encoding device, encoding method, decoding device, and decoding method
US10519840 Continuation 2003-07-18
US10/519,840 Continuation US8259799B2 (en) 2002-07-19 2003-07-18 Encoding device, encoding method, decoding device, and decoding method

Publications (2)

Publication Number Publication Date
US20110158318A1 US20110158318A1 (en) 2011-06-30
US8228988B2 true US8228988B2 (en) 2012-07-24

Family

ID=30772222

Family Applications (2)

Application Number Title Priority Date Filing Date
US10/519,840 Expired - Fee Related US8259799B2 (en) 2002-07-19 2003-07-18 Encoding device, encoding method, decoding device, and decoding method
US13/046,341 Expired - Fee Related US8228988B2 (en) 2002-07-19 2011-03-11 Encoding device, encoding method, decoding device, and decoding method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US10/519,840 Expired - Fee Related US8259799B2 (en) 2002-07-19 2003-07-18 Encoding device, encoding method, decoding device, and decoding method

Country Status (6)

Country Link
US (2) US8259799B2 (en)
EP (1) EP1536651A4 (en)
JP (1) JP4806888B2 (en)
KR (1) KR100960821B1 (en)
CN (1) CN1669331A (en)
WO (1) WO2004010707A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8369402B2 (en) 2004-06-17 2013-02-05 Canon Kabushiki Kaisha Apparatus and method for prediction modes selection based on image formation
JP4247680B2 (en) * 2004-07-07 2009-04-02 ソニー株式会社 Encoding apparatus, encoding method, encoding method program, and recording medium recording the encoding method program
US8483289B2 (en) 2008-04-17 2013-07-09 Broadcom Corporation Method and system for fast channel change
JP2006157481A (en) 2004-11-30 2006-06-15 Canon Inc Image coding apparatus and method thereof
CN101888550A (en) * 2010-06-28 2010-11-17 中兴通讯股份有限公司 Encoding method and device of quantization parameters in slice head information
BR122020013609B1 (en) * 2011-03-11 2023-02-23 Sony Corporation DEVICE AND METHOD OF IMAGE PROCESSING
US10817224B2 (en) 2016-06-23 2020-10-27 Qualcomm Incorporated Preemptive decompression scheduling for a NAND storage device

Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5127000A (en) 1989-08-09 1992-06-30 Alcatel N.V. Resequencing system for a switching node
US5481543A (en) * 1993-03-16 1996-01-02 Sony Corporation Rational input buffer arrangements for auxiliary information in video and audio signal processing systems
JPH0818953A (en) 1994-07-01 1996-01-19 Hitachi Ltd Dynamic picture decoding display device
JPH08149464A (en) 1994-11-25 1996-06-07 Graphics Commun Lab:Kk Image decoder provided with frame rate conversion function
JPH08205146A (en) 1995-01-25 1996-08-09 Hitachi Denshi Ltd Time stamp value computing method in encoding transmission system
JPH099258A (en) 1995-06-20 1997-01-10 Hitachi Ltd Decoding device for encoded video data
JPH0937249A (en) 1995-07-20 1997-02-07 Hitachi Ltd Decoding processing method for coded video signal and decoder using it
JPH09247670A (en) 1996-03-13 1997-09-19 Toshiba Corp Information multiplexer
JPH11122113A (en) 1997-10-09 1999-04-30 Sony Corp Data decoder and its method
JP2000069477A (en) 1997-10-31 2000-03-03 Matsushita Electric Ind Co Ltd Picture signal data structure, picture encoding method and picture decoding method
JP2000228768A (en) 1999-02-05 2000-08-15 Sony Corp Digital signal transmitter, its method and served medium
JP2000350183A (en) 1999-06-03 2000-12-15 Canon Inc Remote control system for image pickup device
US20010001023A1 (en) 1997-09-25 2001-05-10 Sony Corporation Encoded stream generating apparatus and method, data transmission system and method, and editing system and method
JP2001238110A (en) 2000-02-24 2001-08-31 Sony Corp Imaging apparatus, image pickup method and recording medium
US6285405B1 (en) 1998-10-14 2001-09-04 Vtel Corporation System and method for synchronizing data signals
US6320909B1 (en) * 1995-05-09 2001-11-20 Mitsubishi Denki Kabushiki Kaisha Picture decoding and display unit including a memory having reduce storage capacity for storing pixel data
WO2001089227A1 (en) 2000-05-15 2001-11-22 Nokia Corporation Video coding
JP2002091424A (en) 2000-09-20 2002-03-27 Matsushita Electric Ind Co Ltd Device and method for displaying picture
EP1195996A2 (en) 2000-10-05 2002-04-10 Kabushiki Kaisha Toshiba Apparatus, method and computer program product for decoding and reproducing moving images, time control method and multimedia information receiving apparatus
US6504576B2 (en) 1997-06-25 2003-01-07 Sony Corporation Digital signal coding method and apparatus, signal recording medium, and signal transmission method for recording a moving picture signal and an acoustic signal
US6584152B2 (en) * 1997-04-04 2003-06-24 Avid Technology, Inc. Computer system and process for capture, editing and playback of motion video compressed using interframe and intraframe techniques
US6654421B2 (en) 2000-03-02 2003-11-25 Hideyoshi Tominaga Apparatus, method and computer program product for transcoding a coded multiplexed sound and moving picture sequence
JP2004056232A (en) 2002-07-16 2004-02-19 Nippon Telegr & Teleph Corp <Ntt> Moving picture encoding method, moving picture decoding method, image encoder, image decoder, moving picture encoding program, moving picture decoding program, and recording medium for the programs
JP2004180266A (en) 2002-10-03 2004-06-24 Ntt Docomo Inc Video decoding method, video decoding apparatus, and video decoding program
US6795498B1 (en) 1999-05-24 2004-09-21 Sony Corporation Decoding apparatus, decoding method, encoding apparatus, encoding method, image processing system, and image processing method
US6963608B1 (en) * 1998-10-02 2005-11-08 General Instrument Corporation Method and apparatus for providing rate control in a video encoder
US7706445B2 (en) 2001-05-31 2010-04-27 Sanyo Electric Co., Ltd. Image processing employing picture type conversion

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5127000A (en) 1989-08-09 1992-06-30 Alcatel N.V. Resequencing system for a switching node
US5481543A (en) * 1993-03-16 1996-01-02 Sony Corporation Rational input buffer arrangements for auxiliary information in video and audio signal processing systems
JPH0818953A (en) 1994-07-01 1996-01-19 Hitachi Ltd Dynamic picture decoding display device
JPH08149464A (en) 1994-11-25 1996-06-07 Graphics Commun Lab:Kk Image decoder provided with frame rate conversion function
JPH08205146A (en) 1995-01-25 1996-08-09 Hitachi Denshi Ltd Time stamp value computing method in encoding transmission system
US6320909B1 (en) * 1995-05-09 2001-11-20 Mitsubishi Denki Kabushiki Kaisha Picture decoding and display unit including a memory having reduce storage capacity for storing pixel data
JPH099258A (en) 1995-06-20 1997-01-10 Hitachi Ltd Decoding device for encoded video data
JPH0937249A (en) 1995-07-20 1997-02-07 Hitachi Ltd Decoding processing method for coded video signal and decoder using it
JPH09247670A (en) 1996-03-13 1997-09-19 Toshiba Corp Information multiplexer
US6584152B2 (en) * 1997-04-04 2003-06-24 Avid Technology, Inc. Computer system and process for capture, editing and playback of motion video compressed using interframe and intraframe techniques
US6504576B2 (en) 1997-06-25 2003-01-07 Sony Corporation Digital signal coding method and apparatus, signal recording medium, and signal transmission method for recording a moving picture signal and an acoustic signal
US20010001023A1 (en) 1997-09-25 2001-05-10 Sony Corporation Encoded stream generating apparatus and method, data transmission system and method, and editing system and method
JPH11122113A (en) 1997-10-09 1999-04-30 Sony Corp Data decoder and its method
JP2000069477A (en) 1997-10-31 2000-03-03 Matsushita Electric Ind Co Ltd Picture signal data structure, picture encoding method and picture decoding method
US20020012399A1 (en) 1997-10-31 2002-01-31 Takahiro Nishi Image signal, decoding method, decoding apparatus, data storage medium, and computer program implementing a display cycle identifier
US6963608B1 (en) * 1998-10-02 2005-11-08 General Instrument Corporation Method and apparatus for providing rate control in a video encoder
US6285405B1 (en) 1998-10-14 2001-09-04 Vtel Corporation System and method for synchronizing data signals
JP2000228768A (en) 1999-02-05 2000-08-15 Sony Corp Digital signal transmitter, its method and served medium
US6795498B1 (en) 1999-05-24 2004-09-21 Sony Corporation Decoding apparatus, decoding method, encoding apparatus, encoding method, image processing system, and image processing method
JP2000350183A (en) 1999-06-03 2000-12-15 Canon Inc Remote control system for image pickup device
JP2001238110A (en) 2000-02-24 2001-08-31 Sony Corp Imaging apparatus, image pickup method and recording medium
US6654421B2 (en) 2000-03-02 2003-11-25 Hideyoshi Tominaga Apparatus, method and computer program product for transcoding a coded multiplexed sound and moving picture sequence
WO2001089227A1 (en) 2000-05-15 2001-11-22 Nokia Corporation Video coding
JP2002091424A (en) 2000-09-20 2002-03-27 Matsushita Electric Ind Co Ltd Device and method for displaying picture
EP1195996A2 (en) 2000-10-05 2002-04-10 Kabushiki Kaisha Toshiba Apparatus, method and computer program product for decoding and reproducing moving images, time control method and multimedia information receiving apparatus
US7706445B2 (en) 2001-05-31 2010-04-27 Sanyo Electric Co., Ltd. Image processing employing picture type conversion
JP2004056232A (en) 2002-07-16 2004-02-19 Nippon Telegr & Teleph Corp <Ntt> Moving picture encoding method, moving picture decoding method, image encoder, image decoder, moving picture encoding program, moving picture decoding program, and recording medium for the programs
JP2004180266A (en) 2002-10-03 2004-06-24 Ntt Docomo Inc Video decoding method, video decoding apparatus, and video decoding program

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
European Office Action dated Mar. 22, 2012, in corresponding application 03 765 323.5.
Fujiwara, Hiroshi, "Saishin MPEG Kyokasho," Ascii Corp., pp. 236-237, Aug. 1, 1994.
Japanese Office Action dated Jan. 17, 2012.
Supplementary European Search Report dated Nov. 30, 2010.

Also Published As

Publication number Publication date
KR20050028017A (en) 2005-03-21
WO2004010707A1 (en) 2004-01-29
KR100960821B1 (en) 2010-06-07
US20050238099A1 (en) 2005-10-27
US20110158318A1 (en) 2011-06-30
EP1536651A4 (en) 2010-12-29
CN1669331A (en) 2005-09-14
EP1536651A1 (en) 2005-06-01
JP2004056827A (en) 2004-02-19
JP4806888B2 (en) 2011-11-02
US8259799B2 (en) 2012-09-04

Similar Documents

Publication Publication Date Title
US8228988B2 (en) Encoding device, encoding method, decoding device, and decoding method
JP3593988B2 (en) Moving picture signal compression apparatus and method
KR100289852B1 (en) Image coding method, image coding apparatus and image recording medium
KR100950743B1 (en) Image information coding device and method and image information decoding device and method
EP1083750A2 (en) Method and apparatus for transcoding coded video image data
US20070291131A1 (en) Apparatus and Method for Controlling Image Coding Mode
US20020057739A1 (en) Method and apparatus for encoding video
JPH08223577A (en) Moving image coding method and device therefor and moving image decoding method and device therefor
JP2003116104A (en) Information processing apparatus and information processing method
EP0878967A2 (en) Signal coding apparatus and method
JPH0818979A (en) Image processor
JP3852366B2 (en) Encoding apparatus and method, decoding apparatus and method, and program
US6271774B1 (en) Picture data processor, picture data decoder and picture data encoder, and methods thereof
JP4799547B2 (en) Encoding method and encoding apparatus for picture sequence using predictive picture and non-predictive picture each including multi-macroblock
JPH0715729A (en) Image coding method and circuit, device therefor and optical disk
JPH06339111A (en) Compressed moving picture reproduction device
KR20010075389A (en) Device for encoding motion picture signals and encoding method
JPH06276481A (en) Picture signal coding and decoding method and recording medium
JP2002218470A (en) Method for converting image encoded data rate and device for converting image encoding rate
JP5007759B2 (en) Encoding apparatus, encoding method, decoding apparatus, and decoding method
JP3480980B2 (en) Image signal transmission method and apparatus, and image signal decoding method and apparatus
JP3481207B2 (en) Image signal transmission method and apparatus, and image signal decoding method and apparatus
WO2004030367A1 (en) Moving picture data stream conversion device and method
JP2000115777A (en) Image processing method and image processing unit
JPH06276504A (en) Method and device for picture signal coding, method and device for picture decoding

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20240724